High-throughput computing as an enabler of black hole science
The stunning new image of a supermassive black hole in the center of the Milky Way was created by eight telescopes, 300 international astronomers and more than 5 million computational tasks. This Morgridge Institute article describes how the Wisconsin-based Open Science Pool helped make sense of it all.
Expanding, uniting, and enhancing CLAS12 computing with OSG’s fabric of services
A mutually beneficial partnership between Jefferson Lab and the OSG Consortium at both the organizational and individual levels has delivered a prolific impact for the CLAS12 Experiment.
NIAID/ACE - OSG collaboration leads to a successful virtual training session
The U.S. National Institute of Allergy and Infectious Diseases (NIAID) and the African Centers for Excellence in Bioinformatics and Data-Intensive Science (ACE) partnered with the OSG Consortium to host a virtual high throughput computing training session for graduate students from Makerere University and the University Of Sciences, Techniques, and Technologies of Bamako (USTTB).
Learning and adapting with OSG: Investigating the strong nuclear force
David Swanson Memorial Award winner, Connor Natzke’s journey with the OSG Consortium began in 2019 as a student of the OSG User School. Today, nearly three years later, Natzke has executed 600,000 simulations with the help of OSG staff and prior OSG programming. These simulations, each of them submitted as a job, logged over 135,000 core hours provided by the Open Science Pool (OSPool). Natzke’s history with the OSG Consortium reflects a pattern of learning, adapting, and improving that translates to the acceleration and expansion of scientific discovery.
80,000 jobs, 40 billion base pairs, and 20 bats –– all in 4 weeks
An evolutionary biologist at the AMNH used HTC services provided by the OSG to unlock a genomic basis for convergent evolution in bats.
Antimatter: Using HTC to study very rare processes
Anirvan Shukla, a User School participant in 2016, spoke at this year’s Showcase about how high throughput computing has transformed his research of antimatter in the last five years.
Using HTC for a simulation study on cross-validation for model evaluation in psychological science
During the OSG School Showcase, Hannah Moshontz, a postdoctoral fellow at UW-Madison’s Department of Psychology, described her experience of using high throughput computing (HTC) for the very first time, when taking on an entirely new project within the field of psychology.
Transforming research with high throughput computing
During the OSG Virtual School Showcase, three different researchers shared how high throughput computing has made lasting impacts on their work.
Scaling virtual screening to ultra-large virtual chemical libraries
Kicking off last week’s OSG User School Showcase, Spencer Ericksen, a researcher at the University of Wisconsin-Madison’s Carbone Cancer Center, described how high throughput computing (HTC) has made his work in early-stage drug discovery infinitely more scalable.
OSG fuels a student-developed computing platform to advance RNA nanomachines
How undergraduates at the University of Nebraska-Lincoln developed a science gateway that enables researchers to build RNA nanomachines for therapeutic, engineering, and basic science applications.
How to Transfer 460 Terabytes? A File Transfer Case Study
When Greg Daues at the National Center for Supercomputing Applications (NCSA) needed to transfer 460 Terabytes of NCSA files from the National Institute of Nuclear and Particle Physics (IN2P3) in Lyon, France to Urbana, Illinois, for a project they were working with FNAL, CC-IN2P3 and the Rubin Data Production team, he turned to the HTCondor High Throughput system, not to run computationally intensive jobs, as many do, but to manage the hundreds of thousands of I/O bound transfers.
SDSC and IceCube Center Conduct GPU Cloudburst Experiment
The San Diego Supercomputer Center (SDSC) and the Wisconsin IceCube Particle Astrophysics Center (WIPAC) at the University of Wisconsin–Madison successfully completed a computational experiment as part of a multi-institution collaboration that marshalled all globally available for sale GPUs (graphics processing units) across Amazon Web Services, Microsoft Azure, and the Google Cloud Platform.
Des Expanding Universe
DES looks to the Open Science Pool to provide a portion of their computing power. Over the last year, DES has gained roughly 4.58 million hours (522 years) of computing from OSG.
Astronomy archives are creating new science every day
The TRAPPIST-1 system contains a total of seven planets, all around the size of Earth. Three of them — TRAPPIST-1e, f and g — dwell in their star’s so-called “habitable zone.” The habitable zone, or Goldilocks zone, is a band around every star (shown here in green) where astronomers have calculated that temperatures are just right — not too hot, not too cold — for liquid water to pool on the surface of an Earth-like world.
Machine learning insights into molecular science using the Open Science Pool
Machine learning insights into molecular science using the Open Science Pool Computation has extended what researchers can investigate in chemistry, biology, and material science. Studying complex systems like proteins or nanocomposites can use similar techniques for common challenges.
VERITAS and OSG explore extreme window into the universe
Understanding the universe has always fascinated mankind. The VERITAS Cherenkov telescope array unravels its secrets by detecting very-high-energy gamma rays from astrophysics sources.
For neuroscientist Chris Cox, the OSG helps process mountains of data
Whether exploring how the brain is fooled by fake news or explaining the decline of knowledge in dementia, cognitive neuroscientists like Chris Cox are relying more on high-throughput computing resources like the Open Science Pool to understand how the brain makes sense of information.
Free Supercomputing for Research - Scott Cole introduces you to OSG
Scott Cole, a neuroscience PhD student at University of California San Diego, wrote an article which appeared in PythonWeekly that details how to get up and running on Open Science Pool.
OSG helps LIGO scientists confirm Einstein's unproven theory
LIGO’s detectors search for gravitational waves from deep space. With two detectors, researchers can use differences in the wave’s arrival times to constrain the source location in the sky. LIGO’s first data run of its advanced gravitational wave detectors began in September 2015 and ran through January 12, 2016. The first gravitational waves were detected on September 14, 2015 by both detectors.
The OSG Consortium is pleased to see OSG user spotlights reposted on the websites of other organizations. The spotlights should be reposted with credits to the OSG website and the original authors, as well as a link to the original posting. Any alterations to the text or images for the reposting should be agreed by the OSG Communications team. Please email [email protected].