High-throughput computing as an enabler of black hole science
The stunning new image of a supermassive black hole in the center of the Milky Way was created by eight telescopes, 300 international astronomers and more than 5 million computational tasks. This Morgridge Institute article describes how the Wisconsin-based Open Science Pool helped make sense of it all.Read about our Spotlight User
The OSG Consortium
Established in 2005, the OSG Consortium operates a fabric of distributed High Throughput Computing (dHTC) services in support of the National Science & Engineering community. The research collaborations, campuses, national laboratories, and software providers that form the consortium are unified in their commitment to advance open science via these services.
Proposal (NSF 23-526)
Campuses with awards from the NSF Campus Cyberinfrastructure (CC*) Program play an important role in supporting Open Science. To date, 25 CC* campuses contribute to the processing and storage capacity of the Open Science Pool (OSPool) that is harnessed weekly by more than 2M jobs.
NSF (GEO OSE)
Proposal (NSF 23-534)
The PATh Facility provides CPU and GPU high throughput computing capacity for the NSF community. Proposals to the NSF GEO Open Science Ecosystem solicitation can request credits on the Facility. Please reach out to [email protected] for any questions or consultations.
Open Science Pool
Any researcher performing Open Science in the US can become an OSPool user. The OSPool provides its users with fair-share access (no allocation needed!) to processing and storage capacity contributed by university campuses, government-supported supercomputing institutions and research collaborations. Using state-of-the-art distributed computing technologies the OSPool is designed to support High Throughput workloads that consist of large ensembles of independent computations.
Open Science Data Federation (OSDF)
The Open Science Data Federation (OSDF) enables users and institutions to share data files and storage capacity, making them both accessible in dHTC environments such as the OSPool.
- Provides campuses and researchers with the ability to manage their data files, input and output, in support of running their dHTC workloads.
- Improves file access performance, resource consumption and reliability.
- The OSG Connect service provides researchers with a default of 500GB of storage space on the OSDF.
OSG All-Hands (AH) Meetings
If you are a current or potential consumer, provider or executive, of dHTC, you are welcome to attend the annual OSG All-Hands (AH) Meetings. The more than 250 attendees of the 2022 meeting were offered 38 presentations from researchers who reviewed their use of OSG services, executives who discussed the current state and future plans for the OSG services, and administrators who reported on the adoption of HTC by faculty and students on their campuses.
OSG By Maps
LIGO's Search for Gravitational Waves Signals Using HTCondorCody Messick, a Postdoc at the Massachusetts Institute of Technology (MIT) working for the LIGO lab, describes LIGO’s use of HTCondor to search for new gravitational wave sources.
The role of HTC in advancing population genetics researchPostdoctoral researcher Parul Johri uses OSG services, the HTCondor Software Suite, and the population genetics simulation program SLiM to investigate historical patterns of genetic variation.
High-throughput computing as an enabler of black hole scienceThe stunning new image of a supermassive black hole in the center of the Milky Way was created by eight telescopes, 300 international astronomers and more than 5 million computational tasks. This Morgridge Institute article describes how the Wisconsin-based Open Science Pool helped make sense of it all.
What We Do
The OSG facilitates access to distributed high throughput computing for research in the US. The resources accessible through the OSG are contributed by the community, organized by the OSG, and governed by the OSG consortium. In the last 12 months, we have provided more than 1.2 billion CPU hours to researchers across a wide variety of projects.
Submit Locally, Run Globally
Researchers can run jobs on OSG from their home institution or OSG's centrally-operated submission service, OSG Connect (available for US-based research and scholarship).
Sharing Is Key
Sharing is a core principle of the OSG. Over 100 million CPU hours delivered on the OSG in the past year were opportunistic, contributed by university campuses, government-supported supercomputing facilities and research collaborations. Sharing allows individual researchers to access larger computing resources and large organizations to keep their utilization high.
The OSG consists of computing and storage elements at over 100 individual sites spanning the United States. These sites, primarily at universities and national labs, range in size from a few hundred to tens of thousands of CPU cores.
The OSG Software Stack
The OSG provides an integrated software stack to enable high throughput computing; visit our technical documents website for information.
Coordinating CI Services
NSF’s Blueprint for National Cyberinfrastructure Coordination Services lays out the need for coordination services to bring together the distributed elements of a national CI ecosystem. It highlights OSG as providing distributed high throughput computing services to the U.S. research community.
Are you a resource provider wanting to join our collaboration? Contact us: [email protected].
Are you a user wanting more computing resources? Check with your 'local' computing providers, or consider using OSG Connect (available to US-based academic/govt/non-profit research projects).
For any other inquiries, reach us at: [email protected].
To see the breadth of the OSG impact, explore our accounting portal.
The activities of the OSG Consortium are supported by multiple projects and in-kind contributions from members. Significant funding is provided through:
The Partnership to Advance Throughput Computing (PATh) is an NSF-funded (#2030508) project to address the needs of the rapidly growing community embracing Distributed High Throughput Computing (dHTC) technologies and services to advance their research.
The Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) is an NSF-funded (#1836650) software institute established to meet the software and computing challenges of the HL-LHC, through R&D for the software for acquiring, managing, processing and analyzing HL-LHC data.