High-throughput computing as an enabler of black hole science
The stunning new image of a supermassive black hole in the center of the Milky Way was created by eight telescopes, 300 international astronomers and more than 5 million computational tasks. This Morgridge Institute article describes how the Wisconsin-based Open Science Pool helped make sense of it all.Read about our Spotlight User
The OSG Consortium
Established in 2005, the OSG Consortium operates a fabric of distributed High Throughput Computing (dHTC) services in support of the National Science & Engineering community. The research collaborations, campuses, national laboratories, and software providers that form the consortium are unified in their commitment to advance open science via these services.
NSF (CC*) Solicitation (NSF 23-526)
Campuses with awards from the NSF Campus Cyberinfrastructure (CC*) Program play an important role in supporting Open Science. To date, 25 CC* campuses contribute to the processing and storage capacity of the Open Science Pool (OSPool) that is harnessed weekly by more than 2M jobs.
Open Science Pool
Any researcher performing Open Science in the US can become an OSPool user. The OSPool provides its users with fair-share access (no allocation needed!) to processing and storage capacity contributed by university campuses, government-supported supercomputing institutions and research collaborations. Using state-of-the-art distributed computing technologies the OSPool is designed to support High Throughput workloads that consist of large ensembles of independent computations.
Open Science Data Federation (OSDF)
The Open Science Data Federation (OSDF) enables users and institutions to share data files and storage capacity, making them both accessible in dHTC environments such as the OSPool.
- Provides campuses and researchers with the ability to manage their data files, input and output, in support of running their dHTC workloads.
- Improves file access performance, resource consumption and reliability.
- OSG-Operated Access Points provide researchers with a default of 500GB of storage space on the OSDF.
OSG All-Hands (AH) Meetings
You are invited to Throughput Computing 2023 — an event that combines HTCondor Week 2023 and the OSG All-Hands Meeting 2023 — in beautiful Madison, Wisconsin, the week of July 10–14. OSG Consortium staff, campuses, and researchers will discuss topics including the state of the OSG, OSG services, science enabled by the OSPool, and campus services and perspectives.
The annual All-Hands Meeting works to foster relationships between the researchers and campuses using OSG services and service providers in the OSG Consortium. The more than 250 attendees of the 2022 meeting were offered 38 presentations from researchers who reviewed their use of OSG services, executives who discussed the current state and future plans for the OSG services, and administrators who reported on the adoption of HTC by faculty and students on their campuses.
OSG By Maps
OSPool As a Tool for Advancing Research in Computational ChemistryAssistant Professor Eric Jonas uses OSG resources to understand the structure of molecules based on their measurements and derived properties.
Distributed Computing at the African School of Physics 2022 WorkshopOver 50 students chose to participate in a distributed computing workshop from the 7th biennial African School of Physics (ASP) 2022 at Nelson Mandela University in Gqeberha, South Africa.
What We Do
The OSG facilitates access to distributed high throughput computing for research in the US. The resources accessible through the OSG are contributed by the community, organized by the OSG, and governed by the OSG consortium. In the last 12 months, we have provided more than 1.2 billion CPU hours to researchers across a wide variety of projects.
Submit Locally, Run Globally
Researchers can run jobs on OSG from their home institution or an OSG-Operated Access Point (available for US-based research and scholarship).
Sharing Is Key
Sharing is a core principle of the OSG. Over 100 million CPU hours delivered on the OSG in the past year were opportunistic, contributed by university campuses, government-supported supercomputing facilities and research collaborations. Sharing allows individual researchers to access larger computing resources and large organizations to keep their utilization high.
The OSG consists of computing and storage elements at over 100 individual sites spanning the United States. These sites, primarily at universities and national labs, range in size from a few hundred to tens of thousands of CPU cores.
The OSG Software Stack
The OSG provides an integrated software stack to enable high throughput computing; visit our technical documents website for information.
Coordinating CI Services
NSF’s Blueprint for National Cyberinfrastructure Coordination Services lays out the need for coordination services to bring together the distributed elements of a national CI ecosystem. It highlights OSG as providing distributed high throughput computing services to the U.S. research community.
Are you a resource provider wanting to join our collaboration? Contact us: [email protected].
Are you a user wanting more computing resources? Check with your 'local' computing providers, or consider using an OSG-Operated Access Point which has access to the OSPool (available to US-based academic/govt/non-profit research projects).
For any other inquiries, reach us at: [email protected].
To see the breadth of the OSG impact, explore our accounting portal.
The activities of the OSG Consortium are supported by multiple projects and in-kind contributions from members. Significant funding is provided through:
The Partnership to Advance Throughput Computing (PATh) is an NSF-funded (#2030508) project to address the needs of the rapidly growing community embracing Distributed High Throughput Computing (dHTC) technologies and services to advance their research.
The Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) is an NSF-funded (#1836650) software institute established to meet the software and computing challenges of the HL-LHC, through R&D for the software for acquiring, managing, processing and analyzing HL-LHC data.