OSPool As a Tool for Advancing Research in Computational Chemistry
Assistant Professor Eric Jonas uses OSG resources to understand the structure of molecules based on their measurements and derived properties.Read about our Spotlight User
The OSG Consortium
Established in 2005, the OSG Consortium operates a fabric of distributed High Throughput Computing (dHTC) services in support of the National Science & Engineering community. The research collaborations, campuses, national laboratories, and software providers that form the consortium are unified in their commitment to advance open science via these services.
Any researcher performing Open Science in the US can become an OSPool user. The OSPool provides its users with fair-share access (no allocation needed!) to processing and storage capacity contributed by university campuses, government-supported supercomputing institutions and research collaborations. Using state-of-the-art distributed computing technologies the OSPool is designed to support High Throughput workloads that consist of large ensembles of independent computations.
The Open Science Data Federation (OSDF) enables users and institutions to share data files and storage capacity, making them both accessible in dHTC environments such as the OSPool.
- Provides campuses and researchers with the ability to manage their data files, input and output, in support of running their dHTC workloads.
- Improves file access performance, resource consumption and reliability.
- OSG-Operated Access Points provide researchers with a default of 500GB of storage space on the OSDF.
OSG All-Hands (AH) Meetings
You are invited to Throughput Computing 2024 — an event that combines the OSG All-Hands Meeting and HTCondor Week — in beautiful Madison, Wisconsin, the week of July 8-12. OSG Consortium staff, campuses, and researchers will discuss topics including the state of the OSG, OSG services, science enabled by the OSPool, and campus services and perspectives.
The annual All-Hands Meeting works to foster relationships between the researchers and campuses using OSG services and service providers in the OSG Consortium. The more than 350 attendees of the 2023 meeting were offered 30 presentations from researchers who reviewed their use of OSG services, executives who discussed the current state and future plans for the OSG services, and administrators who reported on the adoption of HTC by faculty and students on their campuses.
Using HTC expanded scale of research using noninvasive measurements of tendons and ligamentsWith this technique and the computing power of high throughput computing (HTC) combined, researchers can obtain thousands of simulations to study the pathology of tendons and ligaments.
Training a dog and training a robot aren’t so differentIn the Hanna Lab, researchers use high throughput computing as a critical tool for training robots with reinforcement learning.
What We Do
The OSG facilitates access to distributed high throughput computing for research in the US. The resources accessible through the OSG are contributed by the community, organized by the OSG, and governed by the OSG consortium. In the last 12 months, we have provided more than 1.2 billion CPU hours to researchers across a wide variety of projects.
Submit Locally, Run Globally
Researchers can run jobs on OSG from their home institution or an OSG-Operated Access Point (available for US-based research and scholarship).
Sharing Is Key
Sharing is a core principle of the OSG. Over 100 million CPU hours delivered on the OSG in the past year were opportunistic, contributed by university campuses, government-supported supercomputing facilities and research collaborations. Sharing allows individual researchers to access larger computing resources and large organizations to keep their utilization high.
The OSG consists of computing and storage elements at over 100 individual sites spanning the United States. These sites, primarily at universities and national labs, range in size from a few hundred to tens of thousands of CPU cores.
The OSG Software Stack
The OSG provides an integrated software stack to enable high throughput computing; visit our technical documents website for information.
Coordinating CI Services
NSF’s Blueprint for National Cyberinfrastructure Coordination Services lays out the need for coordination services to bring together the distributed elements of a national CI ecosystem. It highlights OSG as providing distributed high throughput computing services to the U.S. research community.
Are you a resource provider wanting to join our collaboration? Contact us: [email protected].
Are you a user wanting more computing resources? Check with your 'local' computing providers, or consider using an OSG-Operated Access Point which has access to the OSPool (available to US-based academic/govt/non-profit research projects).
For any other inquiries, reach us at: [email protected].
To see the breadth of the OSG impact, explore our accounting portal.
The activities of the OSG Consortium are supported by multiple projects and in-kind contributions from members. Significant funding is provided through:
The Partnership to Advance Throughput Computing (PATh) is an NSF-funded (#2030508) project to address the needs of the rapidly growing community embracing Distributed High Throughput Computing (dHTC) technologies and services to advance their research.
The Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) is an NSF-funded (#1836650) software institute established to meet the software and computing challenges of the HL-LHC, through R&D for the software for acquiring, managing, processing and analyzing HL-LHC data.