CHTC is hiring! View the new Research Cyberinfrastructure Specialist position on the jobs page and apply by August 1st. View Job Posting

Collaborations Between Two National Laboratories and the OSG Consortium Propel Nuclear and High-Energy Physics Forward


Seeking to unlock the secrets of the “glue” binding visible matter in the universe, the ePIC Collaboration stands at the forefront of innovation. Led by a collective of hundreds of scientists and engineers, the Electron-Proton/Ion Collider (ePIC) Collaboration was formed to design, build, and operate the first experiment at the Electron-Ion Collider (EIC).

The OSG Consortium

Established in 2005, the OSG Consortium operates a fabric of distributed High Throughput Computing (dHTC) services in support of the National Science & Engineering community. The research collaborations, campuses, national laboratories, and software providers that form the consortium are unified in their commitment to advance open science via these services.

We are here to help with your CC* Proposal (NSF 24-530)!

Campuses with awards from the NSF Campus Cyberinfrastructure (CC*) Program play an important role in supporting Open Science. To date, 37 CC* campuses contribute to the processing and storage capacity of the Open Science Pool (OSPool) that is harnessed weekly by more than 3M jobs.

Email Us How We Can Help

Open Science Pool

Any researcher performing Open Science in the US can become an OSPool user. The OSPool provides its users with fair-share access (no allocation needed!) to processing and storage capacity contributed by university campuses, government-supported supercomputing institutions and research collaborations. Using state-of-the-art distributed computing technologies the OSPool is designed to support High Throughput workloads that consist of large ensembles of independent computations.

Open Science Data Federation (OSDF)

The Open Science Data Federation (OSDF) enables users and institutions to share data files and storage capacity, making them both accessible in dHTC environments such as the OSPool.

  • Provides campuses and researchers with the ability to manage their data files, input and output, in support of running their dHTC workloads.
  • Improves file access performance, resource consumption and reliability.
  • OSG-Operated Access Points provide researchers with a default of 500GB of storage space on the OSDF.
Throughput Computing 2025

You are invited to Throughput Computing Week 2025 (HTC25) in beautiful Madison, Wisconsin, the week of June 2-6, 2025. Registration will open in January 2025.

In July 2024, CHTC and the OSG Consortium hosted its second annual Throughput Computing Week in Madison, Wisconsin joined in person and remotely by 388 participants. 2024 themes included dedicated sessions on campus cyberinfrastructure, talks on AI and machine learning enabled by high throughput computing, and tutorials and presentations on the new Pelican Platform project.

Read about some of the highlights from Throughput Computing Week 2024

View HTC24 Contributions, Recordings and Slides

If you are campus, researcher, scientific collaboration or government representative interested in throughput computing, please consider joining the annual Throughput Computing Week.

What We Do


The OSG facilitates access to distributed high throughput computing for research in the US. The resources accessible through the OSG are contributed by the community, organized by the OSG, and governed by the OSG consortium. In the last 12 months, we have provided more than 1.2 billion CPU hours to researchers across a wide variety of projects.

Submit Locally, Run Globally


Researchers can run jobs on OSG from their home institution or an OSG-Operated Access Point (available for US-based research and scholarship).

Sharing Is Key


Sharing is a core principle of the OSG. Over 100 million CPU hours delivered on the OSG in the past year were opportunistic, contributed by university campuses, government-supported supercomputing facilities and research collaborations. Sharing allows individual researchers to access larger computing resources and large organizations to keep their utilization high.

Resource Providers


The OSG consists of computing and storage elements at over 100 individual sites spanning the United States. These sites, primarily at universities and national labs, range in size from a few hundred to tens of thousands of CPU cores.

The OSG Software Stack


The OSG provides an integrated software stack to enable high throughput computing; visit our technical documents website for information.

Coordinating CI Services


NSF’s Blueprint for National Cyberinfrastructure Coordination Services lays out the need for coordination services to bring together the distributed elements of a national CI ecosystem. It highlights OSG as providing distributed high throughput computing services to the U.S. research community.

Find Us!

Are you a resource provider wanting to join our collaboration? Contact us: [email protected].

Are you a user wanting more computing resources? Check with your 'local' computing providers, or consider using an OSG-Operated Access Point which has access to the OSPool (available to US-based academic/govt/non-profit research projects).

For any other inquiries, reach us at: [email protected].

To see the breadth of the OSG impact, explore our accounting portal.

Support

The activities of the OSG Consortium are supported by multiple projects and in-kind contributions from members. Significant funding is provided through:


PATh

The Partnership to Advance Throughput Computing (PATh) is an NSF-funded (#2030508) project to address the needs of the rapidly growing community embracing Distributed High Throughput Computing (dHTC) technologies and services to advance their research.

IRIS-HEP

The Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) is an NSF-funded (#1836650) software institute established to meet the software and computing challenges of the HL-LHC, through R&D for the software for acquiring, managing, processing and analyzing HL-LHC data.