OSG | A national, distributed computing partnership for data-intensive research

80,000 jobs, 40 billion base pairs, and 20 bats –– all in 4 weeks

An evolutionary biologist at the AMNH used HTC services provided by the OSG to unlock a genomic basis for convergent evolution in bats.

Read More

The OSG Consortium

The OSG is a consortium of research collaborations, campuses, national laboratories and software providers dedicated to the advancement of all open science via the practice of distributed High Throughput Computing (dHTC), and the advancement of its state of the art. Established in 2005, the OSG operates a fabric of dHTC services for the National S&E community.

We are here to help with your NSF (CC*) Proposal!

Campuses with awards from the NSF Campus Cyberinfrastructure (CC*) Program play an important role in the OSG Consortium. The current chair-elect of the OSG council is Eric Sedore of Syracuse University. To date, 19 CC* campuses contribute to the more than 5M core hours delivered weekly by the Open Science Pool (OSPool).

As part of the PATh project we have accumulated significant experience working with campuses that apply to the CC* Program, providing letters of support and consulting for:

  • CC*-required resource sharing via the OSPool for the Campus Compute category
  • Gathering science drivers and planning local computing services

Please contact us at [email protected] with any questions you may have about OSG and the CC* Program!

What We Do

The OSG facilitates access to distributed high throughput computing for research in the US. The resources accessible through the OSG are contributed by the community, organized by the OSG, and governed by the OSG consortium. In the last 12 months, we have provided more than 1.2 billion CPU hours to researchers across a wide variety of projects.

Submit Locally, Run Globally

Researchers can run jobs on OSG from their home institution or OSG's centrally-operated submission service, OSG Connect (available for US-based research and scholarship).

Sharing Is Key

Sharing is a core principle of the OSG. Over 100 million CPU hours delivered on the OSG in the past year were opportunistic: they would have remained on but idle if it wasn't for the OSG. Sharing allows individual researchers to access larger computing resources and large organizations to keep their utilization high.

Resource Providers

The OSG consists of computing and storage elements at over 100 individual sites spanning the United States. These sites, primarily at universities and national labs, range in size from a few hundred to tens of thousands of CPU cores.

The OSG Software Stack

The OSG provides an integrated software stack to enable high throughput computing; visit our technical documents website for information.

Coordinating CI Services

NSF’s Blueprint for National Cyberinfrastructure Coordination Services lays out the need for coordination services to bring together the distributed elements of a national CI ecosystem. It highlights OSG as providing distributed high throughput computing services to the U.S. research community.

Find Us!

Are you a resource provider wanting to join our collaboration? Contact us: [email protected].

Are you a user wanting more computing resources? Check with your 'local' computing providers, or consider using OSG Connect (available to US-based academic/govt/non-profit research projects).

For any other inquiries, reach us at: [email protected].

To see the breadth of the OSG impact, explore our accounting portal.


The activities of the OSG Consortium are supported by multiple projects and in-kind contributions from members. Significant funding is provided through:


The Partnership to Advance Throughput Computing (PATh) is an NSF-funded (#2030508) project to address the needs of the rapidly growing community embracing Distributed High Throughput Computing (dHTC) technologies and services to advance their research.


The Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) is an NSF-funded (#1836650) software institute established to meet the software and computing challenges of the HL-LHC, through R&D for the software for acquiring, managing, processing and analyzing HL-LHC data.