Thursday Exercise 3.1: Using a Web Proxy for Large Shared Input¶
Continuing the series of exercises blasting mouse genetic sequences, the objective of this exercise is to use a web proxy to stage the large database, which will be downloaded into each of many jobs that use the split input files from the last exercise (Exercise 2.3).
Setup¶
- Make sure you are logged into
training.osgconnect.net
- Make sure you are in the same directory as the previous exercise, Exercise 2.3 directory named
thur-blast-data
.
Place the Large File on the Proxy¶
First, you'll need to put the pdbaa_files.tar.gz
file onto the Stash web directory. Use the following command:
user@training $ cp pdbaa_files.tar.gz ~/stash/public/
Test a download of the file¶
Once the file is placed in the ~/stash/public directory, it can be downloaded from a corresponding URL such as http://stash.osgconnect.net/~username/pdbaa_files.tar.gz
, where username is your username on training.osgconnect.net
.
Using the above convention (and from a different directory on training.osgconnect.net
, any directory), you can test the download of your pdbaa_files.tar.gz
file with a command like the following:
user@training $ wget http://stash.osgconnect.net/~username/pdbaa_files.tar.gz
You may realize that you've been using wget
to download files from a web proxy for many of the previous exercises at the school!
Run a New Test Job¶
Now, you'll repeat the last exercise (with a single input query file) but have HTCondor download the pdbaa_files.tar.gz
file from the web proxy, instead of having the file transferred from the submit server.
Modify the submit file and wrapper script¶
In the wrapper script, we have to add some special lines so that we can pull from Stash and use the HTTP proxy. In blast_wrapper.sh
, we will have to add commands to pull the data file:
#!/bin/bash # Copy the pdbaa_files.tar.gz to the worker node # Add the -S argument, so we can see if it was a cache HIT or MISS wget -S http://stash.osgconnect.net/~username/pdbaa_files.tar.gz tar xvzf pdbaa_files.tar.gz ./blastx -db pdbaa -query mouse.fa -out mouse.fa.result rm pdbaa.*
The new line will download the pdbaa_files.tar.gz
from stash, using the closest cache (because wget
will look at the environment variable http_proxy
for the newest cache).
In your submit file, you will need to remove the pdbaa_files.tar.gz
file from the transfer_input_files
, because we are using HTTP proxies!
Submit the test job¶
You may wish to first remove the log, result, output, and error files from the previous tests, which will be overwritten when the new test job completes.
user@training $ rm *.error *.out *.result *.log
Submit the test job!
When the job starts, the wrapper will download the pdbaa_files.tar.gz
file from the web proxy. If the jobs takes longer than two minutes, you can assume that it will complete successfully, and then continue with the rest of the exercise.
After the job completes examine the *.error file generated by the submission. At the top of the file, you will find something like:
--2018-07-09 17:00:34-- http://stash.osgconnect.net/~dweitzel/pdbaa_files.tar.gz Resolving cmsprxy01.colorado.edu (cmsprxy01.colorado.edu)... 192.12.238.152 Connecting to cmsprxy01.colorado.edu (cmsprxy01.colorado.edu)|192.12.238.152|:3128... connected. Proxy request sent, awaiting response... HTTP/1.0 200 OK Server: nginx/1.10.2 Date: Mon, 09 Jul 2018 16:58:50 GMT Content-Type: application/octet-stream Content-Length: 22105124 Last-Modified: Mon, 09 Jul 2018 15:56:19 GMT ETag: "5b4385a3-1514c24" Accept-Ranges: bytes Age: 104 X-Cache: HIT from cmsprxy01.colorado.edu Via: 1.1 cmsprxy01.colorado.edu:3128 (squid/frontier-squid-2.7.STABLE9-27.1.osg33.el7) Connection: keep-alive Proxy-Connection: keep-alive Length: 22105124 (21M) [application/octet-stream] ...
Notice the X-Cache
line. It says it was a cache HIT from the proxy cmsprxy01.colorado.edu
. Yay! You successfully used a proxy to cache data near your worker node! Notice, the name of the cache file may be different.
Run all 100 Jobs!¶
If all of the previous tests have gone okay, you can prepare to run all 100 jobs that will use the split input files. To make sure you're not going to generate too much data, use the size of files from the previous test to calculate how much total data you're going to add to the thur-blast-data
directory for 100 jobs.
Make sure you remove pdbaa_files.tar.gz
from the transfer_input_files
in the split submit file.
Submit all 100 jobs! They may take a while to all complete, but it will still be faster than the many hours it would have taken to blast the single, large mouse_rna.fa
file without splitting it up. In the meantime, as long as the first several jobs are running for longer than two minutes, you can move on to the next exercise.