Skip Navigation Links 
NOAA logo - Click to go to the NOAA home page National Weather Service   NWS logo - Click to go to the NWS home page
Climate Prediction Center

CPC Search
About Us
   Our Mission
   Who We Are

Contact Us
   CPC Information
   CPC Web Team

HOME > Monitoring and Data > Oceanic & Atmospheric Data > Meteorlogical Data Servers
Fast Downloading of Grib, Part 2


Jan 2, 2019: is changing the URLs from http:// to https://. A version of with the new URLs was released 12/31/2018. You may need to get a newer version of cURL if you have problems.

Wrappers @ NCDC

While the procedure detailed in part 1 is straight forward, it could be easier. I don't like looking for and typing out URLs. Writing loops takes time too. Less experienced people like it even less. Dan Swank wrote a nice wrapper to download data for the North American Regional Reanalysis (NARR). It worked so well that he did During May 2006, 95% of the NCDC-NOMADS downloads were done using cURL.

Wrappers @ NCEP (NOMADS):

At NCEP, we wanted people to (1) get forecasts using partial-http transfers rather than ftp2u and (2) move off the nomad servers to the more reliable NCO servers. So was born. Wanted the script to be easy to use, easy to reconfigure, easier to install and work with Windows.

  2. perl
  3. cURL
  1. The cURL executable needs to be downloaded and put in a directory on your $PATH.
  2. The first line of should point to the location of the local perl interpreter.
  3. Non-windows users can set the $windows flag to "thankfully no" in for more efficiency.

Note: some Windows setups will need to type: 

      perl data DATE HR0 HR1 DHR DIRECTORY

DATE = start time of the forecast YYYYMMDDHH
       note: HH should be 00 06 12 or 18

HR0 = first forecast hour wanted
HR1 = last forecast hour wanted
DHR = forecast hour increment (forecast every 3, 6, 12, or 24 hours)
VARS = list of variables or "all"
        ex. HGT:TMP:OZONE
        ex. all
LEVS = list of levels, blanks replaced by an underscore, or "all"
        ex. 500_mb:200_mb:surface
        ex. all
DIRECTORY = directory in which to put the output

example:  perl data 2006101800 0 12 6 UGRD:VGRD 200_mb .
example:  perl data 2006101800 0 12 6 UGRD:VGRD 200_mb:500_mb:1000_mb .
example:  perl data 2006101800 0 12 12 all surface .

regex metacharacters: ( ) . ^ * [ ] $ +

The script uses the perl regular expressions (regex) for string 
matching.  Consequently the regex metacharacters should be quoted when 
they are part of the the search string.  For example, trying to find 
the following layer 

      "entire atmosphere (considered as a single_layer)"

will not work because the parentheses are metacharacters.  The following
techniques will work.

   Quoting the ( and ) characters data 2012053000 0 6 3 TCDC "entire atmosphere \(considered as a single layer\)" . data 2012053000 0 6 3 TCDC entire_atmosphere_\\\(considered_as_a_single_layer\\\) .

   Using a period (which matches all characters) to match the ( and ) characters data 2012053000 0 6 3 TCDC "entire atmosphere .considered as a single layer." . data 2012053000 0 6 3 TCDC entire_atmosphere_.considered_as_a_single_layer. .
How works is based on the and scripts. The advantage of is that the URL is built in as well as the looping over the forecast hours.

Metalanguage for data DATE HR0 HR1 DHR VARS LEVS DIRECTORY

# convert LEVS and VARS into REGEX
  if (VARS == "all") {
  else {
    VARS = substitute(VARS,':','|')
    VARS = substitute(VARS,'_',' ')
    VARS = ":(VARS):";

  if (LEVS == "all") {
    LEVS = substitute(LEVS,':','|')
    LEVS = substitute(LEVS,'_',' ')
    LEVS = ":(LEVS)";

# loop over all forecaset hours

  for fhour = HR0, HR1, DHR
     URL= URL_name(DATE,fhour)
     URLinv= URL_name(DATE,fhour).idx

     inventory_array[] = get_inv(URLinv);
     for i = inventory..array[0] .. inventory_array[last]
        if (regex_match(LEVS,inventory_array[i]) and regex_match(VARS,inventory_array[i]) {

Advanced Users

A user asked if it were possible to mix the variables and levels. For example, TMP @ 500 mb, HGT @ (250 and 700 mb). Of course you could run twice but that wouldn't be efficient.

It is possible because uses regular expressions and regular expressions are very powerful. All you need to remember is that converts the colon and underscore to a vertical bar and space, respectively for the VAR/LEV arguments.

Unix/Linux: data 2006111500 0 12 12 all 'TMP.500 mb|HGT.(200 mb|700 mb)'  data_dir

Windows: data 2006111500 0 12 12 all "TMP.500 mb|HGT.(200 mb|700 mb)"  C:\unix\

Other GRIB Data sets

One purpose of is to provide a simple script for downloading grib data using the partial httpd downloading protocol. The code was written so that it should be easily adapted to other grib+inv datasets.

Wrappers @ NCEP (NCO):
NCO (NCEP Centeral Operations) also has a wrapper,

Created: 10/2006, Updated: 5/2012

NOAA/ National Weather Service
National Centers for Environmental Prediction
Climate Prediction Center
5830 University Research Court
College Park, Maryland 20740
Climate Prediction Center Web Team
Page last modified: December 21, 2006
Disclaimer Privacy Notice

Privacy Policy