DYCOMS-II Satellite: GOES-10 Channel 1 (Visible) 1KM Sectored Data in Hierarchical Data Format (HDF) 1.0 General Information The GOES-10 Channel 1 (Visible) 1KM Sectored Data in Hierarchical Data Format (HDF) is one of several satellite data sets collected as part of the Dynamics and Chemistry of Marine Stratocumulus Phase II: Entrainment Studies (DYCOMS-II) project by the University Corporation for Atmospheric Research/Joint Office for Science Support (UCAR/JOSS). UCAR/JOSS utilized its SeaSpace system to retrieve data from the Geostationary Operational Environmental Satellite 10 (GOES-10) and developed sectors over the DYCOMS-II region. This data set consists of the sectored 1-km data from Channel 1 (visible; 0.65um) of the GOES-10 satellite. The products cover the period from 1 - 30 July 2001 and cover the DYCOMS-II sector (27-37N and 114-126W). During the times the NSF/NCAR C-130 aircraft was in flight (see the web page http://www.joss.ucar.edu/dycoms/catalog/missions/ for flight times) the data are typically available every 15 minutes, but rapid scan imagery are also often available (most often during the afternoon and evening hours). For the times that the NSF/NCAR C-130 was not in flight data are available every three hours (00, 03, 06, ... UTC). All images are in HDF format. A brief description of HDF is given in Section 3.1. 2.0 Data Contact Scot Loehrer (loehrer@ucar.edu) 3.0 Product Information UCAR/JOSS utilized its SeaSpace system to retrieve data from the Geostationary Operational Environmental Satellite 10 (GOES-10) and developed sectors over the DYCOMS-II region. This data set consists of the sectored data from Channel 1 (visible; 0.65um) of the GOES-10 satellite converted to HDF format. During DYCOMS-II field operations, GOES-10 imagery ingested by the SeaSpace system was archived in real time in Terascan Data Format (TDF) for later processing. From the archived GOES-10 imagery (TDF), sectors covering DYCOMS-II region were generated using the SeaSpace "gsubset" utility. These sectors were converted to HDF format using the SeaSpace system's "tdftohdf" utility. Each HDF file consists of data for every line and sample pixel contained in original TDF file. In order to geo-reference the image data, two data fields containing latitude and longitude for every line and sample were added to the HDF files using the SeaSpace utility "angles". 3.1 Brief Description of Hierarchical Data Format (HDF) The following sections provide a brief description of the Hierarchical Data Format or HDF. A more complete summary and technical information regarding HDF is available from The National Center for Supercomputing Applications (NCSA) HDF homepage at "http://hdf.ncsa.uiuc.edu". Hierarchical Data Format (HDF) is a file format developed by The National Center for Supercomputing Applications (NCSA) at the University of Illinois Urbana-Champaign Campus (UIUC) and was designed to accommodate scientific datasets. HDF facilitates the sharing of data in a distributed environment and is widely used within the scientific community. ("A Short introduction to HDF given at the Workshop on Multidisciplinary Applications and Interoperability, Wright Patterson Air Force Base, May 1997. retrieved February 4,2002 from http://hdf.ncsa.uiuc.edu/training/hdfintro_mapint97/sld002.htm). HDF can accommodate large datasets, many different data types and structures, and numerous metadata types. HDF binaries and libraries are portable, it's data storage is fast and efficient, and additional features can be added as needed. Furthermore, HDf is freely available and uses open standards. ("A Short introduction to HDF given at the Workshop on Multidisciplinary Applications and Interoperability, Wright Patterson Air Force Base, May 1997. retrieved February 4,2002 from http://hdf.ncsa.uiuc.edu/training/hdfintro_mapint97/sld003.htm). Many NetCDF utilities (eg ncdump) will work on HDF files if compiled and linked to the HDF library. Precompiled HDF utilities and binaries for various platforms are available for download from NCSA's HDF homepage at http://hdf.ncsa.uiuc.edu. The following examples used ncdump compiled for Solaris and downloaded from NCSA's HDF web site. HDF files can also be read by the Interactive Data Language (IDL). Moreover, the HDF packages includes C and Fortran 77 routines which can read and write HDF files as well as perform mathematical operations using the HDF data fields. 3.2 DYCOMS-II GOES-10 1KM Visible HDF file format Below are samples from the header and data records contained in a HDF satellite image file. The header describes the image dimensions, the type and number of variables contained in the HDF file, the units foreach variable, and any scaling, offset, or error factors needed when rederiving calibrated variables. The following sections used output from ".../4.1r3_solaris/bin/ncdump -h g10.2001182.0000.1KM.dycoms.hdf". This section details dimensions and channels for each image contained in the HDF file. For the DYCOMS-II 1KM visible imagery, each image consists of 1001 lines by 2201 samples. Visible data is denoted by the "gvar_ch1" descriptor. dimensions: gvar_ch1_line = 1001 ; gvar_ch1_samp = 2201 ; This section describes how the image data for the visible channel (gvar_ch1) are stored in the data section of the HDF file. The image data is in units of "albedo*100%". The relationship between a value iy stored in the dataset and the actual value y is defined as: y = scale_factor * (iy - add_offset). The variables add_offset_err and scale_factor_err contain the potential errors in add_offset and scale_factor respectively. variables: byte gvar_ch1(gvar_ch1_line, gvar_ch1_samp) ; gvar_ch1:long_name = "gvar_ch1" ; gvar_ch1:units = "albedo*100%" ; gvar_ch1:valid_range = '\0', '\377' ; gvar_ch1:_FillValue = '\377' ; gvar_ch1:scale_factor = 0.4000000059604645 ; gvar_ch1:scale_factor_err = 0. ; gvar_ch1:add_offset = 0. ; gvar_ch1:add_offset_err = 0. ; gvar_ch1:calibrated_nt = 21 ; The following is a portion of the output generated by the command: ".../4.1r3_solaris/bin/ncdump -v gvar_ch1 g10.2001182.0000.1KM.dycoms.hdf". This is a sample of the gvar_ch1 (visible) data field. data: gvar_ch1 = 6, 5, 6, 6, 6, 5, 6, 5, 4, 6, 5, 5, 4, 6, 4, 5, 5, 5, 6, 4, 6, 6, 4, 5, 6, 4, 5, 6, 5, 5, 6, 4, 5, 5, 5, 7, 6, 5, 5, 5, 4, 5, 4, 6, 5, 6, 5, 6, 5, 6, 5, 5, 5, 5, 6, 6, 5, 5, 6, 6, 4, 5, 7, 4, 5, 6, 6, 6, 6, 6, 5, 6, 4, 4, 6, 5, 4, 5, 4, 5, 6, 5, 6, 4, 6, 4, 5, 6, 5, 5, 6, 4, 6, 4, 6, 4, 5, 5, 4, 4, 5, 5, 5, 5, 6, 4, 6, 4, 4, 6, 4, 7, 6, 5, 5, 5, 5, 7, 5, 4, 5, 5, 5, 5, 5, 4, 6, 5, 5, 5, 5, 5, 6, 5, 6, 5, 6, 5, 4, 5, 5, 5, 5, 6, 5, 6, 4, 5, 5, 5, 5, 6, 3, 5, 6, 5, 5, 4, 5, 5, 5, 5, 4, 6, 5, 6, 6, 5, 4, 6, 5, 5, 4, 6, 5, 5, 5, 5, 6, 5, 6, 7, 5, 5, 5, 6, 5, 6, 5, 5, 6, 5, 4, The following is a sample of the output generated by the command: ".../4.1r3_solaris/bin/ncdump -h g10.2001182.0000.1KM.dycoms.hdf". float latitude(gvar_ch1_line, gvar_ch1_samp) ; latitude:long_name = "latitude" ; latitude:units = "degrees" ; latitude:valid_range = -3.4028235e+38f, 3.4028235e+38f ; latitude:_FillValue = -3.4028235e+38f ; latitude:scale_factor = 1. ; latitude:scale_factor_err = 0. ; latitude:add_offset = 0. ; latitude:add_offset_err = 0. ; latitude:calibrated_nt = 5 ; The following used is a portion of the output created by the command: ".../4.1r3_solaris/bin/ncdump -v latitude g10.2001182.0000.1KM.dycoms.hdf". data: latitude = 38.26001 , 38.26011 , 38.26022 , 38.26032 , 38.26043 , 38.26054 , 38.26064 , 38.26075 , 38.26085 , 38.26096 , 38.26107 , 38.26117 , 38.26128 , 38.26139 , 38.26149 , 38.2616 , 38.26171 , 38.26181 , 38.26192 , 38.26203 , 38.26213 , 38.26224 , 38.26235 , 38.26245 , 38.26257 , 38.26267 , 38.26278 , 38.26289 , 38.263 , 38.2631 , 38.26321 , 38.26332 , 38.26343 , 38.26354 , 38.26365 , 38.26376 , 38.26386 , 38.26397 , 38.26408 , 38.26419 , 38.2643 , 38.26441 , 38.26452 , 38.26463 , 38.26474 , 38.26485 , 38.26496 , 38.26507 , 38.26518 , The following is a sample of the output generated by the command: ".../4.1r3_solaris/bin/ncdump -h g10.2001182.0000.1KM.dycoms.hdf". float longitude(gvar_ch1_line, gvar_ch1_samp) ; longitude:long_name = "longitude" ; longitude:units = "degrees" ; longitude:valid_range = -3.4028235e+38f, 3.4028235e+38f ; longitude:_FillValue = -3.4028235e+38f ; longitude:scale_factor = 1. ; longitude:scale_factor_err = 0. ; longitude:add_offset = 0. ; longitude:add_offset_err = 0. ; longitude:calibrated_nt = 5 ; The following is a sample of the output generated by the command: ".../4.1r3_solaris/bin/ncdump -v longitude g10.2001182.0000.1KM.dycoms.hdf". data: longitude = -128.1357 , -128.1288 , -128.1218 , -128.1149 , -128.108 , -128.1011 , -128.0941 , -128.0872 , -128.0803 , -128.0733 , -128.0664 , -128.0595 , -128.0526 , -128.0456 , -128.0387 , -128.0318 , -128.0248 , -128.0179 , -128.011 , -128.004 , -127.9971 , -127.9902 , -127.9832 , -127.9763 , -127.9694 , -127.9624 , -127.9555 , -127.9486 , -127.9416 , -127.9347 , -127.9278 , -127.9208 , -127.9139 , -127.907 , -127.9 , -127.8931 , -127.8862 , -127.8792 , -127.8723 , -127.8654 , -127.8584 , -127.8515 , -127.8446 , -127.8376 , -127.8307 , -127.8237 , -127.8168 , -127.8099 , -127.8029 , -127.796 , -127.789 , -127.7821 , -127.7752 , -127.7682 , The HDF files in this dataset contain only data from the GOES-10 visible channel and latitudes and longitudes for every line and sample. 4.0 Quality Control Procedures UCAR/JOSS conducted no quality checks on these data. 5.0 File Naming Convention The file names are structured as follows: goes-10.yyyyddd.hhmm.1KM.dycoms.hdf Where goes-10 is the satellite used yyyy is the four digit year ddd is the three digit julian date hh is the hour (UTC) mm is the minute 1KM is the product's resolution The files on the NCAR Mass Store are gzipped and so have a .gz extention on the end of the file name. 6.0 References HDF Reference Manual Version 4.1r2,June 1998, NCSA University of Illinois at Urbana-Champaign National Center for Supercomputing Applications (NCSA) HDF Home page at http://hdf.ncsa.uiuc.edu. 6.1 Links to HDF documentation and sources National Center for Supercomputing Applications (NCSA) HDF Home page at http://hdf.ncsa.uiuc.edu is the best place to begin searching for HDF utilities, binaries, and documentation. This url located at the NCSA's HDF home page contains numerous links to online HDF documentation and tutorials. http://hdf.ncsa.uiuc.edu/present.html This url located contains links to HDF4 documentation, binaries, utilities, etc. http://hdf.ncsa.uiuc.edu/hdf4.html