H4CF Conversion Toolkit User's Guide
*************************************

This document describes how to use h4cf library and tool. 

It was last updated on June 15, 2013 for version 1.0.beta.

CONTENTS
========
- How to Use H4CF Library APIs
- How to Use HDF4-to-NetCDF Conversion Tool
- Examples
- Editing the Converted NetCDF File

----------------------------
How to Use H4CF Library APIs
----------------------------
 In general, here's how to use the APIs:

   Step 1: Use h4cf_open() to open an HDF4 file or an HDF-EOS2 file and 
           initialize library.
   Step 2: Use h4cf_get_vars() to obtain all variables. Each variable 
           corresponds to a SDS/Vdata field in the opened HDF4/HDF-EOS2 file.
   Step 3: For the variables obtained in step 3, one can further obtain their 
           data values through h4cf_get_var_value() or attributes through 
           h4cf_get_var_attrs().
   Step 4: For the file attributes, h4cf_get_file_attrs() will bring all 
           the file attributes. 
   Step 5: Closes the library through h4cf_close().

   For the detailed usage about the APIs, please refer to code under 
   the "examples" sub-directory in the source distribution. 	

-----------------------------------------
How to Use HDF4-to-NetCDF Conversion Tool
-----------------------------------------

  h4tonccf is a conversion tool which uses the H4CF conversion library APIs to 
convert an HDF4 (incl. HDF-EOS2) file to either a NetCDF-3 or a NetCDF-4 
classic file.  

  You can execute h4tonccf as follows:

        h4tonccf filename.hdf [filename.nc]
  
  The output NetCDF file name is optional. If it is missing, HDF4 base file name
  plus .nc extension will be used as NetCDF file name instead.

  For example, if you don't provide an output file name

	$./h4tonccf CER_ISCCP-D2like-Day_Aqua-FM3-MODIS_Beta1_023030.200612.hdf

  The "CER_ISCCP-D2like-Day_Aqua-FM3-MODIS_Beta1_023030.200612.nc" file will be 
  created.
	
  On the other hand, if you provide an output file name "z.nc",

	$./h4tonccf AIRS.2002.12.31.001.L2.CC_H.v5.0.14.0.G07282131425.hdf z.nc

  "z.nc" file will be created.

---------
Examples
---------

  We provide four examples for four sample HDF4 files - one HDF-EOS2, one HDF4 
SDS, one HDF4 Vdata, and one HDF-EOS2+HDF4 Hybrid. Hybrid means the file was
originally created with HDF-EOS2 APIs but it was modified with HDF4 APIs 
later.  Any object that was modified by HDF4 APIs may no longer be 
accessible by HDF-EOS2 APIs. This kind of hybrid files are often found from 
NASA HDF-EOS2 products.

  For example, the examples/hdf-eos2/ directory in the distribution provides 
"geo.hdf" for you to test. The "geo.hdf" contains a 2D "temp" dataset and 
some file attributes. 

  The four example programs for the "geo.hdf" file are:

  1) "read" -  obtain the all data values of "temp".
  2) "read_subset" - obtain the partial data values of "temp".
  3) "read_var_attrs" - obtain the variable attributes.
  4) "read_file_attrs" - obtain the file attributes.

  The common parts of the four example codes are how to initialize and close the
library. The other parts of example codes call different APIs.

  You can generate executables as follows after editing the Makefile.template 
because Makefiles for these examples are not generated by autoconf:

        $ cd examples/hdf-eos2/
        $ vi Makefile.template # Edit the Makefile template to change paths. 
        $ make -f Makefile.template

  You can execute example programs for "geo.hdf" one by one as follows:

        $ ./read

The output will be like:

Row: 0
0       1       2       3       4       5       6       7       8       9      10       11      12      13      14      15      16      17      18      19     20       21      22      23      24      25      26      27      28      29     30       31      32      33      34      35

......

Row: 17
612     613     614     615     616     617     618     619     620     621    622      623     624     625     626     627     628     629     630     631    632      633     634     635     636     637     638     639     640     641    642      643     644     645     646     647

        $./read_subset

The output will be like:

        vals[0,0]=0     vals[0,1]=2     vals[0,2]=4     vals[0,3]=6
        vals[1,0]=72    vals[1,1]=74    vals[1,2]=76    vals[1,3]=78
        vals[2,0]=144   vals[2,1]=146   vals[2,2]=148   vals[2,3]=150
        vals[3,0]=216   vals[3,1]=218   vals[3,2]=220   vals[3,3]=222

        $./read_var_attrs

The output will be like:

        _FillValue = -999

        $./read_file_attrs

The output will be like:

        HDFEOSVersion = HDFEOS_V2.16StructMetadata_0 = GROUP=SwathStructure
        END_GROUP=SwathStructure
        GROUP=GridStructure
        GROUP=GRID_1
                GridName="grid1"
                XDim=36
                YDim=18
                UpperLeftPointMtrs=(-180000000.000000,90000000.000000)
                LowerRightMtrs=(180000000.000000,-90000000.000000)
                Projection=GCTP_GEO
                GROUP=Dimension
                END_GROUP=Dimension
                GROUP=DataField
                        OBJECT=DataField_1
                                DataFieldName="temp"
                                DataType=DFNT_FLOAT64
                                DimList=("YDim","XDim")
                        END_OBJECT=DataField_1
                END_GROUP=DataField
                GROUP=MergedFields
                END_GROUP=MergedFields
        END_GROUP=GRID_1
        END_GROUP=GridStructure
        GROUP=PointStructure
        END_GROUP=PointStructure
        END

---------------------------------
Editing the Converted NetCDF File
---------------------------------
  Although the H4CF Conversion Toolkit satisfies the key requirements of the
CF conventions that make NetCDF visualization tools work, there are some 
non-CF compliant attributes that the toolkit doesn't convert. To standardize
such attributes that the toolkit does not handle, we recommend users to 
utilize either NcML or NCO.

  For example, if you carefully examine the plot of OBPG MODIS-T variable 
"l3m_data" from the converted file T20000322000060.L3m_MO_NSST_4.nc* with the
Panoply tool, you will notice that there is no unit in the colorbar scale and 
the land area is covered with fill values because the "units" attribute is not 
provided for the "l3m_data" as dataset attribute and "_FillValue" attribute is
missing in the original HDF4 file.

  You can add (or overwrite) units and _FillValue attribute and their values 
by writing a simple NcML file as follows:
--------------------------------------------------------------------------------
<?xml version="1.0" encoding="UTF-8"?>
<netcdf location="T20000322000060.L3m_MO_NSST_4.nc"   
  xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2">
  <variable name="l3m_data" type="int">
    <attribute type="string"  name="units" value="deg-C" />
    <attribute type="int"  name="_FillValue" value="65535" />
  </variable>
</netcdf>
--------------------------------------------------------------------------------
  Then, you can open the NcML file directly with Panoply after putting the 
NcML file and the converted NetCDF file into the same directory. For details,
please refer to examples/ncml/README.txt document in the distribution.

  You can add or revise an attribute of a netCDF file by using the "ncatted" command in NCO.
--------------------------------------------------------------------------------
  $ ncatted -a 'units,l3m_data,c,c,deg-C' \
   T20000322000060.L3m_MO_NSST_4.nc T20000322000060.L3m_MO_NSST_4.nco.nc
  $ ncatted -a '_FillValue,l3m_data,c,i,65535' \
   T20000322000060.L3m_MO_NSST_4.nco.nc
--------------------------------------------------------------------------------
   In the above example, the first line will create a new file (i.e., 
T20000322000060.L3m_MO_NSST_4.nco.nc) with units attribute and the  second line
will modify the new file directly. For details about "ncatted", please visit

    http://nco.sourceforge.net/nco.html#xmp_ncatted

  The table below summarizes some NASA products that need editing after 
conversion. Please note that we could not test all NASA products so the table 
is far from complete. Please let us know at hdfeos.org/forum if you find other
 products that need editing.
|==============================================================================|
| NASA Data Center | Products        | Notes                                   |
|==============================================================================|
|                  |                 |                                         |
| GES DISC         | AIRS            | Add units attribute.                    |
| LAADS            | MOD04/MYD04     | Change the scale factor value.          |
| LP DAAC          | MOD09GA/MYD09GA | Change the scale factor value.          |
| NSIDC            | AMSR_E_L2A      | Rename scale factor.                    |
| OBPG             | CZCS/MODIS      | Add scale/fill value/offset attributes. |
|__________________|_________________|_________________________________________|


*This HDF4 sample file is available through testsuite/data.nasa1/download.sh
 script. Please download it with script and convert it with h4tonccf tool.
