Instructions for the Installation of H4CF Conversion Toolkit Software
=====================================================================

  This file provides instructions for installing the H4CF Conversion Toolkit
software.  

  If you have any problems with the installation, please use the HDF-EOS forum:

  http://hdfeos.org/forums/

------------
REQUIREMENTS
------------

Requirements for Library
------------------------

 HDF4 and HDF-EOS2 libraries are required.
 The HDF4 library may need external libraries such as jpeg
 and zlib depending on your system.

Requirements for Conversion Tool
---------------------------------

  Note: 
  We update the software to 1.4 release, which is a minimal maintenance release.
  One can find the dependent library versions in the doc/RELEASE.txt. 
  Please follow the next paragraph and after for the 1.3 release on how to build the conversion toolkit.
  If having issues to build the h4tonccf utility, 
  check the readme file under /utility on the tips.
  

  To use the HDF-EOS2/HDF4 to NetCDF-CF conversion tool (h4tonccf) to generate 
the netCDF files, the netcdf C library must be installed.

  For this release, we have tested against netcdf-4.7.3(built with HDF5 1.10.5), hdfeos-2.19, 
hdf-4.2.15, and the corresponding jpeg and zlib libraries.

  We also added minimal/experimental MS Visual Studio support in this release. On 
  how to install/build on windows with MS Visual Studio, please read INSTALL_WINDOWS.txt
  under windows/ directory. This installation document if for Linux/Unix and MacOS users.

  The HDF4 library SHOULD BE BUILT with the NETCDF OPTION to be DISABLED. 
Otherwise, the HDF4 to netCDF-CF conversion tool may fail to be compiled and 
work. When you configure HDF4 library, add the "--disable-netcdf" option in the
 ./configure command.

  It's possible that your system's default netcdf library is out of date
 and not suitable for the h4tonccf because the netcdf library version should 
be higher than 4.3.0.  Thus, you may install the latest netcdf library by 
yourself. If that's the case, please don't forget to update your PATH 
environment to include the path of the installed netcdf tools:

  $ export PATH=/netcdf_preifx/bin:$PATH

  The above step is required if you're going to run conversion tool test
 through 'make check' command.

  Building the netCDF library with --disable-netcdf-4 is fine, which will 
generate NetCDF-3 format file for the h4tonccf conversion tool. However,
this is not recommended because NetCDF-3 format doesn't support unsigned 
integer types and an overflow may occur during the conversion of unsigned 
32-bit integer type.

-----
BUILD
-----

  Decompress and unpack the source tar file. You will see 3 main sub-directories,
 src/, utility/, and examples/. To build and install the library and utility 
tools, one can use configure script and type the command like this:

        $ tar zxvf h4cf-1.3.tar.gz
        $ cd h4cf-1.3/src
        $ ./configure \
                --prefix=/your installation path \
                --with-jpeg=/your jpeg library path \
                --with-zlib=/your szip library path \
                --with-szlib=/your szlib library path \ 
                --with-hdf4=/your HDF4 library path \
                --with-hdfeos2=/your hdfeos2 library path \
                --with-hdf5 = /your hdf5 library path\
                --with-netcdf=/your netcdf library path 
        $ make 
        $ make check                # Run test suite (optional).
        $ make install

  If HDF4 and HDF-EOS2 libraries are not built with szip, the --with-szlib 
option doesn't need to be provided. 

  If jpeg and zlib are provided by the system, the --with-jpeg and --with-zlib 
options can also be possibly ignored. 

  If you want to install conversion library only and your system doesn't have 
the netcdf library, the --with-netcdf option doesn't need to be provided. 
If your system already has the netcdf library in your standard system path,
you can skip building the conversion tool by specifying --with-netcdf=no 
explicitly.

  If your netcdf library is built with --disable-netcdf4, the --with-hdf5 option
doesn't need to be provided.

  We also provide simple example codes on how to use the C++ high-level library
APIs to retrieve values of variables and attributes. One can generate 
executables as follows after editing the Makefile.template because Makefiles 
for examples are not generated by autoconf:

        $ cd h4cf-1.3/examples/hdf-eos2/ # See below for more choices.
        $ vi Makefile.template # Edit the Makefile template to change paths. 
        $ make -f Makefile.template

  This will generate four executables, which are "read", "read_subset", 
"read_var_attrs", and "read_file_attrs". "read" will obtain all values of a 
variable. "read_subset" will obtain only the subset of a variable. 
"read_var_attrs" will obtain attributes of a variable. "read_file_attrs" will 
obtain attributes of a file. 

  We also provide examples/hdf4/sds, examples/hdf4/vdata, and examples/hybrid 
directories with the same four example programs. 

----
TEST
----

  If netcdf library is not included during the configuration, the 'make check'
command will not test h4tonccf conversion tool. It will test only h4cf.

  If conversion utility (h4tonccf) test fails, please make sure that ncdump is 
the right one from the netcdf package that you have built the tool with. 

  $ which ncdump

  If you have multiple installations of netcdf library, please update your PATH
environment so that the correct installation of ncdump appears from the above 
command and run 'make check' again. 

  Alternatively, you can manually edit testsuite/h4cfTest.nc.at or 
testsuite/h4cfTest.nc3.at autotest script to include the full path in front
of the ncdump command. See the comments in the h4cfTest.nc.at or h4cfTest.nc3.at
 file for details.

  Since our toolkit is for NASA HDF-EOS2/HDF4 products, we provide a way to test
real NASA sample data with h4tonccf. Please follow the steps below to test 
NASA data.

  0. Make sure that you have built the h4tonccf conversion tool with 
     the latest netcdf library. If you disabled NetCDF-4 output, you'll need
     26G of disk space for converted NASA NetCDF-3 files. Otherwise, you'll 
     need 8G of disk space for converted NASA NetCDF-4 files.

  1. Make sure that you have the additional 10G disk space and download NASA 
     HDF files using

      $ cd testsuite/data.nasa1 && ./download.sh

   It will download all test data from The HDF Group FTP site and put it
   under testsuite/data.nasa1/ directory.


  2. Copy NASA data autotest script over the original autotest script. 

    2.1  If you used NetCDF-4 option for h4tonccf, update h4cfTest.nc.at:
     
      $ cp testsuite/h4cfTest.nc.nasa1.at testsuite/h4cfTest.nc.at

    2.2  If you used NetCDF-3 option for h4tonccf, update h4cfTest.nc3.at:
    
      $ cp testsuite/h4cfTest.nc3.nasa1.at testsuite/h4cfTest.nc3.at

  3. Run 'make check'. This will test the downloaded NASA files from Step 1. 
     instead of sample files in the distribution.

  4. You can find all converted NASA files under testsuite/tmp directory.

  5. For Linux, you can use valgrind to check if the toolkit has the memory leaking.
     $ cp testsuite/h4cfTest.nc.valgrind.at  testsuite/h4cfTest.nc.at
     $ Then run "make check"
     $ The valgrind output is saved as /tmp/h4cf.valgrind.log
     $ greping the output should show "definitely lost: " 0 bytes in o blocks for all tests.
     

    
