BRAMS
                      Minimal Instalation
                      
1. Introduction

	The BRAMS minimal instalation don't use some libraries like:
	
	a) GRIB2 (WGRIB2): used to read GFS/NCEP Model's analisys directly;
	b) NETCDF4 with CDF5: used to read GEOS/NASA Model's analisys directly;
	c) HDF5 : used to read some fixed data like NDVI 
	
	The minimal instalation may be used if you're reading GRADS data analisys 
	generated using data from global model in grads, data fixed using bin or
	grads data or the native format from BRAMS called vfm format generated by
	dprep software.
	
2. Prerequisites:

	The BRAMS must be compiled using fortran compiler. The Model was tested 
	using GNU (Gfortran&GCC), Intel compiler(Ifort,icc), Portland (pgf90, pgcc),
	IBM, NEC and Cray compilers.
	
	BRAMS works only in parallel mode. One can run the model using a single 
	processor/core, but must to do it using MPI with the MPIRUN command. 
	If you don't have a MPI library installed we recommend download and 
	install the last version of MPICH stable release. 
	Also take care to choose the correct version to your OS.
	
	example:
	
	> cd ~/install
	> wget http://www.mpich.org/static/downloads/3.3.2/mpich-3.3.2.tar.gz
	> tar -zxvf mpich-3.3.2.tar.gz
	> cd mpich-3.3.2
	> ./configure -disable-fast CFLAGS=-O2 FFLAGS=-O2 CXXFLAGS=-O2 FCFLAGS=-O2 \
	            --prefix=/opt/mpich3
	> make
	> sudo make install
	
	NOTE: If Your system uses another compile than GNU use it in configure of
	MPICH.
	
3. Getting the Model:

	The BRAMS ins under version control using subversion. The stable and beta test 
	releases are in INPE/CPTEC ftp server. To get the last test realease please
	
	> wget http://ftp.cptec.inpe.br/pesquisa/bramsrd/BRAMS/releases/beta/BRAMS_GEOS.tgz
	
4. Unpacking & Compiling the Model:

	Unpack the model:
	
	> tar -xzvf BRAMS_GEOS.tgz
	
	go to build folder:
	
	> cd BRAMS_GEOS/build
	
	Run a Makefile configure:
	
	> ./configure --program-prefix=BRAMS_5.4 --prefix=/home/<user> --enable-jules \
	   --with-chem=RELACS_TUV --with-aer=SIMPLE --with-fpcomp=/opt/mpich3/bin/mpif90 \
	   --with-cpcomp=/opt/mpich3/bin/mpicc --with-fcomp=gfortran --with-ccomp=gcc 
	
	Notes: 
	a) --prefix=/home/<user> - You must choose your <user> instalation area;
	b) In example above we use mpich3 and gfortran (GNU) instalation;
	c) In example we use the mpi installed in /opt/mpich area;
	
	Make and Make install the model:
	
	> make
	> make install
	
	Note: The binary executable will be installed in bin folder created under 
	--prefix you specified. This will be your "run" area.
	
5. Getting fixed data and tables

	The model uses some fixed files like topography, vegetation, NDVI, etc. 
	You must get the data and put it in a folder to be readed by model.
	
	5.1 Getting table data:
	
	Get the table data and put it in bin folder created in install:
	
	> cd /home/<user>/bin 
	> wget http://ftp.cptec.inpe.br/pesquisa/bramsrd/BRAMS/files/bramsTablesRev2000.tgz  
	> tar -xzvf bramsTablesRev2000.tgz 
	
	Note: the table data size is 405MB
	
	5.2 Getting the fix data:
	
	> wget http://ftp.cptec.inpe.br/pesquisa/bramsrd/BRAMS/files/datafix.tar
	> tar -xzvf datafix.tar
	
	Note: the datafix data size is 3.5GB
	
	All the data inside datafix is zipped using bzip2. Please, bunziped all the
	data in all sub-folders.
	
6. Getting the Initial and boundary conditions:

	The model needs initial and boundary conditions to run. The data must be 
	obtained in INPE/CPTECs ftp site:
	
	Created a datain subfolder inside Your bin folder:
	
	> mkdir datain
	> cd datain
	
	Get all data in http://ftp.cptec.inpe.br/pesquisa/bramsrd/BRAMS/data/GRADS/20200515/
	for test (gra & ctl data) for day 15May2020.
	
	You can get data for just one day of forecast: 
	
	from IC2020051500.* to IC2020051600.* and SM.GEOS.202005150000.*
	
	or until 10 days of forecast.
	
7. The tmp folder.

	Brams uses Jules Models for surface. Jules needs a temporary directories for
	run. Please, make it inside your bin folder and export the tmp folder before 
	run the model:
	
	> cd /home/<user>/bin 
	> mkdir tmp
	> export TMPDIR=./tmp
	
8. The NAMELIST RAMSIN

	The BRAMS uses by default a namelist called RAMSIN. We disposed a minimal 
	RAMSIN to be modified by user, get it in ftp area:
	
	> wget http://ftp.cptec.inpe.br/pesquisa/bramsrd/BRAMS/files/RAMSIN_MINIMAL
	
	Please, lets talk about some RAMSIN important variables:
	
	* RUNTYPE  = 'MAKESFC': The BRAMS is executed in 3 phases: MAKESFC, MAKEVFILE and
	INITIAL. The first to phases prepares the model for run. They Creates fix data
	and data in the area of interest. You must run the MAKESFC and MAKEVFILE using just
	1 processor (or core). The INITIAL is the integration in time phase. Read the 
	previous data from 2 previous phases and go forward. The INITIAL phase may be
	executed with the maximum of processors possible (it depends os area configuration).
	
	* TIMMAX   = 24., : define how long the forecast will be done;
	
	* IMONTH1  =05,      ! Month
    * IDATE1   =15,      ! Day
    * IYEAR1   =2020,    ! Year
    * ITIME1   =0000,    ! GMT of model TIME = 0.
    The variables above are used to define the initial integration time.
    
    *  NGRIDS   = 1, : Number of grids to run  version 5.x of BRAMS only 
    admits 1 grid**
 
	* NNXP     = 80, 318, 154, ! Number of x gridpoints
	* NNYP     = 80, 346, 102, ! Number of y gridpoints
	* NNZP     = 35,  38,  38, ! Number of z gridpoints
	* NZG      = 7,	     ! Number of soil layers
	* NZS      = 1,	     ! Maximum number of snow layers
	* DELTAX   = 10000.,       ! X grid spacing
    * DELTAY   = 10000.,       ! Y grid spacing
    * DELTAZ   = 100.,         ! Z grid spacing (set to 0. to use ZZ)
    * DZRAT    = 1.1,         ! Vertical grid stretch ratio
    * POLELAT  = -21.16,      ! Latitude  of pole point
    * POLELON  = -44.93,      ! Longitude of pole point
    * CENTLAT  = -21.16,      ! Latitude  of grid center 
    * CENTLON  = -44.93,      ! Longitude of grid center 
    
	The variables above define the "size" of area (80 x 80 points), the vertical
	number of levels (35) and other levels, The grid spacing in meters 
	(in this case 10 by 10 km), the initial z level (100m) and a ratio from
	levels above the first level. CENTLAT and CENTLON are the center of domain 
	area en usually are the same of POLELAT and POLELON. Changes in POLES cause
	geographical distortion of area.
	
	NOTES: 
	i. IF YOU CHANGE ONE OF THE VARIABLES ABOVE YOU MUST RUN THE 3 PHASES OF
	MODEL AGAIN!
	ii. The size defined by default is small enough to run in a ordinary laptop
	iii. In case the test with same size of the area/time/date you can only run
	the INITIAL phase.
	
	* DTLONG   = 60., : Coarse grid long timestep. This is one important value!!!
	This variable inform to	model integration the delta T used to advance in time. 
	Small numbers is best but the model will takes a lot of time to end. 
	Large number may cause a model instability and crash. If You are using 
	DYNCORE_FLAG = 0 (leapfrog method) 	pay attention in SSCOURN value printed in 
	screem when models run, the value must be close to 1. If You are using 
	DYNCORE_FLAG = 2 (Runge-Kupta) the value may be close to 4. When You run 
	the model and it crash with blow up of courant	number may be necessary 
	decrease the DTLONG.
	
	NOTE: IF You change the DELTAX or DELTAY the value of DTLONG must be change 
	accordingly!
	
	FRQANL   = 3600., : Time between outputs for forecast. In this case 1 hour.
	In some computer systems this time is relevant because are I/O operations
	involved. You can test a model without I/O by setting IPOS     =  0 and 
	IOUTPUT  =  0. By default we use only pos-processed grads file and turn off 
	IOUTPUT. For IPOS = 2 the output will be the variables listed in $POST section 
	(close to end of RAMSIN) and the files output are defined by GPREFIX. In case
	GPREFIX = './dataout/forecast'
	
9. Running the Model:

	You must run the 3 phases of Model. 
	
	Phase #1) Edit RAMSIN and change RUNTYPE to 'MAKESFC'.
	Then submmit the run with only one processor (or core):
	
	mpirun -np 1 brams-5.4 -f RAMSIN_MINIMAL
	
	Pay attention to the output screen. Some erros may occur for a lot of reasons.
	Check it carefully in case of problems.
	
	Phase #2) Edit RAMSIN and change RUNTYPE to 'MAKEVFILE'.
	Then submmit the run with only one processor (or core):
	
	mpirun -np 1 brams-5.4 -f RAMSIN_MINIMAL
	
	Phase #3) Edit RAMSIN and change RUNTYPE to 'INITIAL'.
	Then submmit the run with a lot of processors (or cores):
	
	mpirun -np 40 brams-5.4 -f RAMSIN_MINIMAL
	
	NOTE: The number of cores You run depends of the machine architecture and the 
	model's grid defined. If You used more cores than the model can support for one
	grid You must tune the grid. In general the problem accour when the number of 
	grids columns by core is less then halo of MPI communication. Small area with
	little colunms do not be used with many cores. The model will notice You when
	the problem is present.
	
	To see the models forecast just look inside the output GPREFIX folder to see 
	the GRADS files generated.
	
10. The dump in screen

	The BRAMS model print some important information during the run of each phase.
	Check it for found errors or mistakes in configuration. One important information
	are the lines:
	
	 === Initial timestep info: ngrid, nndtrat, nnacoust,     dtlongn, sscourn,  sspct ===
                               1        1         4       60.000    4.204   1.000

	The sscourn number (4.204) indicates the stability of model. In the case above is 4.204.
	It is dependent of dtlongn value (60.00). This value is ok for DYNCORE_FLAG = 2 and the 
	model may works fine.
	
	When integration time begins the model will print information about each timestep using
	color mode:
	
	Timestep #     1; Sim Time     60.0 [s]; Wall Time     14.875 [s]; DT     60.000 [s]; sscourn      4.204
	Timestep #     2; Sim Time    120.0 [s]; Wall Time      0.983 [s]; DT     60.000 [s]; sscourn      4.204
	Timestep #     3; Sim Time    180.0 [s]; Wall Time      0.947 [s]; DT     60.000 [s]; sscourn      4.204
	Timestep #     4; Sim Time    240.0 [s]; Wall Time      0.998 [s]; DT     60.000 [s]; sscourn      4.204
	Timestep #     5; Sim Time    300.0 [s]; Wall Time      1.061 [s]; DT     60.000 [s]; sscourn      4.204
	Timestep #     6; Sim Time    360.0 [s]; Wall Time      0.906 [s]; DT     60.000 [s]; sscourn      4.204
	...
	
	If something happens during the integration it will be showed in screen. Notices in green, 
	warning in yellow and errors in red.
	
11. Contact and further info: