Index: /trunk/MESOSCALE/LMD_MM_MARS/SIMU/callphys.def
===================================================================
--- /trunk/MESOSCALE/LMD_MM_MARS/SIMU/callphys.def	(revision 223)
+++ /trunk/MESOSCALE/LMD_MM_MARS/SIMU/callphys.def	(revision 223)
@@ -0,0 +1,92 @@
+General options
+~~~~~~~~~~~~~~~
+tracer    (Run with or without tracer transport ?)
+F
+diurnal   (Diurnal cycle ?  if diurnal=F, diurnal averaged solar heating)
+T
+season    (Seasonal cycle ? if season=F, Ls stays constant like in "start")
+T
+lwrite    (want some more output on the screen ?) 
+F
+stats     (Saving statistics in file "cumul" ?)
+F
+calleofdump (Saving EOF profiles in file "profiles" for Climate Database ?)
+F
+Dust scenario. Used if the dust is prescribed (i.e. if tracer=F or active=F)
+~~~~~~~~~~~~~
+iaervar  (=1 Dust opt.deph read in startfi; =2 Viking scenario; =3 MGS scenario 
+4        (=4 Mars Year 24 from TES assimilation)
+iddist  (Dust vertical distribution: =0: old distrib. (Pollack90) 
+3       (=1: top set by "topdustref"; =2: Viking scenario; =3 MGS scenario )
+topdustref (Dust top altitude (km). Matter only if iddist=1)
+55.
+Physical Parameterizations :
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+callrad   (call radiative transfer ?)
+T
+callnlte (call NLTE radiative schemes ?   matter only if callrad=T)
+F
+callnirco2 (call CO2 NIR absorption ?   matter only if callrad=T)
+T
+calldifv  (call turbulent vertical diffusion ?)
+T
+calladj   (call convective adjustment ?)
+F 
+callcond  (call CO2 condensation ?)
+T
+callsoil  (call thermal conduction in the soil ?)
+T
+calllott  (call Lott's gravity wave/subgrid topography scheme ?)
+F
+Radiative transfer options :
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+iradia    (the rad.transfer is computed every "iradia" physical timestep)
+10
+callg2d   (Output of the exchange coefficient mattrix ? for diagnostic only)
+F
+rayleigh  (Rayleigh scattering : should be =F for now)
+F
+Tracer (dust water, ice and/or chemical species) options (use if tracer=T) :
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+dustbin    (DUST: Transported dust ? (if >0, uses q(1) to q(dustbin))
+0
+active     (DUST: Radiatively active dust ? (uses q(1) to q(dustbin))
+F
+doubleq    (DUST: needs dustbin=1, use mass q(1) and nb q(2) mr to predict dust size ?)
+F
+lifting    (DUST: lifted by GCM surface winds ?)
+F
+dustdevil  (DUST: lifted by dust devils ?)
+F
+scavenging (DUST: Scavenging by CO2 snowfall ?)
+F
+sedimentation (DUST/WATERICE: Gravitationnal sedimentation ?)
+F
+iceparty   (WATERICE: Water cycle includes water ice mixing ratio q(nqmx-1))
+F
+activice   (WATERICE: Radiatively active transported atmospheric water ice ?)
+F
+water      (WATER: Compute water cycle using q(nqmx) )
+F
+caps       (WATER: put the current permanent caps at both poles)
+F
+photochem  (PHOTOCHEMISTRY: chemical species included)
+F
+Thermospheric options (relevant if tracer=T) :
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+callthermos  (call thermosphere ?)
+F
+thermoswater  (WATER: included without cycle	only if water=F)
+F
+callconduct  (call thermal conduction ?     matter only if callthermos=T)
+F
+calleuv  (call EUV heating ?                matter only if callthermos=T)
+F
+callmolvis  (call molecular viscosity ?     matter only if callthermos=T)
+F
+callmoldiff  (call molecular diffusion ?    matter only if callthermos=T)
+F
+thermochem  (call thermospheric photochemistry ?  matter only if callthermos=T)
+F
+solarcondate (date for solar flux calculation: 1985 < date < 2002))
+1993.4       (Solar min=1996.4 ave=1993.4 max=1990.6)
Index: /trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_full
===================================================================
--- /trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_full	(revision 222)
+++ /trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_full	(revision 223)
@@ -17,11 +17,8 @@
  io_form_boundary = 2        !! (*) Choice of NETCDF for ouputs
  debug_level      = 0        !! (*) Verbose level
- !!
- !! OPTIONAL
- !!
+ !!!!! OPTIONAL !!!!!!!!!!!!!!!
  interval_seconds = 3700     !! (p2) Frequency of large-scale fields update (s)
  input_from_file = T,        !! (n)(p2) Initialize a given domain with an input file
  /
-
 
  &domains
@@ -33,7 +30,5 @@
  e_vert = 61,                !! (r)(p2) Number of vertical levels
  p_top_requested = 5         !! (p3) Chosen value of pressure at the top of the model
- !!
- !! OPTIONAL
- !!
+ !!!!! OPTIONAL !!!!!!!!!!!!!!!
  time_step_fract_num = 0     !! Additional fraction to time_step: numerator 
  time_step_fract_den = 1     !! Additional fraction to time_step: denominator
@@ -53,7 +48,5 @@
 
  &physics
- !!
- !! OPTIONAL
- !!
+ !!!!! OPTIONAL !!!!!!!!!!!!!!!
  radt = 1,                   !! Ratio between physical and dynamical time step
  mars =  0,                  !! (r)(p2) Configuration of tracers: 
@@ -71,11 +64,9 @@
 
  &dynamics
- !!
- !! OPTIONAL
- !!
+ !!!!! OPTIONAL !!!!!!!!!!!!!!!
  time_step_sound = 6,        !! Ratio of time step dynamic/acoustic integration
                              !!   NB: an increase could help solve instabilities
  non_hydrostatic = T,        !! Integrate in non-hydrostatic/hydrostatic mode
- pd_scalar = F,              !! Positive-definite advection scheme for tracers 
+ pd_scalar = T,              !! Positive-definite advection scheme for tracers 
  !!
  diff_opt = 1                !! (*d) Diffusion option [set to 0 if LES or GCM]
@@ -93,7 +84,5 @@
 
  &bdy_control
- !!
- !! OPTIONAL
- !!
+ !!!!! OPTIONAL !!!!!!!!!!!!!!!
  specified = T,              !! (n)(p3) Boundary conditions specified by GCM 
  nested = F,                 !! (n)(p3) Boundary conditions from parent domain
@@ -109,4 +98,5 @@
  /
 
+ !!!!! DO NOT MODIFY !!!!!!!!!!
  &grib2
  /
Index: /trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_minim
===================================================================
--- /trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_minim	(revision 222)
+++ /trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_minim	(revision 223)
@@ -1,31 +1,30 @@
  &time_control
- start_year       = 2024,    !! (p1) Start Martian Year (20XX for MY XX)
- start_month      = 07,      !! (p1) Start Martian Month 
- start_day        = 01,      !! (p1) Start Martian Day 
- start_hour       = 06,      !! (p1) Start Martian Hour (at longitude 0)
- end_year         = 2024,    !! (p1) End Martian Year (20XX for MY XX)
- end_month        = 07,      !! (p1) End Martian Month
- end_day          = 02,      !! (p1) End Martian Day
- end_hour         = 06,      !! (p1) End Martian Hour (at longitude 0)
- history_interval    = 37,   !! Frequency of outputs (37 --> 3700s = 1 Martian hour)  
- frames_per_outfile  = 24,   !! Size of time dimension in files 
- restart          = .false.  !! (*) Output restart files ?
- restart_interval = 8880     !! (*) Frequency of output restart files ?
- io_form_history  = 2        !! (*) Choice of NETCDF for ouputs
- io_form_restart  = 2        !! (*) Choice of NETCDF for ouputs     
- io_form_input    = 2        !! (*) Choice of NETCDF for ouputs
- io_form_boundary = 2        !! (*) Choice of NETCDF for ouputs
- debug_level      = 0        !! (*) Verbose level
+ start_year       = 2024,    
+ start_month      = 07,      
+ start_day        = 01,    
+ start_hour       = 06,      
+ end_year         = 2024,    
+ end_month        = 07,     
+ end_day          = 02,     
+ end_hour         = 06,     
+ history_interval    = 37,   
+ frames_per_outfile  = 24,  
+ restart          = .false. 
+ restart_interval = 8880     
+ io_form_history  = 2      
+ io_form_restart  = 2          
+ io_form_input    = 2      
+ io_form_boundary = 2       
+ debug_level      = 0       
  /
 
-
  &domains
- time_step   = 50            !! Dynamical timestep
- dx = 20000,                 !! (p2) Horizontal resolution
- dy = 20000,                 !! (p2) Horizontal resolution (should be equal to dx)
- e_we   = 61,                !! (r)(p2) Number of longitude grid points
- e_sn   = 61,                !! (r)(p2) Number of latitude grid points
- e_vert = 61,                !! (r)(p2) Number of vertical levels
- p_top_requested = 5         !! (p3) Chosen value of pressure at the top of the model
+ time_step   = 50          
+ dx = 20000,               
+ dy = 20000,               
+ e_we   = 61,              
+ e_sn   = 61,             
+ e_vert = 61,             
+ p_top_requested = 5      
  /
 
@@ -45,6 +44,6 @@
  /
 
- &namelist_quilt              !! (*)
- nio_tasks_per_group = 0,     !! (*)
- nio_groups = 1,              !! (*)
- /                            !! (*)
+ &namelist_quilt              
+ nio_tasks_per_group = 0,     
+ nio_groups = 1,              
+ /                            
Index: /trunk/MESOSCALE/LMD_MM_MARS/SRC/WPS/wps_mars/namelist.wps_TEST
===================================================================
--- /trunk/MESOSCALE/LMD_MM_MARS/SRC/WPS/wps_mars/namelist.wps_TEST	(revision 222)
+++ /trunk/MESOSCALE/LMD_MM_MARS/SRC/WPS/wps_mars/namelist.wps_TEST	(revision 223)
@@ -1,38 +1,35 @@
 &share				      	
- wrf_core = 'ARW',                    !!   [do not modify: choice of dynamical core]
- max_dom = 1,                         !! number of simulation domains 
- start_date = '0000-00-00_00:00:00'   !! YYYY-MM-DD_HH:mm:ss start date
- end_date   = '1111-11-11_11:11:11'   !! YYYY-MM-DD_HH:mm:ss end date
- interval_seconds = 3700              !! frequency of GCM updates [1 Mars hour = 3700 s]
- io_form_geogrid = 2,                 !!   [do not modify: choice of NETCDF outputs]	
- debug_level = 0,                     !! verbose level of the programs 
- opt_output_from_geogrid_path='./'    !! location of the geogrid outputs
+ wrf_core = 'ARW',                   !!   [do not modify: choice of dynamical core]
+ max_dom = 1,                        !! number of simulation domains 
+ start_date = '0000-00-00_00:00:00'  !! YYYY-MM-DD_HH:mm:ss start date
+ end_date   = '1111-11-11_11:11:11'  !! YYYY-MM-DD_HH:mm:ss end date
+ interval_seconds = 3700             !! frequency of GCM updates [1 Mars hour = 3700 s]
+ io_form_geogrid = 2,                !!   [do not modify: choice of NETCDF outputs]	
+ debug_level = 0,                    !! verbose level of the programs 
+ opt_output_from_geogrid_path='./'   !! location of the geogrid outputs
 /
 
-
 &geogrid
- parent_id         =   1,         !! number identifying the related parent domain
- parent_grid_ratio =   1,         !! ratio between parent and nested domains
- i_parent_start    =   1,         !! x-position of the southwest corner of nest
- j_parent_start    =   1,         !! y-position of the southwest corner of nest
- e_we              =  61,         !! number of longitude grid points
- e_sn              =  61,         !! number of latitude grid points
- geog_data_res     = 'gcm'        !! choice of static data sources
-                                  !! NB: possible: '64ppd', '32ppd', ...
-                                  !! NB: please glance at geogrid/GEOGRID.TBL
- dx = 20000,                      !! resolution (meters) in the x-dimension	
- dy = 20000,                      !! resolution (meters) in the y-dimension	
- map_proj = 'mercator',           !! map projection: 'mercator', 'lambert' or 'polar'
- ref_lat   =  -12.,               !! north latitude of the center of the domain 
- ref_lon   =  239.,               !! east longitude of the center of the domain
- truelat1  =  0.0,                !! (lambert or polar) lat position of projection cone
- truelat2  =  0.0,                !!   [do not modify]
- stand_lon =  0.0,                !! (lambert or polar) lon position of projection cone
- geog_data_path = './WPS_GEOG',   !!   [do not modify: symbolic link in the WPS folder]
+ parent_id         =   1,        !! number identifying the related parent domain
+ parent_grid_ratio =   1,        !! ratio between parent and nested domains
+ i_parent_start    =   1,        !! x-position of the southwest corner of nest
+ j_parent_start    =   1,        !! y-position of the southwest corner of nest
+ e_we              =  61,        !! number of longitude grid points
+ e_sn              =  61,        !! number of latitude grid points
+ geog_data_res     = 'gcm'       !! static data sources: '64ppd','32ppd',... cf.GEOGRID.TBL 
+ dx = 20000,                     !! resolution (meters) in the x-dimension	
+ dy = 20000,                     !! resolution (meters) in the y-dimension	
+ map_proj = 'mercator',          !! map projection: 'mercator', 'lambert' or 'polar'
+ ref_lat   =  -12.,              !! north latitude of the center of the domain 
+ ref_lon   =  239.,              !! east longitude of the center of the domain
+ truelat1  =  0.0,               !! (lambert or polar) lat position of projection cone
+ truelat2  =  0.0,               !!   [do not modify]
+ stand_lon =  0.0,               !! (lambert or polar) lon position of projection cone
+ geog_data_path = './WPS_GEOG',  !!   [do not modify: symbolic link in the WPS folder]
 /
 
 &metgrid
- fg_name = './WPSFEED/LMD'        !!   [do not modify: symbolic link in the WPS folder]
- io_form_metgrid = 2,             !!   [do not modify: choice of NETCDF outputs]
+ fg_name = './WPSFEED/LMD'       !!   [do not modify: symbolic link in the WPS folder]
+ io_form_metgrid = 2,            !!   [do not modify: choice of NETCDF outputs]
  opt_output_from_metgrid_path='./WRFFEED/current'  !!   [do not modify: symbolic link]
 /
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex	(revision 223)
@@ -11,5 +11,5 @@
 
 \sk
-Any simulation that will be carried out with the LMD Martian Mesoscale Model comprises of the five following steps. More details will be given on the various steps when needed, but it is important at this stage to have this structure in mind. 
+Any simulation that will be carried out with the LMD Martian Mesoscale Model comprises the five following steps. More details are given on the various steps in the following chapters, but it is important at this stage to have this structure in mind. 
 
 \sk 
@@ -29,5 +29,5 @@
 
 \sk
-Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$LMDMOD}, but those are not important at this stage.} and sub-directories through the following command lines:
+Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MOD}, but those are not important at this stage.} and sub-directories through the following command lines:
 \begin{verbatim}
 ls $MMM ; ls $MMM/*
@@ -87,6 +87,6 @@
 \item ask the user about compilation settings;
 \item retrieve some additional information about the system;
-\item create a directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP} which name depends\footnote{For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case); 
-\item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{DIRCOMP} installation folders.};
+\item create a directory \ttt{\$MOD/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case); 
+\item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.};
 \item execute the WRF \ttt{configure} script with the correct option;
 \item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics and various patches and specific compilation options;
@@ -138,5 +138,5 @@
 
 \mk
-Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try top recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{DIRCOMP} directory.}.
+Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try top recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{your\_compdir} directory.}.
 
 \scriptsize
@@ -149,5 +149,5 @@
 
 \sk
-We assume here that you had successfully compiled the model with \ttt{makemeso} at the end of the previous section and you had based your answers to the \ttt{makemeso} script on the indications in brackets. You should then find in the \ttt{DIRCOMP} directory one \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable and one \ttt{wrf\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable.
+We assume here that you had successfully compiled the model with \ttt{makemeso} at the end of the previous section and you had based your answers to the \ttt{makemeso} script on the indications in brackets. You should then find in the \ttt{your\_compdir} directory one \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable and one \ttt{wrf\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable.
 
 \sk
@@ -159,5 +159,5 @@
 %
 \begin{verbatim}
-cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/
+cp LMD_MM_MARS_TESTCASE.tar.gz $MOD/LMD_MM_MARS/
 tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
 cd TESTCASE
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/foreword.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/foreword.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/foreword.tex	(revision 223)
@@ -1,15 +1,12 @@
-\chapter{Foreword}
+\chapter*{Foreword}
 
 \vk
-Welcome! This manual describes how to use the Laboratoire de M\'et\'eorologie Dynamique (LMD) Martian Mesoscale Model. Many thanks for looking forward to using this model. Developping the LMD Martian Mesoscale Model required countless hours of hard work! A significant part of the model development and validation have been funded by ESA and CNES which are acknowledged here.
+\paragraph{Welcome!} This manual describes how to use the Laboratoire de M\'et\'eorologie Dynamique (LMD) Martian Mesoscale Model. Many thanks for looking forward to using this model which development required countless hours of hard work! A significant part of the model development and validation have been funded by ESA and CNES which are acknowledged here.
 
-\mk
-The main contact to reach at LMD to become an user of the model is Aymeric SPIGA (main developper, \href{mailto:aymeric.spiga@upmc.fr}{\nolinkurl{aymeric.spiga@upmc.fr}}). Alternative contacts at LMD for mesoscale modeling inquiries are Ehouarn MILLOUR~\url{ehouarn.millour@lmd.jussieu.fr} or Fran\c cois FORGET~\url{francois.forget@lmd.jussieu.fr}. The model is distributed freely to academic partners in the frame of scientific collaborations, but not to industrial and commercial partners. At any event, we are open both to new scientific collaboration projects and contractual proposals.
+\paragraph{Contact} The main contact to reach at LMD to become an user of the model is Aymeric SPIGA (main developper, \href{mailto:aymeric.spiga@upmc.fr}{\nolinkurl{aymeric.spiga@upmc.fr}}). Alternative contacts at LMD for mesoscale modeling inquiries are Ehouarn MILLOUR~\url{ehouarn.millour@lmd.jussieu.fr} or Fran\c cois FORGET~\url{francois.forget@lmd.jussieu.fr}. We are open to questions and suggestions on new scientific collaborations, teaching/outreach actions or contractual proposals.
 
-\mk
-[To our academic partners] Please cite the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} if you'd like to refer to the LMD Martian Mesoscale Model in one of your publication. If your paper makes use of specific simulations carried out with the LMD Martian Mesoscale Model, please consider including A. Spiga as a co-author of your work and asking for help with writing the part related to mesoscale modeling. If you have any idea of specific simulations and wonder if it is ever possible to perform those with the LMD Martian Mesoscale Model, please do not hesitate to ask! If your study requires a significant work on a peculiar Martian physical parameterization, please do not hesitate to tell us about it and we would determine additional participants in the LMD team.
+\paragraph{Copyright (LMD)} The LMD Martian Mesoscale Model sources are made available on the condition that we make no representations or warranties regarding the reliability or validity of the model predictions nor the use to which such model predictions should be put, disclaim any and all responsibility for any errors or inaccuracies in the model predictions and bear no responsibility for any use made of this model predictions by any party. Scientific use of LMD Martian Mesoscale Model simulations is freely allowed provided that the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} is correctly quoted in all publications and that we are kept informed of usage and developments.. If your paper makes use of specific simulations carried out with the LMD Martian Mesoscale Model, please consider including Aymeric SPIGA as a co-author of your work and asking, if needed, for help with writing the part related to mesoscale modeling. If your study requires additional work on a specific Martian physical parameterization, please consider including other members of the LMD team in addition to Aymeric SPIGA. The LMD Martian Mesoscale Model may not be put to any commercial use without specific authorization. 
 
-\mk
-Part of the LMD Martian Mesoscale Model is based on the terrestrial model WRF which is in the public domain. If you are an user of the LMD Martian Mesoscale Model, you are therefore an user of the WRF model. Please take a minute to fill in the WRF registration form so that the WRF development team knows about the people using their model: \url{http://www.mmm.ucar.edu/wrf/users/download/wrf-regist.php}. \noindent \scriptsize \emph{WRF was developed at the National Center for Atmospheric Research (NCAR) which is operated by the University Corporation for Atmospheric Research (UCAR). NCAR and UCAR make no proprietary claims, either statutory or otherwise, to this version and release of WRF and consider WRF to be in the public domain for use by any person or entity for any purpose without any fee or charge. UCAR requests that any WRF user include this notice on any partial or full copies of WRF. WRF is provided on an "AS IS" basis and any warranties, either express or implied, including but not limited to implied warranties of non-infringement, originality, merchantability and fitness for a particular purpose, are disclaimed. In no event shall UCAR be liable for any damages, whatsoever, whether direct, indirect, consequential or special, that arise out of or in connection with the access, use or performance of WRF, including infringement actions. WRF is a registered trademark of the University Corporation for Atmospheric Research (UCAR).} \normalsize
+\paragraph{Copyright (WRF)} Part of the LMD Martian Mesoscale Model is based on the terrestrial model WRF which is in the public domain. If you are an user of the LMD Martian Mesoscale Model, you are therefore an user of the WRF model. Please take a minute to fill in the WRF registration form so that the WRF development team knows about the people using their model: \url{http://www.mmm.ucar.edu/wrf/users/download/wrf-regist.php}. \noindent \scriptsize \emph{WRF was developed at the National Center for Atmospheric Research (NCAR) which is operated by the University Corporation for Atmospheric Research (UCAR). NCAR and UCAR make no proprietary claims, either statutory or otherwise, to this version and release of WRF and consider WRF to be in the public domain for use by any person or entity for any purpose without any fee or charge. UCAR requests that any WRF user include this notice on any partial or full copies of WRF. WRF is provided on an "AS IS" basis and any warranties, either express or implied, including but not limited to implied warranties of non-infringement, originality, merchantability and fitness for a particular purpose, are disclaimed. In no event shall UCAR be liable for any damages, whatsoever, whether direct, indirect, consequential or special, that arise out of or in connection with the access, use or performance of WRF, including infringement actions. WRF is a registered trademark of the University Corporation for Atmospheric Research (UCAR).} \normalsize
 
 \clearemptydoublepage
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex	(revision 223)
@@ -16,5 +16,5 @@
 \item your computer is connected to the internet;
 \item you have~\ttt{200 Mo} free disk space available;
-\item your OS is Linux\footnote{The model was also successfully compiled on MacOSX; ``howto" information is available upon request but could have become obsolete on recent versions of Apple hardware and software. It is probably possible to compile the model on Windows using Cygwin but this has not implemented nor tested. You are kindly advised to install a Linux distribution on your computer (e.g. Ubuntu, Debian, Fedora, ...).} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots);
+\item your OS is Linux\footnote{The model was also successfully compiled on MacOSX; ``howto" information is available upon request but could have become obsolete on recent versions of Apple hardware and software. It is probably possible to compile the model on Windows using Cygwin but this has not been implemented nor tested. You are kindly advised to install a Linux distribution on your computer (e.g. Ubuntu, Debian, Fedora, ...).} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots);
 \item at least one of the following Fortran compilers is installed on your computer
 \begin{itemize}
@@ -45,24 +45,5 @@
 
 \sk
-Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$LMDMOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable. 
-
-\begin{finger}
-\item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
-\begin{verbatim}
-mkdir $LMDMOD/MPI
-mv mpich2-1.0.8.tar.gz $LMDMOD/MPI
-cd $LMDMOD/MPI
-tar xzvf mpich2-1.0.8.tar.gz
-cd mpich2-1.0.8
-./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
-# please wait...
-make > mk.log 2> mkerr.log &
-declare -x WHERE_MPI=$LMDMOD/MPI/mpich2-1.0.8/bin
-\end{verbatim}
-\normalsize
-\end{finger}
-
-\sk
-\subsection{Compiling the terrestrial WRF model}
+\subsection{Compiling the terrestrial WRF model}\label{terrestrial}
 
 \sk The LMD Martian Mesoscale Model is based on the terrestrial NCEP/NCAR ARW-WRF Mesoscale Model. As a first step towards the compilation of the Martian version, we advise you to check that the terrestrial model compiles on your computer with either \ttt{g95} or \ttt{pgf90} or \ttt{ifort}. On the ARW-WRF website \url{http://www.mmm.ucar.edu/wrf/users/download/get\_source.html}, you will be allowed to freely download the model after a quick registration process (click on ``New users"). Make sure to download the version 2.2 of the WRF model and copy the \ttt{WRFV2.2.TAR.gz} archive to your current working directory. Then please extract the model sources and configure the compilation process:
@@ -82,5 +63,5 @@
 
 \sk
-If the compilation is successful, the file \ttt{log\_error} should be empty or only reporting few warnings). In the \ttt{main} folder two executables \ttt{ideal.exe} and \ttt{run.exe} should be found and allow you to run the test simulation:
+If the compilation is successful, the file \ttt{log\_error} should be empty or only reporting few warnings). In the \ttt{main} folder two executables \ttt{ideal.exe} and \ttt{run.exe} should be found and allow you to run\footnote{If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will probably complain about an error reading the namelist. Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} in the \ttt{namelist.input} file to solve the problem.} the test simulation:
 \begin{verbatim}
 cd test/em_hill2d_x
@@ -90,32 +71,22 @@
 
 \sk
-During the simulation, the time taken by the computer to perform integrations at each dynamical timestep  is displayed in the standard output. The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}. The model results are stored in a \ttt{wrfout} netCDF data file you might like to browse with a \ttt{NETCDF}-compliant software such as \ttt{ncview}, or read with your favorite graphical software. 
-%
-\begin{finger} \item If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will probably complain about an error reading the namelist. Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} in the \ttt{namelist.input} file to solve the problem. \end{finger}
+During the simulation, the time taken by the computer to perform integrations at each dynamical timestep  is displayed in the standard output. The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}. The model results are stored in a \ttt{wrfout} netCDF data file you might like to browse with a \ttt{NETCDF}-compliant software such as \ttt{ncview}, or read with your favorite graphical software. Once you have checked the WRF terrestrial model compiles and runs well on your system, you can delete all files related to the operations done in this section~\ref{terrestrial}.
 
 \mk
 \section{Main installation of the model sources}
 
-\sk
-\subsection{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive}
-
-\sk
-Please set the environment variable \ttt{\$LMDMOD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$LMDMOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$LMDMOD} directory and extract the files. Then execute the \ttt{prepare} script that would do some necessary installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
+\paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MOD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MOD} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
 %
 \begin{verbatim}       
-declare -x LMDMOD=/disk/user/MODELS
-declare -x MMM=$LMDMOD/LMD_MM_MARS
-cp LMD_MM_MARS.tar.gz $LMDMOD
-cd $LMDMOD
+declare -x MOD=/disk/user/MODELS
+declare -x MMM=$MOD/LMD_MM_MARS
+cp LMD_MM_MARS.tar.gz $MOD
+cd $MOD
 tar xzvf LMD_MM_MARS.tar.gz
-cd $LMDMOD/LMD_MM_MARS
+cd $MOD/LMD_MM_MARS
 ./SRC/SCRIPTS/prepare  ## or simply ./prepare if the script is in LMD_MM_MARS
 \end{verbatim}
 
-\sk
-\subsection{Method 2: You were given a \ttt{svn} link \ttt{the\_link}}
-
-\sk
-\emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$LMDMOD} and \ttt{\$MMM}:
+\paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MOD} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
 
 \begin{verbatim}
@@ -124,12 +95,7 @@
 svn update LMDZ.MARS MESOSCALE
 cd MESOSCALE
-declare -x LMDMOD=$PWD
-declare -x MMM=$LMDMOD/LMD_MM_MARS
-\end{verbatim}
-
-\sk
-The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }:
-
-\begin{verbatim}
+declare -x MOD=$PWD
+declare -x MMM=$MOD/LMD_MM_MARS
+## to get latest updates later on
 cd the_name_of_your_local_destination_folder
 svn update LMDZ.MARS MESOSCALE
@@ -137,3 +103,25 @@
 \end{verbatim}
 
+\mk
+\section{Parallel computations (optional)}
+
+\sk
+Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$MOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable. 
+
+\begin{finger}
+\item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
+\begin{verbatim}
+mkdir $MOD/MPI
+mv mpich2-1.0.8.tar.gz $MOD/MPI
+cd $MOD/MPI
+tar xzvf mpich2-1.0.8.tar.gz
+cd mpich2-1.0.8
+./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
+# please wait...
+make > mk.log 2> mkerr.log &
+declare -x WHERE_MPI=$MOD/MPI/mpich2-1.0.8/bin
+\end{verbatim}
+\normalsize
+\end{finger}
+
 \clearemptydoublepage
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/keep
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/keep	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/keep	(revision 223)
@@ -21,4 +21,9 @@
 
 
+\item Defining several domains yields
+distinct files 
+\ttt{geo\_em.d01.nc},
+\ttt{geo\_em.d02.nc},
+\ttt{geo\_em.d03.nc}\ldots
 
 \mk
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/namelist.wps_NEST
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/namelist.wps_NEST	(revision 223)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/namelist.wps_NEST	(revision 223)
@@ -0,0 +1,1 @@
+link ../../../MESOSCALE/LMD_MM_MARS/SRC/WPS/wps_mars/namelist.wps_NEST
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/namelist.wps_TEST
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/namelist.wps_TEST	(revision 223)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/namelist.wps_TEST	(revision 223)
@@ -0,0 +1,1 @@
+link ../../../MESOSCALE/LMD_MM_MARS/SRC/WPS/wps_mars/namelist.wps_TEST
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex	(revision 223)
@@ -51,5 +51,5 @@
 
 \sk
-\subsection{Advice on filling \ttt{namelist.input}}
+\subsection{Advice on filling \ttt{namelist.input}}\label{namelist}
 
 \paragraph{Test case} An interesting exercise is to analyze comparatively the \ttt{TESTCASE/namelist.input} file (cf. section~\ref{sc:arsia}) with the reference \ttt{namelist.input\_full} given above, so that you could understand which settings are being made in the Arsia Mons simulation. Then you could try to modify parameters in the \ttt{namelist.input} file and re-run the model to start getting familiar with the various settings. Given that the test case relies on pre-computed initial and boundary conditions, not all parameters can be changed in the \ttt{namelist.input} file.
@@ -57,5 +57,5 @@
 \paragraph{Syntax} Please pay attention to rigorous syntax while editing your personal \ttt{namelist.input} file to avoid reading error. If the model complains about this at runtime, start again with the available template \ttt{\$MMM/SIMU/namelist.input\_full}.
 
-\paragraph{Time management} Usually a Martian user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year. In the \ttt{namelist.input} file, the settings for starting/ending time must be done in the form year/month/day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix~\ref{calendar}) is here to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~4, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 2^{\circ}$.
+\paragraph{Time management} Usually a Martian user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. In the \ttt{namelist.input} file, the settings for starting/ending time must be done in the form year/month/day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix~\ref{calendar}) is here to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~4, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 2^{\circ}$.
 
 \mk
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex	(revision 223)
@@ -14,18 +14,18 @@
 
 \sk
-First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$LMDMOD/TMPDIR} as indicated below.
-
-\begin{verbatim}
-ln -sf /bigdisk/user $LMDMOD/TMPDIR
-mkdir $LMDMOD/TMPDIR/GCMINI
-mkdir $LMDMOD/TMPDIR/WPSFEED
-mkdir $LMDMOD/TMPDIR/WRFFEED
-\end{verbatim}
-
-\sk
-A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{DIRCOMP} for illustration in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:
-
-\begin{verbatim}
-cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
+First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MOD/TMPDIR} as indicated below.
+
+\begin{verbatim}
+ln -sf /bigdisk/user $MOD/TMPDIR
+mkdir $MOD/TMPDIR/GCMINI
+mkdir $MOD/TMPDIR/WPSFEED
+mkdir $MOD/TMPDIR/WRFFEED
+\end{verbatim}
+
+\sk
+A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{your\_compdir} for illustration in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:
+
+\begin{verbatim}
+cd $MOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
 ln -sf ../SRC/SCRIPTS/prepare_ini .
 ./prepare_ini
@@ -59,8 +59,8 @@
 
 \sk
-All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
-
-\begin{verbatim}
-cd $LMDMOD/LMD_MM_MARS
+All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
+
+\begin{verbatim}
+cd $MOD/LMD_MM_MARS
 ./build_static
 \end{verbatim}
@@ -78,13 +78,13 @@
 
 \sk
-The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$LMDMOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$LMDMOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands must be used and should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
-
-\begin{verbatim}
-cd $LMDMOD/LMDZ.MARS
+The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
+
+\begin{verbatim}
+cd $MOD/LMDZ.MARS
 ./compile
 \end{verbatim}
 
 \sk
-The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$LMDMOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
+The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
 
 \mk
@@ -110,4 +110,98 @@
 \end{center}
 
+\sk
+\subsection{Step 1: Running the GCM and converting data}
+
+\sk
+Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:
+
+\begin{verbatim}
+cd $MOD/LMDZ.MARS/myGCM
+./launch_gcm    ## answer: your desired starting sol for the simulations
+\end{verbatim}
+
+	%\mk
+	%\marge An example of input meteorological file	 
+	%\ttt{diagfi.nc} file can be downloaded
+	%at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
+	%%
+	%Please deflate the archive and copy the \ttt{diagfi.nc} file
+	%in \ttt{\$MOD/TMPDIR/GCMINI}.
+	%%
+	%Such a file can then be used to define the initial
+	%and boundary conditions, and we will go 
+	%through the three preprocessing steps.
+
+\sk
+Once the GCM simulations is finished, the programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if every went well with the conversion,
+the directory \ttt{\$MOD/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}. 
+
+\begin{verbatim}
+cd $MOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
+echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
+./readmeteo.exe < readmeteo.def
+\end{verbatim}
+
+\sk
+\subsection{Step 2: Interpolation on the regional domain}
+
+\sk
+\paragraph{Step 2a} In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows you to define the mesoscale simulation domain, to horizontally interpolate the topography, thermal inertia and albedo fields at the domain resolution and to calculate useful fields such as topographical slopes. Please execute the commands:
+
+\begin{verbatim}
+cd $MMM/your_install_dir/WPS
+ln -sf $MMM/TESTCASE/namelist.wps .   # test case (or use your customized file)
+./geogrid.exe
+\end{verbatim}
+
+The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page, and execute again \ttt{geogrid.exe}. 
+
+\begin{finger}
+\item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be achieved/prepared e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain, while GCM computations are done in step~1. 
+\item More details about the database and more options of interpolation could be found in the file \ttt{geogrid/GEOGRID.TBL} (for advanced users only).
+\item Two examples are given in Figure~\ref{vallespolar}.
+\end{finger}
+
+\footnotesize
+\codesource{namelist.wps_TEST}
+\normalsize
+
+\begin{figure}[h!] 
+\begin{center}
+\includegraphics[width=0.48\textwidth]{valles.png}
+\includegraphics[width=0.48\textwidth]{LMD_MMM_d1_20km_domain_100.png} 
+\end{center}
+\caption{\label{vallespolar} (Left plot) An example of mercator domain in the Valles Marineris region as simulated by \textit{Spiga and Forget} [2009, their section 3.3]: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 401}, \ttt{e\_we = 121}, \ttt{dx = 12000}, \ttt{dy = 12000}, \ttt{map\_proj='mercator'}, \ttt{ref\_lat = -8}, \ttt{ref\_lon = -68}. (Right plot) An example of north polar domain with stereographical projection: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 117}, \ttt{e\_we = 117}, \ttt{dx = 20000}, \ttt{dy = 20000}, \ttt{map\_proj='polar'}, \ttt{ref\_lat = 90}, \ttt{ref\_lon = 0.1}, \ttt{truelat1  =  90}, \ttt{stand\_lon =  0.1}.}
+\end{figure}
+
+
+
+\sk
+\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MOD/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
+
+\begin{verbatim}
+cd $MMM/your_install_dir/WPS
+mkdir WRFFEED/current
+./metgrid.exe
+\end{verbatim}
+
+\sk
+\subsection{Step 3: Vertical interpolation on mesoscale levels}
+
+\sk
+The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files called \ttt{wrfinput} and the boundary conditions in files called \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} controlling the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}. 
+
+\begin{verbatim}
+cd $MOD/TESTCASE   ## or anywhere you would like to run the simulation
+ln -sf $MOD/TMPDIR/WRFFEED/met_em* .
+./real.exe
+\end{verbatim}
+
+\sk
+The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
+
+\begin{finger}
+\item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be all the same. }
+\end{finger}
 
 \clearemptydoublepage
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex	(revision 223)
@@ -1,230 +1,2 @@
-
-
-\mk
-\subsubsection{Meteorological data}
-
-\ttt{launch\_gcm}
-
-\mk
-The preprocessing tools generate initial and boundary conditions
-from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
-%
-If you would like to run a mesoscale simulation at a given
-season, you need to first run a GCM simulation and output
-the meteorological fields at the considered season.
-%
-For optimal forcing at the boundaries, we advise you
-to write the meteorological fields to the 
-\ttt{diagfi.nc} file at least each two hours.
-%
-Please also make sure that the following fields
-are stored in the NETCDF \ttt{diagfi.nc} file:
-
-\footnotesize
-\codesource{contents_diagfi}
-
-\normalsize
-\begin{finger}
-\item If the fields 
-\ttt{emis}, 
-\ttt{co2ice}, 
-\ttt{q01}, 
-\ttt{q02}, 
-\ttt{tsoil} 
-are missing in the \ttt{diagfi.nc} file, 
-they are replaced by respective default 
-values $0.95$, $0$, $0$, $0$, tsurf.
-\end{finger}
-
-\mk
-\marge An example of input meteorological file 
-\ttt{diagfi.nc} file can be downloaded
-at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
-%
-Please deflate the archive and copy the \ttt{diagfi.nc} file
-in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
-%
-Such a file can then be used to define the initial
-and boundary conditions, and we will go 
-through the three preprocessing steps.
-
-\mk
-\subsection{Preprocessing steps} 
-
-\mk
-\subsubsection{Step 1: Converting GCM data}
-
-\mk
-The programs in the \ttt{PREP\_MARS} directory
-convert the data from the NETCDF \ttt{diagfi.nc}
-file into separated binary datafiles for each
-date contained in \ttt{diagfi.nc}, according to
-the formatting needed by the 
-preprocessing programs at step 2.
-%
-These programs can be executed by the following
-commands: 
-\begin{verbatim}
-cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
-echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
-./readmeteo.exe < readmeteo.def
-\end{verbatim}
-%
-\marge If every went well with the conversion,
-the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
-should contain files named \ttt{LMD:}.
-
-\mk
-\subsubsection{2: Interpolation on the regional domain}
-
-\mk
-In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows 
-you to define the mesoscale simulation domain
-to horizontally interpolate the topography, 
-thermal inertia and albedo fields at the domain
-resolution and to calculate useful fields
-such as topographical slopes.%\pagebreak
-
-\mk
-\marge Please execute the commands:
-%
-\begin{verbatim}
-cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
-ln -sf ../../TESTCASE/namelist.wps .   # test case
-./geogrid.exe
-\end{verbatim}
-%
-\marge The result of \ttt{geogrid.exe} 
--- and thus the definition of the mesoscale
-domain -- can be checked in the NETCDF
-file \ttt{geo\_em.d01.nc}.
-%
-A quick check can be performed using the command line
-\begin{verbatim}
-ncview geo_em.d01.nc
-\end{verbatim} 
-\marge if \ttt{ncview} is installed, or the \ttt{IDL}
-script \ttt{out\_geo.pro}
-\begin{verbatim}
-idl
-IDL> out_geo, field1='TOPO'
-IDL> out_geo, field1='TI'
-IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
-IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
-IDL> exit
-\end{verbatim}
-\marge if the demo version of \ttt{IDL} is installed.
-%
-Of course if your favorite graphical tool supports
-the NETCDF standard, you might use it to check the
-domain definition in \ttt{geo\_em.d01.nc}.
-
-\mk
-\marge If you are unhappy with the results or
-you want to change 
-the location of the mesoscale domain on the planet, 
-the horizontal resolution,
-the number of grid points \ldots,
-please modify the parameter
-file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
-%
-Here are the contents of \ttt{namelist.wps}:
-%
-\codesource{namelist.wps_TEST} 
-
-\begin{finger}
-%
-\item No input meteorological data 
-are actually needed to execute \ttt{geogrid.exe}.
-%
-\item More details about the database and
-more options of interpolation could be
-found in the file \ttt{geogrid/GEOGRID.TBL}.
-%
-\item Defining several domains yields
-distinct files 
-\ttt{geo\_em.d01.nc},
-\ttt{geo\_em.d02.nc},
-\ttt{geo\_em.d03.nc}\ldots
-\end{finger}
-
-\mk
-\marge Once the \ttt{geo\_em} file(s) are generated,
-the \ttt{metgrid.exe} program performs
-a similar horizontal interpolation 
-of the meteorological fields to the mesoscale 
-domain as the one performed by \ttt{geogrid.exe} 
-for the surface data.
-%
-Then the program writes the results in
-\ttt{met\_em} files and also collects
-the static fields and domain parameters
-included in the \ttt{geo\_em} file(s)
-%
-Please type the following commands:
-\begin{verbatim}
-cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
-./metgrid.exe
-\end{verbatim}
-%
-\marge If every went well,
-the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
-should contain the \ttt{met\_em.*} files.
-
-\mk
-\subsubsection{Step 3: Vertical interpolation on mesoscale levels}
-
-\mk
-\marge The last step is to execute \ttt{real.exe}
-to perform the interpolation from the vertical
-levels of the GCM to the vertical levels 
-defined in the mesoscale model.
-%
-This program also prepares the final initial
-state for the simulation in files called
-\ttt{wrfinput} and the boundary conditions
-in files called \ttt{wrfbdy}.
-
-\mk
-\marge To successfully execute \ttt{real.exe}, 
-you need the \ttt{met\_em.*} files
-and the \ttt{namelist.input} file
-to be in the same directory as \ttt{real.exe}.
-%
-Parameters in \ttt{namelist.input} 
-controlling the behavior of the vertical interpolation 
-are those labelled with \ttt{(p3)} in the detailed
-list introduced in the previous chapter.
-
-\mk
-\marge Please type the following commands
-to prepare files for the Arsia Mons test case
-(or your personal test case if you changed
-the parameters in \ttt{namelist.wps}):
-\begin{verbatim}
-cd $LMDMOD/TESTCASE
-ln -sf $LMDMOD/WRFFEED/met_em* .
-./real.exe
-\end{verbatim}
-
-\mk
-\marge The final message of the \ttt{real.exe}
-should claim the success of the processes and you
-are now ready to launch the integrations
-of the LMD Martian Mesoscale Model again
-with the \ttt{wrf.exe} command as in section
-\ref{sc:arsia}.
-
-\begin{finger}
-\item When you modify either 
-\ttt{namelist.wps} or \ttt{namelist.input},
-make sure that the common parameters
-are exactly similar in both files
-(especially when running nested simulations)
-otherwise either \ttt{real.exe} or \ttt{wrf.exe}
-command will exit with an error message.
-\end{finger}
-%\pagebreak
-
 
 
@@ -248,4 +20,6 @@
 
 \section{Grid nesting}\label{nests}
+
+\codesource{namelist.wps_NEST} 
 
 \section{Tracers}
Index: /trunk/MESOSCALE_DEV/MANUAL/SRC/whatis.tex
===================================================================
--- /trunk/MESOSCALE_DEV/MANUAL/SRC/whatis.tex	(revision 222)
+++ /trunk/MESOSCALE_DEV/MANUAL/SRC/whatis.tex	(revision 223)
@@ -2,5 +2,5 @@
 
 \vk
-This chapter comprises the excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction.
+This chapter comprises excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} which are dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference \textit{Spiga and Forget} [2009]\nocite{Spig:09} paper and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction.
 
 \begin{center}
