Index: trunk/MESOSCALE_DEV/MANUAL/SRC/LMD_MMM_d1_20km_HGT_UV_10m-ALS_Ls8_LT1.sh
===================================================================
--- trunk/MESOSCALE_DEV/MANUAL/SRC/LMD_MMM_d1_20km_HGT_UV_10m-ALS_Ls8_LT1.sh	(revision 230)
+++ trunk/MESOSCALE_DEV/MANUAL/SRC/LMD_MMM_d1_20km_HGT_UV_10m-ALS_Ls8_LT1.sh	(revision 230)
@@ -0,0 +1,1 @@
+/donnees/aslmd/MODELES/MESOSCALE_DEV/PLOT/PYTHON/scripts/winds.py -f wrfout_d01_2024-01-17_02:00:00 -i 4 -v HGT -n -1 -m -1500. -M 20000. -s 2 
Index: trunk/MESOSCALE_DEV/MANUAL/SRC/LMD_MMM_d1_20km_UV_HGT_Ls2_LT1.sh
===================================================================
--- trunk/MESOSCALE_DEV/MANUAL/SRC/LMD_MMM_d1_20km_UV_HGT_Ls2_LT1.sh	(revision 228)
+++ 	(revision )
@@ -1,1 +1,0 @@
-/donnees/aslmd/MODELES/MESOSCALE_DEV/PLOT/PYTHON/scripts/winds.py -f wrfout_d01_2024-01-04_02:00:00 -i 4 -v HGT -n -1 -m -1500. -M 20000. -s 2 
Index: trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex
===================================================================
--- trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex	(revision 228)
+++ trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex	(revision 230)
@@ -29,5 +29,5 @@
 
 \sk
-Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MOD}, but those are not important at this stage.} and sub-directories through the following command lines:
+Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MESO}, but those are not important at this stage.} and sub-directories through the following command lines:
 \begin{verbatim}
 ls $MMM ; ls $MMM/*
@@ -87,5 +87,5 @@
 \item ask the user about compilation settings;
 \item retrieve some additional information about the system;
-\item create a directory \ttt{\$MOD/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case); 
+\item create a directory \ttt{\$MESO/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case); 
 \item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.};
 \item execute the WRF \ttt{configure} script with the correct option;
@@ -152,6 +152,5 @@
 
 \sk
-In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz}
-archive that you can download in the following FTP site \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/LMD_MM_MARS_TESTCASE.tar.gz}. This test case simulates the hydrostatic atmospheric flow around Arsia Mons (Figure~\ref{arsia}) during half a sol with constant thermal inertia, albedo and dust opacity\footnote{Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time.}.
+In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} archive that you can download in the following FTP site \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/LMD_MM_MARS_TESTCASE.tar.gz}. This test case simulates the hydrostatic atmospheric flow around Arsia Mons (Figure~\ref{arsia}) during half a sol in springtime with constant thermal inertia, albedo and dust opacity\footnote{Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time.}.
 
 \sk
@@ -159,5 +158,5 @@
 %
 \begin{verbatim}
-cp LMD_MM_MARS_TESTCASE.tar.gz $MOD/LMD_MM_MARS/
+cp LMD_MM_MARS_TESTCASE.tar.gz $MESO/LMD_MM_MARS/
 tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
 cd TESTCASE
@@ -167,5 +166,5 @@
 
 \sk
-The files contained in \ttt{TESTCASE} prior to launching the simulations with the \ttt{wrf.exe} command illustrate which files are needed to perform step 4, i.e. running a LMD Martian Mesoscale Model simulation\footnote{For the test case presented here, a file named \ttt{dustopacity.def} is needed because for the sake of simplicity of this test case, we set idealized uniform dust opacity. The files \ttt{real.exe} and \ttt{namelist.wps} are included in the \ttt{TESTCASE} folder for further reference but not needed at this stage.}. 
+The files contained in \ttt{TESTCASE} prior to launching the simulations with the \ttt{wrf.exe} command illustrate which files are needed to perform step 4, i.e. running a LMD Martian Mesoscale Model simulation\footnote{For the test case presented here, a file named \ttt{dustopacity.def} is needed because for the sake of simplicity of this test case, we set idealized uniform dust opacity. The file \ttt{namelist.wps} is included in the \ttt{TESTCASE} folder for further reference but not needed at this stage.}. 
 \begin{itemize}
 \item \ttt{namelist.input}: text file containing parameters for the dynamical core
@@ -178,5 +177,5 @@
 \begin{figure}[h!]
 \includegraphics[width=0.5\textwidth]{arsiadomain.png} 
-\includegraphics[width=0.5\textwidth]{LMD_MMM_d1_20km_UV_HGT_Ls2_LT1_100.png}
+\includegraphics[width=0.5\textwidth]{LMD_MMM_d1_20km_HGT_UV_10m-ALS_Ls8_LT1_100.png}
 \caption{\label{arsia} [Left plot] Simulation domain defined in the test case proposed as a demonstrator for running the LMD Martian Mesoscale Model. [Right plot] Nighttime winds predicted by the model~$10$~m above the surface. Both plots have been generated by command-line scripts written in~\ttt{python + numpy + matplotlib} (see chapter~\ref{postproc}).}
 \end{figure}
Index: trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex
===================================================================
--- trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex	(revision 228)
+++ trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex	(revision 230)
@@ -76,17 +76,17 @@
 \section{Main installation of the model sources}
 
-\paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MOD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MOD} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
+\paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MESO} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MESO/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MESO} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
 %
 \begin{verbatim}       
-declare -x MOD=/disk/user/MODELS
-declare -x MMM=$MOD/LMD_MM_MARS
-cp LMD_MM_MARS.tar.gz $MOD
-cd $MOD
+declare -x MESO=/disk/user/MODELS
+declare -x MMM=$MESO/LMD_MM_MARS
+cp LMD_MM_MARS.tar.gz $MESO
+cd $MESO
 tar xzvf LMD_MM_MARS.tar.gz
-cd $MOD/LMD_MM_MARS
+cd $MESO/LMD_MM_MARS
 ./SRC/SCRIPTS/prepare  ## or simply ./prepare if the script is in LMD_MM_MARS
 \end{verbatim}
 
-\paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MOD} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
+\paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MESO} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
 
 \begin{verbatim}
@@ -95,6 +95,6 @@
 svn update LMDZ.MARS MESOSCALE
 cd MESOSCALE
-declare -x MOD=$PWD
-declare -x MMM=$MOD/LMD_MM_MARS
+declare -x MESO=$PWD  ## put absolute link in your .bashrc
+declare -x MMM=$MESO/LMD_MM_MARS
 ## to get latest updates later on
 cd the_name_of_your_local_destination_folder
@@ -107,12 +107,12 @@
 
 \sk
-Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$MOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable. 
+Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$your_software_dir/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable. 
 
 \begin{finger}
 \item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
 \begin{verbatim}
-mkdir $MOD/MPI
-mv mpich2-1.0.8.tar.gz $MOD/MPI
-cd $MOD/MPI
+mkdir $your_software_dir/MPI    
+mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/
+cd $your_software_dir/MPI
 tar xzvf mpich2-1.0.8.tar.gz
 cd mpich2-1.0.8
@@ -120,5 +120,5 @@
 # please wait...
 make > mk.log 2> mkerr.log &
-declare -x WHERE_MPI=$MOD/MPI/mpich2-1.0.8/bin
+declare -x WHERE_MPI=$your_software_dir/MPI/mpich2-1.0.8/bin
 \end{verbatim}
 \normalsize
Index: trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex
===================================================================
--- trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex	(revision 228)
+++ trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex	(revision 230)
@@ -14,11 +14,11 @@
 
 \sk
-First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MOD/TMPDIR} as indicated below.
-
-\begin{verbatim}
-ln -sf /bigdisk/user $MOD/TMPDIR
-mkdir $MOD/TMPDIR/GCMINI
-mkdir $MOD/TMPDIR/WPSFEED
-mkdir $MOD/TMPDIR/WRFFEED
+First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MESO/TMPDIR} as indicated below.
+
+\begin{verbatim}
+ln -sf /bigdisk/user $MESO/TMPDIR
+mkdir $MESO/TMPDIR/GCMINI
+mkdir $MESO/TMPDIR/WPSFEED
+mkdir $MESO/TMPDIR/WRFFEED
 \end{verbatim}
 
@@ -27,5 +27,5 @@
 
 \begin{verbatim}
-cd $MOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
+cd $MMM/g95_32_single/   ## or any of your install directory
 ln -sf ../SRC/SCRIPTS/prepare_ini .
 ./prepare_ini
@@ -59,9 +59,9 @@
 
 \sk
-All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
-
-\begin{verbatim}
-cd $MOD/LMD_MM_MARS
-./build_static
+All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MMM/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
+
+\begin{verbatim}
+cd $MMM
+./SRC/SCRIPTS/build_static
 \end{verbatim}
 
@@ -78,13 +78,13 @@
 
 \sk
-The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
-
-\begin{verbatim}
-cd $MOD/LMDZ.MARS
+The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MESO} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MESO/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
+
+\begin{verbatim}
+cd $MESO/LMDZ.MARS
 ./compile
 \end{verbatim}
 
 \sk
-The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
+The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MESO/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
 
 \mk
@@ -114,8 +114,8 @@
 
 \sk
-Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:
-
-\begin{verbatim}
-cd $MOD/LMDZ.MARS/myGCM
+Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:
+
+\begin{verbatim}
+cd $MESO/LMDZ.MARS/myGCM
 ./launch_gcm    ## answer: your desired starting sol for the simulations
 \end{verbatim}
@@ -127,5 +127,5 @@
 	%%
 	%Please deflate the archive and copy the \ttt{diagfi.nc} file
-	%in \ttt{\$MOD/TMPDIR/GCMINI}.
+	%in \ttt{\$MESO/TMPDIR/GCMINI}.
 	%%
 	%Such a file can then be used to define the initial
@@ -135,11 +135,13 @@
 \sk
 Once the GCM simulations is finished, the programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if every went well with the conversion,
-the directory \ttt{\$MOD/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}. 
-
-\begin{verbatim}
-cd $MOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
+the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}. 
+
+\begin{verbatim}
+cd $MMM/your_install_dir/PREP\_MARS
 echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
 ./readmeteo.exe < readmeteo.def
 \end{verbatim}
+
+%%% compile_and_exec ????
 
 \sk
@@ -178,5 +180,5 @@
 
 \sk
-\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MOD/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
+\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MESO/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
 
 \begin{verbatim}
@@ -193,6 +195,6 @@
 
 \begin{verbatim}
-cd $MOD/TESTCASE   ## or anywhere you would like to run the simulation
-ln -sf $MOD/TMPDIR/WRFFEED/met_em* .
+cd $MESO/TESTCASE   ## or anywhere you would like to run the simulation
+ln -sf $MESO/TMPDIR/WRFFEED/met_em* .
 ./real.exe
 \end{verbatim}
@@ -201,4 +203,5 @@
 The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
 
+\sk
 \begin{finger}
 \item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be all the same. }
Index: trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex
===================================================================
--- trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex	(revision 228)
+++ trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex	(revision 230)
@@ -5,4 +5,8 @@
 
 \chapter{Starting simulations from scratch: a summary}
+
+A quick guide to full execution
+(a list of commands)
+
 
 \mk
@@ -15,4 +19,6 @@
 To be completed
 \end{remarque}
+
+parler de xeyes
 
 
