\chapter{Compiling the model and running a test case} \vk This chapter is also meant for first time users of the LMD Martian Mesoscale Model. We describe how to compile the program and run a test case. \mk \subsection{Main compilation step} \label{sc:makemeso} \mk In order to compile the model, execute the \ttt{makemeso} compilation script in the \ttt{LMD\_MM\_MARS}\linebreak directory % \begin{verbatim} cd $LMDMOD/LMD_MM_MARS ./makemeso \end{verbatim} % \marge and answer to the questions about \begin{asparaenum}[1.]%[\itshape Q1\upshape)] \item compiler choice (and number of processors if using MPI) \item number of grid points in longitude [61] \item number of grid points in latitude [61] \item number of vertical levels [61] \item number of tracers [1] \item number of domains [1] \end{asparaenum} %\mk \begin{finger} \item On the first time you compile the model, you will probably wonder what to reply to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable for the test case given below. \item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$ to run a specific compilation. If you would like to run another simulation with at least one of parameters $1$ to $6$ subject to change, the model needs to be recompiled\footnote{This necessary recompilation each time the number of grid points, tracers and domains is modified is imposed by the LMD physics code. The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}. \item When you use parallel computations, please bear in mind that with $2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated into $2$ (resp. $2$, $3$, $4$, $4$) tiles over the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction. Thus make sure that the number of grid points minus $1$ in each direction could be divided by the aforementioned number of tiles over the considered direction. \item If you use grid nesting, note that no more than $4$ processors can be used. \end{finger} \mk \marge The \ttt{makemeso} is an automated script which performs the following serie of tasks: %It is useful to detail and comment the performed by the \ttt{makemeso} script: \begin{citemize} \item determine if the machine is 32 or 64 bits; \item ask the user about the compilation settings; \item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP}; \begin{finger} \item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine. \end{finger} \item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources; \begin{finger} \item This method ensures that any change to the model sources would be propagated to all the different \ttt{DIRCOMP} installation folders. \end{finger} \item execute the WRF \ttt{configure} script with the correct option; \item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics; \item calculate the total number of horizontal grid points handled by the LMD physics; \item duplicate LMD physical sources if nesting is activated; \begin{finger} \item The model presently supports 3 nests, but more nests can be included by adaptating the following files: \begin{verbatim} $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest' \end{verbatim}%\pagebreak \end{finger} \item compile the LMD physical packages with the appropriate \ttt{makegcm} command and collect the compiled objects in the library \ttt{liblmd.a}; \begin{finger} \item During this step that could be a bit long, especially if you defined more than one domain, the \ttt{makemeso} script provides you with the full path towards the text file \ttt{log\_compile\_phys} in which you can check for compilation progress and possible errors. % In the end of the process, you will find an error message associated to the generation of the final executable. % Please do not pay attention to this, as the compilation of the LMD sources is meant to generate a library of compiled objects called \ttt{liblmd.a} instead of a program. \end{finger} \item compile the modified Martian ARW-WRF solver, including the \ttt{liblmd.a} library; \begin{finger} \item When it is the first time the model is compiled, this step could be quite long. % The \ttt{makemeso} script provides you with a \ttt{log\_compile} text file where the progress of the compilation can be checked and a \ttt{log\_error} text file listing errors and warnings during compilation. % A list of warnings related to \ttt{grib} utilities (not used in the Martian model) may appear and have no impact on the final executables. \item The compilation with \ttt{g95} might be unsuccessful due to some problems with files related to terrestrial microphysics. % Please type the following commands: \begin{verbatim} cd $LMDMOD/LMD_MM_MARS/SRC tar xzvf g95.tar.gz cp -f g95/WRFV2_g95_fix/* WRFV2/phys/ cd $LMDMOD/LMD_MM_MARS \end{verbatim} \marge then recompile the model with the \ttt{makemeso} command. \end{finger} \item change the name of the executables in agreements with the settings provided by the user. \begin{finger} \item If you choose to answer to the \ttt{makemeso} questions using the aforementioned parameters in brackets, you should have in the \ttt{DIRCOMP} directory two executables: \begin{verbatim} real_x61_y61_z61_d1_t1_p1.exe wrf_x61_y61_z61_d1_t1_p1.exe \end{verbatim} % The directory also contains a text file in which the answers to the questions are stored, which allows you to re-run the script without the ``questions to the user" step: \begin{verbatim} ./makemeso < makemeso_x61_y61_z61_d1_t1_p1 \end{verbatim} \end{finger} \end{citemize} \mk \section{Running a simple test case} \label{sc:arsia} \mk We suppose that you had successfully compiled the model at the end of the previous section and you had used the answers in brackets to the \ttt{makemeso} questions. \mk \marge In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} archive you can download at \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}. % This test case simulates the hydrostatic atmospheric flow around Arsia Mons during half a sol with constant thermal inertia, albedo and dust opacity. \begin{finger} \item Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time. \end{finger} %\pagebreak \marge To launch the test simulation, please type the following commands, replacing the \ttt{g95\_32\_single} directory with its corresponding value on your system: % \begin{verbatim} cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/ tar xzvf LMD_MM_MARS_TESTCASE.tar.gz cd TESTCASE ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe nohup wrf.exe > log_wrf & \end{verbatim} %tar xzvf wrfinput.tar.gz \begin{finger} \item If you compiled the model using MPICH2, the command to launch a simulation is slightly different: % \begin{verbatim} [simulation on 2 processors on 1 machine] mpd & # first-time only (or after a reboot) # NB: may request the creation of a file .mpd.conf mpirun -np 8 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec tail -20 rsl.out.000? # to check the outputs \end{verbatim} \begin{verbatim} [simulation on 16 processors in 4 connected machines] echo barry.lmd.jussieu.fr > ~/mpd.hosts echo white.lmd.jussieu.fr >> ~/mpd.hosts echo loves.lmd.jussieu.fr >> ~/mpd.hosts echo tapas.lmd.jussieu.fr >> ~/mpd.hosts ssh barry.lmd.jussieu.fr # make sure that ssh to other machines # is possible without authentification mpdboot -f ~/mpd.hosts -n 4 mpdtrace mpirun -l -np 16 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec tail -20 rsl.out.00?? # to check the outputs \end{verbatim} \end{finger} \mk \chapter{Setting the simulation parameters} \mk In this chapter, we describe how to set the various parameters defining a given simulation. % As could be inferred from the content of the \ttt{TESTCASE} directory, two parameter files are needed to run the model: \begin{enumerate} \item The parameters related to the dynamical part of the model can be set in the file \ttt{namelist.input} according to the ARW-WRF namelist formatting. \item The parameters related to the physical part of the model can be set in the file \ttt{callphys.def} according to the LMD-MGCM formatting. \end{enumerate} \mk \section{Dynamical settings} \mk \ttt{namelist.input} controls the behavior of the dynamical core in the LMD Martian Mesoscale Model. % Compared to the file the ARW-WRF users are familiar with\footnote{ %%% A description of this file can be found in \ttt{SRC/WRFV2/run/README.namelist}. %%% }, the \ttt{namelist.input} in the LMD Martian Mesoscale Model is much shorter. % The only mandatory parameters in this file are information on time control\footnote{ %%% More information on the adopted Martian calendar: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html} %%% } and domain definition. \mk \marge The minimal version of the \ttt{namelist.input} file corresponds to standard simulations with the model. % It is however possible to modify optional parameters if needed, as is the case in the \ttt{namelist.input} associated to the Arsia Mons test case (e.g. the parameter \ttt{non\_hydrostatic} is set to false to assume hydrostatic equilibrium, whereas standard simulations are non-hydrostatic). \mk \marge A detailed description of the \ttt{namelist.input} file is given below\footnote{ %%% You may find the corresponding file in \ttt{SIMU/namelist.input\_full}. %%% }. % Comments on each of the parameters are provided, with the following labels: \begin{citemize} \item \ttt{(*)} denotes parameters not to be modified, \item \ttt{(r)} indicates parameters which modification implies a new recompilation of the model, \item \ttt{(n)} describes parameters involved when nested domains are defined, \item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing of initial and boundary conditions (see next chapter), \item \ttt{(*d)} denotes dynamical parameters which modification implies non-standard simulations -- please read \ttt{SRC/WRFV2/run/README.namelist} and use with caution. \end{citemize} % If omitted, the optional parameters would be set to their default values indicated below.\pagebreak \centers{\ttt{-- file: namelist.input\_full --}}\codesource{namelist.input_full}\centers{\ttt{-- end file: namelist.input\_full --}} \begin{finger} \item Please pay attention to rigorous syntax while editing your personal \ttt{namelist.input} file to avoid reading error. \item To modify the default values (or even add personal parameters) in the \ttt{namelist.input} file, edit the \ttt{SRC/WRFV2/Registry/Registry.EM} file. % You will then have to recompile the model with \ttt{makemeso} ; answer \ttt{y} to the last question. \end{finger} \mk \marge In case you run simulations with \ttt{max\_dom} nested domains, you have to set \ttt{max\_dom} parameters wherever there is a ``," in the above list. % Here is an example of the resulting syntax of the \ttt{time\_control}, \ttt{domains} and \ttt{bdy\_control} categories in \ttt{namelist.input}: % \codesource{OMG_namelist.input} \section{Physical settings} \mk \ttt{callphys.def} controls the behavior of the physical parameterizations in the LMD Martian\linebreak Mesoscale Model. % The organization of this file is exactly similar to the corresponding file in the LMD Martian GCM, which user manual can be found at \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}. \mk \marge Please find in what follows the contents of \ttt{callphys.def}: % \centers{\ttt{-- file: callphys.def --}}\codesource{callphys.def}\centers{\ttt{-- end file: callphys.def --}} \mk \begin{finger} \item Note that in the given example the convective adjustment, the gravity wave parameterization, and the NLTE schemes are turned off, as is usually the case in typical Martian tropospheric mesoscale simulations. \item \ttt{iradia} sets the frequency (in dynamical timesteps) at which the radiative computations are performed. \item Modifying \ttt{callphys.def} only implies to recompile the model if the number of tracers is different. \item If you run a simulation with, say, $3$ domains, please ensure that you defined three files \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def}. \end{finger} \mk \chapter{Preprocessing utilities} \mk In the previous chapter, we decribed the simulation settings in the \ttt{namelist.input} file. % We saw that any modification of the parameters labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)} implies the initial and boundary conditions and/or the domain definition to be recomputed prior to running the model again. % As a result, you were probably unable to change many of the parameters of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which the initial and boundary conditions -- as well as the domain of simulation -- were predefined. \mk \marge In this chapter, we describe the installation and use of the preprocessing tools to define the domain of simulation, calculate an initial atmospheric state and prepare the boundary conditions for the chosen simulation time. % This necessary step would eventually allow you to run your own simulations at the specific season and region you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}. \mk \section{Installing the preprocessing utilities} \mk First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. % Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows % \begin{verbatim} ln -sf /bigdisk/user $LMDMOD/TMPDIR \end{verbatim} \mk \marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. % If this is not the case, please compile the model with the \ttt{makemeso} command (see section \ref{sc:makemeso}). \mk \marge The compilation process created an installation directory adapted to your particular choice of compiler$+$machine. % The preprocessing tools will also be installed in this directory. % Please type the following commands: % \begin{verbatim} cd $LMDMOD/LMD_MM_MARS/g95_32_single/ ## or any install directory ln -sf ../prepare_ini . ./prepare_ini \end{verbatim} \mk \marge The script \ttt{prepare\_ini} plays with the preprocessing tools an equivalent role as the \ttt{copy\_model} with the model sources : files are simply linked to their actual location in the \ttt{SRC} folder. % Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. \mk \marge In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentionned in the name of the current installation directory: % \begin{verbatim} echo $PWD cd PREP_MARS/ ./compile [or] ./compile_g95 ls -lt create_readmeteo.exe readmeteo.exe cd .. \end{verbatim} \mk \marge In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}: \begin{verbatim} cd WPS/ ./configure ## select your compiler + 'NO GRIB2' option ./compile ls -lt geogrid.exe metgrid.exe \end{verbatim} \mk \marge Apart from the executables you just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}. % \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}. \begin{finger} \item Even though the name of the executable writes e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. % We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} was a practical way not to confuse between executables compiled at different moments. \end{finger} \mk \section{Running the preprocessing utilities} \mk When you run a simulation with \ttt{wrf.exe}, the program attempts to read the initial state in the files \ttt{wrfinput\_d01}, \ttt{wrfinput\_d02}, \ldots (one file per domain) and the parent domain boundary conditions in \ttt{wrfbdy\_d01}. % The whole chain of data conversion and interpolation needed to generate those files is summarized in the diagram next page. % Three distinct preprocessing steps are necessary to generate the final files. % As is described in the previous section, some modifications in the \ttt{namelist.input} file [e.g. start/end dates labelled with \ttt{(p1)}] requires a complete reprocessing from step $1$ to step $3$ to successfully launch the simulation, whereas other changes [e.g. model top labelled with \ttt{(p3)}] only requires a quick reprocessing at step $3$, keeping the files generated at the end of step $2$ the same. \mk \subsection{Input data} \mk \subsubsection{Static data} \mk All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory. % By default, only coarse-resolution datasets\footnote{ %%% Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} %%% } are available, but the directory also contains sources and scripts to install finer resolution datasets: \begin{citemize} \item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola}, \item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01}, \item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07} \end{citemize} \pagebreak \includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf} \mk \marge The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities: % \begin{verbatim} cd $LMDMOD/LMD_MM_MARS ./build_static \end{verbatim} % \begin{finger} \item Please install the \ttt{octave} free software\footnote{ %%% Available at \url{http://www.gnu.org/software/octave} %%% } on your system to be able to use the \ttt{build\_static} script. % Another solution is to browse into each of the directories contained within \ttt{WPS\_GEOG}, download the data with the shell scripts and execute the \ttt{.m} scripts with either \ttt{octave} or the commercial software \ttt{matlab} (just replace \ttt{\#} by \ttt{\%}). % \item If you do not manage to execute the \ttt{build\_static} script, converted ready-to-use datafiles are available upon request. % \item The building of the MOLA 64ppd topographical database can be quite long. Thus, such a process is not performed by default by the \ttt{build\_static} script. If the user would like to build this database, please remove the \ttt{exit} command in the script, just above the commands related to the MOLA 64ppd. % \item The resulting \ttt{WPS\_GEOG} can reach a size of several hundreds of Mo. % You might move such a folder in a place with more disk space available, but then be sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS} a link to the new location of the directory. \end{finger} \mk \subsubsection{Meteorological data} \mk The preprocessing tools generate initial and boundary conditions from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations. % If you would like to run a mesoscale simulation at a given season, you need to first run a GCM simulation and output the meteorological fields at the considered season. % For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours. % Please also make sure that the following fields are stored in the NETCDF \ttt{diagfi.nc} file: \footnotesize \codesource{contents_diagfi} \normalsize \begin{finger} \item If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf. \end{finger} \mk \marge An example of input meteorological file \ttt{diagfi.nc} file can be downloaded at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}. % Please deflate the archive and copy the \ttt{diagfi.nc} file in \ttt{\$LMDMOD/TMPDIR/GCMINI}. % Such a file can then be used to define the initial and boundary conditions, and we will go through the three preprocessing steps. \mk \subsection{Preprocessing steps} \mk \subsubsection{Step 1: Converting GCM data} \mk The programs in the \ttt{PREP\_MARS} directory convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. % These programs can be executed by the following commands: \begin{verbatim} cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS echo 1 | ./create_readmeteo.exe # drop the "echo 1 |" if you want control ./readmeteo.exe < readmeteo.def \end{verbatim} % \marge If every went well with the conversion, the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}. \mk \subsubsection{2: Interpolation on the regional domain} \mk In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows you to define the mesoscale simulation domain to horizontally interpolate the topography, thermal inertia and albedo fields at the domain resolution and to calculate useful fields such as topographical slopes.%\pagebreak \mk \marge Please execute the commands: % \begin{verbatim} cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS ln -sf ../../TESTCASE/namelist.wps . # test case ./geogrid.exe \end{verbatim} % \marge The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc}. % A quick check can be performed using the command line \begin{verbatim} ncview geo_em.d01.nc \end{verbatim} \marge if \ttt{ncview} is installed, or the \ttt{IDL} script \ttt{out\_geo.pro} \begin{verbatim} idl IDL> out_geo, field1='TOPO' IDL> out_geo, field1='TI' IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &' IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &' IDL> exit \end{verbatim} \marge if the demo version of \ttt{IDL} is installed. % Of course if your favorite graphical tool supports the NETCDF standard, you might use it to check the domain definition in \ttt{geo\_em.d01.nc}. \mk \marge If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}. % Here are the contents of \ttt{namelist.wps}: % \codesource{namelist.wps_TEST} \begin{finger} % \item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. % \item More details about the database and more options of interpolation could be found in the file \ttt{geogrid/GEOGRID.TBL}. % \item Defining several domains yields distinct files \ttt{geo\_em.d01.nc}, \ttt{geo\_em.d02.nc}, \ttt{geo\_em.d03.nc}\ldots \end{finger} \mk \marge Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data. % Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s) % Please type the following commands: \begin{verbatim} cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS ./metgrid.exe \end{verbatim} % \marge If every went well, the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED} should contain the \ttt{met\_em.*} files. \mk \subsubsection{Step 3: Vertical interpolation on mesoscale levels} \mk \marge The last step is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. % This program also prepares the final initial state for the simulation in files called \ttt{wrfinput} and the boundary conditions in files called \ttt{wrfbdy}. \mk \marge To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. % Parameters in \ttt{namelist.input} controlling the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in the previous chapter. \mk \marge Please type the following commands to prepare files for the Arsia Mons test case (or your personal test case if you changed the parameters in \ttt{namelist.wps}): \begin{verbatim} cd $LMDMOD/TESTCASE ln -sf $LMDMOD/WRFFEED/met_em* . ./real.exe \end{verbatim} \mk \marge The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}. \begin{finger} \item When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. \end{finger} %\pagebreak \chapter{Starting simulations from scratch} \mk \section{Running your own GCM simulations} \begin{remarque} To be completed \end{remarque} \mk \section{Complete simulations with \ttt{runmeso}} \begin{remarque} To be completed \end{remarque} \chapter{Outputs} \mk \section{Postprocessing utilities and graphics} \begin{remarque} To be completed. Do-it-all \ttt{idl} scripts would be described here ! \end{remarque} \mk \section{Modify the outputs} \begin{remarque} To be completed. Though the method is different, we kept all the convenient aspects of \ttt{writediagfi} \end{remarque} \chapter{Frequently Asked Questions} \begin{finger} \item Which timestep should I choose to avoid crashes of the model ? \item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ? \item Help ! I get strange assembler errors or ILM errors while compiling ! \item Is it possible to run the model on a specific configuration that is not supported ? \item Why do I have to define four less rows in the parent domain when performing nested runs ? \item I am kind of nostalgic of early/middle Mars. How could I run mesoscale simulations at low/high obliquity ? \item Why \ttt{real.exe} is crashing when the model top pressure is lower than $2$~Pa ? \item Can I use the two-way nesting ? \end{finger} \begin{remarque} To be completed. \end{remarque}