source: trunk/MESOSCALE/DOC/SRC/user_manual_txt.tex @ 206

Last change on this file since 206 was 206, checked in by aslmd, 13 years ago

MESOSCALE: user manual. done first three chapters.

  • Property svn:executable set to *
File size: 27.4 KB
Line 
1
2
3
4\chapter{Compiling the model and running a test case}
5
6\vk
7This chapter is also meant for first time users of the LMD Martian Mesoscale Model. We describe how to compile the program and run a test case.
8
9\mk
10\subsection{Main compilation step}
11\label{sc:makemeso}
12
13\mk
14In order to compile the model, execute the \ttt{makemeso} compilation script
15in the \ttt{LMD\_MM\_MARS}\linebreak directory
16%
17\begin{verbatim}
18cd $LMDMOD/LMD_MM_MARS
19./makemeso
20\end{verbatim}
21%
22\marge and answer to the questions about
23\begin{asparaenum}[1.]%[\itshape Q1\upshape)]
24\item compiler choice (and number of processors if using MPI)
25\item number of grid points in longitude [61]
26\item number of grid points in latitude [61]
27\item number of vertical levels [61] 
28\item number of tracers [1]
29\item number of domains [1]
30\end{asparaenum}
31
32%\mk
33\begin{finger}
34\item On the first time you compile the model, you will probably wonder what to reply
35to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable
36for the test case given below.
37\item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$
38to run a specific compilation.
39If you would like to run another simulation
40with at least one of parameters $1$ to $6$ 
41subject to change, the model needs to be recompiled\footnote{This
42necessary recompilation each time the number of grid points,
43tracers and domains is modified is imposed by the LMD physics code.
44The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}.
45\item When you use parallel computations, please bear in mind that with
46$2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated
47into $2$ (resp. $2$, $3$, $4$, $4$) tiles over
48the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction.
49Thus make sure that the number of grid points minus $1$ in each direction
50could be divided by the aforementioned number of tiles over the considered
51direction.
52\item If you use grid nesting, note that no more than $4$ processors can be used.
53\end{finger}
54
55\mk
56\marge The \ttt{makemeso} is an automated script which performs
57the following serie of tasks:
58%It is useful to detail and comment the  performed by the \ttt{makemeso} script:
59\begin{citemize}
60\item determine if the machine is 32 or 64 bits;
61\item ask the user about the compilation settings;
62\item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP};
63\begin{finger}
64\item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single}
65is created if the user requested
66a \ttt{g95} compilation of the code for single-domain simulations
67on a 32bits machine.
68\end{finger}
69\item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources;
70\begin{finger}
71\item This method ensures that any change to the model sources would
72be propagated to all the different \ttt{DIRCOMP} installation folders.
73\end{finger}
74\item execute the WRF \ttt{configure} script with the correct option;
75\item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics;
76\item calculate the total number of horizontal grid points handled by the LMD physics;
77\item duplicate LMD physical sources if nesting is activated;
78\begin{finger}
79\item The model presently supports 3 nests, but more nests
80can be included by adaptating the following files:
81\begin{verbatim}
82$LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc
83$LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc
84$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3
85$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3
86$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm*  ## search for 'nest'
87\end{verbatim}%\pagebreak
88\end{finger}
89\item compile the LMD physical packages with the appropriate \ttt{makegcm} command
90and collect the compiled objects in the library \ttt{liblmd.a};
91\begin{finger}
92\item During this step that could be a bit long,
93especially if you defined more than one domain,
94the \ttt{makemeso} script provides you with the full path towards
95the text file \ttt{log\_compile\_phys} in which you can check for
96compilation progress and possible errors.
97%
98In the end of the process, you will find an
99error message associated to the generation of the
100final executable.
101%
102Please do not pay attention to this, as the compilation of the LMD
103sources is meant to generate a library of
104compiled objects called \ttt{liblmd.a} instead of a program.
105\end{finger}
106\item compile the modified Martian ARW-WRF solver, including
107the \ttt{liblmd.a} library;
108\begin{finger}
109\item When it is the first time the model is compiled, this
110step could be quite long.
111%
112The \ttt{makemeso} script provides you with a \ttt{log\_compile}
113text file where the progress of the compilation can be checked
114and a \ttt{log\_error} text file listing errors and warnings
115during compilation.
116%
117A list of warnings related to \ttt{grib}
118utilities (not used in the Martian model)
119may appear and have no impact on the
120final executables.
121\item The compilation with \ttt{g95} might be unsuccessful
122due to some problems with files related to terrestrial microphysics.
123%
124Please type the following commands:
125\begin{verbatim}
126cd $LMDMOD/LMD_MM_MARS/SRC
127tar xzvf g95.tar.gz
128cp -f g95/WRFV2_g95_fix/* WRFV2/phys/
129cd $LMDMOD/LMD_MM_MARS
130\end{verbatim}
131\marge then recompile the model with the \ttt{makemeso} command.
132\end{finger}
133\item change the name of the executables in agreements with the
134settings provided by the user.
135\begin{finger}
136\item If you choose to answer to the \ttt{makemeso} questions using the
137aforementioned parameters in brackets, you should have in the
138\ttt{DIRCOMP} directory two executables:
139\begin{verbatim}
140real_x61_y61_z61_d1_t1_p1.exe
141wrf_x61_y61_z61_d1_t1_p1.exe
142\end{verbatim}
143%
144The directory also contains a text file
145in which the answers to the questions are stored, which
146allows you to re-run the script without the
147``questions to the user" step:
148\begin{verbatim}
149./makemeso < makemeso_x61_y61_z61_d1_t1_p1
150\end{verbatim}
151\end{finger}
152\end{citemize}
153
154\mk
155\section{Running a simple test case}
156\label{sc:arsia}
157
158\mk
159We suppose that you had successfully compiled
160the model at the end of the previous section
161and you had used the answers in brackets
162to the \ttt{makemeso} questions.
163
164\mk
165\marge In order to test the compiled executables,
166a ready-to-use test case
167(with pre-generated initial and boundary
168conditions) is proposed
169in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz}
170archive you can download at
171\url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}.
172%
173This test case simulates the hydrostatic
174atmospheric flow around Arsia Mons during half a sol
175with constant thermal inertia, albedo
176and dust opacity.
177
178\begin{finger}
179\item Though the simulation reproduces some reasonable
180features of the mesoscale circulation around Arsia
181Mons (e.g. slope winds), it should not be used
182for scientific purpose, for the number of grid points
183is unsufficient for single-domain simulation
184and the integration time is below the necessary spin-up time.
185\end{finger}
186%\pagebreak
187
188\marge To launch the test simulation, please type
189the following commands, replacing the
190\ttt{g95\_32\_single} directory with its corresponding
191value on your system:
192%
193\begin{verbatim}
194cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/
195tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
196cd TESTCASE
197ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe 
198nohup wrf.exe > log_wrf &
199\end{verbatim}
200
201%tar xzvf wrfinput.tar.gz
202
203\begin{finger}
204\item If you compiled the model using MPICH2,
205the command to launch a simulation is slightly different:
206%
207\begin{verbatim}
208[simulation on 2 processors on 1 machine]
209mpd &      # first-time only (or after a reboot)
210           # NB: may request the creation of a file .mpd.conf
211mpirun -np 8 wrf.exe < /dev/null &      # NB: mpirun is only a link to mpiexec 
212tail -20 rsl.out.000?     # to check the outputs
213\end{verbatim}
214\begin{verbatim}
215[simulation on 16 processors in 4 connected machines]
216echo barry.lmd.jussieu.fr > ~/mpd.hosts
217echo white.lmd.jussieu.fr >> ~/mpd.hosts
218echo loves.lmd.jussieu.fr >> ~/mpd.hosts
219echo tapas.lmd.jussieu.fr >> ~/mpd.hosts
220ssh barry.lmd.jussieu.fr   # make sure that ssh to other machines
221                           # is possible without authentification
222mpdboot -f ~/mpd.hosts -n 4
223mpdtrace
224mpirun -l -np 16 wrf.exe < /dev/null &   # NB: mpirun is only a link to mpiexec
225tail -20 rsl.out.00??     # to check the outputs
226\end{verbatim}
227\end{finger}
228
229
230\mk
231\chapter{Setting the simulation parameters}
232
233\mk
234In this chapter, we describe how to set the various parameters
235defining a given simulation.
236%
237As could be inferred from the content of the \ttt{TESTCASE} directory,
238two parameter files are needed to run the model:
239\begin{enumerate}
240\item The parameters related to the dynamical part of the model can be set
241in the file \ttt{namelist.input} according to the ARW-WRF namelist formatting.
242\item The parameters related to the physical part of the model can be set
243in the file \ttt{callphys.def} according to the LMD-MGCM formatting.
244\end{enumerate}
245
246\mk
247\section{Dynamical settings}
248
249\mk
250\ttt{namelist.input} controls the behavior of the dynamical core
251in the LMD Martian Mesoscale Model.
252%
253Compared to the file the ARW-WRF users are familiar with\footnote{
254%%%
255A description of this file can be found in \ttt{SRC/WRFV2/run/README.namelist}.
256%%%
257}, the \ttt{namelist.input} in the LMD Martian Mesoscale Model
258is much shorter.
259%
260The only mandatory parameters in this file
261are information on time control\footnote{
262%%%
263More information on the adopted Martian calendar:
264\url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}
265%%%
266} and domain definition.
267
268\mk
269\marge The minimal version of the \ttt{namelist.input}
270file corresponds to standard simulations with the model.
271%
272It is however possible to modify optional parameters
273if needed, as is the case in the \ttt{namelist.input}
274associated to the Arsia Mons test case
275(e.g. the parameter \ttt{non\_hydrostatic} is set to false
276to assume hydrostatic equilibrium, whereas standard
277simulations are non-hydrostatic).
278
279\mk
280\marge A detailed description of the \ttt{namelist.input} file is given below\footnote{
281%%%
282You may find the corresponding file in \ttt{SIMU/namelist.input\_full}.
283%%%
284}.
285%
286Comments on each of the parameters are provided,
287with the following labels:
288\begin{citemize}
289\item \ttt{(*)} denotes parameters not to be modified,
290\item \ttt{(r)} indicates parameters which modification implies a new recompilation of the model,
291\item \ttt{(n)} describes parameters involved when nested domains are defined,
292\item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing
293of initial and boundary conditions (see next chapter),
294\item \ttt{(*d)} denotes dynamical parameters which modification implies
295non-standard simulations -- please read \ttt{SRC/WRFV2/run/README.namelist}
296and use with caution.
297\end{citemize}
298%
299If omitted, the optional parameters would be set to their default
300values indicated below.\pagebreak
301
302\centers{\ttt{-- file: namelist.input\_full --}}\codesource{namelist.input_full}\centers{\ttt{-- end file: namelist.input\_full --}}
303
304\begin{finger}
305\item Please pay attention to rigorous syntax while
306editing your personal \ttt{namelist.input} file
307to avoid reading error.
308\item To modify the default values (or even add
309personal parameters) in the \ttt{namelist.input} file,
310edit the \ttt{SRC/WRFV2/Registry/Registry.EM} file.
311%
312You will then have to recompile the model with \ttt{makemeso} ;
313answer \ttt{y} to the last question.
314\end{finger}
315
316\mk
317\marge In case you run simulations with \ttt{max\_dom}
318nested domains, you have to set \ttt{max\_dom} parameters
319wherever there is a ``," in the above list.
320%
321Here is an example of the resulting syntax of the
322\ttt{time\_control}, \ttt{domains} and \ttt{bdy\_control}
323categories in \ttt{namelist.input}:
324%
325\codesource{OMG_namelist.input}
326
327\section{Physical settings}
328
329\mk
330\ttt{callphys.def} controls the behavior of the physical parameterizations
331in the LMD Martian\linebreak Mesoscale Model.
332%
333The organization of this file is exactly similar
334to the corresponding file in the LMD Martian GCM, which
335user manual can be found at
336\url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}.
337
338\mk
339\marge Please find in what follows the contents of \ttt{callphys.def}:
340%
341\centers{\ttt{-- file: callphys.def --}}\codesource{callphys.def}\centers{\ttt{-- end file: callphys.def --}}
342
343\mk
344\begin{finger}
345\item Note that in the given example
346the convective adjustment,
347the gravity wave parameterization,
348and the NLTE schemes are turned off, as is
349usually the case in typical Martian tropospheric
350mesoscale simulations.
351\item \ttt{iradia} sets the frequency
352(in dynamical timesteps) at which
353the radiative computations are performed.
354\item Modifying \ttt{callphys.def} only implies
355to recompile the model if the number of tracers is different.
356\item If you run a simulation with, say, $3$ domains,
357please ensure that you defined three files
358\ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def}.
359\end{finger}
360
361\mk
362\chapter{Preprocessing utilities}
363
364\mk
365In the previous chapter, we decribed the simulation settings
366in the \ttt{namelist.input} file.
367%
368We saw that any modification of the parameters
369labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}
370implies the initial and boundary conditions
371and/or the domain definition to be recomputed prior to running the model again.
372%
373As a result, you were probably unable to change many of the parameters
374of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which
375the initial and boundary conditions -- as well as the domain of
376simulation -- were predefined.
377
378\mk
379\marge In this chapter, we describe the installation and use of the preprocessing tools to
380define the domain of simulation, calculate an initial atmospheric state
381and prepare the boundary conditions for the chosen simulation time.
382%
383This necessary step would eventually allow you to run your own simulations at the specific season and region
384you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}.
385
386\mk
387\section{Installing the preprocessing utilities}
388
389\mk
390First and foremost, since the preprocessing utilities could generate
391(or involve) files of quite significant sizes, it is necessary
392to define a directory where these files would be stored.
393%
394Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows
395%
396\begin{verbatim}
397ln -sf /bigdisk/user $LMDMOD/TMPDIR
398\end{verbatim}
399
400\mk
401\marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian
402Mesoscale Model was compiled at least once.
403%
404If this is not the case, please compile
405the model with the \ttt{makemeso} command
406(see section \ref{sc:makemeso}).
407
408\mk
409\marge The compilation process created an
410installation directory adapted to your
411particular choice of compiler$+$machine.
412%
413The preprocessing tools will also
414be installed in this directory.
415%
416Please type the following commands:
417%
418\begin{verbatim}
419cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any install directory
420ln -sf ../prepare_ini .
421./prepare_ini
422\end{verbatim}
423
424\mk
425\marge The script \ttt{prepare\_ini} plays with the preprocessing tools
426an equivalent role as the \ttt{copy\_model} with the model sources :
427files are simply linked to their actual location in the \ttt{SRC} folder.
428%
429Once you have executed \ttt{prepare\_ini}, please check that
430two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}.
431
432\mk
433\marge In the \ttt{PREP\_MARS} directory, please compile
434the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe},
435using the compiler mentionned in the name of the current
436installation directory:
437%
438\begin{verbatim}
439echo $PWD
440cd PREP_MARS/
441./compile [or] ./compile_g95
442ls -lt create_readmeteo.exe readmeteo.exe
443cd ..
444\end{verbatim}
445
446\mk
447\marge In the \ttt{WPS} directory, please compile
448the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}:
449\begin{verbatim}
450cd WPS/   
451./configure   ## select your compiler + 'NO GRIB2' option
452./compile
453ls -lt geogrid.exe metgrid.exe
454\end{verbatim}
455
456\mk
457\marge Apart from the executables you just compiled,
458the preprocessing utilities include \ttt{real.exe},
459which was compiled by the \ttt{makemeso} script
460along with the mesoscale model executable \ttt{wrf.exe}.
461%
462\ttt{real.exe} should be copied or linked in the
463simulation directory (e.g. \ttt{TESTCASE} for the
464Arsia Mons test case) to be at the same level than
465\ttt{namelist.input}.
466
467\begin{finger}
468\item Even though the name of the executable writes
469e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program
470is not related to the specific \ttt{makemeso} 
471parameters -- contrary to the \ttt{wrf.exe} executable.
472%
473We just found that renaming the (possibly similar
474if the model sources were not modified)
475\ttt{real.exe} was a practical way not to confuse
476between executables compiled at different moments.
477\end{finger}
478
479\mk
480\section{Running the preprocessing utilities}
481
482\mk
483When you run a simulation with \ttt{wrf.exe},
484the program attempts to read the initial state
485in the files
486\ttt{wrfinput\_d01},
487\ttt{wrfinput\_d02}, \ldots 
488(one file per domain)
489and the parent domain boundary conditions
490in \ttt{wrfbdy\_d01}.
491%
492The whole chain of data conversion and
493interpolation needed to generate those
494files is summarized in the diagram next
495page.
496%
497Three distinct preprocessing steps are
498necessary to generate the final files.
499%
500As is described in the previous section,
501some modifications in the \ttt{namelist.input} file
502[e.g. start/end dates labelled with \ttt{(p1)}]
503requires a complete reprocessing from step $1$ to step $3$
504to successfully launch the simulation,
505whereas other changes
506[e.g. model top labelled with \ttt{(p3)}]
507only requires a quick reprocessing at step $3$, keeping
508the files generated at the end of step $2$
509the same.
510 
511\mk
512\subsection{Input data}
513
514\mk
515\subsubsection{Static data}
516
517\mk
518All the static data
519(topography, thermal inertia, albedo)
520needed to initialize the model
521are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory.
522%
523By default, only coarse-resolution datasets\footnote{ 
524%%%
525Corresponding to the fields stored in the
526file \ttt{surface.nc} known by LMD-MGCM users:
527\url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc}
528%%%
529} are available, but the directory also contains sources and scripts
530to install finer resolution datasets:
531\begin{citemize}
532\item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola},
533\item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01},
534\item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07} 
535\end{citemize}
536\pagebreak
537\includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf}
538
539\mk
540\marge The role of the \ttt{build\_static} script is to
541automatically download these datasets from the web
542(namely PDS archives) and convert them to an
543acceptable format for a future use by the
544preprocessing utilities:
545%
546\begin{verbatim}
547cd $LMDMOD/LMD_MM_MARS
548./build_static
549\end{verbatim}
550%
551\begin{finger}
552\item Please install the \ttt{octave}
553free software\footnote{
554%%%
555Available at \url{http://www.gnu.org/software/octave}
556%%%
557} on your system to be able to use the
558\ttt{build\_static} script.
559%
560Another solution is to browse into each of the
561directories contained within \ttt{WPS\_GEOG}, download the
562data with the shell scripts and execute the \ttt{.m} scripts with either
563\ttt{octave} or the commercial software \ttt{matlab}
564(just replace \ttt{\#} by \ttt{\%}).
565%
566\item If you do not manage to execute the \ttt{build\_static} script,
567converted ready-to-use datafiles are available upon request.
568%
569\item The building of the MOLA 64ppd topographical
570database can be quite long. Thus, such a process is
571not performed by default by the \ttt{build\_static} script.
572If the user would like to build this database,
573please remove the \ttt{exit} command in the script, just above
574the commands related to the MOLA 64ppd.
575%
576\item The resulting \ttt{WPS\_GEOG} can reach a size
577of several hundreds of Mo.
578%
579You might move such a folder in a place
580with more disk space available, but then be
581sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS}
582a link to the new location
583of the directory.
584\end{finger}
585
586\mk
587\subsubsection{Meteorological data}
588
589\mk
590The preprocessing tools generate initial and boundary conditions
591from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
592%
593If you would like to run a mesoscale simulation at a given
594season, you need to first run a GCM simulation and output
595the meteorological fields at the considered season.
596%
597For optimal forcing at the boundaries, we advise you
598to write the meteorological fields to the
599\ttt{diagfi.nc} file at least each two hours.
600%
601Please also make sure that the following fields
602are stored in the NETCDF \ttt{diagfi.nc} file:
603
604\footnotesize
605\codesource{contents_diagfi}
606
607\normalsize
608\begin{finger}
609\item If the fields
610\ttt{emis},
611\ttt{co2ice},
612\ttt{q01},
613\ttt{q02},
614\ttt{tsoil}
615are missing in the \ttt{diagfi.nc} file,
616they are replaced by respective default
617values $0.95$, $0$, $0$, $0$, tsurf.
618\end{finger}
619
620\mk
621\marge An example of input meteorological file
622\ttt{diagfi.nc} file can be downloaded
623at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
624%
625Please deflate the archive and copy the \ttt{diagfi.nc} file
626in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
627%
628Such a file can then be used to define the initial
629and boundary conditions, and we will go
630through the three preprocessing steps.
631
632\mk
633\subsection{Preprocessing steps}
634
635\mk
636\subsubsection{Step 1: Converting GCM data}
637
638\mk
639The programs in the \ttt{PREP\_MARS} directory
640convert the data from the NETCDF \ttt{diagfi.nc}
641file into separated binary datafiles for each
642date contained in \ttt{diagfi.nc}, according to
643the formatting needed by the
644preprocessing programs at step 2.
645%
646These programs can be executed by the following
647commands:
648\begin{verbatim}
649cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
650echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
651./readmeteo.exe < readmeteo.def
652\end{verbatim}
653%
654\marge If every went well with the conversion,
655the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
656should contain files named \ttt{LMD:}.
657
658\mk
659\subsubsection{2: Interpolation on the regional domain}
660
661\mk
662In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
663you to define the mesoscale simulation domain
664to horizontally interpolate the topography,
665thermal inertia and albedo fields at the domain
666resolution and to calculate useful fields
667such as topographical slopes.%\pagebreak
668
669\mk
670\marge Please execute the commands:
671%
672\begin{verbatim}
673cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
674ln -sf ../../TESTCASE/namelist.wps .   # test case
675./geogrid.exe
676\end{verbatim}
677%
678\marge The result of \ttt{geogrid.exe}
679-- and thus the definition of the mesoscale
680domain -- can be checked in the NETCDF
681file \ttt{geo\_em.d01.nc}.
682%
683A quick check can be performed using the command line
684\begin{verbatim}
685ncview geo_em.d01.nc
686\end{verbatim}
687\marge if \ttt{ncview} is installed, or the \ttt{IDL}
688script \ttt{out\_geo.pro}
689\begin{verbatim}
690idl
691IDL> out_geo, field1='TOPO'
692IDL> out_geo, field1='TI'
693IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
694IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
695IDL> exit
696\end{verbatim}
697\marge if the demo version of \ttt{IDL} is installed.
698%
699Of course if your favorite graphical tool supports
700the NETCDF standard, you might use it to check the
701domain definition in \ttt{geo\_em.d01.nc}.
702
703\mk
704\marge If you are unhappy with the results or
705you want to change
706the location of the mesoscale domain on the planet,
707the horizontal resolution,
708the number of grid points \ldots,
709please modify the parameter
710file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
711%
712Here are the contents of \ttt{namelist.wps}:
713%
714\codesource{namelist.wps_TEST}
715
716\begin{finger}
717%
718\item No input meteorological data
719are actually needed to execute \ttt{geogrid.exe}.
720%
721\item More details about the database and
722more options of interpolation could be
723found in the file \ttt{geogrid/GEOGRID.TBL}.
724%
725\item Defining several domains yields
726distinct files
727\ttt{geo\_em.d01.nc},
728\ttt{geo\_em.d02.nc},
729\ttt{geo\_em.d03.nc}\ldots
730\end{finger}
731
732\mk
733\marge Once the \ttt{geo\_em} file(s) are generated,
734the \ttt{metgrid.exe} program performs
735a similar horizontal interpolation
736of the meteorological fields to the mesoscale
737domain as the one performed by \ttt{geogrid.exe}
738for the surface data.
739%
740Then the program writes the results in
741\ttt{met\_em} files and also collects
742the static fields and domain parameters
743included in the \ttt{geo\_em} file(s)
744%
745Please type the following commands:
746\begin{verbatim}
747cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
748./metgrid.exe
749\end{verbatim}
750%
751\marge If every went well,
752the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
753should contain the \ttt{met\_em.*} files.
754
755\mk
756\subsubsection{Step 3: Vertical interpolation on mesoscale levels}
757
758\mk
759\marge The last step is to execute \ttt{real.exe}
760to perform the interpolation from the vertical
761levels of the GCM to the vertical levels
762defined in the mesoscale model.
763%
764This program also prepares the final initial
765state for the simulation in files called
766\ttt{wrfinput} and the boundary conditions
767in files called \ttt{wrfbdy}.
768
769\mk
770\marge To successfully execute \ttt{real.exe},
771you need the \ttt{met\_em.*} files
772and the \ttt{namelist.input} file
773to be in the same directory as \ttt{real.exe}.
774%
775Parameters in \ttt{namelist.input} 
776controlling the behavior of the vertical interpolation
777are those labelled with \ttt{(p3)} in the detailed
778list introduced in the previous chapter.
779
780\mk
781\marge Please type the following commands
782to prepare files for the Arsia Mons test case
783(or your personal test case if you changed
784the parameters in \ttt{namelist.wps}):
785\begin{verbatim}
786cd $LMDMOD/TESTCASE
787ln -sf $LMDMOD/WRFFEED/met_em* .
788./real.exe
789\end{verbatim}
790
791\mk
792\marge The final message of the \ttt{real.exe}
793should claim the success of the processes and you
794are now ready to launch the integrations
795of the LMD Martian Mesoscale Model again
796with the \ttt{wrf.exe} command as in section
797\ref{sc:arsia}.
798
799\begin{finger}
800\item When you modify either
801\ttt{namelist.wps} or \ttt{namelist.input},
802make sure that the common parameters
803are exactly similar in both files
804(especially when running nested simulations)
805otherwise either \ttt{real.exe} or \ttt{wrf.exe}
806command will exit with an error message.
807\end{finger}
808%\pagebreak
809
810
811\chapter{Starting simulations from scratch}
812
813\mk
814\section{Running your own GCM simulations}
815
816\begin{remarque}
817To be completed
818\end{remarque}
819
820\mk
821\section{Complete simulations with \ttt{runmeso}}
822
823\begin{remarque}
824To be completed
825\end{remarque}
826
827
828\chapter{Outputs}
829
830\mk
831\section{Postprocessing utilities and graphics}
832
833\begin{remarque}
834To be completed. Do-it-all \ttt{idl} scripts
835would be described here !
836\end{remarque}
837
838\mk
839\section{Modify the outputs}
840
841\begin{remarque}
842To be completed.
843Though the method is different,
844we kept all the convenient aspects of \ttt{writediagfi}
845\end{remarque}
846
847\chapter{Frequently Asked Questions}
848
849
850\begin{finger}
851\item Which timestep should I choose to avoid crashes of the model ?
852\item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ?
853\item Help ! I get strange assembler errors or ILM errors while compiling !
854\item Is it possible to run the model on a specific configuration that is not supported ?
855\item Why do I have to define four less rows in the parent domain
856when performing nested runs ?
857\item I am kind of nostalgic of early/middle Mars. How could I run
858mesoscale simulations at low/high obliquity ?
859\item Why \ttt{real.exe} is crashing when the model top pressure is
860lower than $2$~Pa ?
861\item Can I use the two-way nesting ?
862\end{finger}
863
864\begin{remarque}
865To be completed.
866\end{remarque}
867
868
869
870
871
872
873
874
875
876
877
878
Note: See TracBrowser for help on using the repository browser.