source: trunk/MESOSCALE/DOC/SRC/user_manual_txt.tex @ 166

Last change on this file since 166 was 166, checked in by aslmd, 13 years ago

MESOSCALE: ajout sources user manual dans DOC/SRC pour travail collaboratif.

  • Property svn:executable set to *
File size: 35.7 KB
Line 
1
2\chapter{Introducing the model}
3
4\mk
5\begin{finger}
6\item Please first read the document ``Design and
7Performance of the LMD Martian Mesoscale Model"
8to know what the model is,
9what kind of results can be obtained
10and how these results compare with
11available data or independant simulations
12\end{finger}
13
14\begin{remarque}
15To be completed with description
16of the dynamics/physics driver
17\end{remarque}
18
19\chapter{First steps toward running the model}
20
21\mk
22This chapter is meant for first time users of the LMD Martian Mesoscale Model.
23%
24We describe how to install the model on your system, compile the program and run a test case.
25%
26Experience with either the terrestrial WRF mesoscale model or the LMD Martian GCM is not absolutely required,
27although it would help you getting more easily through the installation process.
28
29\mk
30\section{Prerequisites}
31
32\mk
33\subsection{General requirements}
34
35\mk
36In order to install the LMD Martian Mesoscale Model, please ensure that:
37\begin{citemize}
38\item your computer is connected to the internet;
39\item your OS is Linux\footnote{
40%%%%%%%%%%%%%%
41The model was also successfully compiled on MacOSX;
42``howto" information is available upon request.
43%%%%%%%%%%%%%%
44} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots);
45\item your Fortran compiler is the PGI commercial compiler \ttt{pgf90} or the GNU
46free compiler\footnote{
47%%%%%%%%%%%%%%
48Sources and binaries available on \url{http://www.g95.org}
49%%%%%%%%%%%%%%
50} \ttt{g95};
51\item your C compiler is \ttt{gcc} and C development libraries are included;
52\item \ttt{bash}, \ttt{m4} and \ttt{perl} are installed on your computer;
53\item \ttt{NETCDF} libraries have been compiled \emph{on your system}.
54\end{citemize} 
55%
56\begin{finger}
57\item You might also find useful -- though not mandatory -- to install on your system:
58\begin{citemize}
59\item the \ttt{ncview} utility\footnote{
60%%%%%%
61\url{http://meteora.ucsd.edu/~pierce/ncview\_home\_page.html}
62%%%%%%
63}, which is a nice tool to visualize the contents of a NETCDF file;
64\item the \ttt{IDL} demo version\footnote{
65%%%%%%
66\url{http://www.ittvis.com/ProductServices/IDL.aspx}
67%%%%%%
68}, which is used by the plot utilities provided with the model.
69\end{citemize} 
70\end{finger}
71
72\mk
73\marge Three environment variables associated with the \ttt{NETCDF} libraries must be defined:
74\begin{verbatim}
75declare -x NETCDF=/disk/user/netcdf 
76declare -x NCDFLIB=$NETCDF/lib       
77declare -x NCDFINC=$NETCDF/inc       
78\end{verbatim}
79
80\begin{finger}
81\item All command lines in the document are proposed in \ttt{bash}.
82\end{finger}
83
84%%[csh] setenv NETCDF /disk/user/netcdf
85%%[csh] setenv NCDFLIB $NETCDF/lib
86%%[csh] setenv NCDFINC $NETCDF/inc
87
88\mk
89\marge You also need the environment variable \ttt{\$LMDMOD} to point
90at the directory where you will install the model (e.g. \ttt{/disk/user/MODELS}):
91\begin{verbatim}
92declare -x LMDMOD=/disk/user/MODELS 
93\end{verbatim}
94%[csh] setenv LMDMOD /disk/user/MODELS
95%
96\begin{finger}
97\item Please check that $\sim 200$~Mo free disk space is available in \ttt{/disk}.
98\end{finger}
99
100\mk
101\subsection{Parallel computations}
102
103\mk
104\marge Parallel computations with the Message Passing Interface (MPI) standard are supported by
105the ARW-WRF mesoscale model.
106%
107If you want to use this capability in the LMD Martian Mesoscale Model,
108you would have the installation of MPICH2 as a additional prerequisite.
109
110\mk
111\marge Please download the current stable version of the sources
112(e.g. \ttt{mpich2-1.0.8.tar.gz}) on the MPICH2 website
113\url{http://www.mcs.anl.gov/research/projects/mpich2}
114and install the MPICH2 utilities by the following commands:
115%
116\begin{verbatim}
117mkdir $LMDMOD/MPI
118mv mpich2-1.0.8.tar.gz $LMDMOD/MPI
119cd $LMDMOD/MPI
120tar xzvf mpich2-1.0.8.tar.gz
121cd mpich2-1.0.8
122./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
123# please wait...
124make > mk.log 2> mkerr.log &
125declare -x WHERE_MPI=$LMDMOD/MPI/mpich2-1.0.8/bin
126\end{verbatim}
127%
128\begin{finger}
129\item Even if you add the \ttt{\$LMDMOD/MPI/mpich2-1.0.8/bin}
130directory to your \ttt{\$PATH} variable, defining the environment
131variable \ttt{\$WHERE\_MPI} is still required
132to ensure a successful compilation of the model.
133\end{finger}
134
135\mk
136\subsection{Compiling the terrestrial WRF model}
137
138\mk
139The LMD Martian Mesoscale Model is based on the terrestrial NCEP/NCAR ARW-WRF Mesoscale Model.
140%
141As a first step towards the compilation of the Martian version, we advise you to check that the terrestrial
142model compiles on your computer with either \ttt{g95} or \ttt{pgf90}.
143
144\mk
145\marge On the ARW-WRF website \url{http://www.mmm.ucar.edu/wrf/users/download/get\_source.html}, you will be allowed
146to freely download the model after a quick registration process (click on ``New users").
147%
148Make sure to download the version 2.2 of the WRF model and copy the
149\ttt{WRFV2.2.TAR.gz} archive to the \ttt{\$LMDMOD} folder.
150
151\mk
152\marge Then please extract the model sources and configure the compilation process:
153\begin{verbatim}
154cd $LMDMOD
155tar xzvf WRFV2.2.TAR.gz
156cd WRFV2
157./configure
158\end{verbatim}
159
160\mk
161\marge The \ttt{configure} script analyzes your architecture
162and proposes you several possible compilation options.
163%
164Make sure to choose the ``single-threaded, no nesting"
165option related to either \ttt{g95} (should be option $13$ on a $32$~bits Linux PC)
166or \ttt{pgf90} (should be option $1$ on a $32$~bits Linux PC).
167
168\mk
169\marge The next step is to compile the WRF model by choosing the kind of
170simulations you would like to run.
171%
172A simple and direct test consists in trying to compile
173the idealized case of a 2D flow impinging on a small hill:
174\begin{verbatim}
175./compile em_hill2d_x > log_compile 2> log_error &
176\end{verbatim}
177%
178\begin{finger}
179\item In case you encounter problems compiling the ARW-WRF model,
180please read documentation on the website
181\url{http://www.mmm.ucar.edu/wrf/users},
182contact the WRF helpdesk or search the web for your error message.
183\end{finger}%\pagebreak
184
185\mk
186\marge If the compilation was successful
187(the file \ttt{log\_error} should be empty
188or only reporting few warnings), you should find
189in the \ttt{main} folder two executables
190\ttt{ideal.exe} and \ttt{run.exe}
191that would allow you to run the test
192simulation:
193\begin{verbatim}
194cd test/em_hill2d_x
195./ideal.exe
196./wrf.exe
197\end{verbatim}
198%
199During the simulation, the time taken by the computer
200to perform integrations at each dynamical timestep
201is displayed in the standard output.
202%
203The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}.
204%
205The model results are stored in a \ttt{wrfout} data file
206you might like to browse with a \ttt{NETCDF}-compliant software
207such as \ttt{ncview}.
208%
209\begin{finger}
210\item If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will
211probably complain about an error reading the namelist.
212%
213Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order}
214in the \ttt{namelist.input} file to solve the problem.
215\end{finger}
216
217\mk
218\section{Compiling the Martian model}
219
220\mk
221\subsection{Extracting and preparing the sources}
222
223\mk
224To start the installation of the Martian mesoscale model,
225download the archive \ttt{LMD\_MM\_MARS.tar.gz}
226(click on \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS.tar.gz}
227or use the \ttt{wget} command).
228%
229Copy the sources in the \ttt{\$LMDMOD} directory and extract the files:
230\begin{verbatim}
231cp LMD_MM_MARS.tar.gz $LMDMOD
232cd $LMDMOD
233tar xzvf LMD_MM_MARS.tar.gz
234\end{verbatim}
235
236\mk
237\marge Execute the \ttt{prepare} script
238that would do some necessary preparatory tasks for you:
239deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS},
240download the ARW-WRF sources from the web,
241apply a (quite significant) ``Martian patch" to these sources
242and build the final structure of your \ttt{LMD\_MM\_MARS} directory:
243\begin{verbatim}
244cd $LMDMOD/LMD_MM_MARS
245./prepare
246\end{verbatim}
247
248\mk
249\marge Please check the contents of the \ttt{LMD\_MM\_MARS} directory:
250\begin{citemize}
251\item seven \ttt{bash} scripts:
252\ttt{build\_static},
253\ttt{copy\_model},
254\ttt{makemeso},
255\ttt{prepare},
256\ttt{prepare\_ini},\linebreak 
257\ttt{prepare\_post},
258\ttt{save\_all};
259\item the sources directory \ttt{SRC};
260\item the static data directory \ttt{WPS\_GEOG};
261\item the simulation utilities directory \ttt{SIMU}.
262\end{citemize}
263%
264\marge and check that the \ttt{LMD\_MM\_MARS/SRC} directory contains:
265\begin{citemize}
266\item the model main sources in \ttt{WRFV2},
267\item the preprocessing sources in \ttt{WPS} and \ttt{PREP\_MARS},
268\item the postprocessing sources in \ttt{ARWpost},
269\item three \ttt{tar.gz} archives and two information text files. %\ttt{saved} and \ttt{datesave}.
270\end{citemize}
271
272\mk
273\subsection{Main compilation step}
274\label{sc:makemeso}
275
276\mk
277In order to compile the model, execute the \ttt{makemeso} compilation script
278in the \ttt{LMD\_MM\_MARS}\linebreak directory
279%
280\begin{verbatim}
281cd $LMDMOD/LMD_MM_MARS
282./makemeso
283\end{verbatim}
284%
285\marge and answer to the questions about
286\begin{asparaenum}[1.]%[\itshape Q1\upshape)]
287\item compiler choice (and number of processors if using MPI)
288\item number of grid points in longitude [61]
289\item number of grid points in latitude [61]
290\item number of vertical levels [61]
291\item number of tracers [1]
292\item number of domains [1]
293\end{asparaenum}
294
295%\mk
296\begin{finger}
297\item On the first time you compile the model, you will probably wonder what to reply
298to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable
299for the test case given below.
300\item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$
301to run a specific compilation.
302If you would like to run another simulation
303with at least one of parameters $1$ to $6$ 
304subject to change, the model needs to be recompiled\footnote{This
305necessary recompilation each time the number of grid points,
306tracers and domains is modified is imposed by the LMD physics code.
307The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}.
308\item When you use parallel computations, please bear in mind that with
309$2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated
310into $2$ (resp. $2$, $3$, $4$, $4$) tiles over
311the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction.
312Thus make sure that the number of grid points minus $1$ in each direction
313could be divided by the aforementioned number of tiles over the considered
314direction.
315\item If you use grid nesting, note that no more than $4$ processors can be used.
316\end{finger}
317
318\mk
319\marge The \ttt{makemeso} is an automated script which performs
320the following serie of tasks:
321%It is useful to detail and comment the  performed by the \ttt{makemeso} script:
322\begin{citemize}
323\item determine if the machine is 32 or 64 bits;
324\item ask the user about the compilation settings;
325\item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP};
326\begin{finger}
327\item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} 
328is created if the user requested
329a \ttt{g95} compilation of the code for single-domain simulations
330on a 32bits machine.
331\end{finger}
332\item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources;
333\begin{finger}
334\item This method ensures that any change to the model sources would
335be propagated to all the different \ttt{DIRCOMP} installation folders.
336\end{finger}
337\item execute the WRF \ttt{configure} script with the correct option;
338\item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics;
339\item calculate the total number of horizontal grid points handled by the LMD physics;
340\item duplicate LMD physical sources if nesting is activated;
341\begin{finger}
342\item The model presently supports 3 nests, but more nests
343can be included by adaptating the following files:
344\begin{verbatim}
345$LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc
346$LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc
347$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3
348$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3
349$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm*  ## search for 'nest'
350\end{verbatim}%\pagebreak
351\end{finger}
352\item compile the LMD physical packages with the appropriate \ttt{makegcm} command
353and collect the compiled objects in the library \ttt{liblmd.a};
354\begin{finger}
355\item During this step that could be a bit long,
356especially if you defined more than one domain,
357the \ttt{makemeso} script provides you with the full path towards
358the text file \ttt{log\_compile\_phys} in which you can check for
359compilation progress and possible errors.
360%
361In the end of the process, you will find an
362error message associated to the generation of the
363final executable.
364%
365Please do not pay attention to this, as the compilation of the LMD
366sources is meant to generate a library of
367compiled objects called \ttt{liblmd.a} instead of a program.
368\end{finger}
369\item compile the modified Martian ARW-WRF solver, including
370the \ttt{liblmd.a} library;
371\begin{finger}
372\item When it is the first time the model is compiled, this
373step could be quite long.
374%
375The \ttt{makemeso} script provides you with a \ttt{log\_compile}
376text file where the progress of the compilation can be checked
377and a \ttt{log\_error} text file listing errors and warnings
378during compilation.
379%
380A list of warnings related to \ttt{grib}
381utilities (not used in the Martian model) 
382may appear and have no impact on the
383final executables.
384\item The compilation with \ttt{g95} might be unsuccessful
385due to some problems with files related to terrestrial microphysics.
386%
387Please type the following commands:
388\begin{verbatim}
389cd $LMDMOD/LMD_MM_MARS/SRC
390tar xzvf g95.tar.gz
391cp -f g95/WRFV2_g95_fix/* WRFV2/phys/
392cd $LMDMOD/LMD_MM_MARS
393\end{verbatim}
394\marge then recompile the model with the \ttt{makemeso} command.
395\end{finger}
396\item change the name of the executables in agreements with the
397settings provided by the user.
398\begin{finger}
399\item If you choose to answer to the \ttt{makemeso} questions using the
400aforementioned parameters in brackets, you should have in the
401\ttt{DIRCOMP} directory two executables:
402\begin{verbatim}
403real_x61_y61_z61_d1_t1_p1.exe
404wrf_x61_y61_z61_d1_t1_p1.exe
405\end{verbatim}
406%
407The directory also contains a text file
408in which the answers to the questions are stored, which
409allows you to re-run the script without the
410``questions to the user" step:
411\begin{verbatim}
412./makemeso < makemeso_x61_y61_z61_d1_t1_p1
413\end{verbatim}
414\end{finger}
415\end{citemize}
416
417\mk
418\section{Running a simple test case}
419\label{sc:arsia}
420
421\mk
422We suppose that you had successfully compiled
423the model at the end of the previous section
424and you had used the answers in brackets
425to the \ttt{makemeso} questions.
426
427\mk
428\marge In order to test the compiled executables,
429a ready-to-use test case
430(with pre-generated initial and boundary
431conditions) is proposed
432in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz}
433archive you can download at
434\url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}.
435%
436This test case simulates the hydrostatic
437atmospheric flow around Arsia Mons during half a sol
438with constant thermal inertia, albedo
439and dust opacity.
440
441\begin{finger}
442\item Though the simulation reproduces some reasonable
443features of the mesoscale circulation around Arsia
444Mons (e.g. slope winds), it should not be used
445for scientific purpose, for the number of grid points
446is unsufficient for single-domain simulation
447and the integration time is below the necessary spin-up time.
448\end{finger}
449%\pagebreak
450
451\marge To launch the test simulation, please type
452the following commands, replacing the
453\ttt{g95\_32\_single} directory with its corresponding
454value on your system:
455%
456\begin{verbatim}
457cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/
458tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
459cd TESTCASE
460ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe 
461nohup wrf.exe > log_wrf &
462\end{verbatim}
463
464%tar xzvf wrfinput.tar.gz
465
466\begin{finger}
467\item If you compiled the model using MPICH2,
468the command to launch a simulation is slightly different:
469%
470\begin{verbatim}
471[simulation on 2 processors on 1 machine]
472mpd &      # first-time only (or after a reboot)
473           # NB: may request the creation of a file .mpd.conf
474mpirun -np 8 wrf.exe < /dev/null &      # NB: mpirun is only a link to mpiexec 
475tail -20 rsl.out.000?     # to check the outputs
476\end{verbatim}
477\begin{verbatim}
478[simulation on 16 processors in 4 connected machines]
479echo barry.lmd.jussieu.fr > ~/mpd.hosts
480echo white.lmd.jussieu.fr >> ~/mpd.hosts
481echo loves.lmd.jussieu.fr >> ~/mpd.hosts
482echo tapas.lmd.jussieu.fr >> ~/mpd.hosts
483ssh barry.lmd.jussieu.fr   # make sure that ssh to other machines
484                           # is possible without authentification
485mpdboot -f ~/mpd.hosts -n 4
486mpdtrace
487mpirun -l -np 16 wrf.exe < /dev/null &   # NB: mpirun is only a link to mpiexec
488tail -20 rsl.out.00??     # to check the outputs
489\end{verbatim}
490\end{finger}
491
492
493\mk
494\chapter{Setting the simulation parameters}
495
496\mk
497In this chapter, we describe how to set the various parameters
498defining a given simulation.
499%
500As could be inferred from the content of the \ttt{TESTCASE} directory,
501two parameter files are needed to run the model:
502\begin{enumerate}
503\item The parameters related to the dynamical part of the model can be set
504in the file \ttt{namelist.input} according to the ARW-WRF namelist formatting.
505\item The parameters related to the physical part of the model can be set
506in the file \ttt{callphys.def} according to the LMD-MGCM formatting.
507\end{enumerate}
508
509\mk
510\section{Dynamical settings}
511
512\mk
513\ttt{namelist.input} controls the behavior of the dynamical core
514in the LMD Martian Mesoscale Model.
515%
516Compared to the file the ARW-WRF users are familiar with\footnote{
517%%%
518A description of this file can be found in \ttt{SRC/WRFV2/run/README.namelist}.
519%%%
520}, the \ttt{namelist.input} in the LMD Martian Mesoscale Model
521is much shorter.
522%
523The only mandatory parameters in this file
524are information on time control\footnote{
525%%%
526More information on the adopted Martian calendar:
527\url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}
528%%%
529} and domain definition.
530
531\mk
532\marge The minimal version of the \ttt{namelist.input}
533file corresponds to standard simulations with the model.
534%
535It is however possible to modify optional parameters
536if needed, as is the case in the \ttt{namelist.input} 
537associated to the Arsia Mons test case
538(e.g. the parameter \ttt{non\_hydrostatic} is set to false
539to assume hydrostatic equilibrium, whereas standard
540simulations are non-hydrostatic).
541
542\mk
543\marge A detailed description of the \ttt{namelist.input} file is given below\footnote{
544%%%
545You may find the corresponding file in \ttt{SIMU/namelist.input\_full}.
546%%%
547}.
548%
549Comments on each of the parameters are provided,
550with the following labels:
551\begin{citemize}
552\item \ttt{(*)} denotes parameters not to be modified,
553\item \ttt{(r)} indicates parameters which modification implies a new recompilation of the model,
554\item \ttt{(n)} describes parameters involved when nested domains are defined,
555\item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing
556of initial and boundary conditions (see next chapter),
557\item \ttt{(*d)} denotes dynamical parameters which modification implies
558non-standard simulations -- please read \ttt{SRC/WRFV2/run/README.namelist} 
559and use with caution.
560\end{citemize}
561%
562If omitted, the optional parameters would be set to their default
563values indicated below.\pagebreak
564
565\centers{\ttt{-- file: namelist.input\_full --}}\codesource{namelist.input_full}\centers{\ttt{-- end file: namelist.input\_full --}}
566
567\begin{finger}
568\item Please pay attention to rigorous syntax while
569editing your personal \ttt{namelist.input} file
570to avoid reading error.
571\item To modify the default values (or even add
572personal parameters) in the \ttt{namelist.input} file,
573edit the \ttt{SRC/WRFV2/Registry/Registry.EM} file.
574%
575You will then have to recompile the model with \ttt{makemeso} ;
576answer \ttt{y} to the last question.
577\end{finger}
578
579\mk
580\marge In case you run simulations with \ttt{max\_dom}
581nested domains, you have to set \ttt{max\_dom} parameters
582wherever there is a ``," in the above list.
583%
584Here is an example of the resulting syntax of the
585\ttt{time\_control}, \ttt{domains} and \ttt{bdy\_control}
586categories in \ttt{namelist.input}:
587%
588\codesource{OMG_namelist.input}
589
590\section{Physical settings}
591
592\mk
593\ttt{callphys.def} controls the behavior of the physical parameterizations
594in the LMD Martian\linebreak Mesoscale Model.
595%
596The organization of this file is exactly similar
597to the corresponding file in the LMD Martian GCM, which
598user manual can be found at
599\url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}.
600
601\mk
602\marge Please find in what follows the contents of \ttt{callphys.def}:
603%
604\centers{\ttt{-- file: callphys.def --}}\codesource{callphys.def}\centers{\ttt{-- end file: callphys.def --}}
605
606\mk
607\begin{finger}
608\item Note that in the given example
609the convective adjustment,
610the gravity wave parameterization,
611and the NLTE schemes are turned off, as is
612usually the case in typical Martian tropospheric
613mesoscale simulations.
614\item \ttt{iradia} sets the frequency
615(in dynamical timesteps) at which
616the radiative computations are performed.
617\item Modifying \ttt{callphys.def} only implies
618to recompile the model if the number of tracers is different.
619\item If you run a simulation with, say, $3$ domains,
620please ensure that you defined three files
621\ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def}.
622\end{finger}
623
624\mk
625\chapter{Preprocessing utilities}
626
627\mk
628In the previous chapter, we decribed the simulation settings
629in the \ttt{namelist.input} file.
630%
631We saw that any modification of the parameters
632labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)} 
633implies the initial and boundary conditions
634and/or the domain definition to be recomputed prior to running the model again.
635%
636As a result, you were probably unable to change many of the parameters
637of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which
638the initial and boundary conditions -- as well as the domain of
639simulation -- were predefined.
640
641\mk
642\marge In this chapter, we describe the installation and use of the preprocessing tools to
643define the domain of simulation, calculate an initial atmospheric state
644and prepare the boundary conditions for the chosen simulation time.
645%
646This necessary step would eventually allow you to run your own simulations at the specific season and region
647you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}.
648
649\mk
650\section{Installing the preprocessing utilities}
651
652\mk
653First and foremost, since the preprocessing utilities could generate
654(or involve) files of quite significant sizes, it is necessary
655to define a directory where these files would be stored.
656%
657Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows
658%
659\begin{verbatim}
660ln -sf /bigdisk/user $LMDMOD/TMPDIR
661\end{verbatim}
662
663\mk
664\marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian
665Mesoscale Model was compiled at least once.
666%
667If this is not the case, please compile
668the model with the \ttt{makemeso} command
669(see section \ref{sc:makemeso}).
670
671\mk
672\marge The compilation process created an
673installation directory adapted to your
674particular choice of compiler$+$machine.
675%
676The preprocessing tools will also
677be installed in this directory.
678%
679Please type the following commands:
680%
681\begin{verbatim}
682cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any install directory
683ln -sf ../prepare_ini .
684./prepare_ini
685\end{verbatim}
686
687\mk
688\marge The script \ttt{prepare\_ini} plays with the preprocessing tools
689an equivalent role as the \ttt{copy\_model} with the model sources :
690files are simply linked to their actual location in the \ttt{SRC} folder.
691%
692Once you have executed \ttt{prepare\_ini}, please check that
693two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}.
694
695\mk
696\marge In the \ttt{PREP\_MARS} directory, please compile
697the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe},
698using the compiler mentionned in the name of the current
699installation directory:
700%
701\begin{verbatim}
702echo $PWD
703cd PREP_MARS/
704./compile [or] ./compile_g95
705ls -lt create_readmeteo.exe readmeteo.exe
706cd ..
707\end{verbatim}
708
709\mk
710\marge In the \ttt{WPS} directory, please compile
711the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}:
712\begin{verbatim}
713cd WPS/   
714./configure   ## select your compiler + 'NO GRIB2' option
715./compile
716ls -lt geogrid.exe metgrid.exe
717\end{verbatim}
718
719\mk
720\marge Apart from the executables you just compiled,
721the preprocessing utilities include \ttt{real.exe},
722which was compiled by the \ttt{makemeso} script
723along with the mesoscale model executable \ttt{wrf.exe}.
724%
725\ttt{real.exe} should be copied or linked in the
726simulation directory (e.g. \ttt{TESTCASE} for the
727Arsia Mons test case) to be at the same level than
728\ttt{namelist.input}.
729
730\begin{finger}
731\item Even though the name of the executable writes
732e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program
733is not related to the specific \ttt{makemeso}
734parameters -- contrary to the \ttt{wrf.exe} executable.
735%
736We just found that renaming the (possibly similar
737if the model sources were not modified) 
738\ttt{real.exe} was a practical way not to confuse
739between executables compiled at different moments.
740\end{finger}
741
742\mk
743\section{Running the preprocessing utilities}
744
745\mk
746When you run a simulation with \ttt{wrf.exe},
747the program attempts to read the initial state
748in the files
749\ttt{wrfinput\_d01},
750\ttt{wrfinput\_d02}, \ldots 
751(one file per domain)
752and the parent domain boundary conditions
753in \ttt{wrfbdy\_d01}.
754%
755The whole chain of data conversion and
756interpolation needed to generate those
757files is summarized in the diagram next
758page.
759%
760Three distinct preprocessing steps are
761necessary to generate the final files.
762%
763As is described in the previous section,
764some modifications in the \ttt{namelist.input} file
765[e.g. start/end dates labelled with \ttt{(p1)}]
766requires a complete reprocessing from step $1$ to step $3$
767to successfully launch the simulation,
768whereas other changes
769[e.g. model top labelled with \ttt{(p3)}] 
770only requires a quick reprocessing at step $3$, keeping
771the files generated at the end of step $2$
772the same.
773 
774\mk
775\subsection{Input data}
776
777\mk
778\subsubsection{Static data}
779
780\mk
781All the static data
782(topography, thermal inertia, albedo)
783needed to initialize the model
784are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory.
785%
786By default, only coarse-resolution datasets\footnote{
787%%%
788Corresponding to the fields stored in the
789file \ttt{surface.nc} known by LMD-MGCM users:
790\url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc}
791%%%
792} are available, but the directory also contains sources and scripts
793to install finer resolution datasets:
794\begin{citemize}
795\item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola},
796\item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01},
797\item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07}
798\end{citemize}
799\pagebreak
800\includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf}
801
802\mk
803\marge The role of the \ttt{build\_static} script is to
804automatically download these datasets from the web
805(namely PDS archives) and convert them to an
806acceptable format for a future use by the
807preprocessing utilities:
808%
809\begin{verbatim}
810cd $LMDMOD/LMD_MM_MARS
811./build_static
812\end{verbatim}
813%
814\begin{finger}
815\item Please install the \ttt{octave}
816free software\footnote{
817%%%
818Available at \url{http://www.gnu.org/software/octave}
819%%%
820} on your system to be able to use the
821\ttt{build\_static} script.
822%
823Another solution is to browse into each of the
824directories contained within \ttt{WPS\_GEOG}, download the
825data with the shell scripts and execute the \ttt{.m} scripts with either
826\ttt{octave} or the commercial software \ttt{matlab}
827(just replace \ttt{\#} by \ttt{\%}).
828%
829\item If you do not manage to execute the \ttt{build\_static} script,
830converted ready-to-use datafiles are available upon request.
831%
832\item The building of the MOLA 64ppd topographical
833database can be quite long. Thus, such a process is
834not performed by default by the \ttt{build\_static} script.
835If the user would like to build this database,
836please remove the \ttt{exit} command in the script, just above
837the commands related to the MOLA 64ppd.
838%
839\item The resulting \ttt{WPS\_GEOG} can reach a size
840of several hundreds of Mo.
841%
842You might move such a folder in a place
843with more disk space available, but then be
844sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS}
845a link to the new location
846of the directory.
847\end{finger}
848
849\mk
850\subsubsection{Meteorological data}
851
852\mk
853The preprocessing tools generate initial and boundary conditions
854from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
855%
856If you would like to run a mesoscale simulation at a given
857season, you need to first run a GCM simulation and output
858the meteorological fields at the considered season.
859%
860For optimal forcing at the boundaries, we advise you
861to write the meteorological fields to the
862\ttt{diagfi.nc} file at least each two hours.
863%
864Please also make sure that the following fields
865are stored in the NETCDF \ttt{diagfi.nc} file:
866
867\footnotesize
868\codesource{contents_diagfi}
869
870\normalsize
871\begin{finger}
872\item If the fields
873\ttt{emis},
874\ttt{co2ice},
875\ttt{q01},
876\ttt{q02},
877\ttt{tsoil} 
878are missing in the \ttt{diagfi.nc} file,
879they are replaced by respective default
880values $0.95$, $0$, $0$, $0$, tsurf.
881\end{finger}
882
883\mk
884\marge An example of input meteorological file
885\ttt{diagfi.nc} file can be downloaded
886at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
887%
888Please deflate the archive and copy the \ttt{diagfi.nc} file
889in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
890%
891Such a file can then be used to define the initial
892and boundary conditions, and we will go
893through the three preprocessing steps.
894
895\mk
896\subsection{Preprocessing steps} 
897
898\mk
899\subsubsection{Step 1: Converting GCM data}
900
901\mk
902The programs in the \ttt{PREP\_MARS} directory
903convert the data from the NETCDF \ttt{diagfi.nc}
904file into separated binary datafiles for each
905date contained in \ttt{diagfi.nc}, according to
906the formatting needed by the
907preprocessing programs at step 2.
908%
909These programs can be executed by the following
910commands:
911\begin{verbatim}
912cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
913echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
914./readmeteo.exe < readmeteo.def
915\end{verbatim}
916%
917\marge If every went well with the conversion,
918the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
919should contain files named \ttt{LMD:}.
920
921\mk
922\subsubsection{2: Interpolation on the regional domain}
923
924\mk
925In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
926you to define the mesoscale simulation domain
927to horizontally interpolate the topography,
928thermal inertia and albedo fields at the domain
929resolution and to calculate useful fields
930such as topographical slopes.%\pagebreak
931
932\mk
933\marge Please execute the commands:
934%
935\begin{verbatim}
936cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
937ln -sf ../../TESTCASE/namelist.wps .   # test case
938./geogrid.exe
939\end{verbatim}
940%
941\marge The result of \ttt{geogrid.exe} 
942-- and thus the definition of the mesoscale
943domain -- can be checked in the NETCDF
944file \ttt{geo\_em.d01.nc}.
945%
946A quick check can be performed using the command line
947\begin{verbatim}
948ncview geo_em.d01.nc
949\end{verbatim} 
950\marge if \ttt{ncview} is installed, or the \ttt{IDL}
951script \ttt{out\_geo.pro}
952\begin{verbatim}
953idl
954IDL> out_geo, field1='TOPO'
955IDL> out_geo, field1='TI'
956IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
957IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
958IDL> exit
959\end{verbatim}
960\marge if the demo version of \ttt{IDL} is installed.
961%
962Of course if your favorite graphical tool supports
963the NETCDF standard, you might use it to check the
964domain definition in \ttt{geo\_em.d01.nc}.
965
966\mk
967\marge If you are unhappy with the results or
968you want to change
969the location of the mesoscale domain on the planet,
970the horizontal resolution,
971the number of grid points \ldots,
972please modify the parameter
973file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
974%
975Here are the contents of \ttt{namelist.wps}:
976%
977\codesource{namelist.wps_TEST} 
978
979\begin{finger}
980%
981\item No input meteorological data
982are actually needed to execute \ttt{geogrid.exe}.
983%
984\item More details about the database and
985more options of interpolation could be
986found in the file \ttt{geogrid/GEOGRID.TBL}.
987%
988\item Defining several domains yields
989distinct files
990\ttt{geo\_em.d01.nc},
991\ttt{geo\_em.d02.nc},
992\ttt{geo\_em.d03.nc}\ldots
993\end{finger}
994
995\mk
996\marge Once the \ttt{geo\_em} file(s) are generated,
997the \ttt{metgrid.exe} program performs
998a similar horizontal interpolation
999of the meteorological fields to the mesoscale
1000domain as the one performed by \ttt{geogrid.exe} 
1001for the surface data.
1002%
1003Then the program writes the results in
1004\ttt{met\_em} files and also collects
1005the static fields and domain parameters
1006included in the \ttt{geo\_em} file(s)
1007%
1008Please type the following commands:
1009\begin{verbatim}
1010cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
1011./metgrid.exe
1012\end{verbatim}
1013%
1014\marge If every went well,
1015the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
1016should contain the \ttt{met\_em.*} files.
1017
1018\mk
1019\subsubsection{Step 3: Vertical interpolation on mesoscale levels}
1020
1021\mk
1022\marge The last step is to execute \ttt{real.exe}
1023to perform the interpolation from the vertical
1024levels of the GCM to the vertical levels
1025defined in the mesoscale model.
1026%
1027This program also prepares the final initial
1028state for the simulation in files called
1029\ttt{wrfinput} and the boundary conditions
1030in files called \ttt{wrfbdy}.
1031
1032\mk
1033\marge To successfully execute \ttt{real.exe},
1034you need the \ttt{met\_em.*} files
1035and the \ttt{namelist.input} file
1036to be in the same directory as \ttt{real.exe}.
1037%
1038Parameters in \ttt{namelist.input}
1039controlling the behavior of the vertical interpolation
1040are those labelled with \ttt{(p3)} in the detailed
1041list introduced in the previous chapter.
1042
1043\mk
1044\marge Please type the following commands
1045to prepare files for the Arsia Mons test case
1046(or your personal test case if you changed
1047the parameters in \ttt{namelist.wps}):
1048\begin{verbatim}
1049cd $LMDMOD/TESTCASE
1050ln -sf $LMDMOD/WRFFEED/met_em* .
1051./real.exe
1052\end{verbatim}
1053
1054\mk
1055\marge The final message of the \ttt{real.exe}
1056should claim the success of the processes and you
1057are now ready to launch the integrations
1058of the LMD Martian Mesoscale Model again
1059with the \ttt{wrf.exe} command as in section
1060\ref{sc:arsia}.
1061
1062\begin{finger}
1063\item When you modify either
1064\ttt{namelist.wps} or \ttt{namelist.input},
1065make sure that the common parameters
1066are exactly similar in both files
1067(especially when running nested simulations)
1068otherwise either \ttt{real.exe} or \ttt{wrf.exe}
1069command will exit with an error message.
1070\end{finger}
1071%\pagebreak
1072
1073
1074\chapter{Starting simulations from scratch}
1075
1076\mk
1077\section{Running your own GCM simulations}
1078
1079\begin{remarque}
1080To be completed
1081\end{remarque}
1082
1083\mk
1084\section{Complete simulations with \ttt{runmeso}}
1085
1086\begin{remarque}
1087To be completed
1088\end{remarque}
1089
1090
1091\chapter{Outputs}
1092
1093\mk
1094\section{Postprocessing utilities and graphics}
1095
1096\begin{remarque}
1097To be completed. Do-it-all \ttt{idl} scripts
1098would be described here !
1099\end{remarque}
1100
1101\mk
1102\section{Modify the outputs}
1103
1104\begin{remarque}
1105To be completed.
1106Though the method is different,
1107we kept all the convenient aspects of \ttt{writediagfi}
1108\end{remarque}
1109
1110\chapter{Frequently Asked Questions}
1111
1112
1113\begin{finger}
1114\item Which timestep should I choose to avoid crashes of the model ?
1115\item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ?
1116\item Help ! I get strange assembler errors or ILM errors while compiling !
1117\item Is it possible to run the model on a specific configuration that is not supported ?
1118\item Why do I have to define four less rows in the parent domain
1119when performing nested runs ?
1120\item I am kind of nostalgic of early/middle Mars. How could I run
1121mesoscale simulations at low/high obliquity ?
1122\item Why \ttt{real.exe} is crashing when the model top pressure is
1123lower than $2$~Pa ?
1124\item Can I use the two-way nesting ?
1125\end{finger}
1126
1127\begin{remarque}
1128To be completed.
1129\end{remarque}
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
Note: See TracBrowser for help on using the repository browser.