source: trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex @ 219

Last change on this file since 219 was 219, checked in by aslmd, 13 years ago

MESOSCALE: user manual. added chapter about simulation paramaters.

  • Property svn:executable set to *
File size: 14.8 KB
RevLine 
[209]1
2
3\mk
4\chapter{Preprocessing utilities}
5
6\mk
7In the previous chapter, we decribed the simulation settings
8in the \ttt{namelist.input} file.
9%
10We saw that any modification of the parameters
11labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)} 
12implies the initial and boundary conditions
13and/or the domain definition to be recomputed prior to running the model again.
14%
15As a result, you were probably unable to change many of the parameters
16of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which
17the initial and boundary conditions -- as well as the domain of
18simulation -- were predefined.
19
20\mk
21\marge In this chapter, we describe the installation and use of the preprocessing tools to
22define the domain of simulation, calculate an initial atmospheric state
23and prepare the boundary conditions for the chosen simulation time.
24%
25This necessary step would eventually allow you to run your own simulations at the specific season and region
26you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}.
27
28\mk
29\section{Installing the preprocessing utilities}
30
31\mk
32First and foremost, since the preprocessing utilities could generate
33(or involve) files of quite significant sizes, it is necessary
34to define a directory where these files would be stored.
35%
36Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows
37%
38\begin{verbatim}
39ln -sf /bigdisk/user $LMDMOD/TMPDIR
40\end{verbatim}
41
42\mk
43\marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian
44Mesoscale Model was compiled at least once.
45%
46If this is not the case, please compile
47the model with the \ttt{makemeso} command
48(see section \ref{sc:makemeso}).
49
50\mk
51\marge The compilation process created an
52installation directory adapted to your
53particular choice of compiler$+$machine.
54%
55The preprocessing tools will also
56be installed in this directory.
57%
58Please type the following commands:
59%
60\begin{verbatim}
61cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any install directory
62ln -sf ../prepare_ini .
63./prepare_ini
64\end{verbatim}
65
66\mk
67\marge The script \ttt{prepare\_ini} plays with the preprocessing tools
68an equivalent role as the \ttt{copy\_model} with the model sources :
69files are simply linked to their actual location in the \ttt{SRC} folder.
70%
71Once you have executed \ttt{prepare\_ini}, please check that
72two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}.
73
74\mk
75\marge In the \ttt{PREP\_MARS} directory, please compile
76the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe},
77using the compiler mentionned in the name of the current
78installation directory:
79%
80\begin{verbatim}
81echo $PWD
82cd PREP_MARS/
83./compile [or] ./compile_g95
84ls -lt create_readmeteo.exe readmeteo.exe
85cd ..
86\end{verbatim}
87
88\mk
89\marge In the \ttt{WPS} directory, please compile
90the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}:
91\begin{verbatim}
92cd WPS/   
93./configure   ## select your compiler + 'NO GRIB2' option
94./compile
95ls -lt geogrid.exe metgrid.exe
96\end{verbatim}
97
98\mk
99\marge Apart from the executables you just compiled,
100the preprocessing utilities include \ttt{real.exe},
101which was compiled by the \ttt{makemeso} script
102along with the mesoscale model executable \ttt{wrf.exe}.
103%
104\ttt{real.exe} should be copied or linked in the
105simulation directory (e.g. \ttt{TESTCASE} for the
106Arsia Mons test case) to be at the same level than
107\ttt{namelist.input}.
108
109\begin{finger}
110\item Even though the name of the executable writes
111e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program
112is not related to the specific \ttt{makemeso}
113parameters -- contrary to the \ttt{wrf.exe} executable.
114%
115We just found that renaming the (possibly similar
116if the model sources were not modified) 
117\ttt{real.exe} was a practical way not to confuse
118between executables compiled at different moments.
119\end{finger}
120
121\mk
122\section{Running the preprocessing utilities}
123
124\mk
125When you run a simulation with \ttt{wrf.exe},
126the program attempts to read the initial state
127in the files
128\ttt{wrfinput\_d01},
129\ttt{wrfinput\_d02}, \ldots 
130(one file per domain)
131and the parent domain boundary conditions
132in \ttt{wrfbdy\_d01}.
133%
134The whole chain of data conversion and
135interpolation needed to generate those
136files is summarized in the diagram next
137page.
138%
139Three distinct preprocessing steps are
140necessary to generate the final files.
141%
142As is described in the previous section,
143some modifications in the \ttt{namelist.input} file
144[e.g. start/end dates labelled with \ttt{(p1)}]
145requires a complete reprocessing from step $1$ to step $3$
146to successfully launch the simulation,
147whereas other changes
148[e.g. model top labelled with \ttt{(p3)}] 
149only requires a quick reprocessing at step $3$, keeping
150the files generated at the end of step $2$
151the same.
152 
153\mk
154\subsection{Input data}
155
156\mk
157\subsubsection{Static data}
158
159\mk
160All the static data
161(topography, thermal inertia, albedo)
162needed to initialize the model
163are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory.
164%
165By default, only coarse-resolution datasets\footnote{
166%%%
167Corresponding to the fields stored in the
168file \ttt{surface.nc} known by LMD-MGCM users:
169\url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc}
170%%%
171} are available, but the directory also contains sources and scripts
172to install finer resolution datasets:
173\begin{citemize}
174\item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola},
175\item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01},
176\item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07}
177\end{citemize}
178\pagebreak
179\includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf}
180
181\mk
182\marge The role of the \ttt{build\_static} script is to
183automatically download these datasets from the web
184(namely PDS archives) and convert them to an
185acceptable format for a future use by the
186preprocessing utilities:
187%
188\begin{verbatim}
189cd $LMDMOD/LMD_MM_MARS
190./build_static
191\end{verbatim}
192%
193\begin{finger}
194\item Please install the \ttt{octave}
195free software\footnote{
196%%%
197Available at \url{http://www.gnu.org/software/octave}
198%%%
199} on your system to be able to use the
200\ttt{build\_static} script.
201%
202Another solution is to browse into each of the
203directories contained within \ttt{WPS\_GEOG}, download the
204data with the shell scripts and execute the \ttt{.m} scripts with either
205\ttt{octave} or the commercial software \ttt{matlab}
206(just replace \ttt{\#} by \ttt{\%}).
207%
208\item If you do not manage to execute the \ttt{build\_static} script,
209converted ready-to-use datafiles are available upon request.
210%
211\item The building of the MOLA 64ppd topographical
212database can be quite long. Thus, such a process is
213not performed by default by the \ttt{build\_static} script.
214If the user would like to build this database,
215please remove the \ttt{exit} command in the script, just above
216the commands related to the MOLA 64ppd.
217%
218\item The resulting \ttt{WPS\_GEOG} can reach a size
219of several hundreds of Mo.
220%
221You might move such a folder in a place
222with more disk space available, but then be
223sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS}
224a link to the new location
225of the directory.
226\end{finger}
227
228\mk
229\subsubsection{Meteorological data}
230
231\mk
232The preprocessing tools generate initial and boundary conditions
233from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
234%
235If you would like to run a mesoscale simulation at a given
236season, you need to first run a GCM simulation and output
237the meteorological fields at the considered season.
238%
239For optimal forcing at the boundaries, we advise you
240to write the meteorological fields to the
241\ttt{diagfi.nc} file at least each two hours.
242%
243Please also make sure that the following fields
244are stored in the NETCDF \ttt{diagfi.nc} file:
245
246\footnotesize
247\codesource{contents_diagfi}
248
249\normalsize
250\begin{finger}
251\item If the fields
252\ttt{emis},
253\ttt{co2ice},
254\ttt{q01},
255\ttt{q02},
256\ttt{tsoil} 
257are missing in the \ttt{diagfi.nc} file,
258they are replaced by respective default
259values $0.95$, $0$, $0$, $0$, tsurf.
260\end{finger}
261
262\mk
263\marge An example of input meteorological file
264\ttt{diagfi.nc} file can be downloaded
265at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
266%
267Please deflate the archive and copy the \ttt{diagfi.nc} file
268in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
269%
270Such a file can then be used to define the initial
271and boundary conditions, and we will go
272through the three preprocessing steps.
273
274\mk
275\subsection{Preprocessing steps} 
276
277\mk
278\subsubsection{Step 1: Converting GCM data}
279
280\mk
[219]281\section{Running your own GCM simulations}
282
283\begin{remarque}
284To be completed
285\end{remarque}
286
287\mk
[209]288The programs in the \ttt{PREP\_MARS} directory
289convert the data from the NETCDF \ttt{diagfi.nc}
290file into separated binary datafiles for each
291date contained in \ttt{diagfi.nc}, according to
292the formatting needed by the
293preprocessing programs at step 2.
294%
295These programs can be executed by the following
296commands:
297\begin{verbatim}
298cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
299echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
300./readmeteo.exe < readmeteo.def
301\end{verbatim}
302%
303\marge If every went well with the conversion,
304the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
305should contain files named \ttt{LMD:}.
306
307\mk
308\subsubsection{2: Interpolation on the regional domain}
309
310\mk
311In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
312you to define the mesoscale simulation domain
313to horizontally interpolate the topography,
314thermal inertia and albedo fields at the domain
315resolution and to calculate useful fields
316such as topographical slopes.%\pagebreak
317
318\mk
319\marge Please execute the commands:
320%
321\begin{verbatim}
322cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
323ln -sf ../../TESTCASE/namelist.wps .   # test case
324./geogrid.exe
325\end{verbatim}
326%
327\marge The result of \ttt{geogrid.exe} 
328-- and thus the definition of the mesoscale
329domain -- can be checked in the NETCDF
330file \ttt{geo\_em.d01.nc}.
331%
332A quick check can be performed using the command line
333\begin{verbatim}
334ncview geo_em.d01.nc
335\end{verbatim} 
336\marge if \ttt{ncview} is installed, or the \ttt{IDL}
337script \ttt{out\_geo.pro}
338\begin{verbatim}
339idl
340IDL> out_geo, field1='TOPO'
341IDL> out_geo, field1='TI'
342IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
343IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
344IDL> exit
345\end{verbatim}
346\marge if the demo version of \ttt{IDL} is installed.
347%
348Of course if your favorite graphical tool supports
349the NETCDF standard, you might use it to check the
350domain definition in \ttt{geo\_em.d01.nc}.
351
352\mk
353\marge If you are unhappy with the results or
354you want to change
355the location of the mesoscale domain on the planet,
356the horizontal resolution,
357the number of grid points \ldots,
358please modify the parameter
359file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
360%
361Here are the contents of \ttt{namelist.wps}:
362%
363\codesource{namelist.wps_TEST} 
364
365\begin{finger}
366%
367\item No input meteorological data
368are actually needed to execute \ttt{geogrid.exe}.
369%
370\item More details about the database and
371more options of interpolation could be
372found in the file \ttt{geogrid/GEOGRID.TBL}.
373%
374\item Defining several domains yields
375distinct files
376\ttt{geo\_em.d01.nc},
377\ttt{geo\_em.d02.nc},
378\ttt{geo\_em.d03.nc}\ldots
379\end{finger}
380
381\mk
382\marge Once the \ttt{geo\_em} file(s) are generated,
383the \ttt{metgrid.exe} program performs
384a similar horizontal interpolation
385of the meteorological fields to the mesoscale
386domain as the one performed by \ttt{geogrid.exe} 
387for the surface data.
388%
389Then the program writes the results in
390\ttt{met\_em} files and also collects
391the static fields and domain parameters
392included in the \ttt{geo\_em} file(s)
393%
394Please type the following commands:
395\begin{verbatim}
396cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
397./metgrid.exe
398\end{verbatim}
399%
400\marge If every went well,
401the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
402should contain the \ttt{met\_em.*} files.
403
404\mk
405\subsubsection{Step 3: Vertical interpolation on mesoscale levels}
406
407\mk
408\marge The last step is to execute \ttt{real.exe}
409to perform the interpolation from the vertical
410levels of the GCM to the vertical levels
411defined in the mesoscale model.
412%
413This program also prepares the final initial
414state for the simulation in files called
415\ttt{wrfinput} and the boundary conditions
416in files called \ttt{wrfbdy}.
417
418\mk
419\marge To successfully execute \ttt{real.exe},
420you need the \ttt{met\_em.*} files
421and the \ttt{namelist.input} file
422to be in the same directory as \ttt{real.exe}.
423%
424Parameters in \ttt{namelist.input}
425controlling the behavior of the vertical interpolation
426are those labelled with \ttt{(p3)} in the detailed
427list introduced in the previous chapter.
428
429\mk
430\marge Please type the following commands
431to prepare files for the Arsia Mons test case
432(or your personal test case if you changed
433the parameters in \ttt{namelist.wps}):
434\begin{verbatim}
435cd $LMDMOD/TESTCASE
436ln -sf $LMDMOD/WRFFEED/met_em* .
437./real.exe
438\end{verbatim}
439
440\mk
441\marge The final message of the \ttt{real.exe}
442should claim the success of the processes and you
443are now ready to launch the integrations
444of the LMD Martian Mesoscale Model again
445with the \ttt{wrf.exe} command as in section
446\ref{sc:arsia}.
447
448\begin{finger}
449\item When you modify either
450\ttt{namelist.wps} or \ttt{namelist.input},
451make sure that the common parameters
452are exactly similar in both files
453(especially when running nested simulations)
454otherwise either \ttt{real.exe} or \ttt{wrf.exe}
455command will exit with an error message.
456\end{finger}
457%\pagebreak
458
459
[219]460\chapter{Starting simulations from scratch: a summary}
[209]461
462\mk
[219]463\section{Complete simulations with \ttt{runmeso}}
[209]464
[219]465you'll notice you need to change namelist according to model compilation
466in order to minimize errors and help the user.
467
[209]468\begin{remarque}
469To be completed
470\end{remarque}
471
472
[219]473\chapter{Advanced use}
[209]474
[219]475\section{Grid nesting}\label{nests}
[209]476
[219]477\section{Tracers}
478
479\section{New physics}
480
481
482
483
[209]484\chapter{Outputs}
485
486\mk
[218]487\section{Postprocessing utilities and graphics}\label{postproc}
[209]488
489\begin{remarque}
490To be completed. Do-it-all \ttt{idl} scripts
491would be described here !
492\end{remarque}
493
494\mk
495\section{Modify the outputs}
496
497\begin{remarque}
498To be completed.
499Though the method is different,
500we kept all the convenient aspects of \ttt{writediagfi}
501\end{remarque}
502
503\chapter{Frequently Asked Questions}
504
505
506\begin{finger}
507\item Which timestep should I choose to avoid crashes of the model ?
508\item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ?
509\item Help ! I get strange assembler errors or ILM errors while compiling !
510\item Is it possible to run the model on a specific configuration that is not supported ?
511\item Why do I have to define four less rows in the parent domain
512when performing nested runs ?
513\item I am kind of nostalgic of early/middle Mars. How could I run
514mesoscale simulations at low/high obliquity ?
515\item Why \ttt{real.exe} is crashing when the model top pressure is
516lower than $2$~Pa ?
517\item Can I use the two-way nesting ?
518\end{finger}
519
520\begin{remarque}
521To be completed.
522\end{remarque}
523
524
525
526
527
528
529
530
531
532
533
534
Note: See TracBrowser for help on using the repository browser.