1 | |
---|
2 | |
---|
3 | |
---|
4 | \chapter{Compiling the model and running a test case} |
---|
5 | |
---|
6 | \vk |
---|
7 | This chapter is also meant for first time users of the LMD Martian Mesoscale Model. We describe how to compile the program and run a test case. |
---|
8 | |
---|
9 | \mk |
---|
10 | \subsection{Main compilation step} |
---|
11 | \label{sc:makemeso} |
---|
12 | |
---|
13 | \mk |
---|
14 | In order to compile the model, execute the \ttt{makemeso} compilation script |
---|
15 | in the \ttt{LMD\_MM\_MARS}\linebreak directory |
---|
16 | % |
---|
17 | \begin{verbatim} |
---|
18 | cd $LMDMOD/LMD_MM_MARS |
---|
19 | ./makemeso |
---|
20 | \end{verbatim} |
---|
21 | % |
---|
22 | \marge and answer to the questions about |
---|
23 | \begin{asparaenum}[1.]%[\itshape Q1\upshape)] |
---|
24 | \item compiler choice (and number of processors if using MPI) |
---|
25 | \item number of grid points in longitude [61] |
---|
26 | \item number of grid points in latitude [61] |
---|
27 | \item number of vertical levels [61] |
---|
28 | \item number of tracers [1] |
---|
29 | \item number of domains [1] |
---|
30 | \end{asparaenum} |
---|
31 | |
---|
32 | %\mk |
---|
33 | \begin{finger} |
---|
34 | \item On the first time you compile the model, you will probably wonder what to reply |
---|
35 | to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable |
---|
36 | for the test case given below. |
---|
37 | \item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$ |
---|
38 | to run a specific compilation. |
---|
39 | If you would like to run another simulation |
---|
40 | with at least one of parameters $1$ to $6$ |
---|
41 | subject to change, the model needs to be recompiled\footnote{This |
---|
42 | necessary recompilation each time the number of grid points, |
---|
43 | tracers and domains is modified is imposed by the LMD physics code. |
---|
44 | The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}. |
---|
45 | \item When you use parallel computations, please bear in mind that with |
---|
46 | $2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated |
---|
47 | into $2$ (resp. $2$, $3$, $4$, $4$) tiles over |
---|
48 | the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction. |
---|
49 | Thus make sure that the number of grid points minus $1$ in each direction |
---|
50 | could be divided by the aforementioned number of tiles over the considered |
---|
51 | direction. |
---|
52 | \item If you use grid nesting, note that no more than $4$ processors can be used. |
---|
53 | \end{finger} |
---|
54 | |
---|
55 | \mk |
---|
56 | \marge The \ttt{makemeso} is an automated script which performs |
---|
57 | the following serie of tasks: |
---|
58 | %It is useful to detail and comment the performed by the \ttt{makemeso} script: |
---|
59 | \begin{citemize} |
---|
60 | \item determine if the machine is 32 or 64 bits; |
---|
61 | \item ask the user about the compilation settings; |
---|
62 | \item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP}; |
---|
63 | \begin{finger} |
---|
64 | \item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} |
---|
65 | is created if the user requested |
---|
66 | a \ttt{g95} compilation of the code for single-domain simulations |
---|
67 | on a 32bits machine. |
---|
68 | \end{finger} |
---|
69 | \item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources; |
---|
70 | \begin{finger} |
---|
71 | \item This method ensures that any change to the model sources would |
---|
72 | be propagated to all the different \ttt{DIRCOMP} installation folders. |
---|
73 | \end{finger} |
---|
74 | \item execute the WRF \ttt{configure} script with the correct option; |
---|
75 | \item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics; |
---|
76 | \item calculate the total number of horizontal grid points handled by the LMD physics; |
---|
77 | \item duplicate LMD physical sources if nesting is activated; |
---|
78 | \begin{finger} |
---|
79 | \item The model presently supports 3 nests, but more nests |
---|
80 | can be included by adaptating the following files: |
---|
81 | \begin{verbatim} |
---|
82 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc |
---|
83 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc |
---|
84 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3 |
---|
85 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3 |
---|
86 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest' |
---|
87 | \end{verbatim}%\pagebreak |
---|
88 | \end{finger} |
---|
89 | \item compile the LMD physical packages with the appropriate \ttt{makegcm} command |
---|
90 | and collect the compiled objects in the library \ttt{liblmd.a}; |
---|
91 | \begin{finger} |
---|
92 | \item During this step that could be a bit long, |
---|
93 | especially if you defined more than one domain, |
---|
94 | the \ttt{makemeso} script provides you with the full path towards |
---|
95 | the text file \ttt{log\_compile\_phys} in which you can check for |
---|
96 | compilation progress and possible errors. |
---|
97 | % |
---|
98 | In the end of the process, you will find an |
---|
99 | error message associated to the generation of the |
---|
100 | final executable. |
---|
101 | % |
---|
102 | Please do not pay attention to this, as the compilation of the LMD |
---|
103 | sources is meant to generate a library of |
---|
104 | compiled objects called \ttt{liblmd.a} instead of a program. |
---|
105 | \end{finger} |
---|
106 | \item compile the modified Martian ARW-WRF solver, including |
---|
107 | the \ttt{liblmd.a} library; |
---|
108 | \begin{finger} |
---|
109 | \item When it is the first time the model is compiled, this |
---|
110 | step could be quite long. |
---|
111 | % |
---|
112 | The \ttt{makemeso} script provides you with a \ttt{log\_compile} |
---|
113 | text file where the progress of the compilation can be checked |
---|
114 | and a \ttt{log\_error} text file listing errors and warnings |
---|
115 | during compilation. |
---|
116 | % |
---|
117 | A list of warnings related to \ttt{grib} |
---|
118 | utilities (not used in the Martian model) |
---|
119 | may appear and have no impact on the |
---|
120 | final executables. |
---|
121 | \item The compilation with \ttt{g95} might be unsuccessful |
---|
122 | due to some problems with files related to terrestrial microphysics. |
---|
123 | % |
---|
124 | Please type the following commands: |
---|
125 | \begin{verbatim} |
---|
126 | cd $LMDMOD/LMD_MM_MARS/SRC |
---|
127 | tar xzvf g95.tar.gz |
---|
128 | cp -f g95/WRFV2_g95_fix/* WRFV2/phys/ |
---|
129 | cd $LMDMOD/LMD_MM_MARS |
---|
130 | \end{verbatim} |
---|
131 | \marge then recompile the model with the \ttt{makemeso} command. |
---|
132 | \end{finger} |
---|
133 | \item change the name of the executables in agreements with the |
---|
134 | settings provided by the user. |
---|
135 | \begin{finger} |
---|
136 | \item If you choose to answer to the \ttt{makemeso} questions using the |
---|
137 | aforementioned parameters in brackets, you should have in the |
---|
138 | \ttt{DIRCOMP} directory two executables: |
---|
139 | \begin{verbatim} |
---|
140 | real_x61_y61_z61_d1_t1_p1.exe |
---|
141 | wrf_x61_y61_z61_d1_t1_p1.exe |
---|
142 | \end{verbatim} |
---|
143 | % |
---|
144 | The directory also contains a text file |
---|
145 | in which the answers to the questions are stored, which |
---|
146 | allows you to re-run the script without the |
---|
147 | ``questions to the user" step: |
---|
148 | \begin{verbatim} |
---|
149 | ./makemeso < makemeso_x61_y61_z61_d1_t1_p1 |
---|
150 | \end{verbatim} |
---|
151 | \end{finger} |
---|
152 | \end{citemize} |
---|
153 | |
---|
154 | \mk |
---|
155 | \section{Running a simple test case} |
---|
156 | \label{sc:arsia} |
---|
157 | |
---|
158 | \mk |
---|
159 | We suppose that you had successfully compiled |
---|
160 | the model at the end of the previous section |
---|
161 | and you had used the answers in brackets |
---|
162 | to the \ttt{makemeso} questions. |
---|
163 | |
---|
164 | \mk |
---|
165 | \marge In order to test the compiled executables, |
---|
166 | a ready-to-use test case |
---|
167 | (with pre-generated initial and boundary |
---|
168 | conditions) is proposed |
---|
169 | in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} |
---|
170 | archive you can download at |
---|
171 | \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}. |
---|
172 | % |
---|
173 | This test case simulates the hydrostatic |
---|
174 | atmospheric flow around Arsia Mons during half a sol |
---|
175 | with constant thermal inertia, albedo |
---|
176 | and dust opacity. |
---|
177 | |
---|
178 | \begin{finger} |
---|
179 | \item Though the simulation reproduces some reasonable |
---|
180 | features of the mesoscale circulation around Arsia |
---|
181 | Mons (e.g. slope winds), it should not be used |
---|
182 | for scientific purpose, for the number of grid points |
---|
183 | is unsufficient for single-domain simulation |
---|
184 | and the integration time is below the necessary spin-up time. |
---|
185 | \end{finger} |
---|
186 | %\pagebreak |
---|
187 | |
---|
188 | \marge To launch the test simulation, please type |
---|
189 | the following commands, replacing the |
---|
190 | \ttt{g95\_32\_single} directory with its corresponding |
---|
191 | value on your system: |
---|
192 | % |
---|
193 | \begin{verbatim} |
---|
194 | cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/ |
---|
195 | tar xzvf LMD_MM_MARS_TESTCASE.tar.gz |
---|
196 | cd TESTCASE |
---|
197 | ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe |
---|
198 | nohup wrf.exe > log_wrf & |
---|
199 | \end{verbatim} |
---|
200 | |
---|
201 | %tar xzvf wrfinput.tar.gz |
---|
202 | |
---|
203 | \begin{finger} |
---|
204 | \item If you compiled the model using MPICH2, |
---|
205 | the command to launch a simulation is slightly different: |
---|
206 | % |
---|
207 | \begin{verbatim} |
---|
208 | [simulation on 2 processors on 1 machine] |
---|
209 | mpd & # first-time only (or after a reboot) |
---|
210 | # NB: may request the creation of a file .mpd.conf |
---|
211 | mpirun -np 8 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec |
---|
212 | tail -20 rsl.out.000? # to check the outputs |
---|
213 | \end{verbatim} |
---|
214 | \begin{verbatim} |
---|
215 | [simulation on 16 processors in 4 connected machines] |
---|
216 | echo barry.lmd.jussieu.fr > ~/mpd.hosts |
---|
217 | echo white.lmd.jussieu.fr >> ~/mpd.hosts |
---|
218 | echo loves.lmd.jussieu.fr >> ~/mpd.hosts |
---|
219 | echo tapas.lmd.jussieu.fr >> ~/mpd.hosts |
---|
220 | ssh barry.lmd.jussieu.fr # make sure that ssh to other machines |
---|
221 | # is possible without authentification |
---|
222 | mpdboot -f ~/mpd.hosts -n 4 |
---|
223 | mpdtrace |
---|
224 | mpirun -l -np 16 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec |
---|
225 | tail -20 rsl.out.00?? # to check the outputs |
---|
226 | \end{verbatim} |
---|
227 | \end{finger} |
---|
228 | |
---|
229 | |
---|
230 | \mk |
---|
231 | \chapter{Setting the simulation parameters} |
---|
232 | |
---|
233 | \mk |
---|
234 | In this chapter, we describe how to set the various parameters |
---|
235 | defining a given simulation. |
---|
236 | % |
---|
237 | As could be inferred from the content of the \ttt{TESTCASE} directory, |
---|
238 | two parameter files are needed to run the model: |
---|
239 | \begin{enumerate} |
---|
240 | \item The parameters related to the dynamical part of the model can be set |
---|
241 | in the file \ttt{namelist.input} according to the ARW-WRF namelist formatting. |
---|
242 | \item The parameters related to the physical part of the model can be set |
---|
243 | in the file \ttt{callphys.def} according to the LMD-MGCM formatting. |
---|
244 | \end{enumerate} |
---|
245 | |
---|
246 | \mk |
---|
247 | \section{Dynamical settings} |
---|
248 | |
---|
249 | \mk |
---|
250 | \ttt{namelist.input} controls the behavior of the dynamical core |
---|
251 | in the LMD Martian Mesoscale Model. |
---|
252 | % |
---|
253 | Compared to the file the ARW-WRF users are familiar with\footnote{ |
---|
254 | %%% |
---|
255 | A description of this file can be found in \ttt{SRC/WRFV2/run/README.namelist}. |
---|
256 | %%% |
---|
257 | }, the \ttt{namelist.input} in the LMD Martian Mesoscale Model |
---|
258 | is much shorter. |
---|
259 | % |
---|
260 | The only mandatory parameters in this file |
---|
261 | are information on time control\footnote{ |
---|
262 | %%% |
---|
263 | More information on the adopted Martian calendar: |
---|
264 | \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html} |
---|
265 | %%% |
---|
266 | } and domain definition. |
---|
267 | |
---|
268 | \mk |
---|
269 | \marge The minimal version of the \ttt{namelist.input} |
---|
270 | file corresponds to standard simulations with the model. |
---|
271 | % |
---|
272 | It is however possible to modify optional parameters |
---|
273 | if needed, as is the case in the \ttt{namelist.input} |
---|
274 | associated to the Arsia Mons test case |
---|
275 | (e.g. the parameter \ttt{non\_hydrostatic} is set to false |
---|
276 | to assume hydrostatic equilibrium, whereas standard |
---|
277 | simulations are non-hydrostatic). |
---|
278 | |
---|
279 | \mk |
---|
280 | \marge A detailed description of the \ttt{namelist.input} file is given below\footnote{ |
---|
281 | %%% |
---|
282 | You may find the corresponding file in \ttt{SIMU/namelist.input\_full}. |
---|
283 | %%% |
---|
284 | }. |
---|
285 | % |
---|
286 | Comments on each of the parameters are provided, |
---|
287 | with the following labels: |
---|
288 | \begin{citemize} |
---|
289 | \item \ttt{(*)} denotes parameters not to be modified, |
---|
290 | \item \ttt{(r)} indicates parameters which modification implies a new recompilation of the model, |
---|
291 | \item \ttt{(n)} describes parameters involved when nested domains are defined, |
---|
292 | \item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing |
---|
293 | of initial and boundary conditions (see next chapter), |
---|
294 | \item \ttt{(*d)} denotes dynamical parameters which modification implies |
---|
295 | non-standard simulations -- please read \ttt{SRC/WRFV2/run/README.namelist} |
---|
296 | and use with caution. |
---|
297 | \end{citemize} |
---|
298 | % |
---|
299 | If omitted, the optional parameters would be set to their default |
---|
300 | values indicated below.\pagebreak |
---|
301 | |
---|
302 | \centers{\ttt{-- file: namelist.input\_full --}}\codesource{namelist.input_full}\centers{\ttt{-- end file: namelist.input\_full --}} |
---|
303 | |
---|
304 | \begin{finger} |
---|
305 | \item Please pay attention to rigorous syntax while |
---|
306 | editing your personal \ttt{namelist.input} file |
---|
307 | to avoid reading error. |
---|
308 | \item To modify the default values (or even add |
---|
309 | personal parameters) in the \ttt{namelist.input} file, |
---|
310 | edit the \ttt{SRC/WRFV2/Registry/Registry.EM} file. |
---|
311 | % |
---|
312 | You will then have to recompile the model with \ttt{makemeso} ; |
---|
313 | answer \ttt{y} to the last question. |
---|
314 | \end{finger} |
---|
315 | |
---|
316 | \mk |
---|
317 | \marge In case you run simulations with \ttt{max\_dom} |
---|
318 | nested domains, you have to set \ttt{max\_dom} parameters |
---|
319 | wherever there is a ``," in the above list. |
---|
320 | % |
---|
321 | Here is an example of the resulting syntax of the |
---|
322 | \ttt{time\_control}, \ttt{domains} and \ttt{bdy\_control} |
---|
323 | categories in \ttt{namelist.input}: |
---|
324 | % |
---|
325 | \codesource{OMG_namelist.input} |
---|
326 | |
---|
327 | \section{Physical settings} |
---|
328 | |
---|
329 | \mk |
---|
330 | \ttt{callphys.def} controls the behavior of the physical parameterizations |
---|
331 | in the LMD Martian\linebreak Mesoscale Model. |
---|
332 | % |
---|
333 | The organization of this file is exactly similar |
---|
334 | to the corresponding file in the LMD Martian GCM, which |
---|
335 | user manual can be found at |
---|
336 | \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}. |
---|
337 | |
---|
338 | \mk |
---|
339 | \marge Please find in what follows the contents of \ttt{callphys.def}: |
---|
340 | % |
---|
341 | \centers{\ttt{-- file: callphys.def --}}\codesource{callphys.def}\centers{\ttt{-- end file: callphys.def --}} |
---|
342 | |
---|
343 | \mk |
---|
344 | \begin{finger} |
---|
345 | \item Note that in the given example |
---|
346 | the convective adjustment, |
---|
347 | the gravity wave parameterization, |
---|
348 | and the NLTE schemes are turned off, as is |
---|
349 | usually the case in typical Martian tropospheric |
---|
350 | mesoscale simulations. |
---|
351 | \item \ttt{iradia} sets the frequency |
---|
352 | (in dynamical timesteps) at which |
---|
353 | the radiative computations are performed. |
---|
354 | \item Modifying \ttt{callphys.def} only implies |
---|
355 | to recompile the model if the number of tracers is different. |
---|
356 | \item If you run a simulation with, say, $3$ domains, |
---|
357 | please ensure that you defined three files |
---|
358 | \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def}. |
---|
359 | \end{finger} |
---|
360 | |
---|
361 | \mk |
---|
362 | \chapter{Preprocessing utilities} |
---|
363 | |
---|
364 | \mk |
---|
365 | In the previous chapter, we decribed the simulation settings |
---|
366 | in the \ttt{namelist.input} file. |
---|
367 | % |
---|
368 | We saw that any modification of the parameters |
---|
369 | labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)} |
---|
370 | implies the initial and boundary conditions |
---|
371 | and/or the domain definition to be recomputed prior to running the model again. |
---|
372 | % |
---|
373 | As a result, you were probably unable to change many of the parameters |
---|
374 | of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which |
---|
375 | the initial and boundary conditions -- as well as the domain of |
---|
376 | simulation -- were predefined. |
---|
377 | |
---|
378 | \mk |
---|
379 | \marge In this chapter, we describe the installation and use of the preprocessing tools to |
---|
380 | define the domain of simulation, calculate an initial atmospheric state |
---|
381 | and prepare the boundary conditions for the chosen simulation time. |
---|
382 | % |
---|
383 | This necessary step would eventually allow you to run your own simulations at the specific season and region |
---|
384 | you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}. |
---|
385 | |
---|
386 | \mk |
---|
387 | \section{Installing the preprocessing utilities} |
---|
388 | |
---|
389 | \mk |
---|
390 | First and foremost, since the preprocessing utilities could generate |
---|
391 | (or involve) files of quite significant sizes, it is necessary |
---|
392 | to define a directory where these files would be stored. |
---|
393 | % |
---|
394 | Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows |
---|
395 | % |
---|
396 | \begin{verbatim} |
---|
397 | ln -sf /bigdisk/user $LMDMOD/TMPDIR |
---|
398 | \end{verbatim} |
---|
399 | |
---|
400 | \mk |
---|
401 | \marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian |
---|
402 | Mesoscale Model was compiled at least once. |
---|
403 | % |
---|
404 | If this is not the case, please compile |
---|
405 | the model with the \ttt{makemeso} command |
---|
406 | (see section \ref{sc:makemeso}). |
---|
407 | |
---|
408 | \mk |
---|
409 | \marge The compilation process created an |
---|
410 | installation directory adapted to your |
---|
411 | particular choice of compiler$+$machine. |
---|
412 | % |
---|
413 | The preprocessing tools will also |
---|
414 | be installed in this directory. |
---|
415 | % |
---|
416 | Please type the following commands: |
---|
417 | % |
---|
418 | \begin{verbatim} |
---|
419 | cd $LMDMOD/LMD_MM_MARS/g95_32_single/ ## or any install directory |
---|
420 | ln -sf ../prepare_ini . |
---|
421 | ./prepare_ini |
---|
422 | \end{verbatim} |
---|
423 | |
---|
424 | \mk |
---|
425 | \marge The script \ttt{prepare\_ini} plays with the preprocessing tools |
---|
426 | an equivalent role as the \ttt{copy\_model} with the model sources : |
---|
427 | files are simply linked to their actual location in the \ttt{SRC} folder. |
---|
428 | % |
---|
429 | Once you have executed \ttt{prepare\_ini}, please check that |
---|
430 | two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. |
---|
431 | |
---|
432 | \mk |
---|
433 | \marge In the \ttt{PREP\_MARS} directory, please compile |
---|
434 | the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, |
---|
435 | using the compiler mentionned in the name of the current |
---|
436 | installation directory: |
---|
437 | % |
---|
438 | \begin{verbatim} |
---|
439 | echo $PWD |
---|
440 | cd PREP_MARS/ |
---|
441 | ./compile [or] ./compile_g95 |
---|
442 | ls -lt create_readmeteo.exe readmeteo.exe |
---|
443 | cd .. |
---|
444 | \end{verbatim} |
---|
445 | |
---|
446 | \mk |
---|
447 | \marge In the \ttt{WPS} directory, please compile |
---|
448 | the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}: |
---|
449 | \begin{verbatim} |
---|
450 | cd WPS/ |
---|
451 | ./configure ## select your compiler + 'NO GRIB2' option |
---|
452 | ./compile |
---|
453 | ls -lt geogrid.exe metgrid.exe |
---|
454 | \end{verbatim} |
---|
455 | |
---|
456 | \mk |
---|
457 | \marge Apart from the executables you just compiled, |
---|
458 | the preprocessing utilities include \ttt{real.exe}, |
---|
459 | which was compiled by the \ttt{makemeso} script |
---|
460 | along with the mesoscale model executable \ttt{wrf.exe}. |
---|
461 | % |
---|
462 | \ttt{real.exe} should be copied or linked in the |
---|
463 | simulation directory (e.g. \ttt{TESTCASE} for the |
---|
464 | Arsia Mons test case) to be at the same level than |
---|
465 | \ttt{namelist.input}. |
---|
466 | |
---|
467 | \begin{finger} |
---|
468 | \item Even though the name of the executable writes |
---|
469 | e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program |
---|
470 | is not related to the specific \ttt{makemeso} |
---|
471 | parameters -- contrary to the \ttt{wrf.exe} executable. |
---|
472 | % |
---|
473 | We just found that renaming the (possibly similar |
---|
474 | if the model sources were not modified) |
---|
475 | \ttt{real.exe} was a practical way not to confuse |
---|
476 | between executables compiled at different moments. |
---|
477 | \end{finger} |
---|
478 | |
---|
479 | \mk |
---|
480 | \section{Running the preprocessing utilities} |
---|
481 | |
---|
482 | \mk |
---|
483 | When you run a simulation with \ttt{wrf.exe}, |
---|
484 | the program attempts to read the initial state |
---|
485 | in the files |
---|
486 | \ttt{wrfinput\_d01}, |
---|
487 | \ttt{wrfinput\_d02}, \ldots |
---|
488 | (one file per domain) |
---|
489 | and the parent domain boundary conditions |
---|
490 | in \ttt{wrfbdy\_d01}. |
---|
491 | % |
---|
492 | The whole chain of data conversion and |
---|
493 | interpolation needed to generate those |
---|
494 | files is summarized in the diagram next |
---|
495 | page. |
---|
496 | % |
---|
497 | Three distinct preprocessing steps are |
---|
498 | necessary to generate the final files. |
---|
499 | % |
---|
500 | As is described in the previous section, |
---|
501 | some modifications in the \ttt{namelist.input} file |
---|
502 | [e.g. start/end dates labelled with \ttt{(p1)}] |
---|
503 | requires a complete reprocessing from step $1$ to step $3$ |
---|
504 | to successfully launch the simulation, |
---|
505 | whereas other changes |
---|
506 | [e.g. model top labelled with \ttt{(p3)}] |
---|
507 | only requires a quick reprocessing at step $3$, keeping |
---|
508 | the files generated at the end of step $2$ |
---|
509 | the same. |
---|
510 | |
---|
511 | \mk |
---|
512 | \subsection{Input data} |
---|
513 | |
---|
514 | \mk |
---|
515 | \subsubsection{Static data} |
---|
516 | |
---|
517 | \mk |
---|
518 | All the static data |
---|
519 | (topography, thermal inertia, albedo) |
---|
520 | needed to initialize the model |
---|
521 | are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory. |
---|
522 | % |
---|
523 | By default, only coarse-resolution datasets\footnote{ |
---|
524 | %%% |
---|
525 | Corresponding to the fields stored in the |
---|
526 | file \ttt{surface.nc} known by LMD-MGCM users: |
---|
527 | \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} |
---|
528 | %%% |
---|
529 | } are available, but the directory also contains sources and scripts |
---|
530 | to install finer resolution datasets: |
---|
531 | \begin{citemize} |
---|
532 | \item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola}, |
---|
533 | \item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01}, |
---|
534 | \item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07} |
---|
535 | \end{citemize} |
---|
536 | \pagebreak |
---|
537 | \includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf} |
---|
538 | |
---|
539 | \mk |
---|
540 | \marge The role of the \ttt{build\_static} script is to |
---|
541 | automatically download these datasets from the web |
---|
542 | (namely PDS archives) and convert them to an |
---|
543 | acceptable format for a future use by the |
---|
544 | preprocessing utilities: |
---|
545 | % |
---|
546 | \begin{verbatim} |
---|
547 | cd $LMDMOD/LMD_MM_MARS |
---|
548 | ./build_static |
---|
549 | \end{verbatim} |
---|
550 | % |
---|
551 | \begin{finger} |
---|
552 | \item Please install the \ttt{octave} |
---|
553 | free software\footnote{ |
---|
554 | %%% |
---|
555 | Available at \url{http://www.gnu.org/software/octave} |
---|
556 | %%% |
---|
557 | } on your system to be able to use the |
---|
558 | \ttt{build\_static} script. |
---|
559 | % |
---|
560 | Another solution is to browse into each of the |
---|
561 | directories contained within \ttt{WPS\_GEOG}, download the |
---|
562 | data with the shell scripts and execute the \ttt{.m} scripts with either |
---|
563 | \ttt{octave} or the commercial software \ttt{matlab} |
---|
564 | (just replace \ttt{\#} by \ttt{\%}). |
---|
565 | % |
---|
566 | \item If you do not manage to execute the \ttt{build\_static} script, |
---|
567 | converted ready-to-use datafiles are available upon request. |
---|
568 | % |
---|
569 | \item The building of the MOLA 64ppd topographical |
---|
570 | database can be quite long. Thus, such a process is |
---|
571 | not performed by default by the \ttt{build\_static} script. |
---|
572 | If the user would like to build this database, |
---|
573 | please remove the \ttt{exit} command in the script, just above |
---|
574 | the commands related to the MOLA 64ppd. |
---|
575 | % |
---|
576 | \item The resulting \ttt{WPS\_GEOG} can reach a size |
---|
577 | of several hundreds of Mo. |
---|
578 | % |
---|
579 | You might move such a folder in a place |
---|
580 | with more disk space available, but then be |
---|
581 | sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS} |
---|
582 | a link to the new location |
---|
583 | of the directory. |
---|
584 | \end{finger} |
---|
585 | |
---|
586 | \mk |
---|
587 | \subsubsection{Meteorological data} |
---|
588 | |
---|
589 | \mk |
---|
590 | The preprocessing tools generate initial and boundary conditions |
---|
591 | from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations. |
---|
592 | % |
---|
593 | If you would like to run a mesoscale simulation at a given |
---|
594 | season, you need to first run a GCM simulation and output |
---|
595 | the meteorological fields at the considered season. |
---|
596 | % |
---|
597 | For optimal forcing at the boundaries, we advise you |
---|
598 | to write the meteorological fields to the |
---|
599 | \ttt{diagfi.nc} file at least each two hours. |
---|
600 | % |
---|
601 | Please also make sure that the following fields |
---|
602 | are stored in the NETCDF \ttt{diagfi.nc} file: |
---|
603 | |
---|
604 | \footnotesize |
---|
605 | \codesource{contents_diagfi} |
---|
606 | |
---|
607 | \normalsize |
---|
608 | \begin{finger} |
---|
609 | \item If the fields |
---|
610 | \ttt{emis}, |
---|
611 | \ttt{co2ice}, |
---|
612 | \ttt{q01}, |
---|
613 | \ttt{q02}, |
---|
614 | \ttt{tsoil} |
---|
615 | are missing in the \ttt{diagfi.nc} file, |
---|
616 | they are replaced by respective default |
---|
617 | values $0.95$, $0$, $0$, $0$, tsurf. |
---|
618 | \end{finger} |
---|
619 | |
---|
620 | \mk |
---|
621 | \marge An example of input meteorological file |
---|
622 | \ttt{diagfi.nc} file can be downloaded |
---|
623 | at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}. |
---|
624 | % |
---|
625 | Please deflate the archive and copy the \ttt{diagfi.nc} file |
---|
626 | in \ttt{\$LMDMOD/TMPDIR/GCMINI}. |
---|
627 | % |
---|
628 | Such a file can then be used to define the initial |
---|
629 | and boundary conditions, and we will go |
---|
630 | through the three preprocessing steps. |
---|
631 | |
---|
632 | \mk |
---|
633 | \subsection{Preprocessing steps} |
---|
634 | |
---|
635 | \mk |
---|
636 | \subsubsection{Step 1: Converting GCM data} |
---|
637 | |
---|
638 | \mk |
---|
639 | The programs in the \ttt{PREP\_MARS} directory |
---|
640 | convert the data from the NETCDF \ttt{diagfi.nc} |
---|
641 | file into separated binary datafiles for each |
---|
642 | date contained in \ttt{diagfi.nc}, according to |
---|
643 | the formatting needed by the |
---|
644 | preprocessing programs at step 2. |
---|
645 | % |
---|
646 | These programs can be executed by the following |
---|
647 | commands: |
---|
648 | \begin{verbatim} |
---|
649 | cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS |
---|
650 | echo 1 | ./create_readmeteo.exe # drop the "echo 1 |" if you want control |
---|
651 | ./readmeteo.exe < readmeteo.def |
---|
652 | \end{verbatim} |
---|
653 | % |
---|
654 | \marge If every went well with the conversion, |
---|
655 | the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED} |
---|
656 | should contain files named \ttt{LMD:}. |
---|
657 | |
---|
658 | \mk |
---|
659 | \subsubsection{2: Interpolation on the regional domain} |
---|
660 | |
---|
661 | \mk |
---|
662 | In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows |
---|
663 | you to define the mesoscale simulation domain |
---|
664 | to horizontally interpolate the topography, |
---|
665 | thermal inertia and albedo fields at the domain |
---|
666 | resolution and to calculate useful fields |
---|
667 | such as topographical slopes.%\pagebreak |
---|
668 | |
---|
669 | \mk |
---|
670 | \marge Please execute the commands: |
---|
671 | % |
---|
672 | \begin{verbatim} |
---|
673 | cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS |
---|
674 | ln -sf ../../TESTCASE/namelist.wps . # test case |
---|
675 | ./geogrid.exe |
---|
676 | \end{verbatim} |
---|
677 | % |
---|
678 | \marge The result of \ttt{geogrid.exe} |
---|
679 | -- and thus the definition of the mesoscale |
---|
680 | domain -- can be checked in the NETCDF |
---|
681 | file \ttt{geo\_em.d01.nc}. |
---|
682 | % |
---|
683 | A quick check can be performed using the command line |
---|
684 | \begin{verbatim} |
---|
685 | ncview geo_em.d01.nc |
---|
686 | \end{verbatim} |
---|
687 | \marge if \ttt{ncview} is installed, or the \ttt{IDL} |
---|
688 | script \ttt{out\_geo.pro} |
---|
689 | \begin{verbatim} |
---|
690 | idl |
---|
691 | IDL> out_geo, field1='TOPO' |
---|
692 | IDL> out_geo, field1='TI' |
---|
693 | IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &' |
---|
694 | IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &' |
---|
695 | IDL> exit |
---|
696 | \end{verbatim} |
---|
697 | \marge if the demo version of \ttt{IDL} is installed. |
---|
698 | % |
---|
699 | Of course if your favorite graphical tool supports |
---|
700 | the NETCDF standard, you might use it to check the |
---|
701 | domain definition in \ttt{geo\_em.d01.nc}. |
---|
702 | |
---|
703 | \mk |
---|
704 | \marge If you are unhappy with the results or |
---|
705 | you want to change |
---|
706 | the location of the mesoscale domain on the planet, |
---|
707 | the horizontal resolution, |
---|
708 | the number of grid points \ldots, |
---|
709 | please modify the parameter |
---|
710 | file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}. |
---|
711 | % |
---|
712 | Here are the contents of \ttt{namelist.wps}: |
---|
713 | % |
---|
714 | \codesource{namelist.wps_TEST} |
---|
715 | |
---|
716 | \begin{finger} |
---|
717 | % |
---|
718 | \item No input meteorological data |
---|
719 | are actually needed to execute \ttt{geogrid.exe}. |
---|
720 | % |
---|
721 | \item More details about the database and |
---|
722 | more options of interpolation could be |
---|
723 | found in the file \ttt{geogrid/GEOGRID.TBL}. |
---|
724 | % |
---|
725 | \item Defining several domains yields |
---|
726 | distinct files |
---|
727 | \ttt{geo\_em.d01.nc}, |
---|
728 | \ttt{geo\_em.d02.nc}, |
---|
729 | \ttt{geo\_em.d03.nc}\ldots |
---|
730 | \end{finger} |
---|
731 | |
---|
732 | \mk |
---|
733 | \marge Once the \ttt{geo\_em} file(s) are generated, |
---|
734 | the \ttt{metgrid.exe} program performs |
---|
735 | a similar horizontal interpolation |
---|
736 | of the meteorological fields to the mesoscale |
---|
737 | domain as the one performed by \ttt{geogrid.exe} |
---|
738 | for the surface data. |
---|
739 | % |
---|
740 | Then the program writes the results in |
---|
741 | \ttt{met\_em} files and also collects |
---|
742 | the static fields and domain parameters |
---|
743 | included in the \ttt{geo\_em} file(s) |
---|
744 | % |
---|
745 | Please type the following commands: |
---|
746 | \begin{verbatim} |
---|
747 | cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS |
---|
748 | ./metgrid.exe |
---|
749 | \end{verbatim} |
---|
750 | % |
---|
751 | \marge If every went well, |
---|
752 | the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED} |
---|
753 | should contain the \ttt{met\_em.*} files. |
---|
754 | |
---|
755 | \mk |
---|
756 | \subsubsection{Step 3: Vertical interpolation on mesoscale levels} |
---|
757 | |
---|
758 | \mk |
---|
759 | \marge The last step is to execute \ttt{real.exe} |
---|
760 | to perform the interpolation from the vertical |
---|
761 | levels of the GCM to the vertical levels |
---|
762 | defined in the mesoscale model. |
---|
763 | % |
---|
764 | This program also prepares the final initial |
---|
765 | state for the simulation in files called |
---|
766 | \ttt{wrfinput} and the boundary conditions |
---|
767 | in files called \ttt{wrfbdy}. |
---|
768 | |
---|
769 | \mk |
---|
770 | \marge To successfully execute \ttt{real.exe}, |
---|
771 | you need the \ttt{met\_em.*} files |
---|
772 | and the \ttt{namelist.input} file |
---|
773 | to be in the same directory as \ttt{real.exe}. |
---|
774 | % |
---|
775 | Parameters in \ttt{namelist.input} |
---|
776 | controlling the behavior of the vertical interpolation |
---|
777 | are those labelled with \ttt{(p3)} in the detailed |
---|
778 | list introduced in the previous chapter. |
---|
779 | |
---|
780 | \mk |
---|
781 | \marge Please type the following commands |
---|
782 | to prepare files for the Arsia Mons test case |
---|
783 | (or your personal test case if you changed |
---|
784 | the parameters in \ttt{namelist.wps}): |
---|
785 | \begin{verbatim} |
---|
786 | cd $LMDMOD/TESTCASE |
---|
787 | ln -sf $LMDMOD/WRFFEED/met_em* . |
---|
788 | ./real.exe |
---|
789 | \end{verbatim} |
---|
790 | |
---|
791 | \mk |
---|
792 | \marge The final message of the \ttt{real.exe} |
---|
793 | should claim the success of the processes and you |
---|
794 | are now ready to launch the integrations |
---|
795 | of the LMD Martian Mesoscale Model again |
---|
796 | with the \ttt{wrf.exe} command as in section |
---|
797 | \ref{sc:arsia}. |
---|
798 | |
---|
799 | \begin{finger} |
---|
800 | \item When you modify either |
---|
801 | \ttt{namelist.wps} or \ttt{namelist.input}, |
---|
802 | make sure that the common parameters |
---|
803 | are exactly similar in both files |
---|
804 | (especially when running nested simulations) |
---|
805 | otherwise either \ttt{real.exe} or \ttt{wrf.exe} |
---|
806 | command will exit with an error message. |
---|
807 | \end{finger} |
---|
808 | %\pagebreak |
---|
809 | |
---|
810 | |
---|
811 | \chapter{Starting simulations from scratch} |
---|
812 | |
---|
813 | \mk |
---|
814 | \section{Running your own GCM simulations} |
---|
815 | |
---|
816 | \begin{remarque} |
---|
817 | To be completed |
---|
818 | \end{remarque} |
---|
819 | |
---|
820 | \mk |
---|
821 | \section{Complete simulations with \ttt{runmeso}} |
---|
822 | |
---|
823 | \begin{remarque} |
---|
824 | To be completed |
---|
825 | \end{remarque} |
---|
826 | |
---|
827 | |
---|
828 | \chapter{Outputs} |
---|
829 | |
---|
830 | \mk |
---|
831 | \section{Postprocessing utilities and graphics} |
---|
832 | |
---|
833 | \begin{remarque} |
---|
834 | To be completed. Do-it-all \ttt{idl} scripts |
---|
835 | would be described here ! |
---|
836 | \end{remarque} |
---|
837 | |
---|
838 | \mk |
---|
839 | \section{Modify the outputs} |
---|
840 | |
---|
841 | \begin{remarque} |
---|
842 | To be completed. |
---|
843 | Though the method is different, |
---|
844 | we kept all the convenient aspects of \ttt{writediagfi} |
---|
845 | \end{remarque} |
---|
846 | |
---|
847 | \chapter{Frequently Asked Questions} |
---|
848 | |
---|
849 | |
---|
850 | \begin{finger} |
---|
851 | \item Which timestep should I choose to avoid crashes of the model ? |
---|
852 | \item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ? |
---|
853 | \item Help ! I get strange assembler errors or ILM errors while compiling ! |
---|
854 | \item Is it possible to run the model on a specific configuration that is not supported ? |
---|
855 | \item Why do I have to define four less rows in the parent domain |
---|
856 | when performing nested runs ? |
---|
857 | \item I am kind of nostalgic of early/middle Mars. How could I run |
---|
858 | mesoscale simulations at low/high obliquity ? |
---|
859 | \item Why \ttt{real.exe} is crashing when the model top pressure is |
---|
860 | lower than $2$~Pa ? |
---|
861 | \item Can I use the two-way nesting ? |
---|
862 | \end{finger} |
---|
863 | |
---|
864 | \begin{remarque} |
---|
865 | To be completed. |
---|
866 | \end{remarque} |
---|
867 | |
---|
868 | |
---|
869 | |
---|
870 | |
---|
871 | |
---|
872 | |
---|
873 | |
---|
874 | |
---|
875 | |
---|
876 | |
---|
877 | |
---|
878 | |
---|