1 | |
---|
2 | \chapter{Introducing the model} |
---|
3 | |
---|
4 | \mk |
---|
5 | \begin{finger} |
---|
6 | \item Please first read the document ``Design and |
---|
7 | Performance of the LMD Martian Mesoscale Model" |
---|
8 | to know what the model is, |
---|
9 | what kind of results can be obtained |
---|
10 | and how these results compare with |
---|
11 | available data or independant simulations |
---|
12 | \end{finger} |
---|
13 | |
---|
14 | \begin{remarque} |
---|
15 | To be completed with description |
---|
16 | of the dynamics/physics driver |
---|
17 | \end{remarque} |
---|
18 | |
---|
19 | \chapter{First steps toward running the model} |
---|
20 | |
---|
21 | \mk |
---|
22 | This chapter is meant for first time users of the LMD Martian Mesoscale Model. |
---|
23 | % |
---|
24 | We describe how to install the model on your system, compile the program and run a test case. |
---|
25 | % |
---|
26 | Experience with either the terrestrial WRF mesoscale model or the LMD Martian GCM is not absolutely required, |
---|
27 | although it would help you getting more easily through the installation process. |
---|
28 | |
---|
29 | \mk |
---|
30 | \section{Prerequisites} |
---|
31 | |
---|
32 | \mk |
---|
33 | \subsection{General requirements} |
---|
34 | |
---|
35 | \mk |
---|
36 | In order to install the LMD Martian Mesoscale Model, please ensure that: |
---|
37 | \begin{citemize} |
---|
38 | \item your computer is connected to the internet; |
---|
39 | \item your OS is Linux\footnote{ |
---|
40 | %%%%%%%%%%%%%% |
---|
41 | The model was also successfully compiled on MacOSX; |
---|
42 | ``howto" information is available upon request. |
---|
43 | %%%%%%%%%%%%%% |
---|
44 | } with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots); |
---|
45 | \item your Fortran compiler is the PGI commercial compiler \ttt{pgf90} or the GNU |
---|
46 | free compiler\footnote{ |
---|
47 | %%%%%%%%%%%%%% |
---|
48 | Sources and binaries available on \url{http://www.g95.org} |
---|
49 | %%%%%%%%%%%%%% |
---|
50 | } \ttt{g95}; |
---|
51 | \item your C compiler is \ttt{gcc} and C development libraries are included; |
---|
52 | \item \ttt{bash}, \ttt{m4} and \ttt{perl} are installed on your computer; |
---|
53 | \item \ttt{NETCDF} libraries have been compiled \emph{on your system}. |
---|
54 | \end{citemize} |
---|
55 | % |
---|
56 | \begin{finger} |
---|
57 | \item You might also find useful -- though not mandatory -- to install on your system: |
---|
58 | \begin{citemize} |
---|
59 | \item the \ttt{ncview} utility\footnote{ |
---|
60 | %%%%%% |
---|
61 | \url{http://meteora.ucsd.edu/~pierce/ncview\_home\_page.html} |
---|
62 | %%%%%% |
---|
63 | }, which is a nice tool to visualize the contents of a NETCDF file; |
---|
64 | \item the \ttt{IDL} demo version\footnote{ |
---|
65 | %%%%%% |
---|
66 | \url{http://www.ittvis.com/ProductServices/IDL.aspx} |
---|
67 | %%%%%% |
---|
68 | }, which is used by the plot utilities provided with the model. |
---|
69 | \end{citemize} |
---|
70 | \end{finger} |
---|
71 | |
---|
72 | \mk |
---|
73 | \marge Three environment variables associated with the \ttt{NETCDF} libraries must be defined: |
---|
74 | \begin{verbatim} |
---|
75 | declare -x NETCDF=/disk/user/netcdf |
---|
76 | declare -x NCDFLIB=$NETCDF/lib |
---|
77 | declare -x NCDFINC=$NETCDF/inc |
---|
78 | \end{verbatim} |
---|
79 | |
---|
80 | \begin{finger} |
---|
81 | \item All command lines in the document are proposed in \ttt{bash}. |
---|
82 | \end{finger} |
---|
83 | |
---|
84 | %%[csh] setenv NETCDF /disk/user/netcdf |
---|
85 | %%[csh] setenv NCDFLIB $NETCDF/lib |
---|
86 | %%[csh] setenv NCDFINC $NETCDF/inc |
---|
87 | |
---|
88 | \mk |
---|
89 | \marge You also need the environment variable \ttt{\$LMDMOD} to point |
---|
90 | at the directory where you will install the model (e.g. \ttt{/disk/user/MODELS}): |
---|
91 | \begin{verbatim} |
---|
92 | declare -x LMDMOD=/disk/user/MODELS |
---|
93 | \end{verbatim} |
---|
94 | %[csh] setenv LMDMOD /disk/user/MODELS |
---|
95 | % |
---|
96 | \begin{finger} |
---|
97 | \item Please check that $\sim 200$~Mo free disk space is available in \ttt{/disk}. |
---|
98 | \end{finger} |
---|
99 | |
---|
100 | \mk |
---|
101 | \subsection{Parallel computations} |
---|
102 | |
---|
103 | \mk |
---|
104 | \marge Parallel computations with the Message Passing Interface (MPI) standard are supported by |
---|
105 | the ARW-WRF mesoscale model. |
---|
106 | % |
---|
107 | If you want to use this capability in the LMD Martian Mesoscale Model, |
---|
108 | you would have the installation of MPICH2 as a additional prerequisite. |
---|
109 | |
---|
110 | \mk |
---|
111 | \marge Please download the current stable version of the sources |
---|
112 | (e.g. \ttt{mpich2-1.0.8.tar.gz}) on the MPICH2 website |
---|
113 | \url{http://www.mcs.anl.gov/research/projects/mpich2} |
---|
114 | and install the MPICH2 utilities by the following commands: |
---|
115 | % |
---|
116 | \begin{verbatim} |
---|
117 | mkdir $LMDMOD/MPI |
---|
118 | mv mpich2-1.0.8.tar.gz $LMDMOD/MPI |
---|
119 | cd $LMDMOD/MPI |
---|
120 | tar xzvf mpich2-1.0.8.tar.gz |
---|
121 | cd mpich2-1.0.8 |
---|
122 | ./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log & |
---|
123 | # please wait... |
---|
124 | make > mk.log 2> mkerr.log & |
---|
125 | declare -x WHERE_MPI=$LMDMOD/MPI/mpich2-1.0.8/bin |
---|
126 | \end{verbatim} |
---|
127 | % |
---|
128 | \begin{finger} |
---|
129 | \item Even if you add the \ttt{\$LMDMOD/MPI/mpich2-1.0.8/bin} |
---|
130 | directory to your \ttt{\$PATH} variable, defining the environment |
---|
131 | variable \ttt{\$WHERE\_MPI} is still required |
---|
132 | to ensure a successful compilation of the model. |
---|
133 | \end{finger} |
---|
134 | |
---|
135 | \mk |
---|
136 | \subsection{Compiling the terrestrial WRF model} |
---|
137 | |
---|
138 | \mk |
---|
139 | The LMD Martian Mesoscale Model is based on the terrestrial NCEP/NCAR ARW-WRF Mesoscale Model. |
---|
140 | % |
---|
141 | As a first step towards the compilation of the Martian version, we advise you to check that the terrestrial |
---|
142 | model compiles on your computer with either \ttt{g95} or \ttt{pgf90}. |
---|
143 | |
---|
144 | \mk |
---|
145 | \marge On the ARW-WRF website \url{http://www.mmm.ucar.edu/wrf/users/download/get\_source.html}, you will be allowed |
---|
146 | to freely download the model after a quick registration process (click on ``New users"). |
---|
147 | % |
---|
148 | Make sure to download the version 2.2 of the WRF model and copy the |
---|
149 | \ttt{WRFV2.2.TAR.gz} archive to the \ttt{\$LMDMOD} folder. |
---|
150 | |
---|
151 | \mk |
---|
152 | \marge Then please extract the model sources and configure the compilation process: |
---|
153 | \begin{verbatim} |
---|
154 | cd $LMDMOD |
---|
155 | tar xzvf WRFV2.2.TAR.gz |
---|
156 | cd WRFV2 |
---|
157 | ./configure |
---|
158 | \end{verbatim} |
---|
159 | |
---|
160 | \mk |
---|
161 | \marge The \ttt{configure} script analyzes your architecture |
---|
162 | and proposes you several possible compilation options. |
---|
163 | % |
---|
164 | Make sure to choose the ``single-threaded, no nesting" |
---|
165 | option related to either \ttt{g95} (should be option $13$ on a $32$~bits Linux PC) |
---|
166 | or \ttt{pgf90} (should be option $1$ on a $32$~bits Linux PC). |
---|
167 | |
---|
168 | \mk |
---|
169 | \marge The next step is to compile the WRF model by choosing the kind of |
---|
170 | simulations you would like to run. |
---|
171 | % |
---|
172 | A simple and direct test consists in trying to compile |
---|
173 | the idealized case of a 2D flow impinging on a small hill: |
---|
174 | \begin{verbatim} |
---|
175 | ./compile em_hill2d_x > log_compile 2> log_error & |
---|
176 | \end{verbatim} |
---|
177 | % |
---|
178 | \begin{finger} |
---|
179 | \item In case you encounter problems compiling the ARW-WRF model, |
---|
180 | please read documentation on the website |
---|
181 | \url{http://www.mmm.ucar.edu/wrf/users}, |
---|
182 | contact the WRF helpdesk or search the web for your error message. |
---|
183 | \end{finger}%\pagebreak |
---|
184 | |
---|
185 | \mk |
---|
186 | \marge If the compilation was successful |
---|
187 | (the file \ttt{log\_error} should be empty |
---|
188 | or only reporting few warnings), you should find |
---|
189 | in the \ttt{main} folder two executables |
---|
190 | \ttt{ideal.exe} and \ttt{run.exe} |
---|
191 | that would allow you to run the test |
---|
192 | simulation: |
---|
193 | \begin{verbatim} |
---|
194 | cd test/em_hill2d_x |
---|
195 | ./ideal.exe |
---|
196 | ./wrf.exe |
---|
197 | \end{verbatim} |
---|
198 | % |
---|
199 | During the simulation, the time taken by the computer |
---|
200 | to perform integrations at each dynamical timestep |
---|
201 | is displayed in the standard output. |
---|
202 | % |
---|
203 | The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}. |
---|
204 | % |
---|
205 | The model results are stored in a \ttt{wrfout} data file |
---|
206 | you might like to browse with a \ttt{NETCDF}-compliant software |
---|
207 | such as \ttt{ncview}. |
---|
208 | % |
---|
209 | \begin{finger} |
---|
210 | \item If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will |
---|
211 | probably complain about an error reading the namelist. |
---|
212 | % |
---|
213 | Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} |
---|
214 | in the \ttt{namelist.input} file to solve the problem. |
---|
215 | \end{finger} |
---|
216 | |
---|
217 | \mk |
---|
218 | \section{Compiling the Martian model} |
---|
219 | |
---|
220 | \mk |
---|
221 | \subsection{Extracting and preparing the sources} |
---|
222 | |
---|
223 | \mk |
---|
224 | To start the installation of the Martian mesoscale model, |
---|
225 | download the archive \ttt{LMD\_MM\_MARS.tar.gz} |
---|
226 | (click on \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS.tar.gz} |
---|
227 | or use the \ttt{wget} command). |
---|
228 | % |
---|
229 | Copy the sources in the \ttt{\$LMDMOD} directory and extract the files: |
---|
230 | \begin{verbatim} |
---|
231 | cp LMD_MM_MARS.tar.gz $LMDMOD |
---|
232 | cd $LMDMOD |
---|
233 | tar xzvf LMD_MM_MARS.tar.gz |
---|
234 | \end{verbatim} |
---|
235 | |
---|
236 | \mk |
---|
237 | \marge Execute the \ttt{prepare} script |
---|
238 | that would do some necessary preparatory tasks for you: |
---|
239 | deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, |
---|
240 | download the ARW-WRF sources from the web, |
---|
241 | apply a (quite significant) ``Martian patch" to these sources |
---|
242 | and build the final structure of your \ttt{LMD\_MM\_MARS} directory: |
---|
243 | \begin{verbatim} |
---|
244 | cd $LMDMOD/LMD_MM_MARS |
---|
245 | ./prepare |
---|
246 | \end{verbatim} |
---|
247 | |
---|
248 | \mk |
---|
249 | \marge Please check the contents of the \ttt{LMD\_MM\_MARS} directory: |
---|
250 | \begin{citemize} |
---|
251 | \item seven \ttt{bash} scripts: |
---|
252 | \ttt{build\_static}, |
---|
253 | \ttt{copy\_model}, |
---|
254 | \ttt{makemeso}, |
---|
255 | \ttt{prepare}, |
---|
256 | \ttt{prepare\_ini},\linebreak |
---|
257 | \ttt{prepare\_post}, |
---|
258 | \ttt{save\_all}; |
---|
259 | \item the sources directory \ttt{SRC}; |
---|
260 | \item the static data directory \ttt{WPS\_GEOG}; |
---|
261 | \item the simulation utilities directory \ttt{SIMU}. |
---|
262 | \end{citemize} |
---|
263 | % |
---|
264 | \marge and check that the \ttt{LMD\_MM\_MARS/SRC} directory contains: |
---|
265 | \begin{citemize} |
---|
266 | \item the model main sources in \ttt{WRFV2}, |
---|
267 | \item the preprocessing sources in \ttt{WPS} and \ttt{PREP\_MARS}, |
---|
268 | \item the postprocessing sources in \ttt{ARWpost}, |
---|
269 | \item three \ttt{tar.gz} archives and two information text files. %\ttt{saved} and \ttt{datesave}. |
---|
270 | \end{citemize} |
---|
271 | |
---|
272 | \mk |
---|
273 | \subsection{Main compilation step} |
---|
274 | \label{sc:makemeso} |
---|
275 | |
---|
276 | \mk |
---|
277 | In order to compile the model, execute the \ttt{makemeso} compilation script |
---|
278 | in the \ttt{LMD\_MM\_MARS}\linebreak directory |
---|
279 | % |
---|
280 | \begin{verbatim} |
---|
281 | cd $LMDMOD/LMD_MM_MARS |
---|
282 | ./makemeso |
---|
283 | \end{verbatim} |
---|
284 | % |
---|
285 | \marge and answer to the questions about |
---|
286 | \begin{asparaenum}[1.]%[\itshape Q1\upshape)] |
---|
287 | \item compiler choice (and number of processors if using MPI) |
---|
288 | \item number of grid points in longitude [61] |
---|
289 | \item number of grid points in latitude [61] |
---|
290 | \item number of vertical levels [61] |
---|
291 | \item number of tracers [1] |
---|
292 | \item number of domains [1] |
---|
293 | \end{asparaenum} |
---|
294 | |
---|
295 | %\mk |
---|
296 | \begin{finger} |
---|
297 | \item On the first time you compile the model, you will probably wonder what to reply |
---|
298 | to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable |
---|
299 | for the test case given below. |
---|
300 | \item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$ |
---|
301 | to run a specific compilation. |
---|
302 | If you would like to run another simulation |
---|
303 | with at least one of parameters $1$ to $6$ |
---|
304 | subject to change, the model needs to be recompiled\footnote{This |
---|
305 | necessary recompilation each time the number of grid points, |
---|
306 | tracers and domains is modified is imposed by the LMD physics code. |
---|
307 | The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}. |
---|
308 | \item When you use parallel computations, please bear in mind that with |
---|
309 | $2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated |
---|
310 | into $2$ (resp. $2$, $3$, $4$, $4$) tiles over |
---|
311 | the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction. |
---|
312 | Thus make sure that the number of grid points minus $1$ in each direction |
---|
313 | could be divided by the aforementioned number of tiles over the considered |
---|
314 | direction. |
---|
315 | \item If you use grid nesting, note that no more than $4$ processors can be used. |
---|
316 | \end{finger} |
---|
317 | |
---|
318 | \mk |
---|
319 | \marge The \ttt{makemeso} is an automated script which performs |
---|
320 | the following serie of tasks: |
---|
321 | %It is useful to detail and comment the performed by the \ttt{makemeso} script: |
---|
322 | \begin{citemize} |
---|
323 | \item determine if the machine is 32 or 64 bits; |
---|
324 | \item ask the user about the compilation settings; |
---|
325 | \item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP}; |
---|
326 | \begin{finger} |
---|
327 | \item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} |
---|
328 | is created if the user requested |
---|
329 | a \ttt{g95} compilation of the code for single-domain simulations |
---|
330 | on a 32bits machine. |
---|
331 | \end{finger} |
---|
332 | \item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources; |
---|
333 | \begin{finger} |
---|
334 | \item This method ensures that any change to the model sources would |
---|
335 | be propagated to all the different \ttt{DIRCOMP} installation folders. |
---|
336 | \end{finger} |
---|
337 | \item execute the WRF \ttt{configure} script with the correct option; |
---|
338 | \item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics; |
---|
339 | \item calculate the total number of horizontal grid points handled by the LMD physics; |
---|
340 | \item duplicate LMD physical sources if nesting is activated; |
---|
341 | \begin{finger} |
---|
342 | \item The model presently supports 3 nests, but more nests |
---|
343 | can be included by adaptating the following files: |
---|
344 | \begin{verbatim} |
---|
345 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc |
---|
346 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc |
---|
347 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3 |
---|
348 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3 |
---|
349 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest' |
---|
350 | \end{verbatim}%\pagebreak |
---|
351 | \end{finger} |
---|
352 | \item compile the LMD physical packages with the appropriate \ttt{makegcm} command |
---|
353 | and collect the compiled objects in the library \ttt{liblmd.a}; |
---|
354 | \begin{finger} |
---|
355 | \item During this step that could be a bit long, |
---|
356 | especially if you defined more than one domain, |
---|
357 | the \ttt{makemeso} script provides you with the full path towards |
---|
358 | the text file \ttt{log\_compile\_phys} in which you can check for |
---|
359 | compilation progress and possible errors. |
---|
360 | % |
---|
361 | In the end of the process, you will find an |
---|
362 | error message associated to the generation of the |
---|
363 | final executable. |
---|
364 | % |
---|
365 | Please do not pay attention to this, as the compilation of the LMD |
---|
366 | sources is meant to generate a library of |
---|
367 | compiled objects called \ttt{liblmd.a} instead of a program. |
---|
368 | \end{finger} |
---|
369 | \item compile the modified Martian ARW-WRF solver, including |
---|
370 | the \ttt{liblmd.a} library; |
---|
371 | \begin{finger} |
---|
372 | \item When it is the first time the model is compiled, this |
---|
373 | step could be quite long. |
---|
374 | % |
---|
375 | The \ttt{makemeso} script provides you with a \ttt{log\_compile} |
---|
376 | text file where the progress of the compilation can be checked |
---|
377 | and a \ttt{log\_error} text file listing errors and warnings |
---|
378 | during compilation. |
---|
379 | % |
---|
380 | A list of warnings related to \ttt{grib} |
---|
381 | utilities (not used in the Martian model) |
---|
382 | may appear and have no impact on the |
---|
383 | final executables. |
---|
384 | \item The compilation with \ttt{g95} might be unsuccessful |
---|
385 | due to some problems with files related to terrestrial microphysics. |
---|
386 | % |
---|
387 | Please type the following commands: |
---|
388 | \begin{verbatim} |
---|
389 | cd $LMDMOD/LMD_MM_MARS/SRC |
---|
390 | tar xzvf g95.tar.gz |
---|
391 | cp -f g95/WRFV2_g95_fix/* WRFV2/phys/ |
---|
392 | cd $LMDMOD/LMD_MM_MARS |
---|
393 | \end{verbatim} |
---|
394 | \marge then recompile the model with the \ttt{makemeso} command. |
---|
395 | \end{finger} |
---|
396 | \item change the name of the executables in agreements with the |
---|
397 | settings provided by the user. |
---|
398 | \begin{finger} |
---|
399 | \item If you choose to answer to the \ttt{makemeso} questions using the |
---|
400 | aforementioned parameters in brackets, you should have in the |
---|
401 | \ttt{DIRCOMP} directory two executables: |
---|
402 | \begin{verbatim} |
---|
403 | real_x61_y61_z61_d1_t1_p1.exe |
---|
404 | wrf_x61_y61_z61_d1_t1_p1.exe |
---|
405 | \end{verbatim} |
---|
406 | % |
---|
407 | The directory also contains a text file |
---|
408 | in which the answers to the questions are stored, which |
---|
409 | allows you to re-run the script without the |
---|
410 | ``questions to the user" step: |
---|
411 | \begin{verbatim} |
---|
412 | ./makemeso < makemeso_x61_y61_z61_d1_t1_p1 |
---|
413 | \end{verbatim} |
---|
414 | \end{finger} |
---|
415 | \end{citemize} |
---|
416 | |
---|
417 | \mk |
---|
418 | \section{Running a simple test case} |
---|
419 | \label{sc:arsia} |
---|
420 | |
---|
421 | \mk |
---|
422 | We suppose that you had successfully compiled |
---|
423 | the model at the end of the previous section |
---|
424 | and you had used the answers in brackets |
---|
425 | to the \ttt{makemeso} questions. |
---|
426 | |
---|
427 | \mk |
---|
428 | \marge In order to test the compiled executables, |
---|
429 | a ready-to-use test case |
---|
430 | (with pre-generated initial and boundary |
---|
431 | conditions) is proposed |
---|
432 | in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} |
---|
433 | archive you can download at |
---|
434 | \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}. |
---|
435 | % |
---|
436 | This test case simulates the hydrostatic |
---|
437 | atmospheric flow around Arsia Mons during half a sol |
---|
438 | with constant thermal inertia, albedo |
---|
439 | and dust opacity. |
---|
440 | |
---|
441 | \begin{finger} |
---|
442 | \item Though the simulation reproduces some reasonable |
---|
443 | features of the mesoscale circulation around Arsia |
---|
444 | Mons (e.g. slope winds), it should not be used |
---|
445 | for scientific purpose, for the number of grid points |
---|
446 | is unsufficient for single-domain simulation |
---|
447 | and the integration time is below the necessary spin-up time. |
---|
448 | \end{finger} |
---|
449 | %\pagebreak |
---|
450 | |
---|
451 | \marge To launch the test simulation, please type |
---|
452 | the following commands, replacing the |
---|
453 | \ttt{g95\_32\_single} directory with its corresponding |
---|
454 | value on your system: |
---|
455 | % |
---|
456 | \begin{verbatim} |
---|
457 | cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/ |
---|
458 | tar xzvf LMD_MM_MARS_TESTCASE.tar.gz |
---|
459 | cd TESTCASE |
---|
460 | ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe |
---|
461 | nohup wrf.exe > log_wrf & |
---|
462 | \end{verbatim} |
---|
463 | |
---|
464 | %tar xzvf wrfinput.tar.gz |
---|
465 | |
---|
466 | \begin{finger} |
---|
467 | \item If you compiled the model using MPICH2, |
---|
468 | the command to launch a simulation is slightly different: |
---|
469 | % |
---|
470 | \begin{verbatim} |
---|
471 | [simulation on 2 processors on 1 machine] |
---|
472 | mpd & # first-time only (or after a reboot) |
---|
473 | # NB: may request the creation of a file .mpd.conf |
---|
474 | mpirun -np 8 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec |
---|
475 | tail -20 rsl.out.000? # to check the outputs |
---|
476 | \end{verbatim} |
---|
477 | \begin{verbatim} |
---|
478 | [simulation on 16 processors in 4 connected machines] |
---|
479 | echo barry.lmd.jussieu.fr > ~/mpd.hosts |
---|
480 | echo white.lmd.jussieu.fr >> ~/mpd.hosts |
---|
481 | echo loves.lmd.jussieu.fr >> ~/mpd.hosts |
---|
482 | echo tapas.lmd.jussieu.fr >> ~/mpd.hosts |
---|
483 | ssh barry.lmd.jussieu.fr # make sure that ssh to other machines |
---|
484 | # is possible without authentification |
---|
485 | mpdboot -f ~/mpd.hosts -n 4 |
---|
486 | mpdtrace |
---|
487 | mpirun -l -np 16 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec |
---|
488 | tail -20 rsl.out.00?? # to check the outputs |
---|
489 | \end{verbatim} |
---|
490 | \end{finger} |
---|
491 | |
---|
492 | |
---|
493 | \mk |
---|
494 | \chapter{Setting the simulation parameters} |
---|
495 | |
---|
496 | \mk |
---|
497 | In this chapter, we describe how to set the various parameters |
---|
498 | defining a given simulation. |
---|
499 | % |
---|
500 | As could be inferred from the content of the \ttt{TESTCASE} directory, |
---|
501 | two parameter files are needed to run the model: |
---|
502 | \begin{enumerate} |
---|
503 | \item The parameters related to the dynamical part of the model can be set |
---|
504 | in the file \ttt{namelist.input} according to the ARW-WRF namelist formatting. |
---|
505 | \item The parameters related to the physical part of the model can be set |
---|
506 | in the file \ttt{callphys.def} according to the LMD-MGCM formatting. |
---|
507 | \end{enumerate} |
---|
508 | |
---|
509 | \mk |
---|
510 | \section{Dynamical settings} |
---|
511 | |
---|
512 | \mk |
---|
513 | \ttt{namelist.input} controls the behavior of the dynamical core |
---|
514 | in the LMD Martian Mesoscale Model. |
---|
515 | % |
---|
516 | Compared to the file the ARW-WRF users are familiar with\footnote{ |
---|
517 | %%% |
---|
518 | A description of this file can be found in \ttt{SRC/WRFV2/run/README.namelist}. |
---|
519 | %%% |
---|
520 | }, the \ttt{namelist.input} in the LMD Martian Mesoscale Model |
---|
521 | is much shorter. |
---|
522 | % |
---|
523 | The only mandatory parameters in this file |
---|
524 | are information on time control\footnote{ |
---|
525 | %%% |
---|
526 | More information on the adopted Martian calendar: |
---|
527 | \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html} |
---|
528 | %%% |
---|
529 | } and domain definition. |
---|
530 | |
---|
531 | \mk |
---|
532 | \marge The minimal version of the \ttt{namelist.input} |
---|
533 | file corresponds to standard simulations with the model. |
---|
534 | % |
---|
535 | It is however possible to modify optional parameters |
---|
536 | if needed, as is the case in the \ttt{namelist.input} |
---|
537 | associated to the Arsia Mons test case |
---|
538 | (e.g. the parameter \ttt{non\_hydrostatic} is set to false |
---|
539 | to assume hydrostatic equilibrium, whereas standard |
---|
540 | simulations are non-hydrostatic). |
---|
541 | |
---|
542 | \mk |
---|
543 | \marge A detailed description of the \ttt{namelist.input} file is given below\footnote{ |
---|
544 | %%% |
---|
545 | You may find the corresponding file in \ttt{SIMU/namelist.input\_full}. |
---|
546 | %%% |
---|
547 | }. |
---|
548 | % |
---|
549 | Comments on each of the parameters are provided, |
---|
550 | with the following labels: |
---|
551 | \begin{citemize} |
---|
552 | \item \ttt{(*)} denotes parameters not to be modified, |
---|
553 | \item \ttt{(r)} indicates parameters which modification implies a new recompilation of the model, |
---|
554 | \item \ttt{(n)} describes parameters involved when nested domains are defined, |
---|
555 | \item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing |
---|
556 | of initial and boundary conditions (see next chapter), |
---|
557 | \item \ttt{(*d)} denotes dynamical parameters which modification implies |
---|
558 | non-standard simulations -- please read \ttt{SRC/WRFV2/run/README.namelist} |
---|
559 | and use with caution. |
---|
560 | \end{citemize} |
---|
561 | % |
---|
562 | If omitted, the optional parameters would be set to their default |
---|
563 | values indicated below.\pagebreak |
---|
564 | |
---|
565 | \centers{\ttt{-- file: namelist.input\_full --}}\codesource{namelist.input_full}\centers{\ttt{-- end file: namelist.input\_full --}} |
---|
566 | |
---|
567 | \begin{finger} |
---|
568 | \item Please pay attention to rigorous syntax while |
---|
569 | editing your personal \ttt{namelist.input} file |
---|
570 | to avoid reading error. |
---|
571 | \item To modify the default values (or even add |
---|
572 | personal parameters) in the \ttt{namelist.input} file, |
---|
573 | edit the \ttt{SRC/WRFV2/Registry/Registry.EM} file. |
---|
574 | % |
---|
575 | You will then have to recompile the model with \ttt{makemeso} ; |
---|
576 | answer \ttt{y} to the last question. |
---|
577 | \end{finger} |
---|
578 | |
---|
579 | \mk |
---|
580 | \marge In case you run simulations with \ttt{max\_dom} |
---|
581 | nested domains, you have to set \ttt{max\_dom} parameters |
---|
582 | wherever there is a ``," in the above list. |
---|
583 | % |
---|
584 | Here is an example of the resulting syntax of the |
---|
585 | \ttt{time\_control}, \ttt{domains} and \ttt{bdy\_control} |
---|
586 | categories in \ttt{namelist.input}: |
---|
587 | % |
---|
588 | \codesource{OMG_namelist.input} |
---|
589 | |
---|
590 | \section{Physical settings} |
---|
591 | |
---|
592 | \mk |
---|
593 | \ttt{callphys.def} controls the behavior of the physical parameterizations |
---|
594 | in the LMD Martian\linebreak Mesoscale Model. |
---|
595 | % |
---|
596 | The organization of this file is exactly similar |
---|
597 | to the corresponding file in the LMD Martian GCM, which |
---|
598 | user manual can be found at |
---|
599 | \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}. |
---|
600 | |
---|
601 | \mk |
---|
602 | \marge Please find in what follows the contents of \ttt{callphys.def}: |
---|
603 | % |
---|
604 | \centers{\ttt{-- file: callphys.def --}}\codesource{callphys.def}\centers{\ttt{-- end file: callphys.def --}} |
---|
605 | |
---|
606 | \mk |
---|
607 | \begin{finger} |
---|
608 | \item Note that in the given example |
---|
609 | the convective adjustment, |
---|
610 | the gravity wave parameterization, |
---|
611 | and the NLTE schemes are turned off, as is |
---|
612 | usually the case in typical Martian tropospheric |
---|
613 | mesoscale simulations. |
---|
614 | \item \ttt{iradia} sets the frequency |
---|
615 | (in dynamical timesteps) at which |
---|
616 | the radiative computations are performed. |
---|
617 | \item Modifying \ttt{callphys.def} only implies |
---|
618 | to recompile the model if the number of tracers is different. |
---|
619 | \item If you run a simulation with, say, $3$ domains, |
---|
620 | please ensure that you defined three files |
---|
621 | \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def}. |
---|
622 | \end{finger} |
---|
623 | |
---|
624 | \mk |
---|
625 | \chapter{Preprocessing utilities} |
---|
626 | |
---|
627 | \mk |
---|
628 | In the previous chapter, we decribed the simulation settings |
---|
629 | in the \ttt{namelist.input} file. |
---|
630 | % |
---|
631 | We saw that any modification of the parameters |
---|
632 | labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)} |
---|
633 | implies the initial and boundary conditions |
---|
634 | and/or the domain definition to be recomputed prior to running the model again. |
---|
635 | % |
---|
636 | As a result, you were probably unable to change many of the parameters |
---|
637 | of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which |
---|
638 | the initial and boundary conditions -- as well as the domain of |
---|
639 | simulation -- were predefined. |
---|
640 | |
---|
641 | \mk |
---|
642 | \marge In this chapter, we describe the installation and use of the preprocessing tools to |
---|
643 | define the domain of simulation, calculate an initial atmospheric state |
---|
644 | and prepare the boundary conditions for the chosen simulation time. |
---|
645 | % |
---|
646 | This necessary step would eventually allow you to run your own simulations at the specific season and region |
---|
647 | you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}. |
---|
648 | |
---|
649 | \mk |
---|
650 | \section{Installing the preprocessing utilities} |
---|
651 | |
---|
652 | \mk |
---|
653 | First and foremost, since the preprocessing utilities could generate |
---|
654 | (or involve) files of quite significant sizes, it is necessary |
---|
655 | to define a directory where these files would be stored. |
---|
656 | % |
---|
657 | Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows |
---|
658 | % |
---|
659 | \begin{verbatim} |
---|
660 | ln -sf /bigdisk/user $LMDMOD/TMPDIR |
---|
661 | \end{verbatim} |
---|
662 | |
---|
663 | \mk |
---|
664 | \marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian |
---|
665 | Mesoscale Model was compiled at least once. |
---|
666 | % |
---|
667 | If this is not the case, please compile |
---|
668 | the model with the \ttt{makemeso} command |
---|
669 | (see section \ref{sc:makemeso}). |
---|
670 | |
---|
671 | \mk |
---|
672 | \marge The compilation process created an |
---|
673 | installation directory adapted to your |
---|
674 | particular choice of compiler$+$machine. |
---|
675 | % |
---|
676 | The preprocessing tools will also |
---|
677 | be installed in this directory. |
---|
678 | % |
---|
679 | Please type the following commands: |
---|
680 | % |
---|
681 | \begin{verbatim} |
---|
682 | cd $LMDMOD/LMD_MM_MARS/g95_32_single/ ## or any install directory |
---|
683 | ln -sf ../prepare_ini . |
---|
684 | ./prepare_ini |
---|
685 | \end{verbatim} |
---|
686 | |
---|
687 | \mk |
---|
688 | \marge The script \ttt{prepare\_ini} plays with the preprocessing tools |
---|
689 | an equivalent role as the \ttt{copy\_model} with the model sources : |
---|
690 | files are simply linked to their actual location in the \ttt{SRC} folder. |
---|
691 | % |
---|
692 | Once you have executed \ttt{prepare\_ini}, please check that |
---|
693 | two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. |
---|
694 | |
---|
695 | \mk |
---|
696 | \marge In the \ttt{PREP\_MARS} directory, please compile |
---|
697 | the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, |
---|
698 | using the compiler mentionned in the name of the current |
---|
699 | installation directory: |
---|
700 | % |
---|
701 | \begin{verbatim} |
---|
702 | echo $PWD |
---|
703 | cd PREP_MARS/ |
---|
704 | ./compile [or] ./compile_g95 |
---|
705 | ls -lt create_readmeteo.exe readmeteo.exe |
---|
706 | cd .. |
---|
707 | \end{verbatim} |
---|
708 | |
---|
709 | \mk |
---|
710 | \marge In the \ttt{WPS} directory, please compile |
---|
711 | the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}: |
---|
712 | \begin{verbatim} |
---|
713 | cd WPS/ |
---|
714 | ./configure ## select your compiler + 'NO GRIB2' option |
---|
715 | ./compile |
---|
716 | ls -lt geogrid.exe metgrid.exe |
---|
717 | \end{verbatim} |
---|
718 | |
---|
719 | \mk |
---|
720 | \marge Apart from the executables you just compiled, |
---|
721 | the preprocessing utilities include \ttt{real.exe}, |
---|
722 | which was compiled by the \ttt{makemeso} script |
---|
723 | along with the mesoscale model executable \ttt{wrf.exe}. |
---|
724 | % |
---|
725 | \ttt{real.exe} should be copied or linked in the |
---|
726 | simulation directory (e.g. \ttt{TESTCASE} for the |
---|
727 | Arsia Mons test case) to be at the same level than |
---|
728 | \ttt{namelist.input}. |
---|
729 | |
---|
730 | \begin{finger} |
---|
731 | \item Even though the name of the executable writes |
---|
732 | e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program |
---|
733 | is not related to the specific \ttt{makemeso} |
---|
734 | parameters -- contrary to the \ttt{wrf.exe} executable. |
---|
735 | % |
---|
736 | We just found that renaming the (possibly similar |
---|
737 | if the model sources were not modified) |
---|
738 | \ttt{real.exe} was a practical way not to confuse |
---|
739 | between executables compiled at different moments. |
---|
740 | \end{finger} |
---|
741 | |
---|
742 | \mk |
---|
743 | \section{Running the preprocessing utilities} |
---|
744 | |
---|
745 | \mk |
---|
746 | When you run a simulation with \ttt{wrf.exe}, |
---|
747 | the program attempts to read the initial state |
---|
748 | in the files |
---|
749 | \ttt{wrfinput\_d01}, |
---|
750 | \ttt{wrfinput\_d02}, \ldots |
---|
751 | (one file per domain) |
---|
752 | and the parent domain boundary conditions |
---|
753 | in \ttt{wrfbdy\_d01}. |
---|
754 | % |
---|
755 | The whole chain of data conversion and |
---|
756 | interpolation needed to generate those |
---|
757 | files is summarized in the diagram next |
---|
758 | page. |
---|
759 | % |
---|
760 | Three distinct preprocessing steps are |
---|
761 | necessary to generate the final files. |
---|
762 | % |
---|
763 | As is described in the previous section, |
---|
764 | some modifications in the \ttt{namelist.input} file |
---|
765 | [e.g. start/end dates labelled with \ttt{(p1)}] |
---|
766 | requires a complete reprocessing from step $1$ to step $3$ |
---|
767 | to successfully launch the simulation, |
---|
768 | whereas other changes |
---|
769 | [e.g. model top labelled with \ttt{(p3)}] |
---|
770 | only requires a quick reprocessing at step $3$, keeping |
---|
771 | the files generated at the end of step $2$ |
---|
772 | the same. |
---|
773 | |
---|
774 | \mk |
---|
775 | \subsection{Input data} |
---|
776 | |
---|
777 | \mk |
---|
778 | \subsubsection{Static data} |
---|
779 | |
---|
780 | \mk |
---|
781 | All the static data |
---|
782 | (topography, thermal inertia, albedo) |
---|
783 | needed to initialize the model |
---|
784 | are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory. |
---|
785 | % |
---|
786 | By default, only coarse-resolution datasets\footnote{ |
---|
787 | %%% |
---|
788 | Corresponding to the fields stored in the |
---|
789 | file \ttt{surface.nc} known by LMD-MGCM users: |
---|
790 | \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} |
---|
791 | %%% |
---|
792 | } are available, but the directory also contains sources and scripts |
---|
793 | to install finer resolution datasets: |
---|
794 | \begin{citemize} |
---|
795 | \item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola}, |
---|
796 | \item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01}, |
---|
797 | \item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07} |
---|
798 | \end{citemize} |
---|
799 | \pagebreak |
---|
800 | \includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf} |
---|
801 | |
---|
802 | \mk |
---|
803 | \marge The role of the \ttt{build\_static} script is to |
---|
804 | automatically download these datasets from the web |
---|
805 | (namely PDS archives) and convert them to an |
---|
806 | acceptable format for a future use by the |
---|
807 | preprocessing utilities: |
---|
808 | % |
---|
809 | \begin{verbatim} |
---|
810 | cd $LMDMOD/LMD_MM_MARS |
---|
811 | ./build_static |
---|
812 | \end{verbatim} |
---|
813 | % |
---|
814 | \begin{finger} |
---|
815 | \item Please install the \ttt{octave} |
---|
816 | free software\footnote{ |
---|
817 | %%% |
---|
818 | Available at \url{http://www.gnu.org/software/octave} |
---|
819 | %%% |
---|
820 | } on your system to be able to use the |
---|
821 | \ttt{build\_static} script. |
---|
822 | % |
---|
823 | Another solution is to browse into each of the |
---|
824 | directories contained within \ttt{WPS\_GEOG}, download the |
---|
825 | data with the shell scripts and execute the \ttt{.m} scripts with either |
---|
826 | \ttt{octave} or the commercial software \ttt{matlab} |
---|
827 | (just replace \ttt{\#} by \ttt{\%}). |
---|
828 | % |
---|
829 | \item If you do not manage to execute the \ttt{build\_static} script, |
---|
830 | converted ready-to-use datafiles are available upon request. |
---|
831 | % |
---|
832 | \item The building of the MOLA 64ppd topographical |
---|
833 | database can be quite long. Thus, such a process is |
---|
834 | not performed by default by the \ttt{build\_static} script. |
---|
835 | If the user would like to build this database, |
---|
836 | please remove the \ttt{exit} command in the script, just above |
---|
837 | the commands related to the MOLA 64ppd. |
---|
838 | % |
---|
839 | \item The resulting \ttt{WPS\_GEOG} can reach a size |
---|
840 | of several hundreds of Mo. |
---|
841 | % |
---|
842 | You might move such a folder in a place |
---|
843 | with more disk space available, but then be |
---|
844 | sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS} |
---|
845 | a link to the new location |
---|
846 | of the directory. |
---|
847 | \end{finger} |
---|
848 | |
---|
849 | \mk |
---|
850 | \subsubsection{Meteorological data} |
---|
851 | |
---|
852 | \mk |
---|
853 | The preprocessing tools generate initial and boundary conditions |
---|
854 | from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations. |
---|
855 | % |
---|
856 | If you would like to run a mesoscale simulation at a given |
---|
857 | season, you need to first run a GCM simulation and output |
---|
858 | the meteorological fields at the considered season. |
---|
859 | % |
---|
860 | For optimal forcing at the boundaries, we advise you |
---|
861 | to write the meteorological fields to the |
---|
862 | \ttt{diagfi.nc} file at least each two hours. |
---|
863 | % |
---|
864 | Please also make sure that the following fields |
---|
865 | are stored in the NETCDF \ttt{diagfi.nc} file: |
---|
866 | |
---|
867 | \footnotesize |
---|
868 | \codesource{contents_diagfi} |
---|
869 | |
---|
870 | \normalsize |
---|
871 | \begin{finger} |
---|
872 | \item If the fields |
---|
873 | \ttt{emis}, |
---|
874 | \ttt{co2ice}, |
---|
875 | \ttt{q01}, |
---|
876 | \ttt{q02}, |
---|
877 | \ttt{tsoil} |
---|
878 | are missing in the \ttt{diagfi.nc} file, |
---|
879 | they are replaced by respective default |
---|
880 | values $0.95$, $0$, $0$, $0$, tsurf. |
---|
881 | \end{finger} |
---|
882 | |
---|
883 | \mk |
---|
884 | \marge An example of input meteorological file |
---|
885 | \ttt{diagfi.nc} file can be downloaded |
---|
886 | at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}. |
---|
887 | % |
---|
888 | Please deflate the archive and copy the \ttt{diagfi.nc} file |
---|
889 | in \ttt{\$LMDMOD/TMPDIR/GCMINI}. |
---|
890 | % |
---|
891 | Such a file can then be used to define the initial |
---|
892 | and boundary conditions, and we will go |
---|
893 | through the three preprocessing steps. |
---|
894 | |
---|
895 | \mk |
---|
896 | \subsection{Preprocessing steps} |
---|
897 | |
---|
898 | \mk |
---|
899 | \subsubsection{Step 1: Converting GCM data} |
---|
900 | |
---|
901 | \mk |
---|
902 | The programs in the \ttt{PREP\_MARS} directory |
---|
903 | convert the data from the NETCDF \ttt{diagfi.nc} |
---|
904 | file into separated binary datafiles for each |
---|
905 | date contained in \ttt{diagfi.nc}, according to |
---|
906 | the formatting needed by the |
---|
907 | preprocessing programs at step 2. |
---|
908 | % |
---|
909 | These programs can be executed by the following |
---|
910 | commands: |
---|
911 | \begin{verbatim} |
---|
912 | cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS |
---|
913 | echo 1 | ./create_readmeteo.exe # drop the "echo 1 |" if you want control |
---|
914 | ./readmeteo.exe < readmeteo.def |
---|
915 | \end{verbatim} |
---|
916 | % |
---|
917 | \marge If every went well with the conversion, |
---|
918 | the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED} |
---|
919 | should contain files named \ttt{LMD:}. |
---|
920 | |
---|
921 | \mk |
---|
922 | \subsubsection{2: Interpolation on the regional domain} |
---|
923 | |
---|
924 | \mk |
---|
925 | In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows |
---|
926 | you to define the mesoscale simulation domain |
---|
927 | to horizontally interpolate the topography, |
---|
928 | thermal inertia and albedo fields at the domain |
---|
929 | resolution and to calculate useful fields |
---|
930 | such as topographical slopes.%\pagebreak |
---|
931 | |
---|
932 | \mk |
---|
933 | \marge Please execute the commands: |
---|
934 | % |
---|
935 | \begin{verbatim} |
---|
936 | cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS |
---|
937 | ln -sf ../../TESTCASE/namelist.wps . # test case |
---|
938 | ./geogrid.exe |
---|
939 | \end{verbatim} |
---|
940 | % |
---|
941 | \marge The result of \ttt{geogrid.exe} |
---|
942 | -- and thus the definition of the mesoscale |
---|
943 | domain -- can be checked in the NETCDF |
---|
944 | file \ttt{geo\_em.d01.nc}. |
---|
945 | % |
---|
946 | A quick check can be performed using the command line |
---|
947 | \begin{verbatim} |
---|
948 | ncview geo_em.d01.nc |
---|
949 | \end{verbatim} |
---|
950 | \marge if \ttt{ncview} is installed, or the \ttt{IDL} |
---|
951 | script \ttt{out\_geo.pro} |
---|
952 | \begin{verbatim} |
---|
953 | idl |
---|
954 | IDL> out_geo, field1='TOPO' |
---|
955 | IDL> out_geo, field1='TI' |
---|
956 | IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &' |
---|
957 | IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &' |
---|
958 | IDL> exit |
---|
959 | \end{verbatim} |
---|
960 | \marge if the demo version of \ttt{IDL} is installed. |
---|
961 | % |
---|
962 | Of course if your favorite graphical tool supports |
---|
963 | the NETCDF standard, you might use it to check the |
---|
964 | domain definition in \ttt{geo\_em.d01.nc}. |
---|
965 | |
---|
966 | \mk |
---|
967 | \marge If you are unhappy with the results or |
---|
968 | you want to change |
---|
969 | the location of the mesoscale domain on the planet, |
---|
970 | the horizontal resolution, |
---|
971 | the number of grid points \ldots, |
---|
972 | please modify the parameter |
---|
973 | file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}. |
---|
974 | % |
---|
975 | Here are the contents of \ttt{namelist.wps}: |
---|
976 | % |
---|
977 | \codesource{namelist.wps_TEST} |
---|
978 | |
---|
979 | \begin{finger} |
---|
980 | % |
---|
981 | \item No input meteorological data |
---|
982 | are actually needed to execute \ttt{geogrid.exe}. |
---|
983 | % |
---|
984 | \item More details about the database and |
---|
985 | more options of interpolation could be |
---|
986 | found in the file \ttt{geogrid/GEOGRID.TBL}. |
---|
987 | % |
---|
988 | \item Defining several domains yields |
---|
989 | distinct files |
---|
990 | \ttt{geo\_em.d01.nc}, |
---|
991 | \ttt{geo\_em.d02.nc}, |
---|
992 | \ttt{geo\_em.d03.nc}\ldots |
---|
993 | \end{finger} |
---|
994 | |
---|
995 | \mk |
---|
996 | \marge Once the \ttt{geo\_em} file(s) are generated, |
---|
997 | the \ttt{metgrid.exe} program performs |
---|
998 | a similar horizontal interpolation |
---|
999 | of the meteorological fields to the mesoscale |
---|
1000 | domain as the one performed by \ttt{geogrid.exe} |
---|
1001 | for the surface data. |
---|
1002 | % |
---|
1003 | Then the program writes the results in |
---|
1004 | \ttt{met\_em} files and also collects |
---|
1005 | the static fields and domain parameters |
---|
1006 | included in the \ttt{geo\_em} file(s) |
---|
1007 | % |
---|
1008 | Please type the following commands: |
---|
1009 | \begin{verbatim} |
---|
1010 | cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS |
---|
1011 | ./metgrid.exe |
---|
1012 | \end{verbatim} |
---|
1013 | % |
---|
1014 | \marge If every went well, |
---|
1015 | the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED} |
---|
1016 | should contain the \ttt{met\_em.*} files. |
---|
1017 | |
---|
1018 | \mk |
---|
1019 | \subsubsection{Step 3: Vertical interpolation on mesoscale levels} |
---|
1020 | |
---|
1021 | \mk |
---|
1022 | \marge The last step is to execute \ttt{real.exe} |
---|
1023 | to perform the interpolation from the vertical |
---|
1024 | levels of the GCM to the vertical levels |
---|
1025 | defined in the mesoscale model. |
---|
1026 | % |
---|
1027 | This program also prepares the final initial |
---|
1028 | state for the simulation in files called |
---|
1029 | \ttt{wrfinput} and the boundary conditions |
---|
1030 | in files called \ttt{wrfbdy}. |
---|
1031 | |
---|
1032 | \mk |
---|
1033 | \marge To successfully execute \ttt{real.exe}, |
---|
1034 | you need the \ttt{met\_em.*} files |
---|
1035 | and the \ttt{namelist.input} file |
---|
1036 | to be in the same directory as \ttt{real.exe}. |
---|
1037 | % |
---|
1038 | Parameters in \ttt{namelist.input} |
---|
1039 | controlling the behavior of the vertical interpolation |
---|
1040 | are those labelled with \ttt{(p3)} in the detailed |
---|
1041 | list introduced in the previous chapter. |
---|
1042 | |
---|
1043 | \mk |
---|
1044 | \marge Please type the following commands |
---|
1045 | to prepare files for the Arsia Mons test case |
---|
1046 | (or your personal test case if you changed |
---|
1047 | the parameters in \ttt{namelist.wps}): |
---|
1048 | \begin{verbatim} |
---|
1049 | cd $LMDMOD/TESTCASE |
---|
1050 | ln -sf $LMDMOD/WRFFEED/met_em* . |
---|
1051 | ./real.exe |
---|
1052 | \end{verbatim} |
---|
1053 | |
---|
1054 | \mk |
---|
1055 | \marge The final message of the \ttt{real.exe} |
---|
1056 | should claim the success of the processes and you |
---|
1057 | are now ready to launch the integrations |
---|
1058 | of the LMD Martian Mesoscale Model again |
---|
1059 | with the \ttt{wrf.exe} command as in section |
---|
1060 | \ref{sc:arsia}. |
---|
1061 | |
---|
1062 | \begin{finger} |
---|
1063 | \item When you modify either |
---|
1064 | \ttt{namelist.wps} or \ttt{namelist.input}, |
---|
1065 | make sure that the common parameters |
---|
1066 | are exactly similar in both files |
---|
1067 | (especially when running nested simulations) |
---|
1068 | otherwise either \ttt{real.exe} or \ttt{wrf.exe} |
---|
1069 | command will exit with an error message. |
---|
1070 | \end{finger} |
---|
1071 | %\pagebreak |
---|
1072 | |
---|
1073 | |
---|
1074 | \chapter{Starting simulations from scratch} |
---|
1075 | |
---|
1076 | \mk |
---|
1077 | \section{Running your own GCM simulations} |
---|
1078 | |
---|
1079 | \begin{remarque} |
---|
1080 | To be completed |
---|
1081 | \end{remarque} |
---|
1082 | |
---|
1083 | \mk |
---|
1084 | \section{Complete simulations with \ttt{runmeso}} |
---|
1085 | |
---|
1086 | \begin{remarque} |
---|
1087 | To be completed |
---|
1088 | \end{remarque} |
---|
1089 | |
---|
1090 | |
---|
1091 | \chapter{Outputs} |
---|
1092 | |
---|
1093 | \mk |
---|
1094 | \section{Postprocessing utilities and graphics} |
---|
1095 | |
---|
1096 | \begin{remarque} |
---|
1097 | To be completed. Do-it-all \ttt{idl} scripts |
---|
1098 | would be described here ! |
---|
1099 | \end{remarque} |
---|
1100 | |
---|
1101 | \mk |
---|
1102 | \section{Modify the outputs} |
---|
1103 | |
---|
1104 | \begin{remarque} |
---|
1105 | To be completed. |
---|
1106 | Though the method is different, |
---|
1107 | we kept all the convenient aspects of \ttt{writediagfi} |
---|
1108 | \end{remarque} |
---|
1109 | |
---|
1110 | \chapter{Frequently Asked Questions} |
---|
1111 | |
---|
1112 | |
---|
1113 | \begin{finger} |
---|
1114 | \item Which timestep should I choose to avoid crashes of the model ? |
---|
1115 | \item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ? |
---|
1116 | \item Help ! I get strange assembler errors or ILM errors while compiling ! |
---|
1117 | \item Is it possible to run the model on a specific configuration that is not supported ? |
---|
1118 | \item Why do I have to define four less rows in the parent domain |
---|
1119 | when performing nested runs ? |
---|
1120 | \item I am kind of nostalgic of early/middle Mars. How could I run |
---|
1121 | mesoscale simulations at low/high obliquity ? |
---|
1122 | \item Why \ttt{real.exe} is crashing when the model top pressure is |
---|
1123 | lower than $2$~Pa ? |
---|
1124 | \item Can I use the two-way nesting ? |
---|
1125 | \end{finger} |
---|
1126 | |
---|
1127 | \begin{remarque} |
---|
1128 | To be completed. |
---|
1129 | \end{remarque} |
---|
1130 | |
---|
1131 | |
---|
1132 | |
---|
1133 | |
---|
1134 | |
---|
1135 | |
---|
1136 | |
---|
1137 | |
---|
1138 | |
---|
1139 | |
---|
1140 | |
---|
1141 | |
---|