source: trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex @ 220

Last change on this file since 220 was 220, checked in by aslmd, 14 years ago

MESOSCALE: user manual. started chapter about preprocessing steps.

File size: 16.8 KB
Line 
1\chapter{Preprocessing utilities}\label{zepreproc}
2
3\vk
4In this chapter, we describe the installation and use of the preprocessing tools to define the domain of simulation, calculate an initial atmospheric state and prepare the boundary conditions for the chosen simulation season and time of day. This corresponds to steps 1,2,3 as defined in section~\ref{steps}. These operations would eventually allow you to run your own simulations at the specific season and region you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}, including the ones labelled with~\ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}.
5
6\mk
7\section{Installing the preprocessing utilities}
8
9\sk
10The compilation operations indicated here need to be done only once on a given system.
11
12\sk
13\subsection{Prerequisites}
14
15\sk
16First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$LMDMOD/TMPDIR} as indicated below.
17
18\begin{verbatim}
19ln -sf /bigdisk/user $LMDMOD/TMPDIR
20mkdir $LMDMOD/TMPDIR/GCMINI
21mkdir $LMDMOD/TMPDIR/WPSFEED
22mkdir $LMDMOD/TMPDIR/WRFFEED
23\end{verbatim}
24
25\sk
26A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{DIRCOMP} for illustration in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:
27
28\begin{verbatim}
29cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
30ln -sf ../SRC/SCRIPTS/prepare_ini .
31./prepare_ini
32\end{verbatim}
33
34\sk
35\subsection{Compiling preprocessing utilities}
36
37\sk
38The script \ttt{prepare\_ini} plays for the preprocessing tools a similar role as the \ttt{copy\_model} with the model sources: files are simply linked to their actual location in the \ttt{SRC} folder. Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentionned in the name of the current installation directory. In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}.
39
40\begin{verbatim}
41echo $PWD
42cd PREP_MARS/
43./compile [or] ./compile_g95     ## the first script compiles with pgf90
44                                 ## the second script compiles with g95
45                                 ## scripts can be easily adapted to, e.g., ifort
46ls -lt create_readmeteo.exe readmeteo.exe
47cd ..
48cd WPS/   
49./configure     ## select your compiler + 'NO GRIB2' option
50./compile
51ls -lt geogrid.exe metgrid.exe
52\end{verbatim}
53
54\sk
55Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} was a practical way not to confuse between executables compiled at different moments.}. \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}.
56
57\sk
58\subsection{Preparing input static data}
59
60\sk
61All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
62
63\begin{verbatim}
64cd $LMDMOD/LMD_MM_MARS
65./build_static
66\end{verbatim}
67
68\sk
69\begin{finger}
70\item Please install the \ttt{octave} free software\footnote{ Available at \url{http://www.gnu.org/software/octave} } on your system to execute the \ttt{build\_static} script\footnote{ Another solution is to browse into each of the directories within \ttt{WPS\_GEOG/res}, download the data with the shell scripts and execute the \ttt{.m} scripts with either \ttt{octave} or the commercial software \ttt{matlab} (just replace \ttt{\#} by \ttt{\%}). }.
71\item Building the MOLA 64ppd database can be quite long; hence this is not performed by default by the \ttt{build\_static} script. If you would like to build this database, please remove the \ttt{exit} command in the script, just above the commands related to the MOLA 64ppd.
72\item If you do not manage to execute the \ttt{build\_static} script, ready-to-use datafiles can be found in the link \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd} and must be extracted in \ttt{\$MMM/WPS\_GEOG}.
73\item The resulting \ttt{WPS\_GEOG} can reach a size of several hundreds of Mo. You might move such a folder in a place with more disk space available and define a link named~\ttt{WPS\_GEOG} in \ttt{\$MMM}.
74\end{finger}
75
76\sk
77\subsection{Compiling the GCM for initial and boundary conditions}
78
79\sk
80The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$LMDMOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$LMDMOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands must be used and should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
81
82\begin{verbatim}
83cd $LMDMOD/LMDZ.MARS
84./compile
85\end{verbatim}
86
87\sk
88The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$LMDMOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
89
90\mk
91\section{Running the preprocessing utilities}
92
93\sk
94\subsection{General overview}
95
96\sk
97When you run a simulation with \ttt{wrf.exe} (e.g. section \ref{sc:arsia}), the program attempts to read the initial state in \ttt{wrfinput\_d01} and the domain boundary conditions in \ttt{wrfbdy\_d01}. The whole chain of data conversion and interpolation needed to generate those files is summarized in the diagram on Figure~\ref{preproc}. Three distinct preprocessing steps are necessary to generate the final files (steps are numbered 1,2,3 as in section~\ref{steps}). Figure~\ref{preproc} helps to better understand the labels \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} used to describe \ttt{namelist.input} parameters in chapter~\ref{zeparam}. For instance:
98\begin{finger}
99\item changing the season of simulation implies to re-run the LMD Mars GCM for this specific season to prepare initial and boundary conditions for the mesoscale model. Hence e.g. \ttt{start\_month} is labelled with \ttt{(p1)} because changing this in \ttt{namelist.input} requires a complete reprocessing from step~$1$ to step~$3$ to successfully launch the simulation.
100\item changing the number of horizontal grid points for the mesoscale domain implies to interpolate the static and GCM fields to the new domain, while no new computations on the GCM side are needed. Hence e.g. \ttt{e\_we} is labelled with \ttt{(p2)} because changing this in \ttt{namelist.input} requires a reprocessing from step~$2$ to step~$3$ to successfully launch the simulation (and also for this specific parameter recompiling with \ttt{makemeso}).
101\item changing the position of model top implies to interpolate initial and boundary conditions to the new vertical levels, while no horizontal re-interpolations are needed. Hence e.g. \ttt{p\_top\_requested} is labelled with \ttt{(p3)} because changing this requires a reprocessing of step~$3$.
102\item changing the timestep for dynamical integration does not require any change in initial and bouondary conditions. Hence e.g. \ttt{time\_step} is not labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}.
103\end{finger}
104
105\begin{center}
106\begin{figure}[p]
107\includegraphics[width=0.99\textwidth]{diagramme.pdf} 
108\caption{\label{preproc} The details of preprocessing steps and their related software and inputs/ouputs}
109\end{figure}
110\end{center}
111
112\end{document}
113
114\mk
115\subsubsection{Meteorological data}
116
117\ttt{launch\_gcm}
118
119\mk
120The preprocessing tools generate initial and boundary conditions
121from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
122%
123If you would like to run a mesoscale simulation at a given
124season, you need to first run a GCM simulation and output
125the meteorological fields at the considered season.
126%
127For optimal forcing at the boundaries, we advise you
128to write the meteorological fields to the
129\ttt{diagfi.nc} file at least each two hours.
130%
131Please also make sure that the following fields
132are stored in the NETCDF \ttt{diagfi.nc} file:
133
134\footnotesize
135\codesource{contents_diagfi}
136
137\normalsize
138\begin{finger}
139\item If the fields
140\ttt{emis},
141\ttt{co2ice},
142\ttt{q01},
143\ttt{q02},
144\ttt{tsoil} 
145are missing in the \ttt{diagfi.nc} file,
146they are replaced by respective default
147values $0.95$, $0$, $0$, $0$, tsurf.
148\end{finger}
149
150\mk
151\marge An example of input meteorological file
152\ttt{diagfi.nc} file can be downloaded
153at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
154%
155Please deflate the archive and copy the \ttt{diagfi.nc} file
156in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
157%
158Such a file can then be used to define the initial
159and boundary conditions, and we will go
160through the three preprocessing steps.
161
162\mk
163\subsection{Preprocessing steps} 
164
165\mk
166\subsubsection{Step 1: Converting GCM data}
167
168\mk
169The programs in the \ttt{PREP\_MARS} directory
170convert the data from the NETCDF \ttt{diagfi.nc}
171file into separated binary datafiles for each
172date contained in \ttt{diagfi.nc}, according to
173the formatting needed by the
174preprocessing programs at step 2.
175%
176These programs can be executed by the following
177commands:
178\begin{verbatim}
179cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
180echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
181./readmeteo.exe < readmeteo.def
182\end{verbatim}
183%
184\marge If every went well with the conversion,
185the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
186should contain files named \ttt{LMD:}.
187
188\mk
189\subsubsection{2: Interpolation on the regional domain}
190
191\mk
192In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
193you to define the mesoscale simulation domain
194to horizontally interpolate the topography,
195thermal inertia and albedo fields at the domain
196resolution and to calculate useful fields
197such as topographical slopes.%\pagebreak
198
199\mk
200\marge Please execute the commands:
201%
202\begin{verbatim}
203cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
204ln -sf ../../TESTCASE/namelist.wps .   # test case
205./geogrid.exe
206\end{verbatim}
207%
208\marge The result of \ttt{geogrid.exe} 
209-- and thus the definition of the mesoscale
210domain -- can be checked in the NETCDF
211file \ttt{geo\_em.d01.nc}.
212%
213A quick check can be performed using the command line
214\begin{verbatim}
215ncview geo_em.d01.nc
216\end{verbatim} 
217\marge if \ttt{ncview} is installed, or the \ttt{IDL}
218script \ttt{out\_geo.pro}
219\begin{verbatim}
220idl
221IDL> out_geo, field1='TOPO'
222IDL> out_geo, field1='TI'
223IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
224IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
225IDL> exit
226\end{verbatim}
227\marge if the demo version of \ttt{IDL} is installed.
228%
229Of course if your favorite graphical tool supports
230the NETCDF standard, you might use it to check the
231domain definition in \ttt{geo\_em.d01.nc}.
232
233\mk
234\marge If you are unhappy with the results or
235you want to change
236the location of the mesoscale domain on the planet,
237the horizontal resolution,
238the number of grid points \ldots,
239please modify the parameter
240file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
241%
242Here are the contents of \ttt{namelist.wps}:
243%
244\codesource{namelist.wps_TEST} 
245
246\begin{finger}
247%
248\item No input meteorological data
249are actually needed to execute \ttt{geogrid.exe}.
250%
251\item More details about the database and
252more options of interpolation could be
253found in the file \ttt{geogrid/GEOGRID.TBL}.
254%
255\item Defining several domains yields
256distinct files
257\ttt{geo\_em.d01.nc},
258\ttt{geo\_em.d02.nc},
259\ttt{geo\_em.d03.nc}\ldots
260\end{finger}
261
262\mk
263\marge Once the \ttt{geo\_em} file(s) are generated,
264the \ttt{metgrid.exe} program performs
265a similar horizontal interpolation
266of the meteorological fields to the mesoscale
267domain as the one performed by \ttt{geogrid.exe} 
268for the surface data.
269%
270Then the program writes the results in
271\ttt{met\_em} files and also collects
272the static fields and domain parameters
273included in the \ttt{geo\_em} file(s)
274%
275Please type the following commands:
276\begin{verbatim}
277cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
278./metgrid.exe
279\end{verbatim}
280%
281\marge If every went well,
282the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
283should contain the \ttt{met\_em.*} files.
284
285\mk
286\subsubsection{Step 3: Vertical interpolation on mesoscale levels}
287
288\mk
289\marge The last step is to execute \ttt{real.exe}
290to perform the interpolation from the vertical
291levels of the GCM to the vertical levels
292defined in the mesoscale model.
293%
294This program also prepares the final initial
295state for the simulation in files called
296\ttt{wrfinput} and the boundary conditions
297in files called \ttt{wrfbdy}.
298
299\mk
300\marge To successfully execute \ttt{real.exe},
301you need the \ttt{met\_em.*} files
302and the \ttt{namelist.input} file
303to be in the same directory as \ttt{real.exe}.
304%
305Parameters in \ttt{namelist.input}
306controlling the behavior of the vertical interpolation
307are those labelled with \ttt{(p3)} in the detailed
308list introduced in the previous chapter.
309
310\mk
311\marge Please type the following commands
312to prepare files for the Arsia Mons test case
313(or your personal test case if you changed
314the parameters in \ttt{namelist.wps}):
315\begin{verbatim}
316cd $LMDMOD/TESTCASE
317ln -sf $LMDMOD/WRFFEED/met_em* .
318./real.exe
319\end{verbatim}
320
321\mk
322\marge The final message of the \ttt{real.exe}
323should claim the success of the processes and you
324are now ready to launch the integrations
325of the LMD Martian Mesoscale Model again
326with the \ttt{wrf.exe} command as in section
327\ref{sc:arsia}.
328
329\begin{finger}
330\item When you modify either
331\ttt{namelist.wps} or \ttt{namelist.input},
332make sure that the common parameters
333are exactly similar in both files
334(especially when running nested simulations)
335otherwise either \ttt{real.exe} or \ttt{wrf.exe}
336command will exit with an error message.
337\end{finger}
338%\pagebreak
Note: See TracBrowser for help on using the repository browser.