source: trunk/MESOSCALE/MANUAL/SRC/preproc.tex @ 3567

Last change on this file since 3567 was 1182, checked in by aslmd, 11 years ago

MESOSCALE MANUAL. mistake in previous commit, erased latest version, fixed now

File size: 19.9 KB
Line 
1\chapter{Preprocessing utilities}\label{zepreproc}
2
3\vk
4In this chapter, we describe the installation and use of the preprocessing tools to define the domain of simulation, calculate an initial atmospheric state and prepare the boundary conditions for the chosen simulation season and time of day. This corresponds to steps 1,2,3 as defined in section~\ref{steps}. These operations would eventually allow you to run your own simulations at the specific season and region you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}, including the ones labelled with~\ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}.
5
6\mk
7\section{Installing the preprocessing utilities}
8
9\sk
10The compilation operations indicated here need to be done only once on a given system with a given compiler.
11
12\sk
13\subsection{Prerequisites}
14
15\sk
16First and foremost, since the preprocessing utilities could involve files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be created in \ttt{\$MESO/TMPDIR} as indicated below.
17
18\begin{verbatim}
19ln -sf /bigdisk/user $MESO/TMPDIR
20mkdir $MESO/TMPDIR/GCMINI
21mkdir $MESO/TMPDIR/WPSFEED
22mkdir $MESO/TMPDIR/WRFFEED
23\end{verbatim}
24
25\sk
26A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{your\_compdir} in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:
27
28\begin{verbatim}
29cd $MMM/your_compdir
30ln -sf ../SRC/SCRIPTS/prepare_ini .
31./prepare_ini
32\end{verbatim}
33%%echo $PWD
34
35\sk
36\subsection{Compiling preprocessing utilities}
37
38\sk
39The script \ttt{prepare\_ini} plays for the preprocessing tools a similar role as the script \ttt{copy\_model} for the model sources: files are simply linked to their actual location in the \ttt{SRC} folder. Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentioned in the name of the current installation directory. In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}. Here are the useful commands:
40
41\begin{verbatim}
42cd your_compdir/PREP_MARS/
43./compile_pgf [or] ./compile_g95 [or] ./compile_ifort 
44ls -lt create_readmeteo.exe readmeteo.exe
45cd ..
46cd WPS/
47clean
48./configure     ## select your compiler + 'NO GRIB2' option
49./compile
50ls -lt geogrid.exe metgrid.exe
51\end{verbatim}
52
53\sk
54Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} executable was a practical way not to confuse between executables compiled at different moments.} (cf. chapter~\ref{compile}). \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}.
55
56\begin{verbatim}
57cp your_compdir/real_*.exe your_simulation_directory/
58cp your_compdir/wrf_*.exe your_simulation_directory/
59\end{verbatim}
60
61\sk
62\subsection{Preparing input static data}\label{wpsgeog}
63
64\sk
65All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MMM/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{These coarse-resolution datasets correspond to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
66
67\begin{verbatim}
68cd $MMM
69ln -sf SRC/SCRIPTS/build_static .
70./build_static
71\end{verbatim}
72
73\sk
74\begin{finger}
75\item Please install the \ttt{octave} free software\footnote{Available at \url{http://www.gnu.org/software/octave} } on your system to execute the \ttt{build\_static} script\footnote{ Another solution is to browse into each of the directories within \ttt{WPS\_GEOG/res}, download the data with the shell scripts and execute the \ttt{.m} scripts with either \ttt{octave} or the commercial software \ttt{matlab} (just replace \ttt{\#} by \ttt{\%}). }.
76\item Building the MOLA 64ppd database can be quite long; hence this is not performed by default by the \ttt{build\_static} script. If you would like to build this database, please remove the \ttt{exit} command in the script, just above the commands related to the MOLA 64ppd.
77\item If you do not manage to execute the \ttt{build\_static} script, ready-to-use datafiles can be found in the link \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd} and must be extracted in \ttt{\$MMM/WPS\_GEOG}.
78\item The resulting \ttt{WPS\_GEOG} directory can reach a size of several hundreds of Mo. You might move such a folder in a place with more disk space available and define a link~\ttt{WPS\_GEOG} in \ttt{\$MMM}.
79\end{finger}
80
81\sk
82\subsection{Compiling the GCM for initial and boundary conditions}
83
84\sk
85The LMD Martian GCM needs to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours by the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, to be extracted in the \ttt{\$MESO} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MESO/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are also available. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
86
87\begin{verbatim}
88cd $MESO/LMDZ.MARS
89[edit $MESO/LMDZ.MARS/libf/phymars/datafile.h & fill absolute link $MMM/WPS_GEOG]
90[edit compile if needed]
91./compile
92\end{verbatim}
93
94\sk
95The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with, based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database\footnote{If another database is used, \ttt{compile} must be edited; default is~$64 \times 48 \times 32$ GCM runs with~$2$ tracers.} can be found in the following online archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible to the system you plan to run the mesoscale model on. A link named~\ttt{startbase} towards the \ttt{STARTBASE\_64\_48\_32\_t2} directory must be created in the directory~\ttt{\$MESO/LMDZ.MARS/myGCM}.
96
97\begin{verbatim}
98ln -sf where_is_your_startbase/STARTBASE_64_48_32_t2 startbase
99\end{verbatim}
100
101\sk
102It is important to check that the chosen reference database 1. spans the season desired for the mesoscale simulation; 2. includes the right number of tracers and vertical extent; and 3. uses GCM parameterizations that are close to the ones employed in the subsequent mesoscale simulations.
103
104\sk
105GCM integrations can then be launched in~\ttt{\$MESO/LMDZ.MARS/myGCM} using~\ttt{launch\_gcm}.
106
107\mk
108\section{Running the preprocessing utilities}
109
110\sk
111\subsection{General overview}\label{changeparam}
112
113\sk
114When you run a simulation with \ttt{wrf.exe} (e.g. section \ref{sc:arsia}), the program attempts to read the initial state in \ttt{wrfinput\_d01} and the domain boundary conditions in \ttt{wrfbdy\_d01}. The whole chain of data conversion and interpolation needed to generate those files is summarized in the diagram on Figure~\ref{preproc}. Three distinct preprocessing steps are necessary to generate the final files (steps are numbered 1,2,3 as in section~\ref{steps}). Figure~\ref{preproc} helps to better understand the labels \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} used to describe \ttt{namelist.input} parameters in chapter~\ref{zeparam}. For instance:
115\begin{finger}
116\item changing the season of simulation implies to re-run the LMD Mars GCM for this specific season to prepare initial and boundary conditions for the mesoscale model. Hence e.g. \ttt{start\_month} is labelled with \ttt{(p1)} because changing this in \ttt{namelist.input} requires a complete reprocessing from step~$1$ to step~$3$ to successfully launch the simulation.
117\item changing the number of horizontal grid points for the mesoscale domain implies to interpolate the static and GCM fields to the new domain, while no new computations on the GCM side are needed. Hence e.g. \ttt{e\_we} is labelled with \ttt{(p2)} because changing this in \ttt{namelist.input} requires a reprocessing from step~$2$ to step~$3$ to successfully launch the simulation (for this specific parameter recompiling with \ttt{makemeso} is also needed).
118\item changing the position of model top implies to interpolate initial and boundary conditions to the new vertical levels, while no horizontal re-interpolations are needed. Hence e.g. \ttt{p\_top\_requested} is labelled with \ttt{(p3)} because changing this requires a reprocessing of step~$3$.
119\item changing the timestep for dynamical integration does not require any change in initial and boundary conditions. Hence e.g. \ttt{time\_step} is not labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}.
120\end{finger}
121
122\begin{center}
123\begin{figure}[p] 
124\includegraphics[width=0.99\textwidth]{diagramme.pdf}
125\caption{\label{preproc} The details of preprocessing steps and their related software and inputs/ouputs}
126\end{figure}
127\end{center}
128
129%\sk
130\subsection{Step 1: Running the GCM and converting data}\label{gcmini}
131
132\sk
133Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} reproduced in appendix can help with this choice (i.e. sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check in the \ttt{calendar} file which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} must be set accordingly: suppose you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, or ideally each hour\footnote{The parameter \ttt{interval\_seconds} in \ttt{namelist.wps} (see section~\ref{wps}) has to be set accordingly.}, i.e. \ttt{ecritphy} is respectively~$80$ or~$40$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}:
134
135\begin{verbatim}
136cd $MESO/LMDZ.MARS/myGCM
137[edit run.def, in particular to modify nday]
138./launch_gcm    ## answer: your desired starting sol for the simulations
139\end{verbatim}
140
141        %\mk
142        %\marge An example of input meteorological file 
143        %\ttt{diagfi.nc} file can be downloaded
144        %at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
145        %%
146        %Please deflate the archive and copy the \ttt{diagfi.nc} file
147        %in \ttt{\$MESO/TMPDIR/GCMINI}.
148        %%
149        %Such a file can then be used to define the initial
150        %and boundary conditions, and we will go
151        %through the three preprocessing steps.
152
153\sk
154Once the GCM simulations are finished, programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, those are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf.} for each date contained in \ttt{diagfi.nc} and formatted for the preprocessing programs at step 2. These programs can be executed by the following commands; if everything went well with the conversion, the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:*}.
155
156\begin{verbatim}
157cd $MMM/your_install_dir/PREP_MARS
158echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
159./readmeteo.exe < readmeteo.def
160\end{verbatim}
161
162\sk
163\subsection{Step 2: Interpolation on the regional domain}\label{wps}
164
165\sk
166\paragraph{Step 2a} In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows you to define the mesoscale simulation domain, to horizontally interpolate the topography, thermal inertia and albedo fields at the domain resolution and to calculate useful fields such as topographical slopes. Please execute the commands:
167
168\begin{verbatim}
169cd $MMM/your_install_dir/WPS
170ln -sf $MMM/TESTCASE/namelist.wps .   # test case (or use your customized file)
171./geogrid.exe
172\end{verbatim}
173
174The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} e.g. with topographical fields \ttt{HGT\_M} \ttt{HGT\_U} \ttt{HGT\_V} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page\footnote{You may find the corresponding file in \ttt{\$MMM/SIMU/namelist.wps\_example}.}, and execute again \ttt{geogrid.exe}.
175
176\begin{finger}
177\item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be done e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain while GCM computations being performed during step~1.
178\item More details about the database and more options of interpolation could be found in the file \ttt{geogrid/GEOGRID.TBL} (for advanced users only).
179\item Two examples of \ttt{namelist.wps} parameters are given in Figure~\ref{vallespolar} with resulting domains.
180\end{finger}
181
182\footnotesize
183\codesource{namelist.wps_example}
184\normalsize
185
186\begin{figure}[h!] 
187\begin{center}
188\includegraphics[width=0.48\textwidth]{valles.png}
189\includegraphics[width=0.48\textwidth]{LMD_MMM_d1_20km_domain_100.png}
190\end{center}
191\caption{\label{vallespolar} (Left plot) An example of mercator domain in the Valles Marineris region as simulated by \textit{Spiga and Forget} [2009, their section 3.3]: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 401}, \ttt{e\_we = 121}, \ttt{dx = 12000}, \ttt{dy = 12000}, \ttt{map\_proj = 'mercator'}, \ttt{ref\_lat = -8}, \ttt{ref\_lon = -68}. (Right plot) An example of north polar domain with stereographical projection: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 117}, \ttt{e\_we = 117}, \ttt{dx = 20000}, \ttt{dy = 20000}, \ttt{map\_proj = 'polar'}, \ttt{ref\_lat = 90}, \ttt{ref\_lon = 0.1}, \ttt{truelat1  =  90}, \ttt{stand\_lon =  0.1}.}
192\end{figure}
193
194\sk
195The input datasets for topography and soil properties can be set in \ttt{namelist.wps} through the keyword \ttt{geog\_data\_res}. Possible choices are:
196\begin{citemize}
197\item \ttt{'gcm'}: coarse-resolution datasets;
198\item \ttt{'32ppd'}: coarse-resolution datasets, but 32ppd MOLA topography;
199\item \ttt{'64ppd'}: fine-resolution datasets: TES albedo \& thermal inertia, 64ppd MOLA topography;
200\item \ttt{'64ppd\_noHRti'}: fine-resolution datasets, but coarse-resolution thermal inertia;
201\item \ttt{'32ppd\_HRalb'}: fine-resolution albedo, coarse-resolution thermal inertia, 32ppd topography.
202\end{citemize}
203The corresponding dataset must have been built in the \ttt{WPS\_GEOG} folder previously (see section~\ref{wpsgeog}).
204
205\sk
206\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MESO/TMPDIR/WRFFEED/current} should contain \ttt{met\_em.*} files.
207
208\begin{verbatim}
209cd $MMM/your_install_dir/WPS
210mkdir WRFFEED/current
211./metgrid.exe
212\end{verbatim}
213
214\sk
215\subsection{Step 3: Vertical interpolation on mesoscale levels}\label{real.exe}
216
217\sk
218The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files named \ttt{wrfinput} and the boundary conditions in files named \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} which controls the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}.
219
220\begin{verbatim}
221cd $MMM/TESTCASE   ## or anywhere you would like to run the simulation
222ln -sf $MESO/TMPDIR/WRFFEED/current/met_em* .
223./real.exe
224\end{verbatim}
225
226\sk
227The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
228
229\sk
230\begin{finger}
231\item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Obviously the dates sent to \ttt{launch\_gcm} and set in both \ttt{namelist.input} and \ttt{namelist.wps} should be consistent too. }
232\end{finger}
233
234\clearemptydoublepage
235
Note: See TracBrowser for help on using the repository browser.