Changeset 220 for trunk/MESOSCALE_DEV/MANUAL/SRC
- Timestamp:
- Jul 13, 2011, 10:33:31 PM (13 years ago)
- Location:
- trunk/MESOSCALE_DEV/MANUAL/SRC
- Files:
-
- 1 added
- 6 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex
r219 r220 23 23 24 24 \sk 25 In this chapter, the general method to perform steps 0 and 4 is reviewed. Other steps are reviewed in the next chapter. Here the model will be compiled and run in a test case with precomputed sample files for preprocessing steps 1, 2, 3.25 In this chapter, the general method to perform steps 0 and 4 is reviewed. Other steps are reviewed in chapter~\ref{zepreproc}. Here the model will be compiled and run in a test case with precomputed sample files for preprocessing steps 1, 2, 3. 26 26 27 27 \sk … … 31 31 Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$LMDMOD}, but those are not important at this stage.} and sub-directories through the following command lines: 32 32 \begin{verbatim} 33 ls $MMM 34 ls $MMM/* 33 ls $MMM ; ls $MMM/* 35 34 \end{verbatim} 36 35 … … 62 61 \item \ttt{runmeso}: this is a \ttt{bash} script that can be used once the model and preprocessing systems are installed; it prepares and runs a mesoscale simulation by going from step~1 to~4. 63 62 \item \ttt{RUN}: this is a directory containing various files and scripts useful for advanced simulations. 64 \item \ttt{DEF}: this is a directory containing many examples for parameter files to be used in the LMD Martian Mesoscale Modelsimulations.63 \item \ttt{DEF}: this is a directory containing many examples of parameter files for simulations. 65 64 \end{citemize} 66 65 … … 94 93 \item calculate the total number of horizontal grid points handled by the LMD physics; 95 94 \item duplicate LMD physical sources if nesting is activated; 95 \pagebreak 96 96 \item compile the LMD physical packages with the appropriate \ttt{makegcm} command 97 97 and collect the compiled objects in the library \ttt{liblmd.a}; -
trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex
r218 r220 35 35 \sk 36 36 \begin{finger} 37 \item 38 You might also find useful -- though not mandatory -- to install on your system the following software:37 \item If you want the environment variables to be persistent in your system, please copy the \ttt{declare} command lines spread in this user manual in your \ttt{.bashrc} or \ttt{.bash\_profile}. 38 \item You might also find useful -- though not mandatory -- to install on your system: 39 39 \begin{citemize} 40 40 \item \ttt{ncview}\footnote{ \url{http://meteora.ucsd.edu/~pierce/ncview\_home\_page.html} }: tool to visualize the contents of a netCDF file; -
trunk/MESOSCALE_DEV/MANUAL/SRC/keep
r219 r220 1 2 \footnote{And \ttt{wrfinput\_d02}, \ttt{wrfinput\_d03}, \ldots (one file per domain, see~\ref{nests}) 3 4 5 } 6 1 7 2 8 %%% -
trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex
r219 r220 1 \chapter{Setting the simulation parameters} 1 \chapter{Setting the simulation parameters}\label{zeparam} 2 2 3 3 \vk … … 38 38 \begin{citemize} 39 39 \item \ttt{(r)} indicates parameters which modifications imply a new compilation\footnote{A full recompilation using the option \ttt{makemeso -f} is not needed here.} of the model using \ttt{makemeso} (step 0); 40 \item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing of initial and boundary conditions (see next chapter), corresponding respectively to step~1, 2, 3; \ttt{(p1)} means the user has to carry out again steps~1 to 3 before being able to run the model at step~4; \ttt{(p2)} means the user has to carry out again steps~2 to~3 before model run at step~4;40 \item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing of initial and boundary conditions (see chapter~\ref{zepreproc}), corresponding respectively to step~1, 2, 3; \ttt{(p1)} means the user has to carry out again steps~1 to 3 before being able to run the model at step~4; \ttt{(p2)} means the user has to carry out again steps~2 to~3 before model run at step~4; 41 41 \item no label means once you have modified the parameter, you can simply start directly at step~4 (running the model); 42 42 \item \ttt{(*d)} denotes dynamical parameters which modification implies non-standard simulations -- please read \ttt{\$MMM/SRC/WRFV2/run/README.namelist} and use with caution, i.e. if you know what you are doing; after modifying those parameters you can simply start at step~4. -
trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual.tex
r219 r220 91 91 \include{compile_exec} 92 92 \include{parameters} 93 \include{preproc} 93 94 %\include{user_manual_txt} 95 %\end{document} 94 96 95 97 \backmatter -
trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex
r219 r220 1 1 2 2 3 \mk4 \chapter{Preprocessing utilities}5 6 \mk7 In the previous chapter, we decribed the simulation settings8 in the \ttt{namelist.input} file.9 %10 We saw that any modification of the parameters11 labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}12 implies the initial and boundary conditions13 and/or the domain definition to be recomputed prior to running the model again.14 %15 As a result, you were probably unable to change many of the parameters16 of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which17 the initial and boundary conditions -- as well as the domain of18 simulation -- were predefined.19 20 \mk21 \marge In this chapter, we describe the installation and use of the preprocessing tools to22 define the domain of simulation, calculate an initial atmospheric state23 and prepare the boundary conditions for the chosen simulation time.24 %25 This necessary step would eventually allow you to run your own simulations at the specific season and region26 you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}.27 28 \mk29 \section{Installing the preprocessing utilities}30 31 \mk32 First and foremost, since the preprocessing utilities could generate33 (or involve) files of quite significant sizes, it is necessary34 to define a directory where these files would be stored.35 %36 Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows37 %38 \begin{verbatim}39 ln -sf /bigdisk/user $LMDMOD/TMPDIR40 \end{verbatim}41 42 \mk43 \marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian44 Mesoscale Model was compiled at least once.45 %46 If this is not the case, please compile47 the model with the \ttt{makemeso} command48 (see section \ref{sc:makemeso}).49 50 \mk51 \marge The compilation process created an52 installation directory adapted to your53 particular choice of compiler$+$machine.54 %55 The preprocessing tools will also56 be installed in this directory.57 %58 Please type the following commands:59 %60 \begin{verbatim}61 cd $LMDMOD/LMD_MM_MARS/g95_32_single/ ## or any install directory62 ln -sf ../prepare_ini .63 ./prepare_ini64 \end{verbatim}65 66 \mk67 \marge The script \ttt{prepare\_ini} plays with the preprocessing tools68 an equivalent role as the \ttt{copy\_model} with the model sources :69 files are simply linked to their actual location in the \ttt{SRC} folder.70 %71 Once you have executed \ttt{prepare\_ini}, please check that72 two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}.73 74 \mk75 \marge In the \ttt{PREP\_MARS} directory, please compile76 the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe},77 using the compiler mentionned in the name of the current78 installation directory:79 %80 \begin{verbatim}81 echo $PWD82 cd PREP_MARS/83 ./compile [or] ./compile_g9584 ls -lt create_readmeteo.exe readmeteo.exe85 cd ..86 \end{verbatim}87 88 \mk89 \marge In the \ttt{WPS} directory, please compile90 the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}:91 \begin{verbatim}92 cd WPS/93 ./configure ## select your compiler + 'NO GRIB2' option94 ./compile95 ls -lt geogrid.exe metgrid.exe96 \end{verbatim}97 98 \mk99 \marge Apart from the executables you just compiled,100 the preprocessing utilities include \ttt{real.exe},101 which was compiled by the \ttt{makemeso} script102 along with the mesoscale model executable \ttt{wrf.exe}.103 %104 \ttt{real.exe} should be copied or linked in the105 simulation directory (e.g. \ttt{TESTCASE} for the106 Arsia Mons test case) to be at the same level than107 \ttt{namelist.input}.108 109 \begin{finger}110 \item Even though the name of the executable writes111 e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program112 is not related to the specific \ttt{makemeso}113 parameters -- contrary to the \ttt{wrf.exe} executable.114 %115 We just found that renaming the (possibly similar116 if the model sources were not modified)117 \ttt{real.exe} was a practical way not to confuse118 between executables compiled at different moments.119 \end{finger}120 121 \mk122 \section{Running the preprocessing utilities}123 124 \mk125 When you run a simulation with \ttt{wrf.exe},126 the program attempts to read the initial state127 in the files128 \ttt{wrfinput\_d01},129 \ttt{wrfinput\_d02}, \ldots130 (one file per domain)131 and the parent domain boundary conditions132 in \ttt{wrfbdy\_d01}.133 %134 The whole chain of data conversion and135 interpolation needed to generate those136 files is summarized in the diagram next137 page.138 %139 Three distinct preprocessing steps are140 necessary to generate the final files.141 %142 As is described in the previous section,143 some modifications in the \ttt{namelist.input} file144 [e.g. start/end dates labelled with \ttt{(p1)}]145 requires a complete reprocessing from step $1$ to step $3$146 to successfully launch the simulation,147 whereas other changes148 [e.g. model top labelled with \ttt{(p3)}]149 only requires a quick reprocessing at step $3$, keeping150 the files generated at the end of step $2$151 the same.152 153 \mk154 \subsection{Input data}155 156 \mk157 \subsubsection{Static data}158 159 \mk160 All the static data161 (topography, thermal inertia, albedo)162 needed to initialize the model163 are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory.164 %165 By default, only coarse-resolution datasets\footnote{166 %%%167 Corresponding to the fields stored in the168 file \ttt{surface.nc} known by LMD-MGCM users:169 \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc}170 %%%171 } are available, but the directory also contains sources and scripts172 to install finer resolution datasets:173 \begin{citemize}174 \item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola},175 \item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01},176 \item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07}177 \end{citemize}178 \pagebreak179 \includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf}180 181 \mk182 \marge The role of the \ttt{build\_static} script is to183 automatically download these datasets from the web184 (namely PDS archives) and convert them to an185 acceptable format for a future use by the186 preprocessing utilities:187 %188 \begin{verbatim}189 cd $LMDMOD/LMD_MM_MARS190 ./build_static191 \end{verbatim}192 %193 \begin{finger}194 \item Please install the \ttt{octave}195 free software\footnote{196 %%%197 Available at \url{http://www.gnu.org/software/octave}198 %%%199 } on your system to be able to use the200 \ttt{build\_static} script.201 %202 Another solution is to browse into each of the203 directories contained within \ttt{WPS\_GEOG}, download the204 data with the shell scripts and execute the \ttt{.m} scripts with either205 \ttt{octave} or the commercial software \ttt{matlab}206 (just replace \ttt{\#} by \ttt{\%}).207 %208 \item If you do not manage to execute the \ttt{build\_static} script,209 converted ready-to-use datafiles are available upon request.210 %211 \item The building of the MOLA 64ppd topographical212 database can be quite long. Thus, such a process is213 not performed by default by the \ttt{build\_static} script.214 If the user would like to build this database,215 please remove the \ttt{exit} command in the script, just above216 the commands related to the MOLA 64ppd.217 %218 \item The resulting \ttt{WPS\_GEOG} can reach a size219 of several hundreds of Mo.220 %221 You might move such a folder in a place222 with more disk space available, but then be223 sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS}224 a link to the new location225 of the directory.226 \end{finger}227 228 \mk229 \subsubsection{Meteorological data}230 231 \mk232 The preprocessing tools generate initial and boundary conditions233 from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.234 %235 If you would like to run a mesoscale simulation at a given236 season, you need to first run a GCM simulation and output237 the meteorological fields at the considered season.238 %239 For optimal forcing at the boundaries, we advise you240 to write the meteorological fields to the241 \ttt{diagfi.nc} file at least each two hours.242 %243 Please also make sure that the following fields244 are stored in the NETCDF \ttt{diagfi.nc} file:245 246 \footnotesize247 \codesource{contents_diagfi}248 249 \normalsize250 \begin{finger}251 \item If the fields252 \ttt{emis},253 \ttt{co2ice},254 \ttt{q01},255 \ttt{q02},256 \ttt{tsoil}257 are missing in the \ttt{diagfi.nc} file,258 they are replaced by respective default259 values $0.95$, $0$, $0$, $0$, tsurf.260 \end{finger}261 262 \mk263 \marge An example of input meteorological file264 \ttt{diagfi.nc} file can be downloaded265 at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.266 %267 Please deflate the archive and copy the \ttt{diagfi.nc} file268 in \ttt{\$LMDMOD/TMPDIR/GCMINI}.269 %270 Such a file can then be used to define the initial271 and boundary conditions, and we will go272 through the three preprocessing steps.273 274 \mk275 \subsection{Preprocessing steps}276 277 \mk278 \subsubsection{Step 1: Converting GCM data}279 280 \mk281 \section{Running your own GCM simulations}282 283 \begin{remarque}284 To be completed285 \end{remarque}286 287 \mk288 The programs in the \ttt{PREP\_MARS} directory289 convert the data from the NETCDF \ttt{diagfi.nc}290 file into separated binary datafiles for each291 date contained in \ttt{diagfi.nc}, according to292 the formatting needed by the293 preprocessing programs at step 2.294 %295 These programs can be executed by the following296 commands:297 \begin{verbatim}298 cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS299 echo 1 | ./create_readmeteo.exe # drop the "echo 1 |" if you want control300 ./readmeteo.exe < readmeteo.def301 \end{verbatim}302 %303 \marge If every went well with the conversion,304 the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}305 should contain files named \ttt{LMD:}.306 307 \mk308 \subsubsection{2: Interpolation on the regional domain}309 310 \mk311 In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows312 you to define the mesoscale simulation domain313 to horizontally interpolate the topography,314 thermal inertia and albedo fields at the domain315 resolution and to calculate useful fields316 such as topographical slopes.%\pagebreak317 318 \mk319 \marge Please execute the commands:320 %321 \begin{verbatim}322 cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS323 ln -sf ../../TESTCASE/namelist.wps . # test case324 ./geogrid.exe325 \end{verbatim}326 %327 \marge The result of \ttt{geogrid.exe}328 -- and thus the definition of the mesoscale329 domain -- can be checked in the NETCDF330 file \ttt{geo\_em.d01.nc}.331 %332 A quick check can be performed using the command line333 \begin{verbatim}334 ncview geo_em.d01.nc335 \end{verbatim}336 \marge if \ttt{ncview} is installed, or the \ttt{IDL}337 script \ttt{out\_geo.pro}338 \begin{verbatim}339 idl340 IDL> out_geo, field1='TOPO'341 IDL> out_geo, field1='TI'342 IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'343 IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'344 IDL> exit345 \end{verbatim}346 \marge if the demo version of \ttt{IDL} is installed.347 %348 Of course if your favorite graphical tool supports349 the NETCDF standard, you might use it to check the350 domain definition in \ttt{geo\_em.d01.nc}.351 352 \mk353 \marge If you are unhappy with the results or354 you want to change355 the location of the mesoscale domain on the planet,356 the horizontal resolution,357 the number of grid points \ldots,358 please modify the parameter359 file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.360 %361 Here are the contents of \ttt{namelist.wps}:362 %363 \codesource{namelist.wps_TEST}364 365 \begin{finger}366 %367 \item No input meteorological data368 are actually needed to execute \ttt{geogrid.exe}.369 %370 \item More details about the database and371 more options of interpolation could be372 found in the file \ttt{geogrid/GEOGRID.TBL}.373 %374 \item Defining several domains yields375 distinct files376 \ttt{geo\_em.d01.nc},377 \ttt{geo\_em.d02.nc},378 \ttt{geo\_em.d03.nc}\ldots379 \end{finger}380 381 \mk382 \marge Once the \ttt{geo\_em} file(s) are generated,383 the \ttt{metgrid.exe} program performs384 a similar horizontal interpolation385 of the meteorological fields to the mesoscale386 domain as the one performed by \ttt{geogrid.exe}387 for the surface data.388 %389 Then the program writes the results in390 \ttt{met\_em} files and also collects391 the static fields and domain parameters392 included in the \ttt{geo\_em} file(s)393 %394 Please type the following commands:395 \begin{verbatim}396 cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS397 ./metgrid.exe398 \end{verbatim}399 %400 \marge If every went well,401 the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}402 should contain the \ttt{met\_em.*} files.403 404 \mk405 \subsubsection{Step 3: Vertical interpolation on mesoscale levels}406 407 \mk408 \marge The last step is to execute \ttt{real.exe}409 to perform the interpolation from the vertical410 levels of the GCM to the vertical levels411 defined in the mesoscale model.412 %413 This program also prepares the final initial414 state for the simulation in files called415 \ttt{wrfinput} and the boundary conditions416 in files called \ttt{wrfbdy}.417 418 \mk419 \marge To successfully execute \ttt{real.exe},420 you need the \ttt{met\_em.*} files421 and the \ttt{namelist.input} file422 to be in the same directory as \ttt{real.exe}.423 %424 Parameters in \ttt{namelist.input}425 controlling the behavior of the vertical interpolation426 are those labelled with \ttt{(p3)} in the detailed427 list introduced in the previous chapter.428 429 \mk430 \marge Please type the following commands431 to prepare files for the Arsia Mons test case432 (or your personal test case if you changed433 the parameters in \ttt{namelist.wps}):434 \begin{verbatim}435 cd $LMDMOD/TESTCASE436 ln -sf $LMDMOD/WRFFEED/met_em* .437 ./real.exe438 \end{verbatim}439 440 \mk441 \marge The final message of the \ttt{real.exe}442 should claim the success of the processes and you443 are now ready to launch the integrations444 of the LMD Martian Mesoscale Model again445 with the \ttt{wrf.exe} command as in section446 \ref{sc:arsia}.447 448 \begin{finger}449 \item When you modify either450 \ttt{namelist.wps} or \ttt{namelist.input},451 make sure that the common parameters452 are exactly similar in both files453 (especially when running nested simulations)454 otherwise either \ttt{real.exe} or \ttt{wrf.exe}455 command will exit with an error message.456 \end{finger}457 %\pagebreak458 3 459 4
Note: See TracChangeset
for help on using the changeset viewer.