Changeset 220


Ignore:
Timestamp:
Jul 13, 2011, 10:33:31 PM (13 years ago)
Author:
aslmd
Message:

MESOSCALE: user manual. started chapter about preprocessing steps.

Location:
trunk/MESOSCALE_DEV/MANUAL/SRC
Files:
1 added
6 edited

Legend:

Unmodified
Added
Removed
  • trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex

    r219 r220  
    2323
    2424\sk
    25 In this chapter, the general method to perform steps 0 and 4 is reviewed. Other steps are reviewed in the next chapter. Here the model will be compiled and run in a test case with precomputed sample files for preprocessing steps 1, 2, 3.
     25In this chapter, the general method to perform steps 0 and 4 is reviewed. Other steps are reviewed in chapter~\ref{zepreproc}. Here the model will be compiled and run in a test case with precomputed sample files for preprocessing steps 1, 2, 3.
    2626
    2727\sk
     
    3131Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$LMDMOD}, but those are not important at this stage.} and sub-directories through the following command lines:
    3232\begin{verbatim}
    33 ls $MMM
    34 ls $MMM/*
     33ls $MMM ; ls $MMM/*
    3534\end{verbatim}
    3635
     
    6261\item \ttt{runmeso}: this is a \ttt{bash} script that can be used once the model and preprocessing systems are installed; it prepares and runs a mesoscale simulation by going from step~1 to~4.
    6362\item \ttt{RUN}: this is a directory containing various files and scripts useful for advanced simulations.
    64 \item \ttt{DEF}: this is a directory containing many examples for parameter files to be used in the LMD Martian Mesoscale Model simulations.
     63\item \ttt{DEF}: this is a directory containing many examples of parameter files for simulations.
    6564\end{citemize}
    6665
     
    9493\item calculate the total number of horizontal grid points handled by the LMD physics;
    9594\item duplicate LMD physical sources if nesting is activated;
     95\pagebreak
    9696\item compile the LMD physical packages with the appropriate \ttt{makegcm} command
    9797and collect the compiled objects in the library \ttt{liblmd.a};
  • trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex

    r218 r220  
    3535\sk
    3636\begin{finger}
    37 \item
    38 You might also find useful -- though not mandatory -- to install on your system the following software:
     37\item If you want the environment variables to be persistent in your system, please copy the \ttt{declare} command lines spread in this user manual in your \ttt{.bashrc} or \ttt{.bash\_profile}.
     38\item You might also find useful -- though not mandatory -- to install on your system:
    3939\begin{citemize}
    4040\item \ttt{ncview}\footnote{ \url{http://meteora.ucsd.edu/~pierce/ncview\_home\_page.html} }: tool to visualize the contents of a netCDF file;
  • trunk/MESOSCALE_DEV/MANUAL/SRC/keep

    r219 r220  
     1
     2\footnote{And \ttt{wrfinput\_d02}, \ttt{wrfinput\_d03}, \ldots (one file per domain, see~\ref{nests})
     3
     4
     5}
     6
    17
    28%%%
  • trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex

    r219 r220  
    1 \chapter{Setting the simulation parameters}
     1\chapter{Setting the simulation parameters}\label{zeparam}
    22
    33\vk
     
    3838\begin{citemize}
    3939\item \ttt{(r)} indicates parameters which modifications imply a new compilation\footnote{A full recompilation using the option \ttt{makemeso -f} is not needed here.} of the model using \ttt{makemeso} (step 0);
    40 \item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing of initial and boundary conditions (see next chapter), corresponding respectively to step~1, 2, 3; \ttt{(p1)} means the user has to carry out again steps~1 to 3 before being able to run the model at step~4; \ttt{(p2)} means the user has to carry out again steps~2 to~3 before model run at step~4;
     40\item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing of initial and boundary conditions (see chapter~\ref{zepreproc}), corresponding respectively to step~1, 2, 3; \ttt{(p1)} means the user has to carry out again steps~1 to 3 before being able to run the model at step~4; \ttt{(p2)} means the user has to carry out again steps~2 to~3 before model run at step~4;
    4141\item no label means once you have modified the parameter, you can simply start directly at step~4 (running the model);
    4242\item \ttt{(*d)} denotes dynamical parameters which modification implies non-standard simulations -- please read \ttt{\$MMM/SRC/WRFV2/run/README.namelist} and use with caution, i.e. if you know what you are doing; after modifying those parameters you can simply start at step~4.
  • trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual.tex

    r219 r220  
    9191\include{compile_exec}
    9292\include{parameters}
     93\include{preproc}
    9394%\include{user_manual_txt}
     95%\end{document}
    9496
    9597\backmatter
  • trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex

    r219 r220  
    11
    22
    3 \mk
    4 \chapter{Preprocessing utilities}
    5 
    6 \mk
    7 In the previous chapter, we decribed the simulation settings
    8 in the \ttt{namelist.input} file.
    9 %
    10 We saw that any modification of the parameters
    11 labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}
    12 implies the initial and boundary conditions
    13 and/or the domain definition to be recomputed prior to running the model again.
    14 %
    15 As a result, you were probably unable to change many of the parameters
    16 of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which
    17 the initial and boundary conditions -- as well as the domain of
    18 simulation -- were predefined.
    19 
    20 \mk
    21 \marge In this chapter, we describe the installation and use of the preprocessing tools to
    22 define the domain of simulation, calculate an initial atmospheric state
    23 and prepare the boundary conditions for the chosen simulation time.
    24 %
    25 This necessary step would eventually allow you to run your own simulations at the specific season and region
    26 you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}.
    27 
    28 \mk
    29 \section{Installing the preprocessing utilities}
    30 
    31 \mk
    32 First and foremost, since the preprocessing utilities could generate
    33 (or involve) files of quite significant sizes, it is necessary
    34 to define a directory where these files would be stored.
    35 %
    36 Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows
    37 %
    38 \begin{verbatim}
    39 ln -sf /bigdisk/user $LMDMOD/TMPDIR
    40 \end{verbatim}
    41 
    42 \mk
    43 \marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian
    44 Mesoscale Model was compiled at least once.
    45 %
    46 If this is not the case, please compile
    47 the model with the \ttt{makemeso} command
    48 (see section \ref{sc:makemeso}).
    49 
    50 \mk
    51 \marge The compilation process created an
    52 installation directory adapted to your
    53 particular choice of compiler$+$machine.
    54 %
    55 The preprocessing tools will also
    56 be installed in this directory.
    57 %
    58 Please type the following commands:
    59 %
    60 \begin{verbatim}
    61 cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any install directory
    62 ln -sf ../prepare_ini .
    63 ./prepare_ini
    64 \end{verbatim}
    65 
    66 \mk
    67 \marge The script \ttt{prepare\_ini} plays with the preprocessing tools
    68 an equivalent role as the \ttt{copy\_model} with the model sources :
    69 files are simply linked to their actual location in the \ttt{SRC} folder.
    70 %
    71 Once you have executed \ttt{prepare\_ini}, please check that
    72 two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}.
    73 
    74 \mk
    75 \marge In the \ttt{PREP\_MARS} directory, please compile
    76 the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe},
    77 using the compiler mentionned in the name of the current
    78 installation directory:
    79 %
    80 \begin{verbatim}
    81 echo $PWD
    82 cd PREP_MARS/
    83 ./compile [or] ./compile_g95
    84 ls -lt create_readmeteo.exe readmeteo.exe
    85 cd ..
    86 \end{verbatim}
    87 
    88 \mk
    89 \marge In the \ttt{WPS} directory, please compile
    90 the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}:
    91 \begin{verbatim}
    92 cd WPS/   
    93 ./configure   ## select your compiler + 'NO GRIB2' option
    94 ./compile
    95 ls -lt geogrid.exe metgrid.exe
    96 \end{verbatim}
    97 
    98 \mk
    99 \marge Apart from the executables you just compiled,
    100 the preprocessing utilities include \ttt{real.exe},
    101 which was compiled by the \ttt{makemeso} script
    102 along with the mesoscale model executable \ttt{wrf.exe}.
    103 %
    104 \ttt{real.exe} should be copied or linked in the
    105 simulation directory (e.g. \ttt{TESTCASE} for the
    106 Arsia Mons test case) to be at the same level than
    107 \ttt{namelist.input}.
    108 
    109 \begin{finger}
    110 \item Even though the name of the executable writes
    111 e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program
    112 is not related to the specific \ttt{makemeso}
    113 parameters -- contrary to the \ttt{wrf.exe} executable.
    114 %
    115 We just found that renaming the (possibly similar
    116 if the model sources were not modified)
    117 \ttt{real.exe} was a practical way not to confuse
    118 between executables compiled at different moments.
    119 \end{finger}
    120 
    121 \mk
    122 \section{Running the preprocessing utilities}
    123 
    124 \mk
    125 When you run a simulation with \ttt{wrf.exe},
    126 the program attempts to read the initial state
    127 in the files
    128 \ttt{wrfinput\_d01},
    129 \ttt{wrfinput\_d02}, \ldots
    130 (one file per domain)
    131 and the parent domain boundary conditions
    132 in \ttt{wrfbdy\_d01}.
    133 %
    134 The whole chain of data conversion and
    135 interpolation needed to generate those
    136 files is summarized in the diagram next
    137 page.
    138 %
    139 Three distinct preprocessing steps are
    140 necessary to generate the final files.
    141 %
    142 As is described in the previous section,
    143 some modifications in the \ttt{namelist.input} file
    144 [e.g. start/end dates labelled with \ttt{(p1)}]
    145 requires a complete reprocessing from step $1$ to step $3$
    146 to successfully launch the simulation,
    147 whereas other changes
    148 [e.g. model top labelled with \ttt{(p3)}]
    149 only requires a quick reprocessing at step $3$, keeping
    150 the files generated at the end of step $2$
    151 the same.
    152  
    153 \mk
    154 \subsection{Input data}
    155 
    156 \mk
    157 \subsubsection{Static data}
    158 
    159 \mk
    160 All the static data
    161 (topography, thermal inertia, albedo)
    162 needed to initialize the model
    163 are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory.
    164 %
    165 By default, only coarse-resolution datasets\footnote{
    166 %%%
    167 Corresponding to the fields stored in the
    168 file \ttt{surface.nc} known by LMD-MGCM users:
    169 \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc}
    170 %%%
    171 } are available, but the directory also contains sources and scripts
    172 to install finer resolution datasets:
    173 \begin{citemize}
    174 \item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola},
    175 \item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01},
    176 \item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07}
    177 \end{citemize}
    178 \pagebreak
    179 \includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf}
    180 
    181 \mk
    182 \marge The role of the \ttt{build\_static} script is to
    183 automatically download these datasets from the web
    184 (namely PDS archives) and convert them to an
    185 acceptable format for a future use by the
    186 preprocessing utilities:
    187 %
    188 \begin{verbatim}
    189 cd $LMDMOD/LMD_MM_MARS
    190 ./build_static
    191 \end{verbatim}
    192 %
    193 \begin{finger}
    194 \item Please install the \ttt{octave}
    195 free software\footnote{
    196 %%%
    197 Available at \url{http://www.gnu.org/software/octave}
    198 %%%
    199 } on your system to be able to use the
    200 \ttt{build\_static} script.
    201 %
    202 Another solution is to browse into each of the
    203 directories contained within \ttt{WPS\_GEOG}, download the
    204 data with the shell scripts and execute the \ttt{.m} scripts with either
    205 \ttt{octave} or the commercial software \ttt{matlab}
    206 (just replace \ttt{\#} by \ttt{\%}).
    207 %
    208 \item If you do not manage to execute the \ttt{build\_static} script,
    209 converted ready-to-use datafiles are available upon request.
    210 %
    211 \item The building of the MOLA 64ppd topographical
    212 database can be quite long. Thus, such a process is
    213 not performed by default by the \ttt{build\_static} script.
    214 If the user would like to build this database,
    215 please remove the \ttt{exit} command in the script, just above
    216 the commands related to the MOLA 64ppd.
    217 %
    218 \item The resulting \ttt{WPS\_GEOG} can reach a size
    219 of several hundreds of Mo.
    220 %
    221 You might move such a folder in a place
    222 with more disk space available, but then be
    223 sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS}
    224 a link to the new location
    225 of the directory.
    226 \end{finger}
    227 
    228 \mk
    229 \subsubsection{Meteorological data}
    230 
    231 \mk
    232 The preprocessing tools generate initial and boundary conditions
    233 from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
    234 %
    235 If you would like to run a mesoscale simulation at a given
    236 season, you need to first run a GCM simulation and output
    237 the meteorological fields at the considered season.
    238 %
    239 For optimal forcing at the boundaries, we advise you
    240 to write the meteorological fields to the
    241 \ttt{diagfi.nc} file at least each two hours.
    242 %
    243 Please also make sure that the following fields
    244 are stored in the NETCDF \ttt{diagfi.nc} file:
    245 
    246 \footnotesize
    247 \codesource{contents_diagfi}
    248 
    249 \normalsize
    250 \begin{finger}
    251 \item If the fields
    252 \ttt{emis},
    253 \ttt{co2ice},
    254 \ttt{q01},
    255 \ttt{q02},
    256 \ttt{tsoil}
    257 are missing in the \ttt{diagfi.nc} file,
    258 they are replaced by respective default
    259 values $0.95$, $0$, $0$, $0$, tsurf.
    260 \end{finger}
    261 
    262 \mk
    263 \marge An example of input meteorological file
    264 \ttt{diagfi.nc} file can be downloaded
    265 at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
    266 %
    267 Please deflate the archive and copy the \ttt{diagfi.nc} file
    268 in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
    269 %
    270 Such a file can then be used to define the initial
    271 and boundary conditions, and we will go
    272 through the three preprocessing steps.
    273 
    274 \mk
    275 \subsection{Preprocessing steps}
    276 
    277 \mk
    278 \subsubsection{Step 1: Converting GCM data}
    279 
    280 \mk
    281 \section{Running your own GCM simulations}
    282 
    283 \begin{remarque}
    284 To be completed
    285 \end{remarque}
    286 
    287 \mk
    288 The programs in the \ttt{PREP\_MARS} directory
    289 convert the data from the NETCDF \ttt{diagfi.nc}
    290 file into separated binary datafiles for each
    291 date contained in \ttt{diagfi.nc}, according to
    292 the formatting needed by the
    293 preprocessing programs at step 2.
    294 %
    295 These programs can be executed by the following
    296 commands:
    297 \begin{verbatim}
    298 cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
    299 echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
    300 ./readmeteo.exe < readmeteo.def
    301 \end{verbatim}
    302 %
    303 \marge If every went well with the conversion,
    304 the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
    305 should contain files named \ttt{LMD:}.
    306 
    307 \mk
    308 \subsubsection{2: Interpolation on the regional domain}
    309 
    310 \mk
    311 In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
    312 you to define the mesoscale simulation domain
    313 to horizontally interpolate the topography,
    314 thermal inertia and albedo fields at the domain
    315 resolution and to calculate useful fields
    316 such as topographical slopes.%\pagebreak
    317 
    318 \mk
    319 \marge Please execute the commands:
    320 %
    321 \begin{verbatim}
    322 cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
    323 ln -sf ../../TESTCASE/namelist.wps .   # test case
    324 ./geogrid.exe
    325 \end{verbatim}
    326 %
    327 \marge The result of \ttt{geogrid.exe}
    328 -- and thus the definition of the mesoscale
    329 domain -- can be checked in the NETCDF
    330 file \ttt{geo\_em.d01.nc}.
    331 %
    332 A quick check can be performed using the command line
    333 \begin{verbatim}
    334 ncview geo_em.d01.nc
    335 \end{verbatim}
    336 \marge if \ttt{ncview} is installed, or the \ttt{IDL}
    337 script \ttt{out\_geo.pro}
    338 \begin{verbatim}
    339 idl
    340 IDL> out_geo, field1='TOPO'
    341 IDL> out_geo, field1='TI'
    342 IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
    343 IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
    344 IDL> exit
    345 \end{verbatim}
    346 \marge if the demo version of \ttt{IDL} is installed.
    347 %
    348 Of course if your favorite graphical tool supports
    349 the NETCDF standard, you might use it to check the
    350 domain definition in \ttt{geo\_em.d01.nc}.
    351 
    352 \mk
    353 \marge If you are unhappy with the results or
    354 you want to change
    355 the location of the mesoscale domain on the planet,
    356 the horizontal resolution,
    357 the number of grid points \ldots,
    358 please modify the parameter
    359 file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
    360 %
    361 Here are the contents of \ttt{namelist.wps}:
    362 %
    363 \codesource{namelist.wps_TEST}
    364 
    365 \begin{finger}
    366 %
    367 \item No input meteorological data
    368 are actually needed to execute \ttt{geogrid.exe}.
    369 %
    370 \item More details about the database and
    371 more options of interpolation could be
    372 found in the file \ttt{geogrid/GEOGRID.TBL}.
    373 %
    374 \item Defining several domains yields
    375 distinct files
    376 \ttt{geo\_em.d01.nc},
    377 \ttt{geo\_em.d02.nc},
    378 \ttt{geo\_em.d03.nc}\ldots
    379 \end{finger}
    380 
    381 \mk
    382 \marge Once the \ttt{geo\_em} file(s) are generated,
    383 the \ttt{metgrid.exe} program performs
    384 a similar horizontal interpolation
    385 of the meteorological fields to the mesoscale
    386 domain as the one performed by \ttt{geogrid.exe}
    387 for the surface data.
    388 %
    389 Then the program writes the results in
    390 \ttt{met\_em} files and also collects
    391 the static fields and domain parameters
    392 included in the \ttt{geo\_em} file(s)
    393 %
    394 Please type the following commands:
    395 \begin{verbatim}
    396 cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
    397 ./metgrid.exe
    398 \end{verbatim}
    399 %
    400 \marge If every went well,
    401 the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
    402 should contain the \ttt{met\_em.*} files.
    403 
    404 \mk
    405 \subsubsection{Step 3: Vertical interpolation on mesoscale levels}
    406 
    407 \mk
    408 \marge The last step is to execute \ttt{real.exe}
    409 to perform the interpolation from the vertical
    410 levels of the GCM to the vertical levels
    411 defined in the mesoscale model.
    412 %
    413 This program also prepares the final initial
    414 state for the simulation in files called
    415 \ttt{wrfinput} and the boundary conditions
    416 in files called \ttt{wrfbdy}.
    417 
    418 \mk
    419 \marge To successfully execute \ttt{real.exe},
    420 you need the \ttt{met\_em.*} files
    421 and the \ttt{namelist.input} file
    422 to be in the same directory as \ttt{real.exe}.
    423 %
    424 Parameters in \ttt{namelist.input}
    425 controlling the behavior of the vertical interpolation
    426 are those labelled with \ttt{(p3)} in the detailed
    427 list introduced in the previous chapter.
    428 
    429 \mk
    430 \marge Please type the following commands
    431 to prepare files for the Arsia Mons test case
    432 (or your personal test case if you changed
    433 the parameters in \ttt{namelist.wps}):
    434 \begin{verbatim}
    435 cd $LMDMOD/TESTCASE
    436 ln -sf $LMDMOD/WRFFEED/met_em* .
    437 ./real.exe
    438 \end{verbatim}
    439 
    440 \mk
    441 \marge The final message of the \ttt{real.exe}
    442 should claim the success of the processes and you
    443 are now ready to launch the integrations
    444 of the LMD Martian Mesoscale Model again
    445 with the \ttt{wrf.exe} command as in section
    446 \ref{sc:arsia}.
    447 
    448 \begin{finger}
    449 \item When you modify either
    450 \ttt{namelist.wps} or \ttt{namelist.input},
    451 make sure that the common parameters
    452 are exactly similar in both files
    453 (especially when running nested simulations)
    454 otherwise either \ttt{real.exe} or \ttt{wrf.exe}
    455 command will exit with an error message.
    456 \end{finger}
    457 %\pagebreak
    4583
    4594
Note: See TracChangeset for help on using the changeset viewer.