Changeset 258


Ignore:
Timestamp:
Aug 4, 2011, 3:59:44 AM (13 years ago)
Author:
aslmd
Message:

MESOSCALE: user manual. finished nesting + tracers + LES. only postproc is missing.

Location:
trunk
Files:
12 added
2 deleted
6 edited

Legend:

Unmodified
Added
Removed
  • trunk/MESOSCALE/LMD_MM_MARS/SIMU/namelist.input_full

    r231 r258  
    3535 num_metgrid_levels = 26     !! (p1) number of vertical levels in GCM inputs (+1)
    3636 force_sfc_in_vinterp = 8    !! (p3) Number of levels hardwired in the PBL
    37                              !!         NB: decrease this parameter when low model top
     37                             !!         NB: decrease this parameter when low p_top_requested
    3838 max_dz = 1500.              !! (p3) Maximal interval (m) between vertical levels
    3939 eta_levels = -1.            !! (p3)  Specify a list of e_vert eta levels
  • trunk/MESOSCALE_DEV/MANUAL/SRC/advance.tex

    r257 r258  
    77\section{Running nested simulations}\label{nests}
    88
    9 \paragraph{Parameter files} In case you run simulations with \ttt{max\_dom} nested domains, you have to set \ttt{max\_dom} parameters wherever there is a ``," in the \ttt{namelist.input} template in chapter~\ref{zeparam}. Below we reproduce an example of the resulting syntax of the \ttt{\&time\_control}, \ttt{\&domains} and \ttt{\&bdy\_control} categories in \ttt{namelist.input}. We recommend running hydrostatic nested simulations\footnote{Non-hydrostatic nested simulations are sometimes unstable at boundaries; this should be fixed in future versions of the model.} by setting \ttt{non\_hydrostatic = F} in \ttt{\&dynamics} in \ttt{namelist.input}. If you run a simulation with, say, $3$ domains, please ensure that you defined three files \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def} (one per nest). Usually all settings in these files are similar except \ttt{iradia}.
     9\paragraph{Preparing namelist.input} For simulations with \ttt{max\_dom} nested domains, \ttt{max\_dom} parameters must be set wherever there is a ``," in the \ttt{namelist.input\_full} template in chapter~\ref{zeparam}. Specific parameters for nested simulations are labelled with \ttt{(n)} in this \ttt{namelist.input} template (see e.g. categories \ttt{\&time\_control}, \ttt{\&domains} and \ttt{\&bdy\_control}). To help you with filling the \ttt{namelist.input} file for a nested simulation, a commented example is given below
    1010
    1111\scriptsize
    12 \codesource{OMG_namelist.input}
     12\codesource{namelist.input_nests}
    1313\normalsize
    1414
    15 \paragraph{Preprocessing steps} The additional settings for nests in \ttt{namelist.input} must correspond to the ones in the \ttt{namelist.wps} file which define the domain settings for \ttt{geogrid.exe}. A typical \ttt{namelist.wps} file for nested simulations is given below. An automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script as for single-domain simulations. Defining several domains yield one output per domain: e.g. for three domains \ttt{geogrid.exe} yields \ttt{geo\_em.d01.nc}, \ttt{geo\_em.d02.nc}, \ttt{geo\_em.d03.nc}\ldots, \ttt{real.exe} yields \ttt{wrfinput\_d02}, \ttt{wrfinput\_d03}, \ldots
     15\paragraph{Preparing namelist.wps} As is the case for single-domain simulations, the common parameters in the two files \ttt{naemlist.input} and~\ttt{namelist.wps} must be exactly similar. Similarly to single-domain simulations, an automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script. If you do not use \ttt{runmeso} to generate the \ttt{namelist.wps} file, please bear in mind that in this file, dates are different for the parent domain and the child domains, since boundary conditions are needed only for the parent domain while initial conditions are needed for all domains. The \ttt{namelist.wps} file associated to the previously described \ttt{namelist.input} file is given below and corresponds to a nested simulation in the Hellas Planitia region (Figure~\ref{nesteddomains}). Note that map projection must be similar in all nests.
    1616
     17\vskip -0.2cm
    1718\scriptsize
    18 \codesource{namelist.wps_NEST}
     19\codesource{namelist.wps_nests}
    1920\normalsize
     21
     22\begin{center}
     23\begin{figure}[h!]
     24\includegraphics[width=0.33\textwidth]{LMD_MMM_d1_63km_domain_100.png}
     25\includegraphics[width=0.33\textwidth]{LMD_MMM_d2_21km_domain_100.png}
     26\includegraphics[width=0.33\textwidth]{LMD_MMM_d3_7km_domain_100.png}
     27\caption{\label{nesteddomains} Domains for a nested mesoscale simulations in Hella Planitia defined by \ttt{namelist.wps\_nests}. From left to right, ``parent" domain i.e. nest number~$1$ (horizontal resolution $63$~km), ``child" domain i.e. nest number~$2$ (horizontal resolution $21$~km), ``grandchild" domain i.e. nest number~$3$ (horizontal resolution $7$~km).}
     28\end{figure}
     29\end{center}
     30
     31\paragraph{Preparing callphys.def} If you run a simulation with, say, $3$ domains, please ensure that you defined three files \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def} (one per nest). If needed, different settings for physical parameterizations can be made in each nest; usually all settings in these files are similar, except \ttt{iradia} (so that differences in dynamical timesteps between nests can be potentially impacted to \ttt{callphys*.def} in order to synchronize radiative transfer call).
     32
     33\paragraph{Compiling} Use the command \ttt{makemeso} and specify the number of domains and dimensions set in \ttt{namelist.input}. This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}.
     34
     35\paragraph{Running} If grid nesting and parallel computing are used, no more than~$4$ processors can be used. If the nested simulation is unstable, try a single-domain simulation with the parent domain and choose best parameters for stability (e.g., \ttt{time\_step}), then add a first nested domain, and start again stability investigations, etc.
     36
     37\paragraph{Inputs/outputs} Defining several domains yield one output per domain: e.g. for three domains \ttt{geogrid.exe} yields \ttt{geo\_em.d01.nc}, \ttt{geo\_em.d02.nc}, \ttt{geo\_em.d03.nc}\ldots; \ttt{real.exe} yields \ttt{wrfinput\_d01}, \ttt{wrfinput\_d02}, \ttt{wrfinput\_d03}, \ldots; \ttt{wrf.exe} yields \ttt{wrfout\_d01*}, \ttt{wrfout\_d02*}, \ttt{wrfout\_d03*}, \ldots   
    2038
    2139\paragraph{Useful remarks} The model presently supports 3 nests, but more nests can be included by adaptating \ttt{runmeso} and the following files:
     
    2846\end{verbatim}
    2947
    30 grid points - 1 est divisé par nproc et ratio dx
    31 exemple: 4 domaines et ratio 3 : il faut multiple de 12 (+1)
    32 
    33 attention seulement 4 processeurs
    34 
    35 nests >> commencer par faire un test nest par nest en en rajoutant un n-1 à chaque succès au niveau n
    36 -- une instabilité dans le nest 2 peut provenir du nest 1. stabiliser le nest 1 tout d'abord.
    37 
    38 resolution à virgule au moins précis à deux chiffres entre WPS et WRF
    39 
    4048\mk
    4149\section{Running simulations with tracers}
    4250
    43 mars: number of corresponding tracers
     51\paragraph{Preparing namelist.input} The default behavior of the model is to include no tracer transported by the dynamics and influenced by physical parameterization. This corresponds to \ttt{mars=0} in \ttt{namelist.input} (or the absence of parameter \ttt{mars} from the user's namelist). To compute the water cycle in the LMD Martian Mesoscale Model, simply set \ttt{mars=1} in \ttt{namelist.input} (category \ttt{\&physics}). This will add one tracer for water vapor and one tracer for water ice in the model's computations and outputs. To compute a mesoscale simulation with one simple transported dust bin (with typical characteristics), set \ttt{mars=2} in \ttt{namelist.input}.
    4452
     53\paragraph{GCM inputs} For water cycle simulations (\ttt{mars=1}), the GCM runs used to build initial and boundary conditions for the mesoscale model must also include water tracers. This is done by default in parameter files in \ttt{\$MESO/LMDZ.MARS/myGCM}, compiler wrapper \ttt{\$MESO/LMDZ.MARS/compile} and the database of start files \ttt{STARTBASE\_64\_48\_32\_t2}.
    4554
     55\paragraph{Preparing callphys.def} It is important to set \ttt{callphys.def} in accordance with the option chosen for the keyword \ttt{mars} in \ttt{namelist.input}. For instance, for water cycle simulations (\ttt{mars=1}), the following settings must be changed in \ttt{callphys.def}: \ttt{tracer}, \ttt{sedimentation}, \ttt{iceparty}, \ttt{water} shall be \ttt{T}.
    4656
    47 \mk
    48 \section{Running simulations with the new physics}
    49 different callphys.def
    50 the step datafile.h is not needed anymore ! use callphys.def.
    51 traceur.def
    52 run.def
    53 different callphys.def
    54 makemeso -p
     57\paragraph{Compiling} It is key to recompile the LMD Martian Mesoscale Model with \ttt{makemeso} each time the number of transported tracers has changed, which would most often be the case if you modify \ttt{mars} in \ttt{namelist.input}. The right number of tracers corresponding to the \ttt{mars} case you are setting must be specify when answering questions to the \ttt{makemeso} script. This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}.
     58
     59\paragraph{Inputs/outputs} Additional fields corresponding to tracer mixing ratios (e.g. \ttt{QH2O} for water vapor) are automatically output in \ttt{wrfout*} files if a different option than~\ttt{0} is used for the \ttt{mars} keyword. Note that when a large number of tracers is set, output files might grow very large quickly after the mesoscale simulation was launched.
     60
     61\paragraph{Test case} A good test case consists in coming back to the Arsia simulation described in~\ref{sc:arsia} and activate the water cycle. Add \ttt{mars=1} to \ttt{namelist.input}, change \ttt{callphys.def} as described previously. Launch \ttt{runmeso} and choose \ttt{3} (i.e. recompile the model, run \ttt{real.exe} so that initial and boundary conditions for water are included, eventually run \ttt{wrf.exe}). Check for tracer fields in output files \ttt{wrfout*}.
    5562
    5663\mk
    5764\section{Running Large-Eddy Simulations}
    5865
    59 \mk
    60 \section{Controlling which fields to output}
    61 %\section{geogrid.tbl}
     66\paragraph{Prerequisites} Large-Eddy Simulations are very specific applications of the LMD Martian Meso\-scale Model which allow the user to simulate boundary layer turbulent convection in idealized conditions. We recommend to read section 3.4 of \textit{Spiga and Forget} [2009] and the first three sections of \textit{Spiga et al.} [2010].
     67
     68\paragraph{Preparing namelist.input} A typical parameter file \ttt{namelist.input\_les} is given in what follows (and could be found in \ttt{\$MMM/SIMU}). Settings specific to Large-Eddy Simulations are referred to as \ttt{LES}. The main differences with regular mesoscale simulations are the following:
     69\begin{citemize}
     70\item the duration of simulation is specified in seconds,
     71\item model top is specified as altitude above surface,
     72\item the dynamical timestep and the spatial resolutions are much smaller,
     73\item an additional \ttt{isfflx} keyword defines surface forcings (\ttt{1} is recommended),
     74\item albedo and thermal inertia have to be set with uniform user-defined values,
     75\item idealized wind profile is often assumed,
     76\item \ttt{\&dynamics} keywords are adapted to small-scale diffusion,
     77\item periodic boundary conditions are set on the horizontal grid.
     78\end{citemize}
    6279
    6380\scriptsize
    64 \codesource{Registry.EM.extract}
     81\codesource{namelist.input_les}
    6582\normalsize
    6683
    67 ne pas déclarer un 2D en 3D et cioce versa
    68 FAIRE registry.bash UNEFOISFINI
     84\vskip 0.4cm
    6985
    70 \mk
    71 \section{Interpolating outputs on altitude and pressure levels}\label{postproc}
    72 \ttt{api}
     86\paragraph{Preparing callphys.def} It is essential that \ttt{calldifv} is set to \ttt{T} and \ttt{calladj} is set to \ttt{F}. Generally \ttt{iaervar} is set to \ttt{1} so that the (uniform) opacity in the domain can be set by creating a text file named \ttt{dustopacity.def} with the chosen value for opacity in it.
    7387
    74 \mk
    75 \section{Generating maps for winds and meteorological fields}\label{plots}
     88\paragraph{Compiling} The dynamical core used for Martian Large-Eddy Simulations is different than usual mesoscale simulations; it is based on WRF v3 instead of WRF v2. The first time the model is compiled, the user has to install it by typing the following commands:
     89\begin{verbatim}
     90cd $MMM/SRC/LES
     91./LMD_LES_MARS_install
     92cd $MMM
     93\end{verbatim}
     94The compilation of the Large-Eddy Simulations model is carried out through the command:
     95\begin{verbatim}
     96makemeso -c les
     97\end{verbatim}
     98This creates a new compilation folder with prefix \ttt{les} in which to find the executables once the model is compiled. Answers to \ttt{makemeso} must be compliant with settings in \ttt{namelist.input}.
    7699
    77 \ttt{python}
    78 \ttt{idl} ?
     100\paragraph{Inputs/outputs} Large-Eddy Simulations need four input files \ttt{input\_coord}, \ttt{input\_sounding}, \ttt{input\_more}, \ttt{input\_therm} which define initial pressure, temperature, density, winds profiles at the location/season for which simulations are run, along with information about this location/season. Typical files are available upon request, or you might simply build your own profiles using the Mars Climate Database (see the sample \ttt{scilab} script \ttt{wrf\_sounding.sci} in \ttt{\$MMM/SIMU/RUN}). Examples for \ttt{input\_*} files are provided in \ttt{\$MMM/SIMU/DEF/LMD\_LES\_MARS\_def} and correspond to the cases run in the study by \textit{Spiga et al.} [2010].
    79101
     102\begin{citemize}
     103\item \ttt{input\_coord} contains longitude, latitude, $L_s$ and local time;
     104\item \ttt{input\_sounding} contains (first line) near-surface pressure (mbar), potential temperature, a dummy value; and (subsequent lines) altitudes above MOLA zero datum, potential temperatures, dummy value, zonal wind component, meridional wind component;
     105\item \ttt{input\_more} contains on the same line altimetry and surface temperature;
     106\item \ttt{input\_therm} contains lines corresponding values for (from left column to right column)~$R$, $c_p$, pressure, density, temperature.
     107\end{citemize}
     108
     109\paragraph{Running} Large-Eddy Simulations are not supported by \ttt{runmeso}. After compiling the model with the command \ttt{makemeso -c les}, please copy the executables \ttt{ideal.exe} and \ttt{wrf.exe} from the compilation directory \ttt{\$MMM/les*} towards your simulation directory where the \ttt{input\_*} files are located. Running \ttt{ideal.exe} would generate the initial state from the profiles provided in the \ttt{input\_*} files, then running \ttt{wrf.exe} would launch the model's integrations.
     110
     111
     112%ze_hill ???
     113
     114
     115%\mk
     116%\section{Running simulations with the new physics}
     117%different callphys.def
     118%the step datafile.h is not needed anymore ! use callphys.def.
     119%traceur.def
     120%run.def
     121%different callphys.def
     122%makemeso -p
     123%%             mars = 3   ---> cycle poussieres : dustq + dustn [NOUVELLE PHYS seulement]
     124%%             mars = 11  ---> cycle de l'eau + poussieres [1+3] [NOUVELLE PHYS seulement]
     125%% LES LES
  • trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex

    r230 r258  
    5757Contents of~\ttt{LMD\_MM\_MARS/SIMU} subdirectory:
    5858\begin{citemize}
    59 \item \ttt{dustopacity.def}, \ttt{namelist.input\_full}, \ttt{namelist.input\_minim}, \ttt{run.def}, \ttt{namelist.wps}: these are useful files to guide you through setting up your own parameters for the LMD Martian Mesoscale Model simulations.
     59\item \ttt{dustopacity.def}, \ttt{namelist.input\_full}, \ttt{namelist.input\_minim}, \ttt{namelist.input\_nests}, \ttt{namelist.input\_les}, \ttt{run.def}, \ttt{namelist.wps}, \ttt{namelist.wps\_les}: these are useful template files to guide you through setting up your own parameters for the LMD Martian Mesoscale Model simulations.
    6060\item \ttt{calendar}: this is a text file containing time management information in the model.
    6161\item \ttt{runmeso}: this is a \ttt{bash} script that can be used once the model and preprocessing systems are installed; it prepares and runs a mesoscale simulation by going from step~1 to~4.
     
    124124\begin{asparaenum}[1.]%[\itshape Q1\upshape)]
    125125\item \textbf{choice of compiler}\footnote{We advise you to compile the model on the same kind of system (computer + operating system + librairies) as the one you plan to use to run the model.}
    126 \item[1.bis] (mpi-based compilation) number of processors to be used\footnote{If you use grid nesting, note that no more than $4$ processors can be used.}
     126\item[1.bis] (mpi-based compilation) number of processors to be used
    127127\item \textbf{number of grid points in longitude}\footnote{When you use parallel computations, please bear in mind that with $2$ (respectively $4$, $6$, $8$, $12$, $16$, $20$, $24$, $32$, $64$, $128$) processors the whole domain would be separated into $1$ (resp. $2$, $2$, $2$,  $3$,  $4$,  $4$,  $4$,  $4$,  $8$,   $8$) tile over the longitude direction and $2$ (resp. $2$, $3$, $4$,  $4$,  $4$,  $5$,  $6$,  $8$,  $8$,  $16$) tiles over the latitude direction. Thus make sure that the number of grid points minus $1$ in each direction could be divided by the aforementioned number of tiles over the considered direction. For instance a~$82 \times 109$ horizontal grid is compliant with the use of~$12$ processors.} [61]
    128128\item \textbf{number of grid points in latitude} [61]
  • trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex

    r257 r258  
    1 \chapter{Setting the simulation parameters}\label{zeparam}
     1\chapter{Setting simulation parameters}\label{zeparam}
    22
    33\vk
     
    7373\mk
    7474\begin{finger}
    75 \item In the given example convective adjustment, gravity wave parameterization and NLTE schemes are turned off, as is usually the case in typical Martian tropospheric mesoscale simulations (see chapter~\ref{whatis}).
     75\item In the given example convective adjustment \ttt{calladj}, gravity wave parameterization \ttt{calllott} and NLTE schemes \ttt{callnlte} are turned off, as is usually the case in typical Martian tropospheric mesoscale simulations (see chapter~\ref{whatis}).
    7676\item \ttt{iradia} sets the frequency (in dynamical timesteps) at which the radiative computations are performed. To obtain the interval in seconds at which radiative computations are performed, one simply has to multiply \ttt{iradia} to the value of \ttt{time\_step} in \ttt{namelist.input}.
    7777\end{finger}
  • trunk/MESOSCALE_DEV/MANUAL/SRC/these_spiga.sty

    r209 r258  
    5151%\usepackage[pdftex]{graphicx}
    5252
    53 
    54 \def\thechapter       {\Roman{chapter}}
     53%\Roman
     54\def\thechapter       {\arabic{chapter}}
    5555\def\thesection       {\thechapter.\arabic{section}}
    5656\def\thesubsection    {\thesection.\arabic{subsection}}
  • trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual.tex

    r257 r258  
    9494\include{guide}
    9595\include{advance}
     96%\include{postproc}
    9697\include{faq}
    9798
Note: See TracChangeset for help on using the changeset viewer.