[257] | 1 | \chapter{Advanced simulations}\label{advance} |
---|
| 2 | |
---|
| 3 | \vk |
---|
[262] | 4 | In this chapter, advice to perform more sophisticated simulations is provided to advanced users. |
---|
[257] | 5 | |
---|
| 6 | \mk |
---|
| 7 | \section{Running nested simulations}\label{nests} |
---|
| 8 | |
---|
[493] | 9 | \paragraph{Preparing namelist.input} For simulations with \ttt{max\_dom} nested domains, \ttt{max\_dom} parameters must be set wherever there is a ``," in the \ttt{namelist.input\_full} template in chapter~\ref{zeparam}. Specific parameters for nested simulations are labelled with \ttt{(n)} in this \ttt{namelist.input} template (see e.g. categories \ttt{\&time\_control}, \ttt{\&domains} and \ttt{\&bdy\_control}). To help you with filling the \ttt{namelist.input} file for a nested simulation, a commented example is given below. |
---|
[257] | 10 | |
---|
[262] | 11 | \vskip -0.4cm |
---|
[257] | 12 | \scriptsize |
---|
[258] | 13 | \codesource{namelist.input_nests} |
---|
[257] | 14 | \normalsize |
---|
| 15 | |
---|
[493] | 16 | \paragraph{Preparing namelist.wps} As is the case for single-domain simulations, the common parameters in the two files \ttt{namelist.input} and~\ttt{namelist.wps} must be exactly similar. Similarly to single-domain simulations, an automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script. If you do not use \ttt{runmeso} to generate the \ttt{namelist.wps} file, please bear in mind that in this file, dates are different for the parent domain and the child domains, since boundary conditions are needed only for the parent domain while initial conditions are needed for all domains. The \ttt{namelist.wps} file associated to the previously described \ttt{namelist.input} file is given below\footnote{You may find \ttt{namelist.input\_nests} and \ttt{namelist.wps\_nests} in \ttt{\$MMM/SIMU}.} and corresponds to a nested simulation in the Hellas Planitia region (Figure~\ref{nesteddomains}). Note that map projection is similar in all nests. |
---|
[257] | 17 | |
---|
[258] | 18 | \vskip -0.2cm |
---|
[257] | 19 | \scriptsize |
---|
[258] | 20 | \codesource{namelist.wps_nests} |
---|
[257] | 21 | \normalsize |
---|
| 22 | |
---|
[258] | 23 | \begin{center} |
---|
| 24 | \begin{figure}[h!] |
---|
| 25 | \includegraphics[width=0.33\textwidth]{LMD_MMM_d1_63km_domain_100.png} |
---|
| 26 | \includegraphics[width=0.33\textwidth]{LMD_MMM_d2_21km_domain_100.png} |
---|
| 27 | \includegraphics[width=0.33\textwidth]{LMD_MMM_d3_7km_domain_100.png} |
---|
| 28 | \caption{\label{nesteddomains} Domains for a nested mesoscale simulations in Hella Planitia defined by \ttt{namelist.wps\_nests}. From left to right, ``parent" domain i.e. nest number~$1$ (horizontal resolution $63$~km), ``child" domain i.e. nest number~$2$ (horizontal resolution $21$~km), ``grandchild" domain i.e. nest number~$3$ (horizontal resolution $7$~km).} |
---|
| 29 | \end{figure} |
---|
| 30 | \end{center} |
---|
| 31 | |
---|
| 32 | \paragraph{Preparing callphys.def} If you run a simulation with, say, $3$ domains, please ensure that you defined three files \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def} (one per nest). If needed, different settings for physical parameterizations can be made in each nest; usually all settings in these files are similar, except \ttt{iradia} (so that differences in dynamical timesteps between nests can be potentially impacted to \ttt{callphys*.def} in order to synchronize radiative transfer call). |
---|
| 33 | |
---|
[262] | 34 | \paragraph{Compiling} Use the command \ttt{makemeso} and specify the number of domains and dimensions set in \ttt{namelist.input} (as far as the horizontal grid is concerned, answers to \ttt{makemeso} shall refer to the values of \ttt{e\_we} and \ttt{e\_sn} for the parent domain). This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}. |
---|
[258] | 35 | |
---|
[262] | 36 | \paragraph{Running} If grid nesting and parallel computing are used, no more than~$4$ processors can be used. If the nested simulation is unstable, try a single-domain simulation with the parent domain and choose best parameters for stability (e.g., \ttt{time\_step}), then add a first nested domain, and start again stability tests and investigations, etc. |
---|
[258] | 37 | |
---|
| 38 | \paragraph{Inputs/outputs} Defining several domains yield one output per domain: e.g. for three domains \ttt{geogrid.exe} yields \ttt{geo\_em.d01.nc}, \ttt{geo\_em.d02.nc}, \ttt{geo\_em.d03.nc}\ldots; \ttt{real.exe} yields \ttt{wrfinput\_d01}, \ttt{wrfinput\_d02}, \ttt{wrfinput\_d03}, \ldots; \ttt{wrf.exe} yields \ttt{wrfout\_d01*}, \ttt{wrfout\_d02*}, \ttt{wrfout\_d03*}, \ldots |
---|
| 39 | |
---|
[257] | 40 | \paragraph{Useful remarks} The model presently supports 3 nests, but more nests can be included by adaptating \ttt{runmeso} and the following files: |
---|
[262] | 41 | %\scriptsize |
---|
[257] | 42 | \begin{verbatim} |
---|
| 43 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc |
---|
| 44 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc |
---|
| 45 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3 |
---|
| 46 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3 |
---|
| 47 | $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest' |
---|
| 48 | \end{verbatim} |
---|
[262] | 49 | %\normalsize |
---|
[257] | 50 | |
---|
[258] | 51 | \mk |
---|
| 52 | \section{Running simulations with tracers} |
---|
[257] | 53 | |
---|
[262] | 54 | \paragraph{Preparing namelist.input} The default behavior of the model is to include no transported tracer by the dynamics. This corresponds to \ttt{mars=0} in \ttt{namelist.input} (or the absence of parameter \ttt{mars} from the user's namelist). To compute the water cycle in the LMD Martian Mesoscale Model, simply set \ttt{mars=1} in \ttt{namelist.input} (category \ttt{\&physics}). This will add one tracer for water vapor and one tracer for water ice in the model's computations and outputs. To compute a mesoscale simulation with one simple transported dust bin (with typical characteristics), set \ttt{mars=2} in \ttt{namelist.input}. |
---|
[257] | 55 | |
---|
[258] | 56 | \paragraph{GCM inputs} For water cycle simulations (\ttt{mars=1}), the GCM runs used to build initial and boundary conditions for the mesoscale model must also include water tracers. This is done by default in parameter files in \ttt{\$MESO/LMDZ.MARS/myGCM}, compiler wrapper \ttt{\$MESO/LMDZ.MARS/compile} and the database of start files \ttt{STARTBASE\_64\_48\_32\_t2}. |
---|
[257] | 57 | |
---|
[493] | 58 | \paragraph{Preparing callphys.def} It is important to set \ttt{callphys.def} in accordance with the option chosen for the keyword \ttt{mars} in \ttt{namelist.input}. For instance, for water cycle simulations (\ttt{mars=1}), the following settings must be changed in \ttt{callphys.def}: \ttt{tracer}, \ttt{sedimentation}, \ttt{iceparty}, \ttt{water} shall be \ttt{T}. An example file is \ttt{\$MMM/SIMU/DEF/REF\_ARTICLE/callphys.def.mars1}. |
---|
[257] | 59 | |
---|
[493] | 60 | \paragraph{Compiling} It is key to recompile the LMD Martian Mesoscale Model with \ttt{makemeso} each time the number of transported tracers has changed, which would most often be the case if you modify \ttt{mars} in \ttt{namelist.input}. The right number of tracers corresponding to the \ttt{mars} case you are setting must be specified when answering questions to the \ttt{makemeso} script. This is of course automatically done if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}. |
---|
[257] | 61 | |
---|
[262] | 62 | \paragraph{Inputs/outputs} Additional fields corresponding to tracer mixing ratios (e.g. \ttt{QH2O} for water vapor) are automatically output in \ttt{wrfout*} files if a different option than~\ttt{0} is used for the \ttt{mars} keyword. Note that when a large number of tracers is set, output files might grow very large quickly after the mesoscale simulation is launched. |
---|
[257] | 63 | |
---|
[262] | 64 | \paragraph{Test case} A good test case consists in coming back to the Arsia simulation described in section~\ref{sc:arsia} and activate the water cycle. Add \ttt{mars=1} to \ttt{namelist.input}, change \ttt{callphys.def} as described previously. Launch \ttt{runmeso} and choose \ttt{3} (i.e. recompile the model, run \ttt{real.exe} so that initial and boundary conditions for water are included, eventually run \ttt{wrf.exe}). Check for tracer fields in output files \ttt{wrfout*}. |
---|
[257] | 65 | |
---|
| 66 | \mk |
---|
| 67 | \section{Running Large-Eddy Simulations} |
---|
| 68 | |
---|
[262] | 69 | \paragraph{Prerequisites} Large-Eddy Simulations are very specific applications of the LMD Martian Meso\-scale Model which allow the user to simulate boundary layer turbulent convection in idealized conditions at fine spatial and temporal resolution. We recommend to read section 3.4 of \textit{Spiga and Forget} [2009] and the first three sections of \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. |
---|
[257] | 70 | |
---|
[258] | 71 | \paragraph{Preparing namelist.input} A typical parameter file \ttt{namelist.input\_les} is given in what follows (and could be found in \ttt{\$MMM/SIMU}). Settings specific to Large-Eddy Simulations are referred to as \ttt{LES}. The main differences with regular mesoscale simulations are the following: |
---|
| 72 | \begin{citemize} |
---|
| 73 | \item the duration of simulation is specified in seconds, |
---|
| 74 | \item model top is specified as altitude above surface, |
---|
| 75 | \item the dynamical timestep and the spatial resolutions are much smaller, |
---|
| 76 | \item an additional \ttt{isfflx} keyword defines surface forcings (\ttt{1} is recommended), |
---|
| 77 | \item albedo and thermal inertia have to be set with uniform user-defined values, |
---|
[493] | 78 | \item idealized wind profile is assumed, |
---|
[258] | 79 | \item \ttt{\&dynamics} keywords are adapted to small-scale diffusion, |
---|
[262] | 80 | \item periodic boundary conditions are set for the horizontal grid. |
---|
[258] | 81 | \end{citemize} |
---|
| 82 | |
---|
[257] | 83 | \scriptsize |
---|
[258] | 84 | \codesource{namelist.input_les} |
---|
[257] | 85 | \normalsize |
---|
| 86 | |
---|
[493] | 87 | %\vskip 0.4cm |
---|
| 88 | \newpage |
---|
[257] | 89 | |
---|
[493] | 90 | \paragraph{Preparing callphys.def} It is essential that \ttt{calldifv} is set to \ttt{T} and \ttt{calladj} is set to \ttt{F} for Large-Eddy Simulations. Generally \ttt{iaervar} is set to \ttt{1} so that the (uniform) opacity in the domain can be set by adding a text file named \ttt{dustopacity.def} with the chosen value for opacity in it. |
---|
[257] | 91 | |
---|
[258] | 92 | \paragraph{Compiling} The dynamical core used for Martian Large-Eddy Simulations is different than usual mesoscale simulations; it is based on WRF v3 instead of WRF v2. The first time the model is compiled, the user has to install it by typing the following commands: |
---|
| 93 | \begin{verbatim} |
---|
| 94 | cd $MMM/SRC/LES |
---|
| 95 | ./LMD_LES_MARS_install |
---|
| 96 | cd $MMM |
---|
| 97 | \end{verbatim} |
---|
| 98 | The compilation of the Large-Eddy Simulations model is carried out through the command: |
---|
| 99 | \begin{verbatim} |
---|
| 100 | makemeso -c les |
---|
| 101 | \end{verbatim} |
---|
[262] | 102 | This creates a new compilation folder with prefix \ttt{les} in which the executables can be found once the model is compiled. Answers to \ttt{makemeso} must be compliant with settings in \ttt{namelist.input}. |
---|
[257] | 103 | |
---|
[493] | 104 | \paragraph{Inputs/outputs} Large-Eddy Simulations need four input files \ttt{input\_coord}, \ttt{input\_sounding}, \ttt{input\_more}, \ttt{input\_therm} which define initial pressure, temperature, density, winds profiles at the location/season for which simulations are run, along with information about this location/season. Typical files are available upon request, or you might simply build your own profiles using the Mars Climate Database (see the sample \ttt{scilab} script \ttt{wrf\_sounding.sci} in \ttt{\$MMM/SIMU/RUN}). Examples for \ttt{input\_*} files are provided in \ttt{\$MMM/SRC/LES/modif\_mars/DEF} and correspond to the cases run in the study by \textit{Spiga et al.} [2010]. |
---|
[257] | 105 | |
---|
[1182] | 106 | %% IMPORTANT IMPORTANT |
---|
[653] | 107 | %% now python inimeso.py in UTIL/PYTHON |
---|
| 108 | |
---|
[258] | 109 | \begin{citemize} |
---|
| 110 | \item \ttt{input\_coord} contains longitude, latitude, $L_s$ and local time; |
---|
| 111 | \item \ttt{input\_sounding} contains (first line) near-surface pressure (mbar), potential temperature, a dummy value; and (subsequent lines) altitudes above MOLA zero datum, potential temperatures, dummy value, zonal wind component, meridional wind component; |
---|
| 112 | \item \ttt{input\_more} contains on the same line altimetry and surface temperature; |
---|
[262] | 113 | \item \ttt{input\_therm} contains lines with corresponding values for (from left column to right column)~$R$, $c_p$, pressure, density, temperature. |
---|
[258] | 114 | \end{citemize} |
---|
| 115 | |
---|
[262] | 116 | \paragraph{Running} Large-Eddy Simulations are not supported by \ttt{runmeso}. After compiling the model with the command \ttt{makemeso -c les}, please copy the executables \ttt{ideal.exe} and \ttt{wrf.exe} from the compilation directory \ttt{\$MMM/les*} towards your simulation directory where the \ttt{input\_*} files are located. Running \ttt{ideal.exe} would generate the initial state \ttt{wrfbdy\_d01} from the profiles provided in the \ttt{input\_*} files, then running \ttt{wrf.exe} would launch the model's integrations. |
---|
[258] | 117 | |
---|
| 118 | |
---|
[1182] | 119 | %%% le modele supporte en fait 5 nests |
---|
| 120 | %%% ne pas oublier que mars doit etre dupliquee dans la namelist... |
---|
| 121 | |
---|
| 122 | |
---|
[493] | 123 | %\mk |
---|
| 124 | %\section{Idealized test cases} [such as GW case] |
---|
| 125 | |
---|
[258] | 126 | %ze_hill ??? |
---|
[262] | 127 | %version without physics ??? |
---|
[258] | 128 | |
---|
| 129 | |
---|
[1182] | 130 | \mk |
---|
| 131 | \section{Running simulations with the new physical parameterizations} |
---|
[261] | 132 | |
---|
[1182] | 133 | \sk |
---|
[1258] | 134 | Using the most recent physical parameterizations means using a version of the LMD Martian Mesoscale Model that is still under development (thus experimental). It is therefore recommended to contact developers to run simulations in this mode. Reference setting files are located in \ttt{MESOSCALE/LMD\_MM\_MARS/SIMU/DEF/newphys\_THARSIS\_WATER}. |
---|
[315] | 135 | |
---|
[1182] | 136 | \sk |
---|
| 137 | For advanced users who learnt with the LMD team how to use the new physical parameterizations, here are a few differences with the physical parameterizations natively provided with the LMD Martian Mesoscale Model that must be kept in mind |
---|
| 138 | \begin{finger} |
---|
| 139 | \item a folder \ttt{LMDZ\_MARS} containing the latest sources of the Mars LMD GCM must be located in the same repository which contains \ttt{MESOSCALE} (easy to do with SVN) |
---|
| 140 | \item GCM runs used to produce initial and boundary conditions for the mesoscale model must be done in \ttt{MESOSCALE/LMDZ.MARS.new} |
---|
| 141 | \item \ttt{makemeso} must be used with option \ttt{-p} |
---|
| 142 | \item the \ttt{callphys.def} file is different |
---|
| 143 | \item modifying the \ttt{datafile.h} is not necessary anymore, this can be done in \ttt{callphys.def} |
---|
| 144 | \item an additional \ttt{run.def} file is needed |
---|
| 145 | \item in \ttt{namelist.input}, the soil model must set to 18 levels |
---|
[1218] | 146 | \item in \ttt{namelist.input}, the 6th order small scale diffusion must be set to 0 (i.e. \ttt{diff\_6th\_opt = 0}) if the resolution is small ($<10$~km) |
---|
[1182] | 147 | \item additional \ttt{mars} modes can be accessed (e.g. for interactive dust or the radiative effect of clouds) |
---|
[1707] | 148 | %\item if \ttt{init\_TI} is modified, \ttt{real.exe} must be run again (because of subsurface modeling) |
---|
[1182] | 149 | \item a varying map for surface roughness~$z_0$ can be used -- or a constant value can be set with \ttt{init\_Z0} in \ttt{namelist.input} (if there is a problem, the old reference of 1cm is chosen) |
---|
[1218] | 150 | \item (starting from version \ttt{r1038}) the model does not need to be recompiled if the number of tracers is changed |
---|
| 151 | \item (starting from version \ttt{r1214}) the model does not need to be recompiled if the number of horizontal grid points or the number of processors is changed |
---|
[1258] | 152 | \item (prior to version \ttt{r1247}) the number of scatterers must be given when compiling, standard simulation uses 1 scatterer (2 is used for radiatively active water ice clouds) |
---|
| 153 | \item (starting from version \ttt{r1247}) the model does not need to be recompiled if the number of scatterers is changed |
---|
[1272] | 154 | \item (starting from version \ttt{r1272}) the model does not need to be recompiled if the number of vertical levels is changed |
---|
[1182] | 155 | \end{finger} |
---|
[317] | 156 | |
---|
[1258] | 157 | For nested runs, all versions posterior to \ttt{r1027} are broken. However, the interface between the WRF dynamical core and the LMD physical parameterizations has been significantly improved in \ttt{r1243}, which fixes nesting runs and simplifies restart runs. Those improvements remain to be extensively tested more extensively, but getting an operational model with nesting and restart runs will only require now minor adjustments that will be committed in subsequent revisions of the model. |
---|
[1182] | 158 | |
---|
[1218] | 159 | %% r1199 MESOSCALE. possibility to experiment simulations with Wee et al. 2012 changes in initialization (more consistent handling of hypsometric equation) |
---|
| 160 | %% run.def different if callphys different for nests |
---|
[542] | 161 | |
---|
[1707] | 162 | %%% Here is how to output near-surface diagnostics |
---|
| 163 | %- Look for n_out in physiq.F this will lead you to a few lines of codes that will allow you to output near-surface diagnostics |
---|
| 164 | %- Change n_out to a value of 2 |
---|
| 165 | %- A few lines below change z_out and make it equal to [1.6,0.5] |
---|
| 166 | %Now this is OK for physics, but you have to make this available in the dynamical outputs of WRF |
---|
| 167 | %Go to the mesoscale sources, then in Registry find Registry.EM |
---|
| 168 | %then following the example of this kind of line |
---|
| 169 | %state real ALBBARE ij misc 1 - rhd "ALBBARE" "SOIL ALBEDO" "" #SAVEMARS2 albedodat |
---|
| 170 | %add this |
---|
| 171 | %state real TVIK ij misc 1 - rhd "TVIK" "temperature at 1.6m" "K" #SAVEMARS2 T_out1 |
---|
| 172 | %Then proceed through a full recompile of the mesoscale model from scratch Here is how to output near-surface diagnostics |
---|
| 173 | %- Look for n_out in physiq.F this will lead you to a few lines of codes that will allow you to output near-surface diagnostics |
---|
| 174 | %- Change n_out to a value of 2 |
---|
| 175 | %- A few lines below change z_out and make it equal to [1.6,0.5] |
---|
| 176 | %Now this is OK for physics, but you have to make this available in the dynamical outputs of WRF |
---|
| 177 | %Go to the mesoscale sources, then in Registry find Registry.EM |
---|
| 178 | %then following the example of this kind of line |
---|
| 179 | %state real ALBBARE ij misc 1 - rhd "ALBBARE" "SOIL ALBEDO" "" #SAVEMARS2 albedodat |
---|
| 180 | %add this |
---|
| 181 | %state real TVIK ij misc 1 - rhd "TVIK" "temperature at 1.6m" "K" #SAVEMARS2 T_out1 |
---|
| 182 | %Then proceed through a full recompile of the mesoscale model from scratch |
---|
[608] | 183 | |
---|
[1707] | 184 | \sk |
---|
| 185 | A fully functional modeling architecture with \ttt{LMD\_MM\_MARS} and the new physical parameterizations (as well as the \ttt{LMDZ\_MARS} Global Climate Model used to initialize the mesoscale simulations) can be downloaded and compiled using the \ttt{meso\_install.sh} script: \url{http://svn.lmd.jussieu.fr/Planeto/trunk/MESOSCALE/LMD_MM_MARS/SIMU/meso_install.sh}. The \ttt{NETCDF} environment variable shall be positioned. The \ttt{meso\_install.sh} script only works on the IPSL \ttt{ciclad} computing cluster with compiler ifort; it shall be used as a template for other environments (or manually installing the architecture step-by-step). Options in the \ttt{meso\_install.sh} script can be displayed with the \ttt{-h} option. |
---|
| 186 | |
---|
| 187 | %I think the main point is to change CICLAD for a name representative of your local cluster, and then creates a arch file analogous to what is done for arch_CICLADifort, but adapted to your local cluster |
---|
| 188 | |
---|
| 189 | %declare -x WHERE_MPI=/usr/lib64/openmpi/1.6.5-ifort/bin/ |
---|
| 190 | %declare -x NETCDF=/opt/netcdf3/ifort |
---|
| 191 | %declare -x NCDFLIB=$NETCDF/lib |
---|
| 192 | %declare -x NCDFINC=$NETCDF/include |
---|
| 193 | %### pour fcm nouvelle version du GCM |
---|
| 194 | %declare -x PATH=~millour/FCM_V1.2/bin/:$PATH |
---|
| 195 | %## + mettre ./ dans PATH |
---|
| 196 | |
---|
| 197 | %r1613 (or slightly before) use of -p mars_lmd_new |
---|
| 198 | |
---|
| 199 | |
---|
| 200 | |
---|
[261] | 201 | \clearemptydoublepage |
---|