Ignore:
Timestamp:
Jan 8, 2012, 10:57:06 PM (13 years ago)
Author:
aslmd
Message:

MESOSCALE: final version of the user manual

Location:
trunk/MESOSCALE_DEV/MANUAL/SRC
Files:
1 added
1 deleted
11 edited

Legend:

Unmodified
Added
Removed
  • trunk/MESOSCALE_DEV/MANUAL/SRC/advance.tex

    r317 r493  
    77\section{Running nested simulations}\label{nests}
    88
    9 \paragraph{Preparing namelist.input} For simulations with \ttt{max\_dom} nested domains, \ttt{max\_dom} parameters must be set wherever there is a ``," in the \ttt{namelist.input\_full} template in chapter~\ref{zeparam}. Specific parameters for nested simulations are labelled with \ttt{(n)} in this \ttt{namelist.input} template (see e.g. categories \ttt{\&time\_control}, \ttt{\&domains} and \ttt{\&bdy\_control}). To help you with filling the \ttt{namelist.input} file for a nested simulation, a commented example is given below
     9\paragraph{Preparing namelist.input} For simulations with \ttt{max\_dom} nested domains, \ttt{max\_dom} parameters must be set wherever there is a ``," in the \ttt{namelist.input\_full} template in chapter~\ref{zeparam}. Specific parameters for nested simulations are labelled with \ttt{(n)} in this \ttt{namelist.input} template (see e.g. categories \ttt{\&time\_control}, \ttt{\&domains} and \ttt{\&bdy\_control}). To help you with filling the \ttt{namelist.input} file for a nested simulation, a commented example is given below.
    1010
    1111\vskip -0.4cm
     
    1414\normalsize
    1515
    16 \paragraph{Preparing namelist.wps} As is the case for single-domain simulations, the common parameters in the two files \ttt{namelist.input} and~\ttt{namelist.wps} must be exactly similar. Similarly to single-domain simulations, an automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script. If you do not use \ttt{runmeso} to generate the \ttt{namelist.wps} file, please bear in mind that in this file, dates are different for the parent domain and the child domains, since boundary conditions are needed only for the parent domain while initial conditions are needed for all domains. The \ttt{namelist.wps} file associated to the previously described \ttt{namelist.input} file is given below and corresponds to a nested simulation in the Hellas Planitia region (Figure~\ref{nesteddomains}). Note that map projection is similar in all nests.
     16\paragraph{Preparing namelist.wps} As is the case for single-domain simulations, the common parameters in the two files \ttt{namelist.input} and~\ttt{namelist.wps} must be exactly similar. Similarly to single-domain simulations, an automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script. If you do not use \ttt{runmeso} to generate the \ttt{namelist.wps} file, please bear in mind that in this file, dates are different for the parent domain and the child domains, since boundary conditions are needed only for the parent domain while initial conditions are needed for all domains. The \ttt{namelist.wps} file associated to the previously described \ttt{namelist.input} file is given below\footnote{You may find \ttt{namelist.input\_nests} and \ttt{namelist.wps\_nests} in \ttt{\$MMM/SIMU}.} and corresponds to a nested simulation in the Hellas Planitia region (Figure~\ref{nesteddomains}). Note that map projection is similar in all nests.
    1717
    1818\vskip -0.2cm
     
    5656\paragraph{GCM inputs} For water cycle simulations (\ttt{mars=1}), the GCM runs used to build initial and boundary conditions for the mesoscale model must also include water tracers. This is done by default in parameter files in \ttt{\$MESO/LMDZ.MARS/myGCM}, compiler wrapper \ttt{\$MESO/LMDZ.MARS/compile} and the database of start files \ttt{STARTBASE\_64\_48\_32\_t2}.
    5757
    58 \paragraph{Preparing callphys.def} It is important to set \ttt{callphys.def} in accordance with the option chosen for the keyword \ttt{mars} in \ttt{namelist.input}. For instance, for water cycle simulations (\ttt{mars=1}), the following settings must be changed in \ttt{callphys.def}: \ttt{tracer}, \ttt{sedimentation}, \ttt{iceparty}, \ttt{water} shall be \ttt{T}.
     58\paragraph{Preparing callphys.def} It is important to set \ttt{callphys.def} in accordance with the option chosen for the keyword \ttt{mars} in \ttt{namelist.input}. For instance, for water cycle simulations (\ttt{mars=1}), the following settings must be changed in \ttt{callphys.def}: \ttt{tracer}, \ttt{sedimentation}, \ttt{iceparty}, \ttt{water} shall be \ttt{T}. An example file is \ttt{\$MMM/SIMU/DEF/REF\_ARTICLE/callphys.def.mars1}.
    5959
    60 \paragraph{Compiling} It is key to recompile the LMD Martian Mesoscale Model with \ttt{makemeso} each time the number of transported tracers has changed, which would most often be the case if you modify \ttt{mars} in \ttt{namelist.input}. The right number of tracers corresponding to the \ttt{mars} case you are setting must be specify when answering questions to the \ttt{makemeso} script. This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}.
     60\paragraph{Compiling} It is key to recompile the LMD Martian Mesoscale Model with \ttt{makemeso} each time the number of transported tracers has changed, which would most often be the case if you modify \ttt{mars} in \ttt{namelist.input}. The right number of tracers corresponding to the \ttt{mars} case you are setting must be specified when answering questions to the \ttt{makemeso} script. This is of course automatically done if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}.
    6161
    6262\paragraph{Inputs/outputs} Additional fields corresponding to tracer mixing ratios (e.g. \ttt{QH2O} for water vapor) are automatically output in \ttt{wrfout*} files if a different option than~\ttt{0} is used for the \ttt{mars} keyword. Note that when a large number of tracers is set, output files might grow very large quickly after the mesoscale simulation is launched.
     
    7676\item an additional \ttt{isfflx} keyword defines surface forcings (\ttt{1} is recommended),
    7777\item albedo and thermal inertia have to be set with uniform user-defined values,
    78 \item idealized wind profile is often assumed,
     78\item idealized wind profile is assumed,
    7979\item \ttt{\&dynamics} keywords are adapted to small-scale diffusion,
    8080\item periodic boundary conditions are set for the horizontal grid.
     
    8585\normalsize
    8686
    87 \vskip 0.4cm
     87%\vskip 0.4cm
     88\newpage
    8889
    89 \paragraph{Preparing callphys.def} It is essential that \ttt{calldifv} is set to \ttt{T} and \ttt{calladj} is set to \ttt{F} for Large-Eddy Simulations. Generally \ttt{iaervar} is set to \ttt{1} so that the (uniform) opacity in the domain can be set by creating a text file named \ttt{dustopacity.def} with the chosen value for opacity in it.
     90\paragraph{Preparing callphys.def} It is essential that \ttt{calldifv} is set to \ttt{T} and \ttt{calladj} is set to \ttt{F} for Large-Eddy Simulations. Generally \ttt{iaervar} is set to \ttt{1} so that the (uniform) opacity in the domain can be set by adding a text file named \ttt{dustopacity.def} with the chosen value for opacity in it.
    9091
    9192\paragraph{Compiling} The dynamical core used for Martian Large-Eddy Simulations is different than usual mesoscale simulations; it is based on WRF v3 instead of WRF v2. The first time the model is compiled, the user has to install it by typing the following commands:
     
    101102This creates a new compilation folder with prefix \ttt{les} in which the executables can be found once the model is compiled. Answers to \ttt{makemeso} must be compliant with settings in \ttt{namelist.input}.
    102103
    103 \paragraph{Inputs/outputs} Large-Eddy Simulations need four input files \ttt{input\_coord}, \ttt{input\_sounding}, \ttt{input\_more}, \ttt{input\_therm} which define initial pressure, temperature, density, winds profiles at the location/season for which simulations are run, along with information about this location/season. Typical files are available upon request, or you might simply build your own profiles using the Mars Climate Database (see the sample \ttt{scilab} script \ttt{wrf\_sounding.sci} in \ttt{\$MMM/SIMU/RUN}). Examples for \ttt{input\_*} files are provided in \ttt{\$MMM/SIMU/DEF/LMD\_LES\_MARS\_def} and correspond to the cases run in the study by \textit{Spiga et al.} [2010].
     104\paragraph{Inputs/outputs} Large-Eddy Simulations need four input files \ttt{input\_coord}, \ttt{input\_sounding}, \ttt{input\_more}, \ttt{input\_therm} which define initial pressure, temperature, density, winds profiles at the location/season for which simulations are run, along with information about this location/season. Typical files are available upon request, or you might simply build your own profiles using the Mars Climate Database (see the sample \ttt{scilab} script \ttt{wrf\_sounding.sci} in \ttt{\$MMM/SIMU/RUN}). Examples for \ttt{input\_*} files are provided in \ttt{\$MMM/SRC/LES/modif\_mars/DEF} and correspond to the cases run in the study by \textit{Spiga et al.} [2010].
    104105
    105106\begin{citemize}
     
    112113\paragraph{Running} Large-Eddy Simulations are not supported by \ttt{runmeso}. After compiling the model with the command \ttt{makemeso -c les}, please copy the executables \ttt{ideal.exe} and \ttt{wrf.exe} from the compilation directory \ttt{\$MMM/les*} towards your simulation directory where the \ttt{input\_*} files are located. Running \ttt{ideal.exe} would generate the initial state \ttt{wrfbdy\_d01} from the profiles provided in the \ttt{input\_*} files, then running \ttt{wrf.exe} would launch the model's integrations.
    113114
     115
     116%\mk
     117%\section{Idealized test cases} [such as GW case]
    114118
    115119%ze_hill ???
  • trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex

    r262 r493  
    5151\item \ttt{WPS}: this is a directory containing sources for step~2.
    5252\item \ttt{POSTPROC}: this is a directory containing postprocessing sources.
    53 \item \ttt{PYTHON}: this is a directory containing \ttt{python}-based graphical scripts.
     53%\item \ttt{PYTHON}: this is a directory containing \ttt{python}-based graphical scripts.
    5454\item \ttt{LES} and \ttt{LESnophys\_}: these are directories containing sources for Large-Eddy Simulations.
    5555\end{citemize}
     
    5858Contents of~\ttt{LMD\_MM\_MARS/SIMU} subdirectory:
    5959\begin{citemize}
    60 \item \ttt{dustopacity.def}, \ttt{namelist.input\_full}, \ttt{namelist.input\_minim}, \ttt{namelist.input\_nests}, \ttt{namelist.input\_les}, \ttt{run.def}, \ttt{namelist.wps}, \ttt{namelist.wps\_les}: these are useful template files to guide you through setting up your own parameters for the LMD Martian Mesoscale Model simulations.
     60\item \ttt{callphys.def},\ttt{dustopacity.def}, \ttt{run.def}, \ttt{namelist.input\_full}, \ttt{namelist.input\_minim}, \ttt{namelist.input\_nests}, \ttt{namelist.input\_les}, \ttt{namelist.wps\_example}, \ttt{namelist.wps\_nests}, \ttt{namelist.wps.template}~: these are useful example and template files to guide you through setting up your own parameters for the LMD Martian Mesoscale Model simulations.
    6161\item \ttt{calendar}: this is a text file containing time management information in the model.
    6262\item \ttt{runmeso}: this is a \ttt{bash} script that can be used once the model and preprocessing systems are installed; it prepares and runs a mesoscale simulation by going from step~1 to~4.
     
    156156
    157157\sk
    158 To launch the test simulation, please type the following commands, replacing the \ttt{g95\_32\_single} directory with its corresponding value on your system. In the end, the model should run and output the computed meteorological fields in netCDF files named \ttt{wrfout*}. Feel free to browse those files with \ttt{ncview} or your favorite graphical tool to check if the simulated fields looks reasonable.
     158To launch the test simulation, please type the following commands, replacing the \ttt{g95\_32\_single} directory with its corresponding value on your system. In the end, the model should run and output the computed meteorological fields in netCDF files named \ttt{wrfout*}. Feel free to browse those files with \ttt{ncview} or your favorite graphical tool to check if the simulated fields look reasonable.
    159159%
    160160\begin{verbatim}
     
    184184
    185185\bk
    186 \scriptsize
    187 \begin{finger}
    188 \item If you compiled the model using MPICH2, the command to launch a simulation is slightly different:
     186%\scriptsize
     187\begin{finger}
     188\item If you compiled the model using MPI, the command to launch a simulation is slightly different:
    189189%
    190190\begin{verbatim}
    191 [simulation on 4 processors on 1 machine]
    192 mpd &      # first-time only (or after a reboot)
    193            # NB: may request the creation of a file .mpd.conf
    194 mpirun -np 4 wrf.exe < /dev/null &      # NB: mpirun is only a link to mpiexec 
    195 tail -20 rsl.out.000?     # to check the outputs
    196 \end{verbatim}
    197 \begin{verbatim}
    198 [simulation on 16 processors in 4 connected machines]
    199 echo barry.lmd.jussieu.fr > ~/mpd.hosts
    200 echo white.lmd.jussieu.fr >> ~/mpd.hosts
    201 echo loves.lmd.jussieu.fr >> ~/mpd.hosts
    202 echo tapas.lmd.jussieu.fr >> ~/mpd.hosts
    203 ssh barry.lmd.jussieu.fr   # make sure that ssh to other machines
    204                            # is possible without authentification
    205 mpdboot -f ~/mpd.hosts -n 4
    206 mpdtrace
    207 mpirun -l -np 16 wrf.exe < /dev/null &   # NB: mpirun is only a link to mpiexec
    208 tail -20 rsl.out.00??     # to check the outputs
    209 \end{verbatim}
    210 \end{finger}
    211 \normalsize
     191[if several connected machines, create a file mpd.hosts with machine names...]
     192[... and make sure that ssh between machines does not need authentification]
     193mpirun [-f mpd.hosts] -np number_of_processors wrf.exe < /dev/null &     
     194tail -20 rsl.out.000? # to check the outputs
     195\end{verbatim}
     196%%echo barry.lmd.jussieu.fr > ~/mpd.hosts
     197%%echo white.lmd.jussieu.fr >> ~/mpd.hosts
     198%%echo loves.lmd.jussieu.fr >> ~/mpd.hosts
     199%%echo tapas.lmd.jussieu.fr >> ~/mpd.hosts     
     200%%mpdboot  -n 4
     201%%mpdtrace
     202%%mpirun -l -np 16 wrf.exe < /dev/null &   # NB: mpirun is only a link to mpiexec
     203\end{finger}
     204%\normalsize
    212205
    213206\clearemptydoublepage
  • trunk/MESOSCALE_DEV/MANUAL/SRC/faq.tex

    r262 r493  
    4444\noindent \textbf{The model is no longer compiling, after I abruptly stopped the \ttt{makemeso} script because I realized that I made a mistake (e.g. I was compiling on the wrong machine).}
    4545\begin{finger}
    46 \item Recompile the model from scratch by using the option \ttt{-f} to \ttt{makemeso}.
     46\item Recompile the model from scratch by adding the option \ttt{-f} to \ttt{makemeso}.
    4747\end{finger}
    4848
     
    5454
    5555\sk
    56 \noindent \textbf{I am afraid I explored a given compilation directory in \ttt{\$MMM} (say \ttt{g95\_32\_single} and broke something, e.g. deleted or break some links. The model does not compile anymore.}
     56\noindent \textbf{I am afraid I explored a given compilation directory in \ttt{\$MMM} (say \ttt{g95\_32\_single}) and broke something, e.g. deleted or break some links. The model does not compile anymore.}
    5757\begin{finger}
    5858\item Delete the corresponding compilation directory. Since it is mostly filled with symbolic links, you will only lose the previously compiled executables and the (possibly modified) \ttt{Registry.EM} file. Save those files prior to deletion of the compilation directory if you would like to keep those. Then run again \ttt{makemeso} for the same combination of compiler/system and a new clean version of the compilation directory will reappear, while the model executables are recompiled from scratch.
     
    101101\noindent \textbf{\ttt{real.exe} is sometimes crashing with certain (low) values of \ttt{p\_top\_requested}.}
    102102\begin{finger}
    103 \item The program \ttt{real.exe} attempts to come up with nice equally-spaced-in-altitude vertical levels above the boundary layer up to the model top. This is done by an iterating algorithm integrating the hydrostatic equation which sometimes does not converge if the model top is too high (typically for values of \ttt{p\_top\_requested} below~$5$~Pa). Try to lower \ttt{force\_sfc\_in\_vinterp}, increase \ttt{max\_dz}, or modify \ttt{tiso} to help the algorithm to converge.
     103\item The program \ttt{real.exe} attempts to come up with nice equally-spaced-in-altitude vertical levels above the boundary layer up to the model top. This is done by an iterating algorithm integrating the hydrostatic equation, which sometimes does not converge if the model top is too high (typically for values of \ttt{p\_top\_requested} below~$5$~Pa). Try to lower \ttt{force\_sfc\_in\_vinterp}, increase \ttt{max\_dz}, or modify \ttt{tiso} to help the algorithm to converge. An alternate solution to set values for \ttt{p\_top\_requested} below~$5$~Pa is to prescribe your own vertical levels (see next point).
    104104\end{finger}
    105105
     
    119119
    120120\sk
    121 \noindent \textbf{I would like to know how much time my simulation will last.}
     121\noindent \textbf{I would like to know how long my simulation will last.}
    122122\begin{finger}
    123123\item Check the log information while \ttt{wrf.exe} is running. The effective time to realize each integrating or writing step is indicated. Hence you can extrapolate and predict the total simulation time. If you use parallel computations, have a look in \ttt{rsl.error.0000} to get this information.
     
    131131
    132132\sk
    133 \noindent \textbf{Looks like in the model (cf. \ttt{namelist.input}, a Martian hour is~$3700$ seconds. The reality is closer to~$3699$ seconds.}
     133\noindent \textbf{Looks like in the model (cf. \ttt{namelist.input}), a Martian hour is~$3700$ seconds. The reality is closer to~$3699$ seconds.}
    134134\begin{finger}
    135 \item This is true, though obviously the \ttt{3700} figure is much more convenient and choosing this instead of~$3699$ has no impact whatsoever on simulations which last typically less than one month, and most often only a few days.
     135\item This is true, though obviously the~$3700$ figure is much more convenient and choosing this instead of~$3699$ has no impact whatsoever on simulations which last typically less than one month, and most often only a few days.
    136136\end{finger}
    137137
     
    145145\noindent \textbf{The executable \ttt{wrf.exe} crashes a few seconds after launching and I don't know why.}
    146146\begin{finger}
    147 \item Please check all outputs from \ttt{wrf.exe}: information log and \ttt{wrfout*} files. It is usually possible to find hints about the problem(s) which make the model become unstable or crash. Sometimes it is just one file that is missing. If \ttt{cfl} warnings are reported in information log, it is probably a good idea to lower the timestep, but this will not fix the problem all the time especially if there are wrong settings and subsequent physical inconsistencies. If everything looks fine in the information log, try to lower \ttt{history\_interval} to $1$ in \ttt{namelist.input} so that much frequent outputs can be obtained in the \ttt{wrfout*} files and the problem can be further diagnosed through analyzing simulated meteorological fields.
     147\item Please check all outputs from \ttt{wrf.exe}: \ttt{wrfout*} files and information log (note that the model can be made more verbose by setting \ttt{debug\_level = 200} in \ttt{namelist.input}). It is usually possible to find hints about the problem(s) which make the model become unstable or crash. Sometimes it is just one file that is missing. If \ttt{cfl} warnings are reported in information log, it is probably a good idea to lower the timestep, but this will not fix the problem all the time especially if there are wrong settings and subsequent physical inconsistencies. If everything looks fine in the information log, try to lower \ttt{history\_interval} to $1$ in \ttt{namelist.input} so that much frequent outputs can be obtained in the \ttt{wrfout*} files and the problem can be further diagnosed through analyzing simulated meteorological fields.
    148148\end{finger}
    149149
     
    151151\noindent \textbf{I don't know which timestep should I choose to prevent the model from crashing.}
    152152\begin{finger}
    153 \item The answer depends on the horizontal resolution according to the CFL condition -- and whether the dynamical core is used in hydrostatic or non-hydrostatic mode, plus other factors (e.g. slopes, temperature gradients, etc\ldots). Please refer to the table in \textit{Spiga and Forget} [2009] for guidelines about timestep. A rule-of-thumb to start with is to set \ttt{time\_step} to the value of \ttt{dx} in kilometers; this value can be sometimes raised to get faster integrations. If the \ttt{time\_step} parameter is too large for the horizontal resolution~\ttt{dx} and violates the CFL criterion, \ttt{wrf.exe} usually issues warnings about CFL violation in the first integration steps.
     153\item The answer depends on the horizontal resolution according to the CFL condition -- and whether the dynamical core is used in hydrostatic or non-hydrostatic mode, plus other factors (e.g. slopes, temperature gradients, etc\ldots). Please refer to the table in \textit{Spiga and Forget} [2009] for guidelines about timestep; or check examples in \ttt{\$MMM/SIMU/DEF}. A rule-of-thumb to start with is to set \ttt{time\_step} to the value of \ttt{dx} in kilometers; this value can be sometimes raised to get faster integrations. If the \ttt{time\_step} parameter is too large for the horizontal resolution~\ttt{dx} and violates the CFL criterion, \ttt{wrf.exe} usually issues warnings about CFL violation in the first integration steps.
    154154\end{finger}
    155155
     
    160160\end{finger}
    161161
     162\sk
     163\noindent \textbf{I compiled the model with \ttt{ifort}. At runtime it stops after a few integration steps because a segmentation fault appeared.}
     164\begin{finger}
     165\item The model uses a lot of memory, especially when large domains or nests are requested. Try the command \ttt{ulimit -s unlimited}. If this does not solve the problem, try other solutions listed in this chapter.
     166\end{finger}
     167
     168%\mk
     169%\section{Specific simulations}
     170%\sk
     171%\noindent \textbf{It seems difficult to me to find a number of horizontal grid points for parallel nested simulations that is compliant with all constraints mentioned in section~\ref{nests}.}
     172%\begin{finger}
     173%\item A tip to find a compliant domain size for nested simulations: for the parent domain, choose \ttt{e\_we} and \ttt{e\_sn} according to~$e_{we} = n_{proc} \times i + 1$ with~$n_{proc}$ being a multiple of~$4$ and~$2 \, i +1$ being a multiple of~$3$. For child domains, set \ttt{e\_we} and \ttt{e\_sn} according to \ttt{e\_we[child domain] = e_we[parent domains] + 4}.
     174%\end{finger}
     175%%%%% TROP SPECIFIQUE !!!
     176%%        -----------------------------------------------------------------------
     177%%        -- si possible comment determiner taille ?
     178%%        nproc doit diviser e_we-1 (1er nest)
     179%%        grid_ratio doit diviser e_we-1 +4 (1er nest)
     180%%        soit e_we=ye+1
     181%%        grid_ratio divise ye+4 et nproc divise ye
     182%%        soit nproc=8, ye=8*i
     183%%        ainsi il existe j tel que 8i + 4 = 3j ou encore 4*[2i+1] = 3j
     184%%        verifie par exemple si 2i+1 est multiple de 3
     185%%        il suffit donc de trouver un multiple impair de 3 et de deduire i
     186%%        par exemple 2i+1=33 >>>> i=16
     187%%        >>>> e_we = 129 pour le 1er nest (et ajouter 4 pour les suivants)
     188%%        ------------------------------------------------------------------------
     189
     190
    162191
    163192%%% DIFFUSION FOR TRACERS
    164193%%% GRAVITY WAVE ABSORBING LAYER
    165 
     194%%% ILM files with PGF90 ?
     195%%% WPS PREP_MARS peuvent être liés entre e.g. pgf et mpi, ou ifort et mpifort
    166196
    167197\clearemptydoublepage
  • trunk/MESOSCALE_DEV/MANUAL/SRC/foreword.tex

    r262 r493  
    22
    33\vk
    4 \paragraph{Welcome!} This manual describes how to use the Laboratoire de M\'et\'eorologie Dynamique (LMD) Martian Mesoscale Model. Many thanks for looking forward to using this model which development required countless hours of hard work! A significant part of the model development and validation have been funded by ESA and CNES which are acknowledged here.
     4\paragraph{Welcome!} This manual describes how to use the Laboratoire de M\'et\'eorologie Dynamique (LMD) Martian Mesoscale Model. Many thanks for looking forward to using this model which development required countless hours of hard work! A significant part of the model development and validation have been funded by ESA and CNES which are acknowledged for their support.
    55
    66\paragraph{Contact} The main contact to reach at LMD to become an user of the model is Aymeric SPIGA (main developper, \href{mailto:aymeric.spiga@upmc.fr}{\nolinkurl{aymeric.spiga@upmc.fr}}). Alternative contacts at LMD for mesoscale modeling inquiries are Ehouarn MILLOUR~\url{ehouarn.millour@lmd.jussieu.fr} or Fran\c cois FORGET~\url{francois.forget@lmd.jussieu.fr}. We are open to questions and suggestions on new scientific collaborations, teaching/outreach actions or contractual proposals.
  • trunk/MESOSCALE_DEV/MANUAL/SRC/guide.tex

    r315 r493  
    1111
    1212\sk
    13 \paragraph{Prerequisites} Prepare parameter files (copy templates or pre-existing files); Edit those files; Use \ttt{\$MMM/SIMU/calendar} (or cf. appendix) to choose simulation dates and fill the namelists; Pay attention to correspondances between \ttt{namelist.input} and \ttt{namelist.wps}. See~\ref{zeparam} and~\ref{wps} for further details.
     13\paragraph{Prerequisites} Prepare parameter files (copy templates or pre-existing files); Edit those files; Use \ttt{\$MMM/SIMU/calendar} (or cf. appendix) to choose simulation dates and fill the namelists; Pay attention to correspondances between \ttt{namelist.input} and \ttt{namelist.wps}. \emph{See~\ref{zeparam} and~\ref{wps} for further details}.
    1414\begin{verbatim}
    1515cd /a_place/MY_SIMU
     
    2121
    2222\sk
    23 \paragraph{Step 0} Compile the model. See~\ref{sc:makemeso} for further details.
     23\paragraph{Step 0} Compile the model. \emph{See~\ref{sc:makemeso} for further details}.
    2424\begin{verbatim}
    2525cd $MMM
     
    3434
    3535\sk
    36 \paragraph{Step 1} Run the LMD Global Circulation Model (GCM) to provide initial and boundary conditions for the mesoscale model. See~\ref{gcmini} for further details.
     36\paragraph{Step 1} Run the LMD Global Circulation Model (GCM) to provide initial and boundary conditions for the mesoscale model. \emph{See~\ref{gcmini} for further details}.
    3737\begin{verbatim}
    3838cd $MESO/LMDZ.MARS/myGCM
     
    4848
    4949\sk
    50 \paragraph{Step 2} Create the mesoscale limited-area domain of simulation. Run preprocessing programs to horizontally interpolate GCM meteorological fields and static data (topography, soil properties) to the chosen simulation domain. See~\ref{wps} for further details.
     50\paragraph{Step 2} Create the mesoscale limited-area domain of simulation. Run preprocessing programs to horizontally interpolate GCM meteorological fields and static data (topography, soil properties) to the chosen simulation domain. \emph{See~\ref{wps} for further details}.
    5151\begin{verbatim}
    5252cd $MMM/your_compdir/WPS
     
    5959
    6060\sk
    61 \paragraph{Step 3} Run preprocessing programs to vertically interpolate GCM meteorological fields and generate the initial and boundary conditions directly used by the mesoscale model. See~\ref{real.exe} for further details.
     61\paragraph{Step 3} Run preprocessing programs to vertically interpolate GCM meteorological fields and generate the initial and boundary conditions directly used by the mesoscale model. \emph{See~\ref{real.exe} for further details}.
    6262\begin{verbatim}
    6363cd /a_place/MY_SIMU
    64 ln -sf $MMM/your\_compdir/WPS/WRFFEED/current/met_em* .
     64ln -sf $MMM/your_compdir/WPS/WRFFEED/current/met_em* .
    6565real.exe
    6666[check that wrfinput* wrfbdy* netCDF files are created]
     
    116116\item If \ttt{runmeso} went well through steps~$1$ and~$2$, but encountered an error in~$3$, once the error has been corrected \ttt{runmeso} is not required to perform steps~$1$ and~$2$ again and can be started directly at step~$3$ (by typing~$3$, see possible choices above).
    117117\item The \ttt{LMD:*} files created by a \ttt{runmeso} call which features step~$1$ are kept in \ttt{WPSFEED} (located in \ttt{\$MESO/TMPDIR}). Those files will be overwritten by subsequent calls to \ttt{runmeso} if you choose to re-run the GCM at similar dates.
    118 \item The \ttt{met\_em*} files created by a \ttt{runmeso} call which features step~$2$ are kept in a directory in \ttt{WRFFEED} (located in \ttt{\$MESO/TMPDIR}) which name refers to precise date and time, so that it will not be overwritten by subsequent calls to \ttt{runmeso} for other simulations. In the simulation directory \ttt{runmeso} creates a \ttt{met\_em} directory which contains links towards the \ttt{met\_em*} files.
     118\item The \ttt{met\_em*} files created by a \ttt{runmeso} call which features step~$2$ are kept in a directory in \ttt{WRFFEED} (located in \ttt{\$MESO/TMPDIR}) which name refers to precise date and time, so that it will not be overwritten by subsequent calls to \ttt{runmeso} for other simulations. In the simulation directory, \ttt{runmeso} creates a \ttt{met\_em} directory which contains links towards the \ttt{met\_em*} files.
    119119\item The contents of directories in \ttt{\$MESO/TMPDIR} (i.e. \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED}) might grow large as you launch more and more simulations with \ttt{runmeso}. It is probably a good idea to clean up from time to time files referring to old obsolete simulations.   
    120120\end{finger}
  • trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex

    r262 r493  
    22
    33\vk
    4 This chapter is meant for first time users of the LMD Martian Mesoscale Model. We describe how to install the model on your system. Experience with either the terrestrial WRF mesoscale model or the LMD Martian GCM is not absolutely required,
    5 although it would help you getting more easily through the installation process.
     4This chapter is meant for first time users of the LMD Martian Mesoscale Model. We describe how to install the model on your system. Experience with either the terrestrial WRF mesoscale model or the LMD Martian GCM is not absolutely required, although it would help you getting more easily through the installation process.
    65
    76\mk
     
    3534\sk
    3635\begin{finger}
    37 \item If you want the environment variables to be persistent in your system, please copy the \ttt{declare} command lines spread in this user manual in your \ttt{.bashrc} or \ttt{.bash\_profile}.
     36\item If you want the environment variables to be persistent in your system, copy the \ttt{declare} command lines spread in this user manual in your \ttt{.bashrc} or \ttt{.bash\_profile}.
    3837\item You might also find useful -- though not mandatory -- to install on your system:
    3938\begin{citemize}
     
    108107
    109108\sk
    110 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your MPICH \ttt{bin} directory, even if you added the \ttt{\$your\_software\_dir/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.
     109Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of \ttt{MPICH2} or \ttt{openMPI} as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your MPI \ttt{bin} directory, even if you added this directory to your \ttt{\$PATH} variable.
    111110
    112111\begin{finger}
    113 \item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing which installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} for the sake of illustration) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
     112\scriptsize
     113\item An installation script for openMPI is available upon request (and information can be easily retrieved from the web).
     114\item Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing which installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} for the sake of illustration) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
    114115\begin{verbatim}
    115 mkdir $your_software_dir/MPI   
    116 mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/
    117 cd $your_software_dir/MPI
    118 tar xzvf mpich2-1.0.8.tar.gz
    119 cd mpich2-1.0.8
     116mkdir $your_software_dir/MPI ; mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/ ; cd $your_software_dir/MPI
     117tar xzvf mpich2-1.0.8.tar.gz ; cd mpich2-1.0.8
    120118./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
    121 # please wait...
    122119make > mk.log 2> mkerr.log &
    123120declare -x WHERE_MPI=$your_software_dir/MPI/mpich2-1.0.8/bin
  • trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex

    r262 r493  
    3939\item \ttt{(r)} indicates parameters which modifications imply a new compilation\footnote{A full recompilation using the option \ttt{makemeso -f} is not needed here.} of the model using \ttt{makemeso} (step 0);
    4040\item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing of initial and boundary conditions (see chapter~\ref{zepreproc}), corresponding respectively to step~1, 2, 3; \ttt{(p1)} means the user has to carry out again steps~1 to 3 before being able to run the model at step~4; \ttt{(p2)} means the user has to carry out again steps~2 to~3 before model run at step~4;
    41 \item no label means once you have modified the parameter, you can simply start directly at step~4 (running the model);
     41\item no label means that once you have modified the parameter, you can simply start directly at step~4 (running the model);
    4242\item \ttt{(*d)} denotes dynamical parameters which modification implies non-standard simulations -- please read \ttt{\$MMM/SRC/WRFV2/run/README.namelist} and use with caution, i.e. if you know what you are doing; after modifying those parameters you can simply start at step~4.
    4343\item \ttt{(*)} denotes parameters not to be modified;
     
    5151
    5252\sk
    53 \subsection{Advice on filling \ttt{namelist.input}}\label{namelist}
     53\subsection{Important advice on filling \ttt{namelist.input}}\label{namelist}
    5454
    5555\paragraph{Test case} An interesting exercise is to analyze comparatively the \ttt{TESTCASE/namelist.input} file (cf. section~\ref{sc:arsia}) with the reference \ttt{namelist.input\_full} given above, so that you could understand which settings are being made in the Arsia Mons test simulation. Then you could try to modify parameters in the \ttt{namelist.input} file and re-run the model to start getting familiar with the various settings. Given that the test case relies on pre-computed initial and boundary conditions, not all parameters can be changed in the \ttt{namelist.input} file at this stage.
     
    5757\paragraph{Syntax} Please pay attention to rigorous syntax while editing your personal \ttt{namelist.input} file to avoid reading error. If the model complains about this at runtime, start again with the available template \ttt{\$MMM/SIMU/namelist.input\_full}.
    5858
    59 \paragraph{Time management} Usually the user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. In the \ttt{namelist.input} file, start/end time is set in the form year / month / day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix) is intended to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~17, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 8^{\circ}$.
     59\paragraph{Time management} Usually the user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. In the \ttt{namelist.input} file, start/end time is set in the form year / month / day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix) is intended to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds to~$L_s \sim 180^{\circ}$ according to the \ttt{calendar} file. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~17, which corresponds to~$L_s \sim 8^{\circ}$.
    6060
    6161\mk
  • trunk/MESOSCALE_DEV/MANUAL/SRC/postproc.tex

    r315 r493  
    2020
    2121\sk
    22 It is also possible to output fields which are present only in the physical computations, i.e. appearing in \ttt{\$MMM/SRC/WRFV2/mars\_lmd/libf/phymars/physiq.F}. The method is simple. Assume you would like to output in the \ttt{wrfout*} files a 3D field named \ttt{zdtnirco2} and a 2D field named \ttt{qsurfice} in \ttt{physiq.F} with the new names \ttt{HR\_NIR} and \ttt{QSURFICE}. All you have to do is add the following lines to \ttt{Registry.EM} (see also examples around lines \ttt{75-120}. For 2D [3D] files the 4th column must be \ttt{ij} [\ttt{ikj}] and the 12th column \ttt{\#SAVEMARS2} [\ttt{\#SAVEMARS3}].
     22It is also possible to output fields which are present only in the physical computations, i.e. appearing in \ttt{\$MMM/SRC/WRFV2/mars\_lmd/libf/phymars/physiq.F}. The method is simple. Assume you would like to output in the \ttt{wrfout*} files a 3D field named \ttt{zdtnirco2} and a 2D field named \ttt{qsurfice} in \ttt{physiq.F} with the new names \ttt{HR\_NIR} and \ttt{QSURFICE}. All you have to do is add the following lines to \ttt{Registry.EM} (see also examples around lines \ttt{75-120}). For 2D [3D] files the 4th column must be \ttt{ij} [\ttt{ikj}] and the 12th column \ttt{\#SAVEMARS2} [\ttt{\#SAVEMARS3}].
    2323
    2424\scriptsize
     
    3030
    3131\sk
    32 Each change in \ttt{Registry.EM} must be followed by a complete recompilation because the model variables have changed. Whether you use \ttt{makemeso} or \ttt{runmeso}, use the option \ttt{-r} to force recompiling with a new/updated list of variables.
     32Each change in \ttt{Registry.EM} must be followed by a complete recompilation because the model variables have changed. Whether you use \ttt{makemeso} or \ttt{runmeso}, use the option \ttt{-f} to force recompiling with a new/updated list of variables.
    3333%%%% now obsolete ! remove Registry and recompile, or use fresh start (-f).
    3434
     
    4242
    4343\sk
    44 The fields output in \ttt{wrfout*} files are given for each grid point and model level. A vertical interpolation has to be performed to get those fields either in altitude or pressure levels. In addition, perturbation potential temperature \ttt{T}, x-component wind \ttt{U} and y-component \ttt{V} are output instead of the more informative (meteorogically-speaking) temperature \ttt{tk}, zonal wind \ttt{Um} and meridional wind \ttt{Vm}. This is why we developed a program named \ttt{api} (Altitude and Pressure Interpolator) which performs the tasks to convert the netCDF \ttt{wrfout*} files into another netCDF file featuring more useful fields to make plots and analyze the Martian mesoscale meteorology.
     44The fields output in \ttt{wrfout*} files are given for each grid point and model level. A vertical interpolation has to be performed to get those fields either in altitude or pressure levels. In addition, perturbation potential temperature \ttt{T}, x-component wind \ttt{U} and y-component \ttt{V} are output instead of the more informative (meteorologically-speaking) temperature \ttt{tk}, zonal wind \ttt{Um} and meridional wind \ttt{Vm}. This is why we developed a program named \ttt{api} (Altitude and Pressure Interpolator) which performs the tasks to convert the netCDF \ttt{wrfout*} files into another netCDF file featuring more useful fields to make plots and analyze the Martian mesoscale meteorology.
    4545
    4646\sk
    47 The source files for \ttt{api} are located in \ttt{\$MMM/SRC/POSTPROC/}. The program \ttt{api.F90} has to be compiled with the \ttt{comp\_api} command (which must be edited first, to uncomment the line corresponding to the Fortran compiler you are used to). Then the user has to fill in the parameter file \ttt{namelist.api} before launching the interpolator through the command \ttt{api}. A commented template for \ttt{namelist.api} is given below (this examples and many others can be found in \ttt{\$MMM/SRC/POSTPROC/}). The calculations might be long if you are asking for many fields and many interpolation levels. In the example below, temperature, meteorological winds and vertical velocity are interpolated at~$50$~m above the local surface. The results are output in a netCDF file having the same name as the input \ttt{wrfout*} files, with an additional suffix which depends on the chosen interpolation method.
     47The source files for \ttt{api} are located in \ttt{\$MMM/SRC/POSTPROC/}. The program \ttt{api.F90} has to be compiled with the \ttt{comp\_api} command (which must be edited first, to uncomment the line corresponding to the Fortran compiler you are used to). Then the user has to fill in the parameter file \ttt{namelist.api} before launching the interpolator through the command \ttt{api}. A commented template for \ttt{namelist.api} is given below (this examples and many others can be found in \ttt{\$MMM/SRC/POSTPROC/}). The calculations might be long if you are requesting many fields and many interpolation levels. In the example below, temperature, meteorological winds and vertical velocity are interpolated at~$50$~m above the local surface. The results are output in a netCDF file having the same name as the input \ttt{wrfout*} files, with an additional suffix which depends on the chosen interpolation method.
    4848
    4949\scriptsize
     
    6464
    6565\sk
    66 Powerful scripts based on \ttt{python+numpy+matplotlib} have been developed to obtain plots from the mesoscale model outputs. All figures in this user manual are based on the scripts \ttt{domain.py} and \ttt{winds.py} (more scripts will be available in the future). Those scripts can be found in \ttt{\$MMM/SRC/PYTHON}. It is required that \ttt{python} and numerical/graphical librairies (\ttt{numpy}, \ttt{scipy}, \ttt{matplotlib}, \ttt{basemap}, \ttt{netcdf} are installed on your system. Perhaps the simplest way to do so is to install the user-friendly complete python distribution \ttt{epd} (cf. link in chapter~\ref{install}). One of the advantages of an approach using \ttt{python}, apart from its open-source philosophy and the abundant online documentation, is that in a common framework it allows for scripting with various options, integrating Fortran routines, manipulating arrays, making plots with various map projections. This is exemplified by the \ttt{winds.py} script. It can both perform interpolation with \ttt{api} for the level requested by the user then generate a map, all that in one simple command line. For instance, Figures~\ref{arsia} in chapter~\ref{compile} has been generated by the following two commands:
     66Powerful scripts based on \ttt{python+numpy+matplotlib} have been developed to obtain plots from the mesoscale model outputs. All figures in this user manual are based on the script \ttt{pp.py}. This script can be obtained with the commands: \ttt{cd \$MESO ; cd .. ; svn update UTIL ; cd UTIL/PYTHON}. It is required that \ttt{python} and numerical/graphical librairies (\ttt{numpy}, \ttt{scipy}, \ttt{matplotlib}, \ttt{basemap}, \ttt{netcdf}) are installed on your system. Perhaps the simplest way to do so is to install the user-friendly complete python distribution \ttt{epd} (cf. link in chapter~\ref{install} and readme file \ttt{UTIL/PYTHON/README.INSTALL}).
    6767
    68 \scriptsize
     68\sk
     69One of the advantages of an approach using \ttt{python}, apart from its open-source philosophy and the abundant online documentation, is that it allows, in a common framework, for scripting with various options, integrating Fortran routines, manipulating arrays, making plots with various map projections. This is exemplified by the \ttt{pp.py} script. It can both perform interpolation with \ttt{api} for the level requested by the user then generate a map, all that in one simple command line. For instance, Figures~\ref{arsia} in chapter~\ref{compile} has been generated by the following two commands\footnote{The first plot can also be obtained by the command \ttt{domain.py -f name\_of\_file}}:
     70
    6971\begin{verbatim}
    70 domain.py -f wrfout_d01_2024-01-17_02:00:00
    71 winds.py -f wrfout_d01_2024-01-17_02:00:00 -i 4 -l 0.01 -v HGT -n -1 -m -1500. -M 20000. -s 2
     72pp.py -f wrfout_d01_2024-01-17_02:00:00 -p ortho -b vishires --title ""
     73pp.py -f wrfout_d01_2024-01-17_02:00:00 -i 4 -l 0.01 -v HGT -W -s 2 \
     74      --time 1 --axtime lt -m -1500. -M 20000. --div 20 -c nobar -z 25
    7275\end{verbatim}
    73 \normalsize
    7476
    75 Many options can be used in our \ttt{python} scripts. The example of command \ttt{winds.py} at the time of writing is listed below; this information can be obtained by typing \ttt{winds.py -h}. This script can also be easily edited to suit your needs if the option you need does not exist.
     77\sk
     78Many options are implemented in our \ttt{pp.py} script. The information on the existing options to~\ttt{pp.py} can be obtained by typing \ttt{pp.py -h} (cf. next page). Examples on how to use the \ttt{pp.py} script can be found in \ttt{UTIL/PYTHON/README.PP}. The script can also be edited to suit your needs if the desired option does not exist.
     79
     80\begin{finger}
     81\item Please ensure that you have the rights to execute \ttt{pp.py} (otherwise use the \ttt{chmod} command). It is also necessary to set the following environment variables to ensure the command \ttt{pp.py} would execute in any working directory
     82\begin{verbatim}
     83PYTHONPATH=$MESO/../UTIL/PYTHON/
     84export PYTHONPATH
     85PATH=$PYTHONPATH:$PATH
     86\end{verbatim}
     87\item The option \ttt{-i} in \ttt{pp.py} make use of the Fortran routines \ttt{api.F90} and \ttt{time.F}. The routines have to be converted to \ttt{python} commands using \ttt{f2py}. Please execute the script amongst \ttt{api\_g95.sh}, \ttt{api\_ifort.sh}, \ttt{api\_pgf90.sh} which corresponds to the Fortran compiler installed on your system. Check for errors/warnings in the log files and ensure that the two files \ttt{api.so} and \ttt{timestuff.so} are generated.
     88\end{finger}
     89
     90\newpage
    7691
    7792\scriptsize
     
    7994\normalsize
    8095
    81 \begin{finger}
    82 \item Please ensure that you have the rights to execute \ttt{domain.py} and \ttt{winds.py} (otherwise use the \ttt{chmod} command). It is also necessary to set the following environment variables to ensure the commands \ttt{winds.py} or \ttt{domain.py} would execute in any working directory
    83 \begin{verbatim}
    84 PYTHONPATH=$MMM/SRC/PYTHON/
    85 export PYTHONPATH
    86 PATH=$PYTHONPATH:$PATH
    87 \end{verbatim}
    88 \item The option \ttt{-i} in \ttt{winds.py} make use of the Fortran routine \ttt{api.F90} (and the routine \ttt{time.F} is also needed). The routines have to be converted to \ttt{python} commands using \ttt{f2py}. Please execute the script amongst \ttt{api\_g95.sh}, \ttt{api\_ifort.sh}, \ttt{api\_pgf90.sh} which corresponds to the Fortran compiler installed on your system. Check for errors/warnings in the log files and ensure that the two files \ttt{api.so} and \ttt{timestuff.so} are generated.
    89 \end{finger}
    90 
    9196\clearemptydoublepage
  • trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex

    r262 r493  
    88
    99\sk
    10 The compilation operations indicated here need to be done only once on a given system.
     10The compilation operations indicated here need to be done only once on a given system with a given compiler.
    1111
    1212\sk
     
    3030ln -sf ../SRC/SCRIPTS/prepare_ini .
    3131./prepare_ini
    32 echo $PWD
    33 \end{verbatim}
     32\end{verbatim}
     33%%echo $PWD
    3434
    3535\sk
     
    3737
    3838\sk
    39 The script \ttt{prepare\_ini} plays for the preprocessing tools a similar role as the \ttt{copy\_model} with the model sources: files are simply linked to their actual location in the \ttt{SRC} folder. Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentioned in the name of the current installation directory. In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}. Here are the useful commands:
     39The script \ttt{prepare\_ini} plays for the preprocessing tools a similar role as the script \ttt{copy\_model} for the model sources: files are simply linked to their actual location in the \ttt{SRC} folder. Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentioned in the name of the current installation directory. In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}. Here are the useful commands:
    4040
    4141\begin{verbatim}
     
    5151
    5252\sk
    53 Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} executable was a practical way not to confuse between executables compiled at different moments.}. \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}.
     53Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} executable was a practical way not to confuse between executables compiled at different moments.} (cf. chapter~\ref{compile}). \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}.
    5454
    5555\sk
     
    8686
    8787\sk
    88 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with, based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible to the system you plan to run the mesoscale model on. A link named~\ttt{startbase} towards the \ttt{STARTBASE\_64\_48\_32\_t2} directory must be created in the directory~\ttt{\$MESO/LMDZ.MARS/myGCM}. If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} in this directory, which should launch the GCM integrations on your system.
     88The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with, based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database\footnote{If another database is used, \ttt{compile} must be edited; default is~$64 \times 48 \times 32$ GCM runs with~$2$ tracers.} can be found in the following online archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible to the system you plan to run the mesoscale model on. A link named~\ttt{startbase} towards the \ttt{STARTBASE\_64\_48\_32\_t2} directory must be created in the directory~\ttt{\$MESO/LMDZ.MARS/myGCM}.
     89
     90\sk
     91GCM integrations can then be launched in~\ttt{\$MESO/LMDZ.MARS/myGCM} using~\ttt{launch\_gcm}.
    8992
    9093\mk
     
    98101\begin{finger}
    99102\item changing the season of simulation implies to re-run the LMD Mars GCM for this specific season to prepare initial and boundary conditions for the mesoscale model. Hence e.g. \ttt{start\_month} is labelled with \ttt{(p1)} because changing this in \ttt{namelist.input} requires a complete reprocessing from step~$1$ to step~$3$ to successfully launch the simulation.
    100 \item changing the number of horizontal grid points for the mesoscale domain implies to interpolate the static and GCM fields to the new domain, while no new computations on the GCM side are needed. Hence e.g. \ttt{e\_we} is labelled with \ttt{(p2)} because changing this in \ttt{namelist.input} requires a reprocessing from step~$2$ to step~$3$ to successfully launch the simulation (and also for this specific parameter recompiling with \ttt{makemeso} is needed).
     103\item changing the number of horizontal grid points for the mesoscale domain implies to interpolate the static and GCM fields to the new domain, while no new computations on the GCM side are needed. Hence e.g. \ttt{e\_we} is labelled with \ttt{(p2)} because changing this in \ttt{namelist.input} requires a reprocessing from step~$2$ to step~$3$ to successfully launch the simulation (for this specific parameter recompiling with \ttt{makemeso} is also needed).
    101104\item changing the position of model top implies to interpolate initial and boundary conditions to the new vertical levels, while no horizontal re-interpolations are needed. Hence e.g. \ttt{p\_top\_requested} is labelled with \ttt{(p3)} because changing this requires a reprocessing of step~$3$.
    102105\item changing the timestep for dynamical integration does not require any change in initial and boundary conditions. Hence e.g. \ttt{time\_step} is not labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}.
     
    114117
    115118\sk
    116 Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} reproduced in appendix can help with this choice (i.e. sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check in the \ttt{calendar} file which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} must be set accordingly: suppose you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, or ideally each hour, i.e. \ttt{ecritphy} is respectively~$80$ or~$40$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}:
     119Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} reproduced in appendix can help with this choice (i.e. sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check in the \ttt{calendar} file which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} must be set accordingly: suppose you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, or ideally each hour\footnote{The parameter \ttt{interval\_seconds} in \ttt{namelist.wps} (see section~\ref{wps}) has to be set accordingly.}, i.e. \ttt{ecritphy} is respectively~$80$ or~$40$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}:
    117120
    118121\begin{verbatim}
     
    134137
    135138\sk
    136 Once the GCM simulations are finished, programs in the \ttt{PREP\_MARS} directory allow the user to convert the data\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, those are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf.} from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, which follows the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if everything went well with the conversion, the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:*}.
    137 
    138 \begin{verbatim}
    139 cd $MMM/your_install_dir/PREP\_MARS
     139Once the GCM simulations are finished, programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, those are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf.} for each date contained in \ttt{diagfi.nc} and formatted for the preprocessing programs at step 2. These programs can be executed by the following commands; if everything went well with the conversion, the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:*}.
     140
     141\begin{verbatim}
     142cd $MMM/your_install_dir/PREP_MARS
    140143echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
    141144./readmeteo.exe < readmeteo.def
     
    154157\end{verbatim}
    155158
    156 The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page, and execute again \ttt{geogrid.exe}.
    157 
    158 \begin{finger}
    159 \item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be achieved/prepared e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain while GCM computations being performed done during step~1.
     159The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page\footnote{You may find the corresponding file in \ttt{\$MMM/SIMU/namelist.wps\_example}.}, and execute again \ttt{geogrid.exe}.
     160
     161\begin{finger}
     162\item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be done e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain while GCM computations being performed during step~1.
    160163\item More details about the database and more options of interpolation could be found in the file \ttt{geogrid/GEOGRID.TBL} (for advanced users only).
    161164\item Two examples of \ttt{namelist.wps} parameters are given in Figure~\ref{vallespolar} with resulting domains.
     
    163166
    164167\footnotesize
    165 \codesource{namelist.wps_TEST}
     168\codesource{namelist.wps_example}
    166169\normalsize
    167170
     
    185188
    186189\sk
    187 \paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MESO/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
     190\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MESO/TMPDIR/WRFFEED/current} should contain \ttt{met\_em.*} files.
    188191
    189192\begin{verbatim}
     
    197200
    198201\sk
    199 The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files called \ttt{wrfinput} and the boundary conditions in files called \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} which controls the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}.
     202The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files named \ttt{wrfinput} and the boundary conditions in files named \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} which controls the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}.
    200203
    201204\begin{verbatim}
    202205cd $MMM/TESTCASE   ## or anywhere you would like to run the simulation
    203 ln -sf $MESO/TMPDIR/WRFFEED/met_em* .
     206ln -sf $MESO/TMPDIR/WRFFEED/current/met_em* .
    204207./real.exe
    205208\end{verbatim}
    206209
    207210\sk
    208 The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
    209 
    210 \sk
    211 \begin{finger}
    212 \item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be consistent. }
     211The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
     212
     213\sk
     214\begin{finger}
     215\item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Obviously the dates sent to \ttt{launch\_gcm} and set in both \ttt{namelist.input} and \ttt{namelist.wps} should be consistent too. }
    213216\end{finger}
    214217
  • trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual.tex

    r262 r493  
    2424\begin{titlepage}
    2525\mbox{}\\
     26%\Ovalbox{
     27%\begin{minipage}{1\textwidth} \begin{center}
     28%\Huge{LMD Martian Mesoscale Model [LMD-MMM]}\\\huge{User Manual}
     29%\end{center} \end{minipage}
     30%}
    2631\Ovalbox{
    2732\begin{minipage}{1\textwidth} \begin{center}
    28 \Huge{LMD Martian Mesoscale Model}\\\huge{User Manual}
     33\Huge{LMD Martian Mesoscale Model [LMD-MMM]}
    2934\end{center} \end{minipage}
    3035}
     36\mbox{}\\
     37\begin{center}\Huge{User Manual}\end{center}
    3138\mbox{}\\
    3239\begin{center}
    33 \includegraphics[width=0.85\textwidth]{domain_100.png}
     40\includegraphics[width=0.8\textwidth]{domain_100.png}
    3441\end{center}
    3542\mbox{}\\
     
    5360%%%%%%%%%%%%%%%%%%%%%%
    5461\dominitoc
    55 %\ajusterletitrecourant{Sommaire}
     62\ajusterletitrecourant{User Manual for the LMD Martian Mesoscale Model}
    5663%\pagestyle{empty}
    5764\tableofcontents
  • trunk/MESOSCALE_DEV/MANUAL/SRC/winds.py.help

    r262 r493  
    1   -h, --help                    show this help message and exit
    2   -f NAMEFILE, --file=NAMEFILE  [NEEDED] name of WRF file (append)
    3   -l NVERT, --level=NVERT       level (def=0)(-i 2: p,mbar)(-i 3,4: z,km)
    4   -p PROJ, --proj=PROJ          projection
    5   -b BACK, --back=BACK          background image (def: None)
    6   -t TARGET, --target=TARGET    destination folder
    7   -s STRIDE, --stride=STRIDE    stride vectors (def=3)
    8   -v VAR, --var=VAR             variable color-shaded (append)
    9   -n NUMPLOT, --num=NUMPLOT     plot number (def=2)(<0: plot LT -*numplot*)
    10   -i INTERP, --interp=INTERP    interpolation (2: p, 3: z-amr, 4:z-als)
    11   -c COLORB, --color=COLORB     change colormap (nobar: no colorbar)
    12   -x, --no-vect                 no wind vectors
    13   -m VMIN, --min=VMIN           bounding minimum value (append)
    14   -M VMAX, --max=VMAX           bounding maximum value (append)
    15   -T, --tiled                   draw a tiled plot (no blank zone)
    16   -z ZOOM, --zoom=ZOOM          zoom factor in %
    17   -N, --no-api                  do not recreate api file
    18   -d, --display                 do not pop up created images
    19   -e ITSTEP, --itstep=ITSTEP    stride time (def=4)
    20   -H, --hole                    holes above max and below min
    21   -S SAVE, --save=SAVE          save mode (png,eps,svg,pdf or gui)(def=gui)
    22   -a, --anomaly                 compute and plot relative anomaly in %
    23   -w VAR2, --with=VAR2          variable contoured
    24   --div=NDIV                    number of divisions in colorbar (def: 10)
    25   -F FIRST, --first=FIRST       first subscript to plot (def: 1)
     1Usage: pp.py [options]
    262
     3Options:
     4  -h, --help            show this help message and exit
     5  -f FILE, --file=FILE  [NEEDED] filename. Append: different figures. Comma-
     6                        separated: same figure (+ possible --operation). Regex
     7                        OK: use -f "foo*" DONT FORGET QUOTES "" !!!!
     8  -t TGT, --target=TGT  destination folder
     9  -S SAVE, --save=SAVE  save mode (gui,png,eps,svg,pdf,txt,html,avi) [gui]
     10  -d, --display         do not pop up created images
     11  -O OUT, --output=OUT  output file name
     12  --rate=RATE           output is a movie along Time dimension [None]
     13  --quality             For movie mode: improves movie quality.(slower)
     14  -v VAR, --var=VAR     variable color-shaded
     15  -w VAR2, --with=VAR2  variable contoured
     16  -a, --anomaly         compute and plot relative anomaly in %
     17  --mult=MULT           a multiplicative factor to plotted field
     18  -m VMIN, --min=VMIN   bounding minimum value [min]
     19  -M VMAX, --max=VMAX   bounding maximum value [max]
     20  -H, --hole            holes above max and below min
     21  --nolow               do not plot low |values| [False]
     22  --redope=REDOPE       REDuce OPErators: mint,maxt for the moment [None]
     23  -l LVL, --level=LVL   level / start,stop,step (-i 2: p,mb)(-i 3,4: z,km) [0]
     24  -i ITP, --interp=ITP  interpolation (2: p, 3: z-amr, 4:z-als)
     25  --intas=INTAS         specify "mcs" or "tes" for gcm P interpolation grid
     26  -N, --no-api          do not recreate api file
     27  -c CLB, --color=CLB   change colormap (nobar: no colorbar)
     28  --div=NDIV            number of divisions in colorbar [10]
     29  --title=ZETITLE       customize the whole title
     30  -T, --tiled           draw a tiled plot (no blank zone)
     31  --res=RES             resolution for png outputs. --save png needed. [200.]
     32  --trans=TRANS         shaded plot transparency, 0 to 1 (=opaque) [1]
     33  --area=AREA           area on the map to be plot [None]
     34  -p PROJ, --proj=PROJ  projection
     35  -b BACK, --back=BACK  background image [None]
     36  -W, --winds           wind vectors [False]
     37  -s STE, --stride=STE  stride vectors [3]
     38  -z ZOOM, --zoom=ZOOM  zoom factor in %
     39  --blat=BLAT           reference lat (or bounding lat for stere) [computed]
     40  --blon=BLON           reference lon [computed]
     41  --lat=SLAT            slices along lat. 2 comma-separated values: averaging
     42  --lon=SLON            slices along lon. 2 comma-separated values: averaging
     43  --vert=SVERT          slices along vert. 2 comma-separated values: averaging
     44  --column              changes --vert z1,z2 from MEAN to INTEGRATE along z
     45  --time=STIME          slices along time. 2 comma-separated values: averaging
     46  --xmax=XMAX           max value for x-axis in contour-plots [max(xaxis)]
     47  --ymax=YMAX           max value for y-axis in contour-plots [max(yaxis)]
     48  --xmin=XMIN           min value for x-axis in contour-plots [min(xaxis)]
     49  --ymin=YMIN           min value for y-axis in contour-plots [min(yaxis)]
     50  --inverty             force decreasing values along y-axis (e.g. p-levels)
     51  --logy                set y-axis to logarithmic
     52  --axtime=AXTIME       choose "ls","sol","lt" for time ref (1D or --time)
     53  --operation=OPERAT    operation to perform on input files given through -f.
     54                        "+" or "-" acts on each input file by adding or
     55                        substracting the ref file specified through --fref.
     56                        "cat" acts on all input files in-a-row. "add_var"
     57                        "sub_var" "mul_var" "div_var" acts on two variables.
     58  --fref=FREF           reference namefile for the --operation option.
     59  --mope=VMINOPE        bounding minimum value for inter-file operation
     60  --Mope=VMAXOPE        bounding maximum value for inter-file operation
     61  --titleref=TITREF     title for the reference file. [title of fig (1)]
     62  --tsat                convert temperature field T in Tsat-T using pressure
Note: See TracChangeset for help on using the changeset viewer.