Changeset 262 for trunk/MESOSCALE_DEV/MANUAL
- Timestamp:
- Aug 5, 2011, 4:48:44 AM (13 years ago)
- Location:
- trunk/MESOSCALE_DEV/MANUAL/SRC
- Files:
-
- 12 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/MESOSCALE_DEV/MANUAL/SRC/advance.tex
r261 r262 2 2 3 3 \vk 4 In this chapter, useful advice to perform more sophisticated simulations is provided to advanced users of the model.4 In this chapter, advice to perform more sophisticated simulations is provided to advanced users. 5 5 6 6 \mk … … 9 9 \paragraph{Preparing namelist.input} For simulations with \ttt{max\_dom} nested domains, \ttt{max\_dom} parameters must be set wherever there is a ``," in the \ttt{namelist.input\_full} template in chapter~\ref{zeparam}. Specific parameters for nested simulations are labelled with \ttt{(n)} in this \ttt{namelist.input} template (see e.g. categories \ttt{\&time\_control}, \ttt{\&domains} and \ttt{\&bdy\_control}). To help you with filling the \ttt{namelist.input} file for a nested simulation, a commented example is given below 10 10 11 \vskip -0.4cm 11 12 \scriptsize 12 13 \codesource{namelist.input_nests} 13 14 \normalsize 14 15 15 \paragraph{Preparing namelist.wps} As is the case for single-domain simulations, the common parameters in the two files \ttt{na emlist.input} and~\ttt{namelist.wps} must be exactly similar. Similarly to single-domain simulations, an automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script. If you do not use \ttt{runmeso} to generate the \ttt{namelist.wps} file, please bear in mind that in this file, dates are different for the parent domain and the child domains, since boundary conditions are needed only for the parent domain while initial conditions are needed for all domains. The \ttt{namelist.wps} file associated to the previously described \ttt{namelist.input} file is given below and corresponds to a nested simulation in the Hellas Planitia region (Figure~\ref{nesteddomains}). Note that map projection must besimilar in all nests.16 \paragraph{Preparing namelist.wps} As is the case for single-domain simulations, the common parameters in the two files \ttt{namelist.input} and~\ttt{namelist.wps} must be exactly similar. Similarly to single-domain simulations, an automated generation of \ttt{namelist.wps} from \ttt{namelist.input} is provided in the \ttt{runmeso} script. If you do not use \ttt{runmeso} to generate the \ttt{namelist.wps} file, please bear in mind that in this file, dates are different for the parent domain and the child domains, since boundary conditions are needed only for the parent domain while initial conditions are needed for all domains. The \ttt{namelist.wps} file associated to the previously described \ttt{namelist.input} file is given below and corresponds to a nested simulation in the Hellas Planitia region (Figure~\ref{nesteddomains}). Note that map projection is similar in all nests. 16 17 17 18 \vskip -0.2cm … … 31 32 \paragraph{Preparing callphys.def} If you run a simulation with, say, $3$ domains, please ensure that you defined three files \ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def} (one per nest). If needed, different settings for physical parameterizations can be made in each nest; usually all settings in these files are similar, except \ttt{iradia} (so that differences in dynamical timesteps between nests can be potentially impacted to \ttt{callphys*.def} in order to synchronize radiative transfer call). 32 33 33 \paragraph{Compiling} Use the command \ttt{makemeso} and specify the number of domains and dimensions set in \ttt{namelist.input} . This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}.34 \paragraph{Compiling} Use the command \ttt{makemeso} and specify the number of domains and dimensions set in \ttt{namelist.input} (as far as the horizontal grid is concerned, answers to \ttt{makemeso} shall refer to the values of \ttt{e\_we} and \ttt{e\_sn} for the parent domain). This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}. 34 35 35 \paragraph{Running} If grid nesting and parallel computing are used, no more than~$4$ processors can be used. If the nested simulation is unstable, try a single-domain simulation with the parent domain and choose best parameters for stability (e.g., \ttt{time\_step}), then add a first nested domain, and start again stability investigations, etc.36 \paragraph{Running} If grid nesting and parallel computing are used, no more than~$4$ processors can be used. If the nested simulation is unstable, try a single-domain simulation with the parent domain and choose best parameters for stability (e.g., \ttt{time\_step}), then add a first nested domain, and start again stability tests and investigations, etc. 36 37 37 38 \paragraph{Inputs/outputs} Defining several domains yield one output per domain: e.g. for three domains \ttt{geogrid.exe} yields \ttt{geo\_em.d01.nc}, \ttt{geo\_em.d02.nc}, \ttt{geo\_em.d03.nc}\ldots; \ttt{real.exe} yields \ttt{wrfinput\_d01}, \ttt{wrfinput\_d02}, \ttt{wrfinput\_d03}, \ldots; \ttt{wrf.exe} yields \ttt{wrfout\_d01*}, \ttt{wrfout\_d02*}, \ttt{wrfout\_d03*}, \ldots 38 39 39 40 \paragraph{Useful remarks} The model presently supports 3 nests, but more nests can be included by adaptating \ttt{runmeso} and the following files: 41 %\scriptsize 40 42 \begin{verbatim} 41 43 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc … … 45 47 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest' 46 48 \end{verbatim} 49 %\normalsize 47 50 48 51 \mk 49 52 \section{Running simulations with tracers} 50 53 51 \paragraph{Preparing namelist.input} The default behavior of the model is to include no tra cer transported by the dynamics and influenced by physical parameterization. This corresponds to \ttt{mars=0} in \ttt{namelist.input} (or the absence of parameter \ttt{mars} from the user's namelist). To compute the water cycle in the LMD Martian Mesoscale Model, simply set \ttt{mars=1} in \ttt{namelist.input} (category \ttt{\&physics}). This will add one tracer for water vapor and one tracer for water ice in the model's computations and outputs. To compute a mesoscale simulation with one simple transported dust bin (with typical characteristics), set \ttt{mars=2} in \ttt{namelist.input}.54 \paragraph{Preparing namelist.input} The default behavior of the model is to include no transported tracer by the dynamics. This corresponds to \ttt{mars=0} in \ttt{namelist.input} (or the absence of parameter \ttt{mars} from the user's namelist). To compute the water cycle in the LMD Martian Mesoscale Model, simply set \ttt{mars=1} in \ttt{namelist.input} (category \ttt{\&physics}). This will add one tracer for water vapor and one tracer for water ice in the model's computations and outputs. To compute a mesoscale simulation with one simple transported dust bin (with typical characteristics), set \ttt{mars=2} in \ttt{namelist.input}. 52 55 53 56 \paragraph{GCM inputs} For water cycle simulations (\ttt{mars=1}), the GCM runs used to build initial and boundary conditions for the mesoscale model must also include water tracers. This is done by default in parameter files in \ttt{\$MESO/LMDZ.MARS/myGCM}, compiler wrapper \ttt{\$MESO/LMDZ.MARS/compile} and the database of start files \ttt{STARTBASE\_64\_48\_32\_t2}. … … 57 60 \paragraph{Compiling} It is key to recompile the LMD Martian Mesoscale Model with \ttt{makemeso} each time the number of transported tracers has changed, which would most often be the case if you modify \ttt{mars} in \ttt{namelist.input}. The right number of tracers corresponding to the \ttt{mars} case you are setting must be specify when answering questions to the \ttt{makemeso} script. This is done automatically of course if you use \ttt{runmeso} which reads the information in \ttt{namelist.input}. 58 61 59 \paragraph{Inputs/outputs} Additional fields corresponding to tracer mixing ratios (e.g. \ttt{QH2O} for water vapor) are automatically output in \ttt{wrfout*} files if a different option than~\ttt{0} is used for the \ttt{mars} keyword. Note that when a large number of tracers is set, output files might grow very large quickly after the mesoscale simulation was launched.62 \paragraph{Inputs/outputs} Additional fields corresponding to tracer mixing ratios (e.g. \ttt{QH2O} for water vapor) are automatically output in \ttt{wrfout*} files if a different option than~\ttt{0} is used for the \ttt{mars} keyword. Note that when a large number of tracers is set, output files might grow very large quickly after the mesoscale simulation is launched. 60 63 61 \paragraph{Test case} A good test case consists in coming back to the Arsia simulation described in ~\ref{sc:arsia} and activate the water cycle. Add \ttt{mars=1} to \ttt{namelist.input}, change \ttt{callphys.def} as described previously. Launch \ttt{runmeso} and choose \ttt{3} (i.e. recompile the model, run \ttt{real.exe} so that initial and boundary conditions for water are included, eventually run \ttt{wrf.exe}). Check for tracer fields in output files \ttt{wrfout*}.64 \paragraph{Test case} A good test case consists in coming back to the Arsia simulation described in section~\ref{sc:arsia} and activate the water cycle. Add \ttt{mars=1} to \ttt{namelist.input}, change \ttt{callphys.def} as described previously. Launch \ttt{runmeso} and choose \ttt{3} (i.e. recompile the model, run \ttt{real.exe} so that initial and boundary conditions for water are included, eventually run \ttt{wrf.exe}). Check for tracer fields in output files \ttt{wrfout*}. 62 65 63 66 \mk 64 67 \section{Running Large-Eddy Simulations} 65 68 66 \paragraph{Prerequisites} Large-Eddy Simulations are very specific applications of the LMD Martian Meso\-scale Model which allow the user to simulate boundary layer turbulent convection in idealized conditions . We recommend to read section 3.4 of \textit{Spiga and Forget} [2009] and the first three sections of \textit{Spiga et al.} [2010].69 \paragraph{Prerequisites} Large-Eddy Simulations are very specific applications of the LMD Martian Meso\-scale Model which allow the user to simulate boundary layer turbulent convection in idealized conditions at fine spatial and temporal resolution. We recommend to read section 3.4 of \textit{Spiga and Forget} [2009] and the first three sections of \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. 67 70 68 71 \paragraph{Preparing namelist.input} A typical parameter file \ttt{namelist.input\_les} is given in what follows (and could be found in \ttt{\$MMM/SIMU}). Settings specific to Large-Eddy Simulations are referred to as \ttt{LES}. The main differences with regular mesoscale simulations are the following: … … 75 78 \item idealized wind profile is often assumed, 76 79 \item \ttt{\&dynamics} keywords are adapted to small-scale diffusion, 77 \item periodic boundary conditions are set onthe horizontal grid.80 \item periodic boundary conditions are set for the horizontal grid. 78 81 \end{citemize} 79 82 … … 84 87 \vskip 0.4cm 85 88 86 \paragraph{Preparing callphys.def} It is essential that \ttt{calldifv} is set to \ttt{T} and \ttt{calladj} is set to \ttt{F} . Generally \ttt{iaervar} is set to \ttt{1} so that the (uniform) opacity in the domain can be set by creating a text file named \ttt{dustopacity.def} with the chosen value for opacity in it.89 \paragraph{Preparing callphys.def} It is essential that \ttt{calldifv} is set to \ttt{T} and \ttt{calladj} is set to \ttt{F} for Large-Eddy Simulations. Generally \ttt{iaervar} is set to \ttt{1} so that the (uniform) opacity in the domain can be set by creating a text file named \ttt{dustopacity.def} with the chosen value for opacity in it. 87 90 88 91 \paragraph{Compiling} The dynamical core used for Martian Large-Eddy Simulations is different than usual mesoscale simulations; it is based on WRF v3 instead of WRF v2. The first time the model is compiled, the user has to install it by typing the following commands: … … 96 99 makemeso -c les 97 100 \end{verbatim} 98 This creates a new compilation folder with prefix \ttt{les} in which t o find the executablesonce the model is compiled. Answers to \ttt{makemeso} must be compliant with settings in \ttt{namelist.input}.101 This creates a new compilation folder with prefix \ttt{les} in which the executables can be found once the model is compiled. Answers to \ttt{makemeso} must be compliant with settings in \ttt{namelist.input}. 99 102 100 103 \paragraph{Inputs/outputs} Large-Eddy Simulations need four input files \ttt{input\_coord}, \ttt{input\_sounding}, \ttt{input\_more}, \ttt{input\_therm} which define initial pressure, temperature, density, winds profiles at the location/season for which simulations are run, along with information about this location/season. Typical files are available upon request, or you might simply build your own profiles using the Mars Climate Database (see the sample \ttt{scilab} script \ttt{wrf\_sounding.sci} in \ttt{\$MMM/SIMU/RUN}). Examples for \ttt{input\_*} files are provided in \ttt{\$MMM/SIMU/DEF/LMD\_LES\_MARS\_def} and correspond to the cases run in the study by \textit{Spiga et al.} [2010]. … … 104 107 \item \ttt{input\_sounding} contains (first line) near-surface pressure (mbar), potential temperature, a dummy value; and (subsequent lines) altitudes above MOLA zero datum, potential temperatures, dummy value, zonal wind component, meridional wind component; 105 108 \item \ttt{input\_more} contains on the same line altimetry and surface temperature; 106 \item \ttt{input\_therm} contains lines corresponding values for (from left column to right column)~$R$, $c_p$, pressure, density, temperature.109 \item \ttt{input\_therm} contains lines with corresponding values for (from left column to right column)~$R$, $c_p$, pressure, density, temperature. 107 110 \end{citemize} 108 111 109 \paragraph{Running} Large-Eddy Simulations are not supported by \ttt{runmeso}. After compiling the model with the command \ttt{makemeso -c les}, please copy the executables \ttt{ideal.exe} and \ttt{wrf.exe} from the compilation directory \ttt{\$MMM/les*} towards your simulation directory where the \ttt{input\_*} files are located. Running \ttt{ideal.exe} would generate the initial state from the profiles provided in the \ttt{input\_*} files, then running \ttt{wrf.exe} would launch the model's integrations.112 \paragraph{Running} Large-Eddy Simulations are not supported by \ttt{runmeso}. After compiling the model with the command \ttt{makemeso -c les}, please copy the executables \ttt{ideal.exe} and \ttt{wrf.exe} from the compilation directory \ttt{\$MMM/les*} towards your simulation directory where the \ttt{input\_*} files are located. Running \ttt{ideal.exe} would generate the initial state \ttt{wrfbdy\_d01} from the profiles provided in the \ttt{input\_*} files, then running \ttt{wrf.exe} would launch the model's integrations. 110 113 111 114 112 115 %ze_hill ??? 116 %version without physics ??? 113 117 114 118 … … 124 128 %% mars = 11 ---> cycle de l'eau + poussieres [1+3] [NOUVELLE PHYS seulement] 125 129 %% LES LES 126 130 %%[set to 18 for newphys] 127 131 128 132 \clearemptydoublepage -
trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex
r258 r262 2 2 3 3 \vk 4 This chapter is alsomeant for first time users of the LMD Martian Mesoscale Model. We describe how to compile the program and run a test case. We start with important basics about how the model works and how it is organized.4 This chapter is meant for first time users of the LMD Martian Mesoscale Model. We describe how to compile the program and run a test case. We start with important basics about how the model works and how it is organized. 5 5 6 6 \mk … … 11 11 12 12 \sk 13 Any simulation that will be carried out with the LMD Martian Mesoscale Model comprises the five following steps. More details are given on the varioussteps in the following chapters, but it is important at this stage to have this structure in mind.13 Any simulation that will be carried out with the LMD Martian Mesoscale Model comprises the five following steps. More details are given on these steps in the following chapters, but it is important at this stage to have this structure in mind. 14 14 15 15 \sk 16 16 \begin{itemize} 17 \item \textbf{Step 0} Compiling the model 18 \item \textbf{Step 1} Running the LMD Global Circulation Model (GCM) to provide initial and boundary conditions for the mesoscale model 17 \item \textbf{Step 0} Compiling the model. 18 \item \textbf{Step 1} Running the LMD Global Circulation Model (GCM) to provide initial and boundary conditions for the mesoscale model. 19 19 \item \textbf{Step 2} Choosing the mesoscale limited-area domain of simulation. Running preprocessing programs to horizontally interpolate GCM meteorological fields and static data (topography, soil properties) to the chosen simulation domain. 20 \item \textbf{Step 3} Running preprocessing programs to vertically interpolate GCM meteorological fields and generate the initial and boundary conditions directly used by the mesoscale model 21 \item \textbf{Step 4} Running the LMD Martian Mesoscale Model 20 \item \textbf{Step 3} Running preprocessing programs to vertically interpolate GCM meteorological fields and generate the initial and boundary conditions directly used by the mesoscale model. 21 \item \textbf{Step 4} Running the LMD Martian Mesoscale Model. 22 22 \end{itemize} 23 23 24 24 \sk 25 In this chapter, the general method to perform steps 0 and 4 is reviewed. Other steps are reviewed in chapter~\ref{zepreproc} . Here the model will be compiled and run ina test case with precomputed sample files for preprocessing steps 1, 2, 3.25 In this chapter, the general method to perform steps 0 and 4 is reviewed. Other steps are reviewed in chapter~\ref{zepreproc}; here the model is compiled and run for a test case with precomputed sample files for preprocessing steps 1, 2, 3. 26 26 27 27 \sk … … 51 51 \item \ttt{WPS}: this is a directory containing sources for step~2. 52 52 \item \ttt{POSTPROC}: this is a directory containing postprocessing sources. 53 \item \ttt{PYTHON}: this is a directory containing \ttt{python}-based graphical scripts. 53 54 \item \ttt{LES} and \ttt{LESnophys\_}: these are directories containing sources for Large-Eddy Simulations. 54 55 \end{citemize} … … 87 88 \item ask the user about compilation settings; 88 89 \item retrieve some additional information about the system; 89 \item create a directory \ttt{\$MESO/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32 bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case);90 \item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} containinglinks to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.};91 \item execute the WRF \ttt{configure} script with the correct option ;90 \item create a directory \ttt{\$MESO/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32 bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case); 91 \item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} with links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.}; 92 \item execute the WRF \ttt{configure} script with the correct options; 92 93 \item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics and various patches and specific compilation options; 93 94 \item calculate the total number of horizontal grid points handled by the LMD physics; … … 111 112 112 113 \sk 113 To compile the model, change to \ttt{\$MMM} and execute the \ttt{makemeso} command:114 To compile the model, change directory to \ttt{\$MMM} and execute the \ttt{makemeso} command: 114 115 115 116 \begin{verbatim} … … 119 120 120 121 \sk 121 You are asked a few questions by the \ttt{makemeso} script (see the list below) then it compiles the model for you. The script outputs a text file named \ttt{last} in which your answers to the questions are stored, which allows you to re-run the script without the ``questions to the user" step using the \ttt{makemeso < last} command line. In what follows, the answers given in brackets are the ones you want to use here so that you will be able to run the test case proposed in the next section.122 You are asked a few questions by the \ttt{makemeso} script (see the list below) then it compiles the model for you. The script outputs a text file named \ttt{last} in which your answers to the questions are stored, which allows you to re-run the script without the ``questions to the user" step through the \ttt{makemeso < last} command line. In what follows, the answers given in brackets are the ones you want to use so that you will be able to run the test case proposed in the next section. 122 123 123 124 \mk … … 125 126 \item \textbf{choice of compiler}\footnote{We advise you to compile the model on the same kind of system (computer + operating system + librairies) as the one you plan to use to run the model.} 126 127 \item[1.bis] (mpi-based compilation) number of processors to be used 127 \item \textbf{number of grid points in longitude}\footnote{When you use parallel computations, please bear in mind that with $2$ (respectively $4$, $6$, $8$, $12$, $16$, $20$, $24$, $32$, $64$, $128$) processors the whole domain would be separated into $1$ (resp. $2$, $2$, $2$, $3$, $4$, $4$, $4$, $4$, $8$, $8$) tile over the longitude direction and $2$ (resp. $2$, $3$, $4$, $4$, $4$, $5$, $6$, $8$, $8$, $16$) tiles over the latitude direction. Thus make sure that the number of grid points minus $1$ in each direction could be divided by the aforementioned number of tiles over the considered direction. For instance a~$82 \times 109$ horizontal grid is compliant with the use of~$12$ processors.} [61]128 \item \textbf{number of grid points in longitude}\footnote{When you use parallel computations, please bear in mind that with $2$ (respectively $4$, $6$, $8$, $12$, $16$, $20$, $24$, $32$, $64$, $128$) processors the whole domain would be separated into $1$ (resp. $2$, $2$, $2$, $3$, $4$, $4$, $4$, $4$, $8$, $8$) tiles over the longitude direction and $2$ (resp. $2$, $3$, $4$, $4$, $4$, $5$, $6$, $8$, $8$, $16$) tiles over the latitude direction. Thus make sure that the number of grid points minus $1$ in each direction could be divided by the aforementioned number of tiles over the considered direction. For instance a~$82 \times 109$ horizontal grid is compliant with the use of~$12$ processors.} [61] 128 129 \item \textbf{number of grid points in latitude} [61] 129 130 \item \textbf{number of vertical levels} [61] 130 131 \item \textbf{number of tracers} [1] 131 132 \item \textbf{number of domains} [1] 132 \item[6.bis] (not the first time you use \ttt{makemeso}) a question for advanced users [press any key]133 \item[6.ter] (new LMD physics) number of different scatterers to be used133 %\item[6.bis] (not the first time you use \ttt{makemeso}) a question for advanced users [press any key] 134 %\item[6.ter] (new LMD physics) number of different scatterers to be used 134 135 \end{asparaenum} 135 136 136 137 \mk 137 A n important question that often arises when using the LMD Martian Mesoscale Model is: when does the user have to recompile the model? The set of questions asked by~\ttt{makemeso} give some hints about this. Suppose you compiled a version of the model for a given set of parameters $1$ to $6$ to run a specific compilation. If you would like to run another simulation with at least one of parameters $1$ to $6$ subject to change, the model needs to be recompiled\footnote{This necessary recompilation each time the number of grid points, tracers and domains is modified is imposed by the LMD physics code. The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}.138 139 \mk 140 Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try to p recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{your\_compdir} directory.}.138 A key question that often arises when using the LMD Martian Mesoscale Model is: when does the model has to be recompiled? The set of questions asked by~\ttt{makemeso} give some hints about this. Suppose you compiled a version of the model for a given set of parameters $1$ to $6$ to run a specific compilation. If you would like to run another simulation with at least one of parameters $1$ to $6$ subject to change, the model needs to be recompiled\footnote{This necessary recompilation each time the number of grid points, tracers and domains is modified is imposed by the LMD physics code. The WRF dynamical core alone is more flexible.} with \ttt{makemeso} (cf. also chapter~\ref{zeparam}). 139 140 \mk 141 Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try to recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{your\_compdir} directory. See chapter~\ref{faq}}. 141 142 142 143 \scriptsize … … 149 150 150 151 \sk 151 We assume here that you had successfully compiled the model with \ttt{makemeso} at the end of the previous section and you had based your answers to the \ttt{makemeso} script on the indications in brackets. You should then find in the \ttt{your\_compdir} directory one \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable and one \ttt{wrf\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable.152 We assume here that you had successfully compiled the model with \ttt{makemeso} at the end of the previous section and you had based your answers to the \ttt{makemeso} script on the indications in brackets. You should then find in the \ttt{your\_compdir} directory the \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe} and~\ttt{wrf\_x61\_y61\_z61\_d1\_t1\_p1.exe} executables. 152 153 153 154 \sk … … 158 159 % 159 160 \begin{verbatim} 160 cp LMD_MM_MARS_TESTCASE.tar.gz $M ESO/LMD_MM_MARS/161 cp LMD_MM_MARS_TESTCASE.tar.gz $MMM 161 162 tar xzvf LMD_MM_MARS_TESTCASE.tar.gz 162 163 cd TESTCASE -
trunk/MESOSCALE_DEV/MANUAL/SRC/faq.tex
r261 r262 1 \chapter{Frequently Asked Questions, Tips and Troubleshooting} 1 \chapter{Frequently Asked Questions, Tips and Troubleshooting}\label{faq} 2 2 3 3 \vk … … 8 8 \item your operating system and machine are in good health. 9 9 \end{citemize} 10 You might also read this chapter out of curiosity: it might be useful for your experience as an user. 11 12 \mk 13 \section{General questions} 14 15 \sk 16 \noindent \textbf{I don't know anything about mesoscale meteorology. Does that prevent me from becoming an user of your model?} 17 \begin{finger} 18 \item Not really. It is the purpose of this user manual to help you with running simulations with the LMD Martian Mesoscale Model. Now, you will probably not be able to interpret simulation results that easily, but we will then be happy to help you with our expertise on atmospheric science and to advise good books so that you learn more about this topic. 19 \end{finger} 20 21 \sk 22 \noindent \textbf{I don't have time, or feeling overwhelmed by learning how to use the model.} 23 \begin{finger} 24 \item There are particular cases in which our team might be able to run the simulation for your study. Or help someone you would hire to do the work with learning about how to use the model and answer to questions. We are open to discussion. 25 \end{finger} 10 26 11 27 \mk … … 13 29 14 30 \sk 15 \noindent \textbf{The model compiled perfectly yesterday; then this morning, without changing anything, it does not compile.}31 \noindent \textbf{The model compiled yesterday. Now, with no apparent changes, it does not compile.} 16 32 \begin{finger} 17 \item This is one of the most frustating situation. Remember though that there is $99\%$ chance that the reason for th is problem is either stupid or none of your responsabilities. Please check that:33 \item This is one of the most frustating situation. Remember though that there is $99\%$ chance that the reason for the problem is either stupid or none of your responsability. Please check that: 18 34 \begin{citemize} 19 35 \item Disk quota is not exceeded; 20 36 \item You are working on the same machine as the day before; 21 \item No source file has been accidentally modified; 37 \item No source file has been accidentally modified; no links broken; 22 38 \item No updates has been performed on your system during the night; 23 \item No symbolic links has been broken in the meantime.39 \item Recompiling with \ttt{makemeso -f} does not solve the problem. 24 40 \end{citemize} 25 41 \end{finger} … … 38 54 39 55 \sk 40 \noindent \textbf{I am afraid I explored a given compilation directory in \ttt{\$MMM} (say \ttt{g95\_32\_single} and broke something, e.g. deleted some linksor break some links. The model does not compile anymore.}56 \noindent \textbf{I am afraid I explored a given compilation directory in \ttt{\$MMM} (say \ttt{g95\_32\_single} and broke something, e.g. deleted or break some links. The model does not compile anymore.} 41 57 \begin{finger} 42 \item Delete the corresponding compilation directory. Since it is mostly filled with symbolic links, you will only lose previously the compiled executables and the (possibly modified) \ttt{Registry.EM}. Save those files prior to deletion of the compilation directory if you would like to keep those. Then run again \ttt{makemeso} for the same combination of compiler/system and a new clean version of the compilation directory will reappear, while the model executables are recompiled from scratch.58 \item Delete the corresponding compilation directory. Since it is mostly filled with symbolic links, you will only lose the previously compiled executables and the (possibly modified) \ttt{Registry.EM} file. Save those files prior to deletion of the compilation directory if you would like to keep those. Then run again \ttt{makemeso} for the same combination of compiler/system and a new clean version of the compilation directory will reappear, while the model executables are recompiled from scratch. 43 59 \end{finger} 44 60 45 61 \sk 46 \noindent \textbf{I update the model's sources through \ttt{svn update} and the compilation failed with anew version}62 \noindent \textbf{I update the model's sources through \ttt{svn update} and the compilation failed with the new version} 47 63 \begin{finger} 48 \item It rarely happens that we move, create or delete some files in \ttt{\$MMM/SRC} while developing new capabilities or bug fixes for the model -- and commit the changes to the reference version of the model. Please apply the solution proposed in the previous point and the model would be able to compile. The need to do so can be anticipated by having a look to commit log through the command \ttt{svn log}.64 \item It could happen (but this is not usual) that we move, create or delete some files in \ttt{\$MMM/SRC} while developing new capabilities or bug fixes for the model -- and commit the changes to the reference version of the model. Please apply the solution proposed in the previous point and the model can be compiled again (because our rule is to commit only versions of the model which could be compiled). Possible problems can be anticipated by having a look to commit log through the command \ttt{svn log}. The vast majority of our commits, and subsequent reference model changes, is perfectly transparent for the user. 49 65 \end{finger} 50 66 … … 65 81 66 82 \sk 83 \noindent \textbf{I would like to have smoother surface properties.} 84 \begin{finger} 85 \item Increase the smoothing parameter \ttt{smooth\_passes} in the file \ttt{WPS/geogrid/GEOGRID.TBL} for each field you would like to get smoother, then restart at step 2 (execution of \ttt{geogrid.exe}). 86 \end{finger} 87 88 \sk 89 \noindent \textbf{I would like to know more about customizing the calculations made by \ttt{geogrid.exe} and \ttt{metgrid.exe}.} 90 \begin{finger} 91 \item You probably want to know more about various settings in \ttt{WPS/geogrid/GEOGRID.TBL} and \ttt{WPS/geogrid/METGRID.TBL}. A detailed description can be found here \url{http://www.mmm.ucar.edu/wrf/users/docs/user_guide/users_guide_chap3.html} (some parameters are not relevant for Mars). 92 \end{finger} 93 94 \sk 67 95 \noindent \textbf{To speed up initializations, I would like to define GCM constraints at the domain boundaries each 6 Martian hours, instead of each one or two hours as it is usually done (cf. \ttt{interval\_seconds = 3700}).} 68 96 \begin{finger} 69 97 \item It is not a good idea. Near-surface atmospheric fields undergo a strong daily cycle on Mars which you will not be able to capture if \ttt{interval\_seconds} is higher than 7400 seconds (i.e. two Martian hours). 70 \end{finger}71 72 \sk73 \noindent \textbf{I would like to have smoother surface properties.}74 \begin{finger}75 \item Increase the smoothing parameter \ttt{smooth\_passes} in the file \ttt{WPS/geogrid/GEOGRID.TBL} for each field you would like to get smoother, then restart at step 2 (execution of \ttt{geogrid.exe}).76 98 \end{finger} 77 99 … … 83 105 84 106 \sk 85 \noindent \textbf{I would like to define my own vertical levels one by one.}107 \noindent \textbf{I would like to define my own vertical levels.} 86 108 \begin{finger} 87 109 \item Create a file \ttt{levels} with all your mass-based model levels (see chapter~\ref{whatis}) in it then add the optional setting in \ttt{\&domains} in \ttt{namelist.input} … … 97 119 98 120 \sk 99 \noindent \textbf{I don't know which timestep should I choose to prevent the model from crashing.}121 \noindent \textbf{I would like to know how much time my simulation will last.} 100 122 \begin{finger} 101 \item It depends on the horizontal resolution according to the CFL condition -- and whether the dynamical core is used in hydrostatic or non-hydrostatic mode. Please refer to the table in \textit{Spiga and Forget} [2009] for guidelines about timestep. 123 \item Check the log information while \ttt{wrf.exe} is running. The effective time to realize each integrating or writing step is indicated. Hence you can extrapolate and predict the total simulation time. If you use parallel computations, have a look in \ttt{rsl.error.0000} to get this information. 124 \end{finger} 125 126 \sk 127 \noindent \textbf{With default settings, I have one \ttt{wrfout*} file per simulated day, each one of those containing fields hour by hour. I want to change this.} 128 \begin{finger} 129 \item If you want to have an output frequency higher [lower] than one per hour, decrease [increase] the parameter \ttt{history\_interval} in \ttt{namelist.input} (remember that each unit of \ttt{history\_interval} is $100$~seconds). If you want to have more [less] data in each individual file, increase [decrease] the parameter \ttt{frames\_per\_outfile} in \ttt{namelist.input}. 130 \end{finger} 131 132 \sk 133 \noindent \textbf{Looks like in the model (cf. \ttt{namelist.input}, a Martian hour is~$3700$ seconds. The reality is closer to~$3699$ seconds.} 134 \begin{finger} 135 \item This is true, though obviously the \ttt{3700} figure is much more convenient and choosing this instead of~$3699$ has no impact whatsoever on simulations which last typically less than one month, and most often only a few days. 136 \end{finger} 137 138 \sk 139 \noindent \textbf{I want to know the local time for a given model output.} 140 \begin{finger} 141 \item Time management in the model, which includes the way output files are named, relates to UTC time, i.e. local time at longitude~$0^{\circ}$. The time given in the name of each \ttt{wrfout*} file refers to the first frame written in the file -- using \ttt{history\_interval} allows you to infer universal time for all frames in the file. Another method is to look at the variable \ttt{Times} in \ttt{wrfout*}. Once you know about universal time, you can check the domain longitudes in \ttt{XLONG} to calculate local time at any location. 102 142 \end{finger} 103 143 … … 108 148 \end{finger} 109 149 150 \sk 151 \noindent \textbf{I don't know which timestep should I choose to prevent the model from crashing.} 152 \begin{finger} 153 \item The answer depends on the horizontal resolution according to the CFL condition -- and whether the dynamical core is used in hydrostatic or non-hydrostatic mode, plus other factors (e.g. slopes, temperature gradients, etc\ldots). Please refer to the table in \textit{Spiga and Forget} [2009] for guidelines about timestep. A rule-of-thumb to start with is to set \ttt{time\_step} to the value of \ttt{dx} in kilometers; this value can be sometimes raised to get faster integrations. If the \ttt{time\_step} parameter is too large for the horizontal resolution~\ttt{dx} and violates the CFL criterion, \ttt{wrf.exe} usually issues warnings about CFL violation in the first integration steps. 154 \end{finger} 155 156 \sk 157 \noindent \textbf{Looks like \ttt{wrf.exe} is crashing because there are dynamical instabilities on the lateral boundaries apparently close to a topographical obstacle.} 158 \begin{finger} 159 \item Check that no steep slope (mountain, crater) is located at the domain boundaries. Otherwise, change the domain's center so that no major topographical gradient is located close to the domain boundaries (in the relaxation zone). This is also true for nested simulations at the boundary between parent and nested domains. 160 \end{finger} 161 162 163 %%% DIFFUSION FOR TRACERS 164 %%% GRAVITY WAVE ABSORBING LAYER 165 166 110 167 \clearemptydoublepage -
trunk/MESOSCALE_DEV/MANUAL/SRC/foreword.tex
r257 r262 6 6 \paragraph{Contact} The main contact to reach at LMD to become an user of the model is Aymeric SPIGA (main developper, \href{mailto:aymeric.spiga@upmc.fr}{\nolinkurl{aymeric.spiga@upmc.fr}}). Alternative contacts at LMD for mesoscale modeling inquiries are Ehouarn MILLOUR~\url{ehouarn.millour@lmd.jussieu.fr} or Fran\c cois FORGET~\url{francois.forget@lmd.jussieu.fr}. We are open to questions and suggestions on new scientific collaborations, teaching/outreach actions or contractual proposals. 7 7 8 \paragraph{Copyright (LMD)} The LMD Martian Mesoscale Model sources are made available on the condition that we make no representations or warranties regarding the reliability or validity of the model predictions nor the use to which such model predictions should be put, disclaim any and all responsibility for any errors or inaccuracies in the model predictions and bear no responsibility for any use made of this model predictions by any party. Scientific use of LMD Martian Mesoscale Model simulations is freely allowed provided that the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} is correctly quoted in all publications and that we are kept informed of usage and developments. If your paper makes use of specific simulations carried outwith the LMD Martian Mesoscale Model, please consider including Aymeric SPIGA as a co-author of your work and asking, if needed, for help with writing the part related to mesoscale modeling. If your study requires additional work on a specific Martian physical parameterization, please consider including other members of the LMD team in addition to Aymeric SPIGA. The LMD Martian Mesoscale Model may not be put to any commercial use without specific authorization.8 \paragraph{Copyright (LMD)} The LMD Martian Mesoscale Model sources are made available on the condition that we make no representations or warranties regarding the reliability or validity of the model predictions nor the use to which such model predictions should be put, disclaim any and all responsibility for any errors or inaccuracies in the model predictions and bear no responsibility for any use made of this model predictions by any party. The scientific use of already published LMD Martian Mesoscale Model simulations is freely allowed provided that the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} is correctly quoted in all publications and that we are kept informed of usage and developments. If your study requires dedicated simulations with the LMD Martian Mesoscale Model, please consider including Aymeric SPIGA as a co-author of your work and asking, if needed, for help with writing the part related to mesoscale modeling. If your study requires additional work on a specific Martian physical parameterization, please consider including other members of the LMD team in addition to Aymeric SPIGA. The LMD Martian Mesoscale Model may not be put to any commercial use without specific authorization. 9 9 10 10 \paragraph{Copyright (WRF)} Part of the LMD Martian Mesoscale Model is based on the terrestrial model WRF which is in the public domain. If you are an user of the LMD Martian Mesoscale Model, you are therefore an user of the WRF model. Please take a minute to fill in the WRF registration form so that the WRF development team knows about the people using their model: \url{http://www.mmm.ucar.edu/wrf/users/download/wrf-regist.php}. \noindent \scriptsize \emph{WRF was developed at the National Center for Atmospheric Research (NCAR) which is operated by the University Corporation for Atmospheric Research (UCAR). NCAR and UCAR make no proprietary claims, either statutory or otherwise, to this version and release of WRF and consider WRF to be in the public domain for use by any person or entity for any purpose without any fee or charge. UCAR requests that any WRF user include this notice on any partial or full copies of WRF. WRF is provided on an "AS IS" basis and any warranties, either express or implied, including but not limited to implied warranties of non-infringement, originality, merchantability and fitness for a particular purpose, are disclaimed. In no event shall UCAR be liable for any damages, whatsoever, whether direct, indirect, consequential or special, that arise out of or in connection with the access, use or performance of WRF, including infringement actions. WRF is a registered trademark of the University Corporation for Atmospheric Research (UCAR).} \normalsize -
trunk/MESOSCALE_DEV/MANUAL/SRC/guide.tex
r261 r262 8 8 9 9 \sk 10 It is assumed here that the user is working in a directory named \ttt{/a\_place/MY\_SIMU} mounted on a disk with enough free space to host the \ttt{wrfout } output files.10 It is assumed here that the user is working in a directory named \ttt{/a\_place/MY\_SIMU} mounted on a disk with enough free space to host the \ttt{wrfout*} output files. 11 11 12 12 \sk 13 \paragraph{Prerequisites} Prepare parameter files (copy templates or pre-existing files); Edit those files; Use \ttt{\$MMM/SIMU/calendar} (or see appendix) to choose simulation dates and fill the namelists; Pay attention to correspondances between \ttt{namelist.input} and \ttt{namelist.wps};See~\ref{zeparam} and~\ref{wps} for further details.13 \paragraph{Prerequisites} Prepare parameter files (copy templates or pre-existing files); Edit those files; Use \ttt{\$MMM/SIMU/calendar} (or cf. appendix) to choose simulation dates and fill the namelists; Pay attention to correspondances between \ttt{namelist.input} and \ttt{namelist.wps}. See~\ref{zeparam} and~\ref{wps} for further details. 14 14 \begin{verbatim} 15 15 cd /a_place/MY_SIMU … … 42 42 cd $MMM/your_compdir/PREP_MARS 43 43 [check that the link input_diagfi.nc points toward the GCM output diagfi.nc] 44 compile_and_exec 44 echo 1 | create_readmeteo.exe 45 readmeteo.exe < readmeteo.def 45 46 [check that WPSFEED contains data files which prefix is LMD:] 46 47 \end{verbatim} … … 79 80 80 81 \sk 81 The serie of commands detailed in section~\ref{zecommands} has to be repeated each time the user would like to run a new simulation with the LMD Martian Mesoscale Model. This is usually simple if the user simply want to change, e.g., the integration timestep, because only the few commands detailed at step~$4$ have to be used. On the contrary, if the user wants to run a new simulation in which, e.g., both the simulated season and the number of grid points are changed, every step from~$0$ to~$4$ have to be repeated (see e.g. section~\ref{changeparam}). Not only it can be tedious to type all commands again and again, but there is a quite high probability that the user (even the most experienced one) will face one or several of the following problems, which would prevent the simulation from running correctly, from running at all, from computing reasonable results, or would waste the user's time:82 The serie of commands detailed in section~\ref{zecommands} has to be repeated each time the user would like to run a new simulation with the LMD Martian Mesoscale Model. This is usually simple if the user simply want to change, e.g., the integration timestep, because only the few commands detailed at step~$4$ have to be used. On the contrary, if the user wants to run a new simulation in which, e.g., both the simulated season and the number of grid points are changed, every step from~$0$ to~$4$ have to be repeated (see e.g. section~\ref{changeparam}). Not only it can be tedious to type all commands again and again, but there is a quite high probability that the user (even the most experienced one) will face one or several of the following problems, which would waste the user's time, or prevent the simulation from running correctly, from running at all, or from computing reasonable results: 82 83 \begin{citemize} 83 84 \item A parameter labelled \ttt{(r)} in \ttt{namelist.input} (see chapter~\ref{zeparam}) is changed, but the sources have not been recompiled accordingly; 84 85 \item The answers to \ttt{makemeso} are not compliant with information in \ttt{namelist.input}; 85 \item The common information in \ttt{namelist.input} and \ttt{namelist.wps} are different;86 \item The input sol in \ttt{launch\_gcm} does not correspond to the dates in \ttt{namelist.input} and \ttt{namelist.wps} (in accordance to the calendar table in appendix~\ref{calendar} and in \ttt{calendar});86 \item The common information in \ttt{namelist.input} and \ttt{namelist.wps} are inconsistent; 87 \item The input sol in \ttt{launch\_gcm} does not correspond to the dates in \ttt{namelist.input} and \ttt{namelist.wps} (in accordance with the \ttt{calendar} table, cf. appendix); 87 88 \item One or several of the various files used as input/output in step~$1$, $2$, $3$ are not correctly linked; 88 89 \item The wrong executable is used because the right model executables are not correctly linked; 89 \item Large domain simulations yield long computations of step~$2$ and~$3$, so the user have sometimesto wait a long time between each commands to type.90 \item Large domain simulations yield long computations of step~$2$ and~$3$, so the user have to wait a long time between each commands to type. 90 91 \end{citemize} 91 92 92 93 \sk 93 In those circumstances, using the \ttt{bash} script \ttt{runmeso} located in \ttt{\$MMM/SIMU} is probably a good idea when the commands listed in section~\ref{zecommands} has been successfully followed \emph{at least once}. The purpose of the \ttt{runmeso} script is to perform all commands and tests about links, executables, etc... described in section~\ref{zecommands}. To put it in a nutshell, after all the efforts made in the previous chapters to install, compile, test the LMD Martian Mesoscale Model and its initialization routines, the user can now rely on \ttt{runmeso} to easily launch a simulation with the LMD Martian Mesoscale Model . The serie of commands listed in the previous section~\ref{zecommands} is replaced by a more user-friendly method:94 In those circumstances, using the \ttt{bash} script \ttt{runmeso} located in \ttt{\$MMM/SIMU} is probably a good idea when the commands listed in section~\ref{zecommands} has been successfully followed \emph{at least once}. The purpose of the \ttt{runmeso} script is to perform all commands and tests about links, executables, etc... described in section~\ref{zecommands}. To put it in a nutshell, after all the efforts made in the previous chapters to install, compile, test the LMD Martian Mesoscale Model and its initialization routines, the user can now rely on \ttt{runmeso} to easily launch a simulation with the LMD Martian Mesoscale Model! The serie of commands listed in the previous section~\ref{zecommands} is replaced by a simple user-friendly method: 94 95 \begin{citemize} 95 96 \item set a simulation directory containing the parameter files \ttt{namelist.input} and \ttt{callphys.def}; … … 100 101 \end{citemize} 101 102 102 When executing the \ttt{runmeso} script, useful information about the simulation and the system in which you plan to run it are prompted, before an invitation appears about the choice of step(s) to process with: 103 % 103 \sk 104 When executing the \ttt{runmeso} script, useful information about the simulation, and the system in which you plan to run it, are prompted before an invitation appears about the choice of step(s) to process with: 105 104 106 \footnotesize 105 107 \codesource{runmeso_output} … … 110 112 \item A first test of \ttt{runmeso} can be carried out with the test case of section~\ref{sc:arsia}. Please create a directory (e.g. \ttt{test}) and copy the files \ttt{namelist.input}, \ttt{callphys.def} and \ttt{namelist.wps} referring to this Arsia Mons test case in this directory. Then run \ttt{runmeso} and make choice~$1$, i.e. going through all steps detailed in \ref{steps} and \ref{zecommands}. 111 113 \item The execution of \ttt{runmeso} stops if an error is encountered: e.g., the environment variable \ttt{MESO} is not defined, one of the two files~\ttt{namelist.input} or~\ttt{callphys.def} are not present in the working directory, etc... 112 \item If \ttt{namelist.wps} is not present in the simulation directory, the \ttt{runmeso} script will propose to create it and will prompt $4$~additional questions about map projection, data source, latitude for center of domain, longitude for center of domain. The remaining information to be set in \ttt{namelist.wps} (cf. section~\ref{wps}) is then copied from \ttt{namelist.input} to ensure all common parameters between the two files are the same. The program \ttt{geogrid.exe} is then run and, if \ttt{ncview} is installed on your system, this program is prompted so that you can explore the file \ttt{geo\_em.d01.nc} file to check the created domain.114 \item If \ttt{namelist.wps} is not present in the simulation directory, the \ttt{runmeso} script will propose to create it and will prompt $4$~additional questions about map projection, data source, latitude for center of domain, longitude for center of domain. The remaining information to be set in \ttt{namelist.wps} (cf. section~\ref{wps}) is then copied from \ttt{namelist.input} to ensure all common parameters between the two files are the same. The program \ttt{geogrid.exe} is then run and, if \ttt{ncview} is installed on your system, this program is prompted so that you can explore the file \ttt{geo\_em.d01.nc} file to check the newly created domain. 113 115 \item An \ttt{xeyes} session is prompted when the \ttt{runmeso} script has finished processing required steps. 114 116 \item If \ttt{runmeso} went well through steps~$1$ and~$2$, but encountered an error in~$3$, once the error has been corrected \ttt{runmeso} is not required to perform steps~$1$ and~$2$ again and can be started directly at step~$3$ (by typing~$3$, see possible choices above). 115 \item The \ttt{LMD:*} files created by a \ttt{runmeso} call which features step~$1$ are kept in \ttt{WPSFEED} (located in \ttt{\$MESO/TMPDIR}). Those files will be overwritten by subsequent calls to \ttt{runmeso} if you choose to r un the GCM at similar dates.117 \item The \ttt{LMD:*} files created by a \ttt{runmeso} call which features step~$1$ are kept in \ttt{WPSFEED} (located in \ttt{\$MESO/TMPDIR}). Those files will be overwritten by subsequent calls to \ttt{runmeso} if you choose to re-run the GCM at similar dates. 116 118 \item The \ttt{met\_em*} files created by a \ttt{runmeso} call which features step~$2$ are kept in a directory in \ttt{WRFFEED} (located in \ttt{\$MESO/TMPDIR}) which name refers to precise date and time, so that it will not be overwritten by subsequent calls to \ttt{runmeso} for other simulations. In the simulation directory \ttt{runmeso} creates a \ttt{met\_em} directory which contains links towards the \ttt{met\_em*} files. 117 \item The contents of directories in \ttt{\$MESO/TMPDIR} ( \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED}) might grow large as you launch more and more simulations with \ttt{runmeso}. It is probably a good idea to clean up files referring to old simulations from time to time.119 \item The contents of directories in \ttt{\$MESO/TMPDIR} (i.e. \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED}) might grow large as you launch more and more simulations with \ttt{runmeso}. It is probably a good idea to clean up from time to time files referring to old obsolete simulations. 118 120 \end{finger} 119 121 -
trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex
r261 r262 16 16 \item your computer is connected to the internet; 17 17 \item you have~\ttt{200 Mo} free disk space available; 18 \item your OS is Linux\footnote{The model was also successfully compiled on MacOSX; ``howto" information is available upon request but could have become obsolete on recent versions of Apple hardware and software. It is probably possible to compile the model on Windows using Cygwin but this has not been implemented nor tested. You are kindly advised to install a Linux distribution on your computer (e.g. Ubuntu, Debian, Fedora, ...).} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots); 18 \item your OS is Linux\footnote{The model was also successfully compiled on MacOSX; ``howto" information is available upon request but could have become obsolete on recent versions of Apple hardware and software. It is probably possible to compile the model on Windows using Cygwin but this has not been implemented nor tested. This could work, but we recommend instead to install a Linux distribution on your computer (e.g. Ubuntu, Debian, Fedora, ...).} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots); 19 \item \ttt{bash}, \ttt{m4} and \ttt{perl} are installed on your computer; 19 20 \item at least one of the following Fortran compilers is installed on your computer 20 21 \begin{itemize} … … 24 25 \end{itemize} 25 26 \item your C compiler is \ttt{gcc} and C development libraries are included; 26 \item \ttt{bash}, \ttt{m4} and \ttt{perl} are installed on your computer; 27 \item \ttt{netCDF} libraries\footnote{The outputs from model computations are in netCDF format. This is a convenient self-describing file format widely used in atmospheric science and data analysis. Further information and downloads can be found in \url{http://www.unidata.ucar.edu/software/netcdf}.} have been compiled \emph{on your system with the Fortran compiler suite you aimed to use to compile the model}. Three environment variables associated with the \ttt{NETCDF} libraries must be defined with the following commands\footnote{Note that all command lines proposed in this document are defined in \ttt{bash} script language}: 27 \item \ttt{netCDF} libraries\footnote{The outputs from model computations are in netCDF format. This is a convenient self-describing file format widely used in atmospheric science and data analysis. Further information and downloads can be found in \url{http://www.unidata.ucar.edu/software/netcdf}.} have been compiled \emph{on your system with the Fortran compiler suite you aim to use to compile the model}. Three environment variables associated with the \ttt{NETCDF} libraries must be defined with the following commands\footnote{All command lines proposed in this document are defined in \ttt{bash} script language}: 28 28 \begin{verbatim} 29 29 declare -x NETCDF=/disk/user/netcdf … … 40 40 \item \ttt{ncview}\footnote{ \url{http://meteora.ucsd.edu/~pierce/ncview\_home\_page.html} }: tool to visualize the contents of a netCDF file; 41 41 \item \ttt{nco}\footnote{ \url{ http://nco.sourceforge.net } }: tools to manipulate and modify netCDF files; 42 \item \ttt{epd}\footnote{ \url{ http://www.enthought.com/products/getepd.php }. A complete version is available free of charge for students and employees at degree-granting institutions. A limited version with essential librairies is available free of charge for any user (but e.g. cartography and \ttt{netCDF} Python packages are not included in this free version). }: the python distribution suite packaged by Enthought, including many librairies for plotting, scientific computations, data analysis...42 \item \ttt{epd}\footnote{ \url{ http://www.enthought.com/products/getepd.php }. A complete version is available free of charge for students and employees at degree-granting institutions. A limited version with essential librairies is available free of charge for any user (but e.g. cartography and \ttt{netCDF} python packages are not included in this free version). }: the python distribution suite packaged by Enthought, including many librairies for plotting, scientific computations, data analysis... 43 43 \end{citemize} 44 44 \end{finger} … … 63 63 64 64 \sk 65 If the compilation is successful, the file \ttt{log\_error} should be empty or only reporting few warnings ). In the \ttt{main} folder two executables \ttt{ideal.exe} and \ttt{run.exe} should be found and allow you to run\footnote{If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will probably complain about an error reading the namelist. Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} in the \ttt{namelist.input} file to solve the problem.} the test simulation:65 If the compilation is successful, the file \ttt{log\_error} should be empty or only reporting few warnings. In the \ttt{main} folder two executables \ttt{ideal.exe} and \ttt{run.exe} should be found, which allows you to run\footnote{If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will possibly complain about an error reading the namelist. Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} in the \ttt{namelist.input} file to solve the problem.} the test simulation: 66 66 \begin{verbatim} 67 67 cd test/em_hill2d_x … … 76 76 \section{Main installation of the model sources} 77 77 78 \paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MESO} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MESO/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MESO} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of yours \ttt{LMD\_MM\_MARS} directory} for you:78 \paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MESO} to point at the directory where you will install the model, and set the environment variable \ttt{\$MMM} as \ttt{\$MESO/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MESO} directory and extract the files. Then execute the \ttt{prepare} script that would perform all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory}: 79 79 % 80 80 \begin{verbatim} … … 85 85 tar xzvf LMD_MM_MARS.tar.gz 86 86 cd $MESO/LMD_MM_MARS 87 ./SRC/SCRIPTS/prepare ## or simply ./prepare if the script is in LMD_MM_MARS 87 ln -sf ./SRC/SCRIPTS/prepare . ## not needed if script already in LMD_MM_MARS 88 ./prepare 88 89 \end{verbatim} 89 90 90 \paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MESO} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by thiscommand line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.91 \paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variables \ttt{\$MESO} and \ttt{\$MMM} as is detailed below. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by the command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }. 91 92 92 93 \begin{verbatim} … … 97 98 declare -x MESO=$PWD ## put absolute link in your .bashrc 98 99 declare -x MMM=$MESO/LMD_MM_MARS 99 ## to get latest updates later on100 cd the_name_of_your_local_destination_folder101 svn update LMDZ.MARS MESOSCALE102 svn info| more100 ## to get latest updates later on 101 cd the_name_of_your_local_destination_folder 102 svn update LMDZ.MARS MESOSCALE 103 svn log | more 103 104 \end{verbatim} 104 105 … … 107 108 108 109 \sk 109 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich}\ttt{bin} directory, even if you added the \ttt{\$your\_software\_dir/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.110 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your MPICH \ttt{bin} directory, even if you added the \ttt{\$your\_software\_dir/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable. 110 111 111 112 \begin{finger} 112 \item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing wh at installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:113 \item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing which installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} for the sake of illustration) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands: 113 114 \begin{verbatim} 114 115 mkdir $your_software_dir/MPI -
trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex
r258 r262 26 26 27 27 \sk %answer \ttt{y} to the last question. 28 Many parameters in the \ttt{namelist.input} file are optional in the Martian version\footnote{E.g., in the \ttt{namelist.input} fi eld associated to the Arsia Mons test case presented in the previous chapter, the parameter \ttt{non\_hydrostatic} is set to false to assume hydrostatic equilibrium, whereas standard simulations are non-hydrostatic. Compared to the file the ARW-WRF users are familiar with (see generic description in \ttt{\$MMM/SRC/WRFV2/run/README.namelist}, typical \ttt{namelist.input} files for LMD Martian Mesoscale Model simulations are much shorter.} and their default values are defined in the file \ttt{\$MMM/SRC/WRFV2/Registry/Registry.EM}\footnote{Changing default values, or even adding parameters, in \ttt{\$MMM/SRC/WRFV2/Registry/Registry.EM} should be avoided except if you are an advanced user. If you modify \ttt{Registry.EM}, recompile the model with \ttt{makemeso -f}}. The only mandatory parameters in \ttt{namelist.input} are within the~\ttt{time\_control} and~\ttt{domains} categories. The minimal version of the \ttt{namelist.input} file corresponds to standard simulations with the model\footnote{You may find the corresponding file in \ttt{\$MMM/SIMU/namelist.input\_minim}.}:28 Many parameters in the \ttt{namelist.input} file are optional in the Martian version\footnote{E.g., in the \ttt{namelist.input} file associated to the Arsia Mons test case presented in the previous chapter, the parameter \ttt{non\_hydrostatic} is set to false to assume hydrostatic equilibrium, whereas standard simulations are non-hydrostatic. Compared to the file the ARW-WRF users are familiar with (see generic description in \ttt{\$MMM/SRC/WRFV2/run/README.namelist}), typical \ttt{namelist.input} files for LMD Martian Mesoscale Model simulations are much shorter.} and their default values are defined in the file \ttt{\$MMM/SRC/WRFV2/Registry/Registry.EM}\footnote{Changing default values in \ttt{\$MMM/SRC/WRFV2/Registry/Registry.EM} should be avoided even if you are an advanced user.}. The only mandatory parameters in \ttt{namelist.input} are within the~\ttt{time\_control} and~\ttt{domains} categories. The minimal version of the \ttt{namelist.input} file corresponds to standard simulations with the model\footnote{You may find the corresponding file in \ttt{\$MMM/SIMU/namelist.input\_minim}.}: 29 29 % 30 30 \scriptsize … … 42 42 \item \ttt{(*d)} denotes dynamical parameters which modification implies non-standard simulations -- please read \ttt{\$MMM/SRC/WRFV2/run/README.namelist} and use with caution, i.e. if you know what you are doing; after modifying those parameters you can simply start at step~4. 43 43 \item \ttt{(*)} denotes parameters not to be modified; 44 \item \ttt{(n)} describes parameters involved when nested domains are defined (see chapter~\ref{nests}) ;44 \item \ttt{(n)} describes parameters involved when nested domains are defined (see chapter~\ref{nests}). 45 45 \end{citemize} 46 46 … … 53 53 \subsection{Advice on filling \ttt{namelist.input}}\label{namelist} 54 54 55 \paragraph{Test case} An interesting exercise is to analyze comparatively the \ttt{TESTCASE/namelist.input} file (cf. section~\ref{sc:arsia}) with the reference \ttt{namelist.input\_full} given above, so that you could understand which settings are being made in the Arsia Mons simulation. Then you could try to modify parameters in the \ttt{namelist.input} file and re-run the model to start getting familiar with the various settings. Given that the test case relies on pre-computed initial and boundary conditions, not all parameters can be changed in the \ttt{namelist.input} file.55 \paragraph{Test case} An interesting exercise is to analyze comparatively the \ttt{TESTCASE/namelist.input} file (cf. section~\ref{sc:arsia}) with the reference \ttt{namelist.input\_full} given above, so that you could understand which settings are being made in the Arsia Mons test simulation. Then you could try to modify parameters in the \ttt{namelist.input} file and re-run the model to start getting familiar with the various settings. Given that the test case relies on pre-computed initial and boundary conditions, not all parameters can be changed in the \ttt{namelist.input} file at this stage. 56 56 57 57 \paragraph{Syntax} Please pay attention to rigorous syntax while editing your personal \ttt{namelist.input} file to avoid reading error. If the model complains about this at runtime, start again with the available template \ttt{\$MMM/SIMU/namelist.input\_full}. 58 58 59 \paragraph{Time management} Usually a Martian user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. In the \ttt{namelist.input} file, the settings for starting/ending time must be done in the form year/month/day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix~\ref{calendar}) is here to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~4, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 2^{\circ}$.59 \paragraph{Time management} Usually the user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. In the \ttt{namelist.input} file, start/end time is set in the form year / month / day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix) is intended to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~17, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 8^{\circ}$. 60 60 61 61 \mk 62 62 \section{Physical settings} 63 63 64 \ sk65 The file \ttt{callphys.def} controls the behavior of the physical parameterizations in the LMD Martian Mesoscale Model. Modifying \ttt{callphys.def} implies to recompile the model only if the number of tracers is different. This file is organized very similarly to the corresponding file in the LMD Martian GCM, which user manual can be found at \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}. Here are the \ttt{callphys.def} contents with typical mesoscale settings:64 \mk 65 The file \ttt{callphys.def} controls the behavior of the physical parameterizations in the LMD Martian Mesoscale Model. Modifying \ttt{callphys.def} implies to recompile the model only if the number of tracers has changed. This file is organized very similarly to the corresponding file in the LMD Martian GCM, which user manual can be found at \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}. Here are the \ttt{callphys.def} contents with typical mesoscale settings: 66 66 67 %\vskip -0.3cm 68 \sk 69 \footnotesize 67 \vskip -0.5cm 70 68 \codesource{callphys.def} 71 \normalsize72 69 73 70 \mk 74 71 \begin{finger} 75 \item In the given example convective adjustment \ttt{calladj}, gravity wave parameterization \ttt{calllott} and NLTEschemes \ttt{callnlte} are turned off, as is usually the case in typical Martian tropospheric mesoscale simulations (see chapter~\ref{whatis}).72 \item In the provided example, convective adjustment \ttt{calladj}, gravity wave parameterization \ttt{calllott} and non-local thermodynamic equilibrium schemes \ttt{callnlte} are turned off, as is usually the case in typical Martian tropospheric mesoscale simulations (see chapter~\ref{whatis}). 76 73 \item \ttt{iradia} sets the frequency (in dynamical timesteps) at which the radiative computations are performed. To obtain the interval in seconds at which radiative computations are performed, one simply has to multiply \ttt{iradia} to the value of \ttt{time\_step} in \ttt{namelist.input}. 74 \item \ttt{iaervar=4} and~\ttt{iddist=3} defines the standard ``Mars Global Surveyor" dust scenario (see chapter~\ref{whatis}). It is the recommended choice. 77 75 \end{finger} 78 76 -
trunk/MESOSCALE_DEV/MANUAL/SRC/postproc.tex
r261 r262 2 2 3 3 \vk 4 In this chapter, the user is introduced to the principles of choosing the outputs from the LMD Martian Mesoscale Model. Elements about post-processing are also proposed here, although it is obviously left to the user to choose and develop its own tools to analyzingthe results of LMD Martian Mesoscale Model computations.4 In this chapter, the user is introduced to the principles of choosing the outputs of the LMD Martian Mesoscale Model. Elements about post-processing (interpolation, graphics) are also proposed here, although it is obviously left to the user to choose and develop its own tools to analyze the results of LMD Martian Mesoscale Model computations. 5 5 6 6 \mk … … 9 9 \sk 10 10 All non-local variables communicated within subroutines and functions in the WRF dynamical core are declared in a text file named \ttt{Registry.EM} located in \ttt{\$MMM/SRC/WRFV2/Registry}. In this file, each useful variable is declared through a one-line instance organized as follows: 11 11 12 \scriptsize 12 13 \begin{verbatim} … … 14 15 \end{verbatim} 15 16 \normalsize 17 18 \sk 16 19 The fields which appears in \ttt{wrfout*} output files feature an \ttt{h} (which stands for history) in the 8th column. If you do not want the field to appear in \ttt{wrfout*} files, simply remove the letter \ttt{h} from the group of letters in the 8th column. If you want the field to appear in \ttt{wrfout*} files, simply add the letter \ttt{h} in the group of letters in the 8th column. 17 20 18 21 \sk 19 22 It is also possible to output fields which are present only in the physical computations, i.e. appearing in \ttt{\$MMM/SRC/WRFV2/mars\_lmd/libf/phymars/physiq.F}. The method is simple. Assume you would like to output in the \ttt{wrfout*} files a 3D field named \ttt{zdtnirco2} and a 2D field named \ttt{qsurfice} in \ttt{physiq.F} with the new names \ttt{HR\_NIR} and \ttt{QSURFICE}. All you have to do is add the following lines to \ttt{Registry.EM} (see also examples around lines \ttt{75-120}. For 2D [3D] files the 4th column must be \ttt{ij} [\ttt{ikj}] and the 12th column \ttt{\#SAVEMARS2} [\ttt{\#SAVEMARS3}]. 23 20 24 \scriptsize 21 25 \begin{verbatim} … … 26 30 27 31 \sk 28 Each change in \ttt{Registry.EM} must be followed by a complete recompilation because the model variables have changed. If you use \ttt{makemeso}, please answer \ttt{y} to the question \scriptsize \ttt{Did you modify anything in the Registry or clean ?} \normalsize. If you use \ttt{runmeso}, please use it withoption \ttt{-r} to force recompiling with a new/updated list of variables.32 Each change in \ttt{Registry.EM} must be followed by a complete recompilation because the model variables have changed. Whether you use \ttt{makemeso} or \ttt{runmeso}, use the option \ttt{-r} to force recompiling with a new/updated list of variables. 29 33 30 34 \sk … … 37 41 38 42 \sk 39 The fields output in \ttt{wrfout*} files are given for each grid point and model level s. A vertical interpolation has to be performed to get those fields either in altitude or pressure levels. In addition, perturbation potential temperature \ttt{T}, x-component wind \ttt{U} and y-component \ttt{V} are output instead of the more informative (meteorogically-speaking) temperature \ttt{tk}, zonal wind \ttt{Um} and meridional wind \ttt{Vm}. This is why we developed a program named \ttt{api} (Altitude and Pressure Interpolator) which performs the tasks to convert the netCDF \ttt{wrfout*} files into another netCDF file featuring useful fields for making plots and analyzing the Martian mesoscale meteorology more easily than raw \ttt{wrfout*} files.43 The fields output in \ttt{wrfout*} files are given for each grid point and model level. A vertical interpolation has to be performed to get those fields either in altitude or pressure levels. In addition, perturbation potential temperature \ttt{T}, x-component wind \ttt{U} and y-component \ttt{V} are output instead of the more informative (meteorogically-speaking) temperature \ttt{tk}, zonal wind \ttt{Um} and meridional wind \ttt{Vm}. This is why we developed a program named \ttt{api} (Altitude and Pressure Interpolator) which performs the tasks to convert the netCDF \ttt{wrfout*} files into another netCDF file featuring more useful fields to make plots and analyze the Martian mesoscale meteorology. 40 44 41 45 \sk 42 The useful source files are located in \ttt{\$MMM/SRC/POSTPROC/}. The program \ttt{api.F90} has to be compiled with the \ttt{comp\_api} command (which must be edited first to uncomment the line corresponding to the Fortran compiler you are used to). Then the user has to fill in the parameter file \ttt{namelist.api} (a commented example is given below, but many examples can be found in \ttt{\$MMM/SRC/POSTPROC/}) before launching the interpolator through the command \ttt{api}. The calculations might be long if you are asking for many fields and many interpolation levels. In the example below, temperature, meteorological winds and vertical velocity will be interpolated at~$50$~m above the local surface. The results are output in a netCDF file having the same name as the input \ttt{wrfout*} files, with an additional suffix which depends on the chosen interpolation method.46 The source files for \ttt{api} are located in \ttt{\$MMM/SRC/POSTPROC/}. The program \ttt{api.F90} has to be compiled with the \ttt{comp\_api} command (which must be edited first, to uncomment the line corresponding to the Fortran compiler you are used to). Then the user has to fill in the parameter file \ttt{namelist.api} before launching the interpolator through the command \ttt{api}. A commented template for \ttt{namelist.api} is given below (this examples and many others can be found in \ttt{\$MMM/SRC/POSTPROC/}). The calculations might be long if you are asking for many fields and many interpolation levels. In the example below, temperature, meteorological winds and vertical velocity are interpolated at~$50$~m above the local surface. The results are output in a netCDF file having the same name as the input \ttt{wrfout*} files, with an additional suffix which depends on the chosen interpolation method. 43 47 44 48 \scriptsize … … 47 51 48 52 \mk 49 \section{Generating maps for winds and meteorological fields }\label{plots}53 \section{Generating maps for winds and meteorological fields simulated by the model}\label{plots} 50 54 51 55 \sk … … 53 57 54 58 \sk 55 This section does not replace the need for you to develop your own plotting tools to suit your needs, which should be not too difficult. The model outputs, as well as the results of \ttt{api} interpolations, are written using the netCDF format which can be read by most software w hich features graphical capabilities. For a quick inspection of model results (especially for checking model outputs while the model is running), we recommend using \ttt{ncview}; for simple manipulations of netCDF files (e.g. concatenation, difference, extraction, \ldots), we recommend using commands from the \ttt{nco} package (see chapter~\ref{install} for website links). Graphical routines based on \ttt{idl} (most graphics in the published papers to date about the LMD Martian Mesoscale Model makes use of this software), \ttt{ferret} and \ttt{grads} can be made available upon request (as is, i.e. undocumented yet commented scripts). Successful reading/plotting of the LMD Martian Mesoscale Model outputs on \ttt{matlab}, \ttt{octave}, \ttt{idv} have also been reported to us. It is also possible to import the model's outputs to Geographical Information System (GIS) such as \ttt{arcgis}. Note that \ttt{idl}, \ttt{matlab} and \ttt{arcgis} are commercial applications.59 This section does not replace the need for you to develop your own plotting tools to suit your needs, which should be not too difficult. The model outputs, as well as the results of \ttt{api} interpolations, are written using the netCDF format which can be read by most software with graphical capabilities. For a quick inspection of model results (especially for checking model outputs while the model is running), we recommend using \ttt{ncview}; for simple manipulations of netCDF files (e.g. concatenation, difference, extraction, \ldots), we recommend using commands from the \ttt{nco} package (see chapter~\ref{install} for website links). Graphical routines based on \ttt{idl}\footnote{Most graphics in the published papers to date about the LMD Martian Mesoscale Model were made with this software}, \ttt{ferret} and \ttt{grads} can be made available upon request (as is, i.e. undocumented yet commented scripts). Successful reading/plotting of the LMD Martian Mesoscale Model outputs on \ttt{matlab}, \ttt{octave}, \ttt{idv} are also reported. It is possible to import the model's outputs to Geographical Information System (GIS) such as \ttt{arcgis}\footnote{\ttt{idl}, \ttt{matlab} and \ttt{arcgis} are neither open-source nor free.}. 56 60 57 61 \sk … … 59 63 60 64 \sk 61 Powerful scripts based on \ttt{python+numpy+matplotlib} have been recently developed to obtain plots from the mesoscale model outputs. All figures in this user manual are based on these scripts. Those scripts can be found in \ttt{\$MMM/SRC/PYTHON}: \ttt{domain.py} and \ttt{winds.py} (more scripts will be added in the future). It is required that \ttt{python} and numerical+graphical librairies (\ttt{numpy}, \ttt{scipy}, \ttt{matplotlib}, \ttt{basemap}, \ttt{netcdf} are installed on your system. Perhaps the simplest way to do so is to install the user-friendly \ttt{epd} complete python distribution (cf. link in chapter~\ref{install}). One of the advantages of an approach using \ttt{python}, apart from its open-source philosophy and the abundant online documentation, is that in a common framework it allows for scripting with various options, integrating Fortran routines, manipulating arrays, making plots with various map projections. This is exemplified by the \ttt{winds.py} script we developed: it can perform interpolation with \ttt{api} for the level requested by the user,then generate a map, all that in one simple command line. For instance, Figures~\ref{arsia} in chapter~\ref{compile} has been generated by the following two commands:65 Powerful scripts based on \ttt{python+numpy+matplotlib} have been developed to obtain plots from the mesoscale model outputs. All figures in this user manual are based on the scripts \ttt{domain.py} and \ttt{winds.py} (more scripts will be available in the future). Those scripts can be found in \ttt{\$MMM/SRC/PYTHON}. It is required that \ttt{python} and numerical/graphical librairies (\ttt{numpy}, \ttt{scipy}, \ttt{matplotlib}, \ttt{basemap}, \ttt{netcdf} are installed on your system. Perhaps the simplest way to do so is to install the user-friendly complete python distribution \ttt{epd} (cf. link in chapter~\ref{install}). One of the advantages of an approach using \ttt{python}, apart from its open-source philosophy and the abundant online documentation, is that in a common framework it allows for scripting with various options, integrating Fortran routines, manipulating arrays, making plots with various map projections. This is exemplified by the \ttt{winds.py} script. It can both perform interpolation with \ttt{api} for the level requested by the user then generate a map, all that in one simple command line. For instance, Figures~\ref{arsia} in chapter~\ref{compile} has been generated by the following two commands: 62 66 63 67 \scriptsize … … 68 72 \normalsize 69 73 70 Many options can be used in our \ttt{python} scripts. The example of command \ttt{winds.py} at the time of writing is listed below; this information can be obtained by typing \ttt{winds.py -h}. This script can also be easily edited to suit your needs if the option you wouldneed does not exist.74 Many options can be used in our \ttt{python} scripts. The example of command \ttt{winds.py} at the time of writing is listed below; this information can be obtained by typing \ttt{winds.py -h}. This script can also be easily edited to suit your needs if the option you need does not exist. 71 75 72 76 \scriptsize … … 81 85 PATH=$PYTHONPATH:$PATH 82 86 \end{verbatim} 83 \item The option \ttt{-i} in \ttt{winds.py} make use of the Fortran routine \ttt{api.F90} (and the routine \ttt{time.F} is also needed). Th is routine has to be converted to a \ttt{python} commandusing \ttt{f2py}. Please execute the script amongst \ttt{api\_g95.sh}, \ttt{api\_ifort.sh}, \ttt{api\_pgf90.sh} which corresponds to the Fortran compiler installed on your system. Check for errors/warnings in the log files and ensure that the two files \ttt{api.so} and \ttt{timestuff.so} are generated.87 \item The option \ttt{-i} in \ttt{winds.py} make use of the Fortran routine \ttt{api.F90} (and the routine \ttt{time.F} is also needed). The routines have to be converted to \ttt{python} commands using \ttt{f2py}. Please execute the script amongst \ttt{api\_g95.sh}, \ttt{api\_ifort.sh}, \ttt{api\_pgf90.sh} which corresponds to the Fortran compiler installed on your system. Check for errors/warnings in the log files and ensure that the two files \ttt{api.so} and \ttt{timestuff.so} are generated. 84 88 \end{finger} 85 89 -
trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex
r257 r262 14 14 15 15 \sk 16 First and foremost, since the preprocessing utilities could involve files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{ \$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MESO/TMPDIR} as indicated below.16 First and foremost, since the preprocessing utilities could involve files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be created in \ttt{\$MESO/TMPDIR} as indicated below. 17 17 18 18 \begin{verbatim} … … 24 24 25 25 \sk 26 A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{your\_compdir} for illustrationin section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:27 28 \begin{verbatim} 29 cd $MMM/ g95_32_single/ ## or any of your install directory26 A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{your\_compdir} in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands: 27 28 \begin{verbatim} 29 cd $MMM/your_compdir 30 30 ln -sf ../SRC/SCRIPTS/prepare_ini . 31 31 ./prepare_ini 32 echo $PWD 32 33 \end{verbatim} 33 34 … … 36 37 37 38 \sk 38 The script \ttt{prepare\_ini} plays for the preprocessing tools a similar role as the \ttt{copy\_model} with the model sources: files are simply linked to their actual location in the \ttt{SRC} folder. Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentionned in the name of the current installation directory. In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}. 39 40 \begin{verbatim} 41 echo $PWD 42 cd PREP_MARS/ 39 The script \ttt{prepare\_ini} plays for the preprocessing tools a similar role as the \ttt{copy\_model} with the model sources: files are simply linked to their actual location in the \ttt{SRC} folder. Once you have executed \ttt{prepare\_ini}, please check that two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}. In the \ttt{PREP\_MARS} directory, please compile the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe}, using the compiler mentioned in the name of the current installation directory. In the \ttt{WPS} directory, please compile the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}. Here are the useful commands: 40 41 \begin{verbatim} 42 cd your_compdir/PREP_MARS/ 43 43 ./compile_pgf [or] ./compile_g95 [or] ./compile_ifort 44 44 ls -lt create_readmeteo.exe readmeteo.exe … … 51 51 52 52 \sk 53 Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} was a practical way not to confuse between executables compiled at different moments.}. \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}.53 Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} executable was a practical way not to confuse between executables compiled at different moments.}. \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}. 54 54 55 55 \sk … … 57 57 58 58 \sk 59 All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MMM/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Correspondingto the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:59 All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MMM/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{These coarse-resolution datasets correspond to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities: 60 60 61 61 \begin{verbatim} 62 62 cd $MMM 63 ./SRC/SCRIPTS/build_static 64 \end{verbatim} 65 66 \sk 67 \begin{finger} 68 \item Please install the \ttt{octave} free software\footnote{ Available at \url{http://www.gnu.org/software/octave} } on your system to execute the \ttt{build\_static} script\footnote{ Another solution is to browse into each of the directories within \ttt{WPS\_GEOG/res}, download the data with the shell scripts and execute the \ttt{.m} scripts with either \ttt{octave} or the commercial software \ttt{matlab} (just replace \ttt{\#} by \ttt{\%}). }. 63 ln -sf SRC/SCRIPTS/build_static . 64 ./build_static 65 \end{verbatim} 66 67 \sk 68 \begin{finger} 69 \item Please install the \ttt{octave} free software\footnote{Available at \url{http://www.gnu.org/software/octave} } on your system to execute the \ttt{build\_static} script\footnote{ Another solution is to browse into each of the directories within \ttt{WPS\_GEOG/res}, download the data with the shell scripts and execute the \ttt{.m} scripts with either \ttt{octave} or the commercial software \ttt{matlab} (just replace \ttt{\#} by \ttt{\%}). }. 69 70 \item Building the MOLA 64ppd database can be quite long; hence this is not performed by default by the \ttt{build\_static} script. If you would like to build this database, please remove the \ttt{exit} command in the script, just above the commands related to the MOLA 64ppd. 70 71 \item If you do not manage to execute the \ttt{build\_static} script, ready-to-use datafiles can be found in the link \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd} and must be extracted in \ttt{\$MMM/WPS\_GEOG}. 71 \item The resulting \ttt{WPS\_GEOG} can reach a size of several hundreds of Mo. You might move such a folder in a place with more disk space available and define a link named~\ttt{WPS\_GEOG} in \ttt{\$MMM}.72 \item The resulting \ttt{WPS\_GEOG} directory can reach a size of several hundreds of Mo. You might move such a folder in a place with more disk space available and define a link~\ttt{WPS\_GEOG} in \ttt{\$MMM}. 72 73 \end{finger} 73 74 … … 76 77 77 78 \sk 78 The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extractin the \ttt{\$MESO} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MESO/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are also available. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:79 The LMD Martian GCM needs to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours by the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, to be extracted in the \ttt{\$MESO} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MESO/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are also available. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}: 79 80 80 81 \begin{verbatim} … … 85 86 86 87 \sk 87 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on. A link named~\ttt{startbase} towards the \ttt{STARTBASE\_64\_48\_32\_t2} directory must be created in the directory~\ttt{\$MESO/LMDZ.MARS/myGCM}. If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm}which should launch the GCM integrations on your system.88 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with, based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible to the system you plan to run the mesoscale model on. A link named~\ttt{startbase} towards the \ttt{STARTBASE\_64\_48\_32\_t2} directory must be created in the directory~\ttt{\$MESO/LMDZ.MARS/myGCM}. If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} in this directory, which should launch the GCM integrations on your system. 88 89 89 90 \mk … … 97 98 \begin{finger} 98 99 \item changing the season of simulation implies to re-run the LMD Mars GCM for this specific season to prepare initial and boundary conditions for the mesoscale model. Hence e.g. \ttt{start\_month} is labelled with \ttt{(p1)} because changing this in \ttt{namelist.input} requires a complete reprocessing from step~$1$ to step~$3$ to successfully launch the simulation. 99 \item changing the number of horizontal grid points for the mesoscale domain implies to interpolate the static and GCM fields to the new domain, while no new computations on the GCM side are needed. Hence e.g. \ttt{e\_we} is labelled with \ttt{(p2)} because changing this in \ttt{namelist.input} requires a reprocessing from step~$2$ to step~$3$ to successfully launch the simulation (and also for this specific parameter recompiling with \ttt{makemeso} ).100 \item changing the number of horizontal grid points for the mesoscale domain implies to interpolate the static and GCM fields to the new domain, while no new computations on the GCM side are needed. Hence e.g. \ttt{e\_we} is labelled with \ttt{(p2)} because changing this in \ttt{namelist.input} requires a reprocessing from step~$2$ to step~$3$ to successfully launch the simulation (and also for this specific parameter recompiling with \ttt{makemeso} is needed). 100 101 \item changing the position of model top implies to interpolate initial and boundary conditions to the new vertical levels, while no horizontal re-interpolations are needed. Hence e.g. \ttt{p\_top\_requested} is labelled with \ttt{(p3)} because changing this requires a reprocessing of step~$3$. 101 \item changing the timestep for dynamical integration does not require any change in initial and bou ondary conditions. Hence e.g. \ttt{time\_step} is not labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}.102 \item changing the timestep for dynamical integration does not require any change in initial and boundary conditions. Hence e.g. \ttt{time\_step} is not labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)}. 102 103 \end{finger} 103 104 … … 113 114 114 115 \sk 115 Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:116 Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} reproduced in appendix can help with this choice (i.e. sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check in the \ttt{calendar} file which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} must be set accordingly: suppose you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, or ideally each hour, i.e. \ttt{ecritphy} is respectively~$80$ or~$40$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}: 116 117 117 118 \begin{verbatim} … … 133 134 134 135 \sk 135 Once the GCM simulations is finished, the programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if every went well with the conversion, 136 the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}. 136 Once the GCM simulations are finished, programs in the \ttt{PREP\_MARS} directory allow the user to convert the data\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, those are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf.} from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, which follows the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if everything went well with the conversion, the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:*}. 137 137 138 138 \begin{verbatim} … … 157 157 158 158 \begin{finger} 159 \item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be achieved/prepared e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain , while GCM computations are done instep~1.159 \item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be achieved/prepared e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain while GCM computations being performed done during step~1. 160 160 \item More details about the database and more options of interpolation could be found in the file \ttt{geogrid/GEOGRID.TBL} (for advanced users only). 161 161 \item Two examples of \ttt{namelist.wps} parameters are given in Figure~\ref{vallespolar} with resulting domains. … … 171 171 \includegraphics[width=0.48\textwidth]{LMD_MMM_d1_20km_domain_100.png} 172 172 \end{center} 173 \caption{\label{vallespolar} (Left plot) An example of mercator domain in the Valles Marineris region as simulated by \textit{Spiga and Forget} [2009, their section 3.3]: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 401}, \ttt{e\_we = 121}, \ttt{dx = 12000}, \ttt{dy = 12000}, \ttt{map\_proj ='mercator'}, \ttt{ref\_lat = -8}, \ttt{ref\_lon = -68}. (Right plot) An example of north polar domain with stereographical projection: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 117}, \ttt{e\_we = 117}, \ttt{dx = 20000}, \ttt{dy = 20000}, \ttt{map\_proj='polar'}, \ttt{ref\_lat = 90}, \ttt{ref\_lon = 0.1}, \ttt{truelat1 = 90}, \ttt{stand\_lon = 0.1}.}173 \caption{\label{vallespolar} (Left plot) An example of mercator domain in the Valles Marineris region as simulated by \textit{Spiga and Forget} [2009, their section 3.3]: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 401}, \ttt{e\_we = 121}, \ttt{dx = 12000}, \ttt{dy = 12000}, \ttt{map\_proj = 'mercator'}, \ttt{ref\_lat = -8}, \ttt{ref\_lon = -68}. (Right plot) An example of north polar domain with stereographical projection: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 117}, \ttt{e\_we = 117}, \ttt{dx = 20000}, \ttt{dy = 20000}, \ttt{map\_proj = 'polar'}, \ttt{ref\_lat = 90}, \ttt{ref\_lon = 0.1}, \ttt{truelat1 = 90}, \ttt{stand\_lon = 0.1}.} 174 174 \end{figure} 175 175 176 176 \sk 177 The input datasets for topography and soil properties can be set in \ttt{namelist.wps} through the keyword \ttt{geog\_data\_res}. Possible choices are 177 The input datasets for topography and soil properties can be set in \ttt{namelist.wps} through the keyword \ttt{geog\_data\_res}. Possible choices are: 178 178 \begin{citemize} 179 179 \item \ttt{'gcm'}: coarse-resolution datasets; … … 197 197 198 198 \sk 199 The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files called \ttt{wrfinput} and the boundary conditions in files called \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} controllingthe behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}.200 201 \begin{verbatim} 202 cd $M ESO/TESTCASE ## or anywhere you would like to run the simulation199 The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files called \ttt{wrfinput} and the boundary conditions in files called \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} which controls the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}. 200 201 \begin{verbatim} 202 cd $MMM/TESTCASE ## or anywhere you would like to run the simulation 203 203 ln -sf $MESO/TMPDIR/WRFFEED/met_em* . 204 204 ./real.exe … … 210 210 \sk 211 211 \begin{finger} 212 \item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be all the same. }212 \item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be consistent. } 213 213 \end{finger} 214 214 -
trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual.tex
r261 r262 102 102 103 103 \appendix 104 \chapter{Martian calendars} \label{calendar}104 \chapter{Martian calendars} 105 105 106 106 \sk -
trunk/MESOSCALE_DEV/MANUAL/SRC/whatis.tex
r257 r262 2 2 3 3 \vk 4 This chapter comprises excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} which are dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference \textit{Spiga and Forget} [2009]\nocite{Spig:09} paper and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction.4 This chapter comprises slightly edited excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09}, dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference \textit{Spiga and Forget} [2009]\nocite{Spig:09} paper and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction. This chapter is intended both for beginners and advanced users of the LMD Martian Mesoscale Model. 5 5 6 6 \begin{center} … … 18 18 19 19 \sk 20 The ARW-WRF mesoscale model integrates the fully compressible non-hydrostatic Navier-Stokes equations in a specific area of interest on the planet. Since the mesoscale models can be employed to resolve meteorological motions less than few kilometers, a scale at which the vertical wind acceleration might become comparable to the acceleration of gravity, hydrostatic balance cannot be assumed, as is usually done in G CMs.20 The ARW-WRF mesoscale model integrates the fully compressible non-hydrostatic Navier-Stokes equations in a specific area of interest on the planet. Since the mesoscale models can be employed to resolve meteorological motions less than few kilometers, a scale at which the vertical wind acceleration might become comparable to the acceleration of gravity, hydrostatic balance cannot be assumed, as is usually done in General Circulation Models (GCMs). 21 21 22 22 \sk … … 123 123 124 124 \sk 125 The LMD Martian Mesoscale Model has the complete ability to simulate the dust cycle (lifting, sedimentation, transport). However, the high sensi vity of the results to the assumptions made on threshold wind stress and injection rate [\textit{Basu et al.}, 2004]\nocite{Basu:04} leads us to postpone these issues to future studies. Instead, similarly to the reference LMD-MGCM simulations, dust opacities are prescribed in the mesoscale model from 1999-2001 TES measurements, thought to be representative of Martian atmospheric conditions outside of planet-encircling dust storm events [\textit{Montabone et al.}, 2006]\nocite{Mont:06luca}. In the vertical dimension, as described in \textit{Forget et al.} [1999], and in accordance with the general consensus of well-mixed dust in equilibrium with sedimentation and mixing processes [\textit{Conrath}, 1975]\nocite{Conr:75}, dust mixing ratio is kept constant from the surface up to a given elevation $z_{\textrm{\tiny{max}}}$ above which it rapidly declines. Both in the nominal GCM and mesoscale simulations, $z_{\textrm{\tiny{max}}}$ as a function of areocentric longitude and latitude is calculated from the ``MGS scenario" [\textit{Forget et al.}, 2003]\nocite{Forg:03}.125 The LMD Martian Mesoscale Model has the complete ability to simulate the dust cycle (lifting, sedimentation, transport). However, the high sensitivity of the results to the assumptions made on threshold wind stress and injection rate [\textit{Basu et al.}, 2004]\nocite{Basu:04} leads us to postpone these issues to future studies. Instead, similarly to the reference LMD-MGCM simulations, dust opacities are prescribed in the mesoscale model from 1999-2001 TES measurements, thought to be representative of Martian atmospheric conditions outside of planet-encircling dust storm events [\textit{Montabone et al.}, 2006]\nocite{Mont:06luca}. In the vertical dimension, as described in \textit{Forget et al.} [1999], and in accordance with the general consensus of well-mixed dust in equilibrium with sedimentation and mixing processes [\textit{Conrath}, 1975]\nocite{Conr:75}, dust mixing ratio is kept constant from the surface up to a given elevation $z_{\textrm{\tiny{max}}}$ above which it rapidly declines. Both in the nominal GCM and mesoscale simulations, $z_{\textrm{\tiny{max}}}$ as a function of areocentric longitude and latitude is calculated from the ``MGS scenario" [\textit{Forget et al.}, 2003]\nocite{Forg:03}. 126 126 127 127 \mk -
trunk/MESOSCALE_DEV/MANUAL/SRC/winds.py.help
r261 r262 1 Usage: winds.py [options]2 3 Options:4 1 -h, --help show this help message and exit 5 2 -f NAMEFILE, --file=NAMEFILE [NEEDED] name of WRF file (append)
Note: See TracChangeset
for help on using the changeset viewer.