Changeset 218 for trunk/MESOSCALE_DEV/MANUAL/SRC
- Timestamp:
- Jul 12, 2011, 7:41:55 PM (13 years ago)
- Location:
- trunk/MESOSCALE_DEV/MANUAL/SRC
- Files:
-
- 5 added
- 5 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex
r214 r218 110 110 tar xzvf LMD_MM_MARS.tar.gz 111 111 cd $LMDMOD/LMD_MM_MARS 112 ./ prepare112 ./SRC/SCRIPTS/prepare ## or simply ./prepare if the script is in LMD_MM_MARS 113 113 \end{verbatim} 114 114 … … 137 137 \end{verbatim} 138 138 139 \mk140 \section{Structure of the \ttt{LMD\_MM\_MARS} directory}141 142 \sk143 Please check the contents of the \ttt{LMD\_MM\_MARS} and the \ttt{LMD\_MM\_MARS/SRC} directories by the following command line:144 \begin{verbatim}145 ls $MMM146 ls $MMM/SRC147 \end{verbatim}148 149 \sk150 Contents of~\ttt{LMD\_MM\_MARS} directory:151 \begin{citemize}152 \item \ttt{makemeso}: this is the \ttt{bash} script to compile the model.153 \item \ttt{SRC}: this is a directory containing the model sources.154 \item \ttt{SIMU}: this is a directory containing scripts and files for an advanced use.155 \item \ttt{WPS_GEOG}: this is a directory containing static data for the model (topography, soil properties, etc...)156 \end{citemize}157 158 \sk159 Contents of~\ttt{LMD\_MM\_MARS/SRC} directory:160 \begin{citemize}161 \item \ttt{WRFV2}: this is a directory containing main model sources (modified WRF dynamics + LMD physics in \ttt{mars_lmd*}).162 \item \ttt{PREP\_MARS}: this a directory containing sources for the first preprocessing step.163 \item \ttt{WPS}: this a directory containing sources for the second preprocessing step.164 \item \ttt{POSTPROC}: this a directory containing postprocessing sources.165 \item \ttt{LES} and \ttt{LESnophys_}: these are directories containing sources for Large-Eddy Simulations.166 \end{citemize}167 168 \sk169 Contents of~\ttt{LMD\_MM\_MARS} directory:170 \begin{citemize}171 \item172 173 \item the sources directory \ttt{SRC};174 \item the static data directory \ttt{WPS\_GEOG};175 \item the simulation utilities directory \ttt{SIMU}.176 \end{citemize}177 178 179 180 \sk181 Contents of~\ttt{LMD\_MM\_MARS/WPS\_GEOG} directory:182 \begin{citemize}183 \item three directories \ttt{albedo\_GCM}, \ttt{mola\_GCM}, \ttt{thermal\_GCM}, \ttt{res}184 \item one data file \ttt{dust\_tes.nc}185 \end{citemize}186 187 \sk188 The directory~\ttt{LMD\_MM\_MARS/SIMU} contains many files and directories not important at this stage. If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$LMDMOD}, but those are not important at this stage.189 190 %LMD_MM_MARS/ LMD_MM_MARS_User_Manual.pdf LMDZ.MARS/ LMDZ.MARS.new191 192 139 \clearemptydoublepage -
trunk/MESOSCALE_DEV/MANUAL/SRC/keep
r209 r218 1 \sk 2 \marge Now a few environment variables are needed for the model to compile and run correctly. You also need the environment variable \ttt{\$LMDMOD} to point at the directory where you will install the model. 1 \begin{finger} 2 \item The model presently supports 3 nests, but more nests 3 can be included by adaptating the following files: 4 \begin{verbatim} 5 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc 6 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc 7 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3 8 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3 9 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest' 10 \end{verbatim}%\pagebreak 11 \end{finger} 3 12 4 \begin{verbatim}5 declare -x LMDMOD=/disk/user/MODELS6 \end{verbatim}7 13 8 \sk9 If everything went well up until this stage, you are now ready to install, compile and run the LMD Martian Mesoscale Model. -
trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual.tex
r209 r218 86 86 %\newpage 87 87 88 %\include{foreword}89 %\include{whatis}88 \include{foreword} 89 \include{whatis} 90 90 \include{installation} 91 \include{user_manual_txt} 91 \include{compile_exec} 92 %\include{user_manual_txt} 92 93 93 94 \backmatter -
trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex
r209 r218 1 1 2 3 4 \chapter{Compiling the model and running a test case}5 6 \vk7 This chapter is also meant for first time users of the LMD Martian Mesoscale Model. We describe how to compile the program and run a test case.8 9 \mk10 \subsection{Main compilation step}11 \label{sc:makemeso}12 13 \mk14 In order to compile the model, execute the \ttt{makemeso} compilation script15 in the \ttt{LMD\_MM\_MARS}\linebreak directory16 %17 \begin{verbatim}18 cd $LMDMOD/LMD_MM_MARS19 ./makemeso20 \end{verbatim}21 %22 \marge and answer to the questions about23 \begin{asparaenum}[1.]%[\itshape Q1\upshape)]24 \item compiler choice (and number of processors if using MPI)25 \item number of grid points in longitude [61]26 \item number of grid points in latitude [61]27 \item number of vertical levels [61]28 \item number of tracers [1]29 \item number of domains [1]30 \end{asparaenum}31 32 %\mk33 \begin{finger}34 \item On the first time you compile the model, you will probably wonder what to reply35 to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable36 for the test case given below.37 \item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$38 to run a specific compilation.39 If you would like to run another simulation40 with at least one of parameters $1$ to $6$41 subject to change, the model needs to be recompiled\footnote{This42 necessary recompilation each time the number of grid points,43 tracers and domains is modified is imposed by the LMD physics code.44 The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}.45 \item When you use parallel computations, please bear in mind that with46 $2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated47 into $2$ (resp. $2$, $3$, $4$, $4$) tiles over48 the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction.49 Thus make sure that the number of grid points minus $1$ in each direction50 could be divided by the aforementioned number of tiles over the considered51 direction.52 \item If you use grid nesting, note that no more than $4$ processors can be used.53 \end{finger}54 55 \mk56 \marge The \ttt{makemeso} is an automated script which performs57 the following serie of tasks:58 %It is useful to detail and comment the performed by the \ttt{makemeso} script:59 \begin{citemize}60 \item determine if the machine is 32 or 64 bits;61 \item ask the user about the compilation settings;62 \item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP};63 \begin{finger}64 \item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single}65 is created if the user requested66 a \ttt{g95} compilation of the code for single-domain simulations67 on a 32bits machine.68 \end{finger}69 \item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources;70 \begin{finger}71 \item This method ensures that any change to the model sources would72 be propagated to all the different \ttt{DIRCOMP} installation folders.73 \end{finger}74 \item execute the WRF \ttt{configure} script with the correct option;75 \item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics;76 \item calculate the total number of horizontal grid points handled by the LMD physics;77 \item duplicate LMD physical sources if nesting is activated;78 \begin{finger}79 \item The model presently supports 3 nests, but more nests80 can be included by adaptating the following files:81 \begin{verbatim}82 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc83 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc84 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate385 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate386 $LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm* ## search for 'nest'87 \end{verbatim}%\pagebreak88 \end{finger}89 \item compile the LMD physical packages with the appropriate \ttt{makegcm} command90 and collect the compiled objects in the library \ttt{liblmd.a};91 \begin{finger}92 \item During this step that could be a bit long,93 especially if you defined more than one domain,94 the \ttt{makemeso} script provides you with the full path towards95 the text file \ttt{log\_compile\_phys} in which you can check for96 compilation progress and possible errors.97 %98 In the end of the process, you will find an99 error message associated to the generation of the100 final executable.101 %102 Please do not pay attention to this, as the compilation of the LMD103 sources is meant to generate a library of104 compiled objects called \ttt{liblmd.a} instead of a program.105 \end{finger}106 \item compile the modified Martian ARW-WRF solver, including107 the \ttt{liblmd.a} library;108 \begin{finger}109 \item When it is the first time the model is compiled, this110 step could be quite long.111 %112 The \ttt{makemeso} script provides you with a \ttt{log\_compile}113 text file where the progress of the compilation can be checked114 and a \ttt{log\_error} text file listing errors and warnings115 during compilation.116 %117 A list of warnings related to \ttt{grib}118 utilities (not used in the Martian model)119 may appear and have no impact on the120 final executables.121 \item The compilation with \ttt{g95} might be unsuccessful122 due to some problems with files related to terrestrial microphysics.123 %124 Please type the following commands:125 \begin{verbatim}126 cd $LMDMOD/LMD_MM_MARS/SRC127 tar xzvf g95.tar.gz128 cp -f g95/WRFV2_g95_fix/* WRFV2/phys/129 cd $LMDMOD/LMD_MM_MARS130 \end{verbatim}131 \marge then recompile the model with the \ttt{makemeso} command.132 \end{finger}133 \item change the name of the executables in agreements with the134 settings provided by the user.135 \begin{finger}136 \item If you choose to answer to the \ttt{makemeso} questions using the137 aforementioned parameters in brackets, you should have in the138 \ttt{DIRCOMP} directory two executables:139 \begin{verbatim}140 real_x61_y61_z61_d1_t1_p1.exe141 wrf_x61_y61_z61_d1_t1_p1.exe142 \end{verbatim}143 %144 The directory also contains a text file145 in which the answers to the questions are stored, which146 allows you to re-run the script without the147 ``questions to the user" step:148 \begin{verbatim}149 ./makemeso < makemeso_x61_y61_z61_d1_t1_p1150 \end{verbatim}151 \end{finger}152 \end{citemize}153 154 \mk155 \section{Running a simple test case}156 \label{sc:arsia}157 158 \mk159 We suppose that you had successfully compiled160 the model at the end of the previous section161 and you had used the answers in brackets162 to the \ttt{makemeso} questions.163 164 \mk165 \marge In order to test the compiled executables,166 a ready-to-use test case167 (with pre-generated initial and boundary168 conditions) is proposed169 in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz}170 archive you can download at171 \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}.172 %173 This test case simulates the hydrostatic174 atmospheric flow around Arsia Mons during half a sol175 with constant thermal inertia, albedo176 and dust opacity.177 178 \begin{finger}179 \item Though the simulation reproduces some reasonable180 features of the mesoscale circulation around Arsia181 Mons (e.g. slope winds), it should not be used182 for scientific purpose, for the number of grid points183 is unsufficient for single-domain simulation184 and the integration time is below the necessary spin-up time.185 \end{finger}186 %\pagebreak187 188 \marge To launch the test simulation, please type189 the following commands, replacing the190 \ttt{g95\_32\_single} directory with its corresponding191 value on your system:192 %193 \begin{verbatim}194 cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/195 tar xzvf LMD_MM_MARS_TESTCASE.tar.gz196 cd TESTCASE197 ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe198 nohup wrf.exe > log_wrf &199 \end{verbatim}200 201 %tar xzvf wrfinput.tar.gz202 203 \begin{finger}204 \item If you compiled the model using MPICH2,205 the command to launch a simulation is slightly different:206 %207 \begin{verbatim}208 [simulation on 2 processors on 1 machine]209 mpd & # first-time only (or after a reboot)210 # NB: may request the creation of a file .mpd.conf211 mpirun -np 8 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec212 tail -20 rsl.out.000? # to check the outputs213 \end{verbatim}214 \begin{verbatim}215 [simulation on 16 processors in 4 connected machines]216 echo barry.lmd.jussieu.fr > ~/mpd.hosts217 echo white.lmd.jussieu.fr >> ~/mpd.hosts218 echo loves.lmd.jussieu.fr >> ~/mpd.hosts219 echo tapas.lmd.jussieu.fr >> ~/mpd.hosts220 ssh barry.lmd.jussieu.fr # make sure that ssh to other machines221 # is possible without authentification222 mpdboot -f ~/mpd.hosts -n 4223 mpdtrace224 mpirun -l -np 16 wrf.exe < /dev/null & # NB: mpirun is only a link to mpiexec225 tail -20 rsl.out.00?? # to check the outputs226 \end{verbatim}227 \end{finger}228 2 229 3 … … 829 603 830 604 \mk 831 \section{Postprocessing utilities and graphics} 605 \section{Postprocessing utilities and graphics}\label{postproc} 832 606 833 607 \begin{remarque} -
trunk/MESOSCALE_DEV/MANUAL/SRC/whatis.tex
r209 r218 2 2 3 3 \vk 4 This chapter comprises the excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. The figure at the end of this chapter summarizes the main points detailed in this introduction. 4 This chapter comprises the excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction. 5 6 \begin{center} 7 \begin{figure}[p] 8 \includegraphics[width=0.99\textwidth]{meso.pdf} 9 \caption{\label{modelstructure} An illustration of the LMD Martian Mesoscale Model design and capabilities.} 10 \end{figure} 11 \end{center} 5 12 6 13 \mk … … 133 140 The initial atmospheric state obtained through this ``hybrid" method ensures low-amplitude adjustments of the meteorological fields by the mesoscale model at the beginning of the performed simulations (i.e., in the first thousands of seconds). Furthermore, the continuity between the large-scale forcing and the mesoscale computations near the limits of the domain, as well as the numerical stability of the simulations, appear as significantly improved compared to methods either based on extrapolation (especially in areas of uneven terrains) or terrain-following interpolation. 134 141 135 %\pagebreak 136 \includepdf[pages=1,offset=25mm -20mm]{meso.pdf} 142 %\includepdf[pages=1,offset=25mm -20mm]{meso.pdf} 137 143 \clearemptydoublepage 138 144
Note: See TracChangeset
for help on using the changeset viewer.