Changeset 1182 for trunk/MESOSCALE


Ignore:
Timestamp:
Feb 18, 2014, 11:28:02 AM (11 years ago)
Author:
aslmd
Message:

MESOSCALE MANUAL. mistake in previous commit, erased latest version, fixed now

Location:
trunk/MESOSCALE/MANUAL/SRC
Files:
7 edited

Legend:

Unmodified
Added
Removed
  • trunk/MESOSCALE/MANUAL/SRC/advance.tex

    r653 r1182  
    104104\paragraph{Inputs/outputs} Large-Eddy Simulations need four input files \ttt{input\_coord}, \ttt{input\_sounding}, \ttt{input\_more}, \ttt{input\_therm} which define initial pressure, temperature, density, winds profiles at the location/season for which simulations are run, along with information about this location/season. Typical files are available upon request, or you might simply build your own profiles using the Mars Climate Database (see the sample \ttt{scilab} script \ttt{wrf\_sounding.sci} in \ttt{\$MMM/SIMU/RUN}). Examples for \ttt{input\_*} files are provided in \ttt{\$MMM/SRC/LES/modif\_mars/DEF} and correspond to the cases run in the study by \textit{Spiga et al.} [2010].
    105105
     106%% IMPORTANT IMPORTANT
    106107%% now python inimeso.py in UTIL/PYTHON
    107108
     
    116117
    117118
     119%%% le modele supporte en fait 5 nests
     120%%% ne pas oublier que mars doit etre dupliquee dans la namelist...
     121
     122
    118123%\mk
    119124%\section{Idealized test cases} [such as GW case]
     
    123128
    124129
    125 %\mk
    126 %\section{Running simulations with the new physics}
    127 %different callphys.def
    128 %the step datafile.h is not needed anymore ! use callphys.def.
    129 %traceur.def
    130 %run.def
    131 %different callphys.def
    132 %makemeso -p
    133 %%             mars = 3   ---> cycle poussieres : dustq + dustn [NOUVELLE PHYS seulement]
    134 %%             mars = 11  ---> cycle de l'eau + poussieres [1+3] [NOUVELLE PHYS seulement]
    135 %% LES LES
    136 %%[set to 18 for newphys]
     130\mk
     131\section{Running simulations with the new physical parameterizations}
    137132
    138 %% attention init_TI n'est plus separable... il faut run real.exe a cause du modele sous surface
    139 %% il faut ajouter init_Z0 ! ou il n'y a pas besoin de re executer real.exe
     133\sk
     134Using the most recent physical parameterizations means using a version of the LMD Martian Mesoscale Model that is still under development (thus experimental). It is therefore recommended to contact developers to run simulations in this mode.
    140135
    141 %% si z0 n est pas dans le wrfinput on prend la valeur par defaut
     136\sk
     137For advanced users who learnt with the LMD team how to use the new physical parameterizations, here are a few differences with the physical parameterizations natively provided with the LMD Martian Mesoscale Model that must be kept in mind
     138\begin{finger}
     139\item a folder \ttt{LMDZ\_MARS} containing the latest sources of the Mars LMD GCM must be located in the same repository which contains \ttt{MESOSCALE} (easy to do with SVN)
     140\item GCM runs used to produce initial and boundary conditions for the mesoscale model must be done in \ttt{MESOSCALE/LMDZ.MARS.new}
     141\item \ttt{makemeso} must be used with option \ttt{-p}
     142\item the \ttt{callphys.def} file is different
     143\item modifying the \ttt{datafile.h} is not necessary anymore, this can be done in \ttt{callphys.def}
     144\item an additional \ttt{run.def} file is needed
     145\item the number of scatterers must be given when compiling, standard simulation uses 1 scatterer (2 is used for radiatively active water ice clouds)
     146\item in \ttt{namelist.input}, the soil model must set to 18 levels
     147\item additional \ttt{mars} modes can be accessed (e.g. for interactive dust or the radiative effect of clouds)
     148\item if \ttt{init\_TI} is modified, \ttt{real.exe} must be run again (because of subsurface modeling)
     149\item a varying map for surface roughness~$z_0$ can be used -- or a constant value can be set with \ttt{init\_Z0} in \ttt{namelist.input} (if there is a problem, the old reference of 1cm is chosen)
     150\item (versions later than late 2013) the model does not need to be recompiled if the number of tracers is changed
     151\item (experimental) cases with tracers and water cycle should use \ttt{diff\_6th\_opt = 0} for better stability
     152\end{finger}
     153
     154
     155
     156
    142157
    143158%% cas test THARSIS
    144159
    145 %% parler de diff_6th_opt = 0 obligatoire pour les cas avec traceurs
     160
     161
    146162
    147163\clearemptydoublepage
  • trunk/MESOSCALE/MANUAL/SRC/compile_exec.tex

    r493 r1182  
    120120
    121121\sk
    122 You are asked a few questions by the \ttt{makemeso} script (see the list below) then it compiles the model for you. The script outputs a text file named \ttt{last} in which your answers to the questions are stored, which allows you to re-run the script without the ``questions to the user" step through the \ttt{makemeso < last} command line. In what follows, the answers given in brackets are the ones you want to use so that you will be able to run the test case proposed in the next section.
     122You are asked a few questions by the \ttt{makemeso} script (see the list below) then it compiles the model for you. The script outputs a text file named \ttt{last} in which your answers to the questions are stored, which allows you to re-run the script without the ``questions to the user" step through the \ttt{makemeso < last} command line.
    123123
    124124\mk
     
    129129\item \textbf{number of grid points in latitude} [61]
    130130\item \textbf{number of vertical levels} [61]
    131 \item \textbf{number of tracers} [1]
     131\item \textbf{number of tracers}\footnote{The minimum number of tracers is 1 and not 0. Setting to 0 will actually make the script to set tracer number to 1.} [1]
    132132\item \textbf{number of domains} [1]
    133133%\item[6.bis] (not the first time you use \ttt{makemeso}) a question for advanced users [press any key]
     
    135135\end{asparaenum}
    136136
     137\sk
     138The answers given in brackets above are the ones you want to use so that you will be able to run the test case proposed in the next section. Otherwise, before proceeding with \ttt{makemeso}, it is good to get used to gather the following information
     139\begin{itemize}
     140\item On which machine do you want to run the model? (a good practice is to compile on the same machine as the one used to run the model).
     141\item What is the horizontal resolution you want for your simulation? How many domains? How many tracers?
     142\item If parallel computations are employed: Do you have parallel librairies installed on the machine you chose? How much processors you want to use?
     143\end{itemize}
     144
    137145\mk
    138146A key question that often arises when using the LMD Martian Mesoscale Model is: when does the model has to be recompiled? The set of questions asked by~\ttt{makemeso} give some hints about this. Suppose you compiled a version of the model for a given set of parameters $1$ to $6$ to run a specific compilation. If you would like to run another simulation with at least one of parameters $1$ to $6$ subject to change, the model needs to be recompiled\footnote{This necessary recompilation each time the number of grid points, tracers and domains is modified is imposed by the LMD physics code. The WRF dynamical core alone is more flexible.} with \ttt{makemeso} (cf. also chapter~\ref{zeparam}).
    139147
    140148\mk
    141 Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try to recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{your\_compdir} directory. See chapter~\ref{faq}}.
     149Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch (this is for instance very useful if a previous compilation ran into problems, or was interrupted by the user). If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try to recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{your\_compdir} directory. See chapter~\ref{faq}}.
    142150
    143151\scriptsize
  • trunk/MESOSCALE/MANUAL/SRC/faq.tex

    r665 r1182  
    7272
    7373\sk
     74\noindent \textbf{WPS does not compile with my favorite compiler (the one I have to use for model integrations) but seems to work with another one}
     75\begin{finger}
     76\item Go to the folder corresponding to your favorite compiler. Remove the \ttt{WPS} folder and link here the \ttt{WPS} folder obtained with the alternate compiler. The \ttt{runmeso} workflow will then work just fine if you select your favorite compiler.
     77\end{finger}
     78
     79\sk
    7480\noindent \textbf{I think I found a bug in the model.}
    7581\begin{finger}
     
    166172\end{finger}
    167173
    168 %\mk
    169 %\section{Specific simulations}
    170 %\sk
    171 %\noindent \textbf{It seems difficult to me to find a number of horizontal grid points for parallel nested simulations that is compliant with all constraints mentioned in section~\ref{nests}.}
    172 %\begin{finger}
    173 %\item A tip to find a compliant domain size for nested simulations: for the parent domain, choose \ttt{e\_we} and \ttt{e\_sn} according to~$e_{we} = n_{proc} \times i + 1$ with~$n_{proc}$ being a multiple of~$4$ and~$2 \, i +1$ being a multiple of~$3$. For child domains, set \ttt{e\_we} and \ttt{e\_sn} according to \ttt{e\_we[child domain] = e_we[parent domains] + 4}.
    174 %\end{finger}
     174\sk
     175\noindent \textbf{The model seems not being able to produce outputs although the log files indicate writing files has been done. This is the case especially when I increased the number of grid points.}
     176\begin{finger}
     177\item Set the environment variable \ttt{WRFIO\_NCD\_LARGE\_FILE\_SUPPORT} to 1
     178\begin{verbatim}
     179declare -x WRFIO_NCD_LARGE_FILE_SUPPORT=1
     180\end{verbatim}
     181and recompile the model from scratch. Your model will be able then to produce very large files (especially restart files).
     182\end{finger}
     183
     184\mk
     185\section{Specific simulations}
     186
     187\sk
     188\noindent \textbf{It seems difficult to me to find a number of horizontal grid points for parallel nested simulations that is compliant with all constraints mentioned in section~\ref{nests}.}
     189\begin{finger}
     190\item A tip to find a compliant domain size for nested simulations: for the parent domain, choose \ttt{e\_we} and \ttt{e\_sn} according to~$e_{we} = n_{proc} \times i + 1$ with~$n_{proc}$ being a multiple of~$4$ and~$2 \, i +1$ being a multiple of~$3$. For child domains, set \ttt{e\_we} and \ttt{e\_sn} according to \ttt{e\_we[child domain] = e\_we[parent domains] + 4}.
     191\end{finger}
    175192%%%%% TROP SPECIFIQUE !!!
    176193%%        -----------------------------------------------------------------------
     
    195212%%% WPS PREP_MARS peuvent être liés entre e.g. pgf et mpi, ou ifort et mpifort
    196213
    197 %%% BIG FILES declare -x WRFIO_NCD_LARGE_FILE_SUPPORT=1
     214%%% RESTART: see SVN
     215
     216%%% LMD with old physics does not compile with ifort
     217
     218%%%         rel_path=               32ppd:thermal_TES/
     219%%%  -- il faudrait mettre ca dans GEOGRID.TBL
     220
     221
     222
    198223
    199224\clearemptydoublepage
  • trunk/MESOSCALE/MANUAL/SRC/guide.tex

    r493 r1182  
    100100\item make a choice about which step to start with.
    101101\end{citemize}
     102Note that only one instance of \ttt{runmeso} must be run at the same time, otherwise conflicting versions of initial conditions (and simulation outputs) will be obtained. If running several versions of the model are needed, it is recommended to duplicate a runmeso script for each version and modify those to be linked towards the correct model folder.
    102103
    103104\sk
  • trunk/MESOSCALE/MANUAL/SRC/installation.tex

    r493 r1182  
    8888\end{verbatim}
    8989
     90\begin{finger}
     91\item If you would like to use several versions of the model in separate folders, remember to change the \ttt{\$MESO} and \ttt{\$MMM} environment variables accordingly.
     92\end{finger}
     93
    9094\paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variables \ttt{\$MESO} and \ttt{\$MMM} as is detailed below. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by the command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
    9195
     
    109113Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of \ttt{MPICH2} or \ttt{openMPI} as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your MPI \ttt{bin} directory, even if you added this directory to your \ttt{\$PATH} variable.
    110114
    111 \begin{finger}
    112 \scriptsize
    113 \item An installation script for openMPI is available upon request (and information can be easily retrieved from the web).
    114 \item Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing which installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} for the sake of illustration) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
    115 \begin{verbatim}
    116 mkdir $your_software_dir/MPI ; mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/ ; cd $your_software_dir/MPI
    117 tar xzvf mpich2-1.0.8.tar.gz ; cd mpich2-1.0.8
    118 ./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
    119 make > mk.log 2> mkerr.log &
    120 declare -x WHERE_MPI=$your_software_dir/MPI/mpich2-1.0.8/bin
    121 \end{verbatim}
    122 \normalsize
    123 \end{finger}
     115%\begin{finger}
     116%\scriptsize
     117%\item An installation script for openMPI is available upon request (and information can be easily retrieved from the web).
     118%\item Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing which installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} for the sake of illustration) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
     119%\begin{verbatim}
     120%mkdir $your_software_dir/MPI ; mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/ ; cd $your_software_dir/MPI
     121%tar xzvf mpich2-1.0.8.tar.gz ; cd mpich2-1.0.8
     122%./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
     123%make > mk.log 2> mkerr.log &
     124%declare -x WHERE_MPI=$your_software_dir/MPI/mpich2-1.0.8/bin
     125%\end{verbatim}
     126%\normalsize
     127%\end{finger}
    124128
    125129\clearemptydoublepage
  • trunk/MESOSCALE/MANUAL/SRC/postproc.tex

    r493 r1182  
    5555
    5656\sk
    57 \subsection{General remarks}
     57%This section does not replace the need for you to develop your own plotting tools to suit your needs, which should be not too difficult.
     58The model outputs, as well as the results of \ttt{api} interpolations, are written using the netCDF format which can be read by most software with graphical capabilities. For a quick inspection of model results (especially for checking model outputs while the model is running), we recommend using \ttt{ncview}; for simple manipulations of netCDF files (e.g. concatenation, difference, extraction, \ldots), we recommend using commands from the \ttt{nco} package (see chapter~\ref{install} for website links). Graphical routines based on \ttt{idl}, \ttt{ferret} and \ttt{grads} can be made available upon request (as is, i.e. undocumented yet commented scripts). Successful reading/plotting of the LMD Martian Mesoscale Model outputs on \ttt{matlab}, \ttt{octave}, \ttt{idv} are also reported. It is possible to import the model's outputs to Geographical Information System (GIS) such as \ttt{arcgis}\footnote{\ttt{idl}, \ttt{matlab} and \ttt{arcgis} are neither open-source nor free.}. Since 2012, we developped our own tool named \ttt{PLANETOPLOT} based on \ttt{Python}. More information can be found here: \url{http://www.lmd.jussieu.fr/~aslmd/planetoplot}.
    5859
    59 \sk
    60 This section does not replace the need for you to develop your own plotting tools to suit your needs, which should be not too difficult. The model outputs, as well as the results of \ttt{api} interpolations, are written using the netCDF format which can be read by most software with graphical capabilities. For a quick inspection of model results (especially for checking model outputs while the model is running), we recommend using \ttt{ncview}; for simple manipulations of netCDF files (e.g. concatenation, difference, extraction, \ldots), we recommend using commands from the \ttt{nco} package (see chapter~\ref{install} for website links). Graphical routines based on \ttt{idl}\footnote{Most graphics in the published papers to date about the LMD Martian Mesoscale Model were made with this software}, \ttt{ferret} and \ttt{grads} can be made available upon request (as is, i.e. undocumented yet commented scripts). Successful reading/plotting of the LMD Martian Mesoscale Model outputs on \ttt{matlab}, \ttt{octave}, \ttt{idv} are also reported. It is possible to import the model's outputs to Geographical Information System (GIS) such as \ttt{arcgis}\footnote{\ttt{idl}, \ttt{matlab} and \ttt{arcgis} are neither open-source nor free.}.
    61 
    62 \sk
    63 \subsection{Python scripts}
    64 
    65 \sk
    66 Powerful scripts based on \ttt{python+numpy+matplotlib} have been developed to obtain plots from the mesoscale model outputs. All figures in this user manual are based on the script \ttt{pp.py}. This script can be obtained with the commands: \ttt{cd \$MESO ; cd .. ; svn update UTIL ; cd UTIL/PYTHON}. It is required that \ttt{python} and numerical/graphical librairies (\ttt{numpy}, \ttt{scipy}, \ttt{matplotlib}, \ttt{basemap}, \ttt{netcdf}) are installed on your system. Perhaps the simplest way to do so is to install the user-friendly complete python distribution \ttt{epd} (cf. link in chapter~\ref{install} and readme file \ttt{UTIL/PYTHON/README.INSTALL}).
    67 
    68 \sk
    69 One of the advantages of an approach using \ttt{python}, apart from its open-source philosophy and the abundant online documentation, is that it allows, in a common framework, for scripting with various options, integrating Fortran routines, manipulating arrays, making plots with various map projections. This is exemplified by the \ttt{pp.py} script. It can both perform interpolation with \ttt{api} for the level requested by the user then generate a map, all that in one simple command line. For instance, Figures~\ref{arsia} in chapter~\ref{compile} has been generated by the following two commands\footnote{The first plot can also be obtained by the command \ttt{domain.py -f name\_of\_file}}:
    70 
    71 \begin{verbatim}
    72 pp.py -f wrfout_d01_2024-01-17_02:00:00 -p ortho -b vishires --title ""
    73 pp.py -f wrfout_d01_2024-01-17_02:00:00 -i 4 -l 0.01 -v HGT -W -s 2 \
    74       --time 1 --axtime lt -m -1500. -M 20000. --div 20 -c nobar -z 25
    75 \end{verbatim}
    76 
    77 \sk
    78 Many options are implemented in our \ttt{pp.py} script. The information on the existing options to~\ttt{pp.py} can be obtained by typing \ttt{pp.py -h} (cf. next page). Examples on how to use the \ttt{pp.py} script can be found in \ttt{UTIL/PYTHON/README.PP}. The script can also be edited to suit your needs if the desired option does not exist.
    79 
    80 \begin{finger}
    81 \item Please ensure that you have the rights to execute \ttt{pp.py} (otherwise use the \ttt{chmod} command). It is also necessary to set the following environment variables to ensure the command \ttt{pp.py} would execute in any working directory
    82 \begin{verbatim}
    83 PYTHONPATH=$MESO/../UTIL/PYTHON/
    84 export PYTHONPATH
    85 PATH=$PYTHONPATH:$PATH
    86 \end{verbatim}
    87 \item The option \ttt{-i} in \ttt{pp.py} make use of the Fortran routines \ttt{api.F90} and \ttt{time.F}. The routines have to be converted to \ttt{python} commands using \ttt{f2py}. Please execute the script amongst \ttt{api\_g95.sh}, \ttt{api\_ifort.sh}, \ttt{api\_pgf90.sh} which corresponds to the Fortran compiler installed on your system. Check for errors/warnings in the log files and ensure that the two files \ttt{api.so} and \ttt{timestuff.so} are generated.
    88 \end{finger}
    89 
    90 \newpage
    91 
    92 \scriptsize
    93 \codesource{winds.py.help}
    94 \normalsize
     60%\sk
     61%\subsection{Python scripts}
     62%
     63%\sk
     64%Powerful scripts based on \ttt{python+numpy+matplotlib} have been developed to obtain plots from the mesoscale model outputs. All figures in this user manual are based on the script \ttt{pp.py}. This script can be obtained with the commands: \ttt{cd \$MESO ; cd .. ; svn update UTIL ; cd UTIL/PYTHON}. It is required that \ttt{python} and numerical/graphical librairies (\ttt{numpy}, \ttt{scipy}, \ttt{matplotlib}, \ttt{basemap}, \ttt{netcdf}) are installed on your system. Perhaps the simplest way to do so is to install the user-friendly complete python distribution \ttt{epd} (cf. link in chapter~\ref{install} and readme file \ttt{UTIL/PYTHON/README.INSTALL}).
     65%
     66%\sk
     67%One of the advantages of an approach using \ttt{python}, apart from its open-source philosophy and the abundant online documentation, is that it allows, in a common framework, for scripting with various options, integrating Fortran routines, manipulating arrays, making plots with various map projections. This is exemplified by the \ttt{pp.py} script. It can both perform interpolation with \ttt{api} for the level requested by the user then generate a map, all that in one simple command line. For instance, Figures~\ref{arsia} in chapter~\ref{compile} has been generated by the following two commands\footnote{The first plot can also be obtained by the command \ttt{domain.py -f name\_of\_file}}:
     68%
     69%\begin{verbatim}
     70%pp.py -f wrfout_d01_2024-01-17_02:00:00 -p ortho -b vishires --title ""
     71%pp.py -f wrfout_d01_2024-01-17_02:00:00 -i 4 -l 0.01 -v HGT -W -s 2 \
     72%      --time 1 --axtime lt -m -1500. -M 20000. --div 20 -c nobar -z 25
     73%\end{verbatim}
     74%
     75%\sk
     76%Many options are implemented in our \ttt{pp.py} script. The information on the existing options to~\ttt{pp.py} can be obtained by typing \ttt{pp.py -h} (cf. next page). Examples on how to use the \ttt{pp.py} script can be found in \ttt{UTIL/PYTHON/README.PP}. The script can also be edited to suit your needs if the desired option does not exist.
     77%
     78%\begin{finger}
     79%\item Please ensure that you have the rights to execute \ttt{pp.py} (otherwise use the \ttt{chmod} command). It is also necessary to set the following environment variables to ensure the command \ttt{pp.py} would execute in any working directory
     80%\begin{verbatim}
     81%PYTHONPATH=$MESO/../UTIL/PYTHON/
     82%export PYTHONPATH
     83%PATH=$PYTHONPATH:$PATH
     84%\end{verbatim}
     85%\item The option \ttt{-i} in \ttt{pp.py} make use of the Fortran routines \ttt{api.F90} and \ttt{time.F}. The routines have to be converted to \ttt{python} commands using \ttt{f2py}. Please execute the script amongst \ttt{api\_g95.sh}, \ttt{api\_ifort.sh}, \ttt{api\_pgf90.sh} which corresponds to the Fortran compiler installed on your system. Check for errors/warnings in the log files and ensure that the two files \ttt{api.so} and \ttt{timestuff.so} are generated.
     86%\end{finger}
     87%
     88%\newpage
     89%
     90%\scriptsize
     91%\codesource{winds.py.help}
     92%\normalsize
     93%
     94%%% IMPORTANT IMPORTANT
     95%%% now python inimeso.py in UTIL/PYTHON
     96%%% mettre a jour le help
    9597
    9698\clearemptydoublepage
  • trunk/MESOSCALE/MANUAL/SRC/preproc.tex

    r493 r1182  
    4444ls -lt create_readmeteo.exe readmeteo.exe
    4545cd ..
    46 cd WPS/   
     46cd WPS/
     47clean
    4748./configure     ## select your compiler + 'NO GRIB2' option
    4849./compile
     
    5354Apart from the executables just compiled, the preprocessing utilities include \ttt{real.exe}, which was compiled by the \ttt{makemeso} script along with the mesoscale model executable \ttt{wrf.exe}\footnote{Even though the name of the executable reads e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program is not related to the specific \ttt{makemeso} parameters -- contrary to the \ttt{wrf.exe} executable. We just found that renaming the (possibly similar if the model sources were not modified) \ttt{real.exe} executable was a practical way not to confuse between executables compiled at different moments.} (cf. chapter~\ref{compile}). \ttt{real.exe} should be copied or linked in the simulation directory (e.g. \ttt{TESTCASE} for the Arsia Mons test case) to be at the same level than \ttt{namelist.input}.
    5455
    55 \sk
    56 \subsection{Preparing input static data}
     56\begin{verbatim}
     57cp your_compdir/real_*.exe your_simulation_directory/
     58cp your_compdir/wrf_*.exe your_simulation_directory/
     59\end{verbatim}
     60
     61\sk
     62\subsection{Preparing input static data}\label{wpsgeog}
    5763
    5864\sk
     
    8288cd $MESO/LMDZ.MARS
    8389[edit $MESO/LMDZ.MARS/libf/phymars/datafile.h & fill absolute link $MMM/WPS_GEOG]
     90[edit compile if needed]
    8491./compile
    8592\end{verbatim}
     
    8794\sk
    8895The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with, based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database\footnote{If another database is used, \ttt{compile} must be edited; default is~$64 \times 48 \times 32$ GCM runs with~$2$ tracers.} can be found in the following online archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible to the system you plan to run the mesoscale model on. A link named~\ttt{startbase} towards the \ttt{STARTBASE\_64\_48\_32\_t2} directory must be created in the directory~\ttt{\$MESO/LMDZ.MARS/myGCM}.
     96
     97\begin{verbatim}
     98ln -sf where_is_your_startbase/STARTBASE_64_48_32_t2 startbase
     99\end{verbatim}
     100
     101\sk
     102It is important to check that the chosen reference database 1. spans the season desired for the mesoscale simulation; 2. includes the right number of tracers and vertical extent; and 3. uses GCM parameterizations that are close to the ones employed in the subsequent mesoscale simulations.
    89103
    90104\sk
     
    121135\begin{verbatim}
    122136cd $MESO/LMDZ.MARS/myGCM
     137[edit run.def, in particular to modify nday]
    123138./launch_gcm    ## answer: your desired starting sol for the simulations
    124139\end{verbatim}
     
    157172\end{verbatim}
    158173
    159 The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page\footnote{You may find the corresponding file in \ttt{\$MMM/SIMU/namelist.wps\_example}.}, and execute again \ttt{geogrid.exe}.
     174The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} e.g. with topographical fields \ttt{HGT\_M} \ttt{HGT\_U} \ttt{HGT\_V} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page\footnote{You may find the corresponding file in \ttt{\$MMM/SIMU/namelist.wps\_example}.}, and execute again \ttt{geogrid.exe}.
    160175
    161176\begin{finger}
     
    186201\item \ttt{'32ppd\_HRalb'}: fine-resolution albedo, coarse-resolution thermal inertia, 32ppd topography.
    187202\end{citemize}
     203The corresponding dataset must have been built in the \ttt{WPS\_GEOG} folder previously (see section~\ref{wpsgeog}).
    188204
    189205\sk
Note: See TracChangeset for help on using the changeset viewer.