Ignore:
Timestamp:
Jul 14, 2011, 3:37:47 AM (13 years ago)
Author:
aslmd
Message:

MESOSCALE: user manual. finished a much corrected version for preprocessing chapter.

Location:
trunk/MESOSCALE_DEV/MANUAL/SRC
Files:
4 added
8 edited

Legend:

Unmodified
Added
Removed
  • trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex

    r220 r223  
    1111
    1212\sk
    13 Any simulation that will be carried out with the LMD Martian Mesoscale Model comprises of the five following steps. More details will be given on the various steps when needed, but it is important at this stage to have this structure in mind.
     13Any simulation that will be carried out with the LMD Martian Mesoscale Model comprises the five following steps. More details are given on the various steps in the following chapters, but it is important at this stage to have this structure in mind.
    1414
    1515\sk
     
    2929
    3030\sk
    31 Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$LMDMOD}, but those are not important at this stage.} and sub-directories through the following command lines:
     31Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MOD}, but those are not important at this stage.} and sub-directories through the following command lines:
    3232\begin{verbatim}
    3333ls $MMM ; ls $MMM/*
     
    8787\item ask the user about compilation settings;
    8888\item retrieve some additional information about the system;
    89 \item create a directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP} which name depends\footnote{For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case);
    90 \item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{DIRCOMP} installation folders.};
     89\item create a directory \ttt{\$MOD/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case);
     90\item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.};
    9191\item execute the WRF \ttt{configure} script with the correct option;
    9292\item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics and various patches and specific compilation options;
     
    138138
    139139\mk
    140 Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try top recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{DIRCOMP} directory.}.
     140Note that the \ttt{makemeso -h} command lists the various options that can be used in the \ttt{makemeso} script. Most options should be used only by advanced users and some of them will be described in the following chapters. At this stage, the only option of \ttt{makemeso} which can be useful to you is \ttt{-f} which forces the model to be recompiled from scratch. If you already compiled the model succesfully, but the model fails to compile a few days later for reasons unrelated to your operations on your system or on the model file, we recommend you to use the \ttt{-f} option in \ttt{makemeso} to try top recompile the model\footnote{A more extreme solution if \ttt{makemeso -f} does not solve your problem is to remove the corresponding \ttt{your\_compdir} directory.}.
    141141
    142142\scriptsize
     
    149149
    150150\sk
    151 We assume here that you had successfully compiled the model with \ttt{makemeso} at the end of the previous section and you had based your answers to the \ttt{makemeso} script on the indications in brackets. You should then find in the \ttt{DIRCOMP} directory one \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable and one \ttt{wrf\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable.
     151We assume here that you had successfully compiled the model with \ttt{makemeso} at the end of the previous section and you had based your answers to the \ttt{makemeso} script on the indications in brackets. You should then find in the \ttt{your\_compdir} directory one \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable and one \ttt{wrf\_x61\_y61\_z61\_d1\_t1\_p1.exe} executable.
    152152
    153153\sk
     
    159159%
    160160\begin{verbatim}
    161 cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/
     161cp LMD_MM_MARS_TESTCASE.tar.gz $MOD/LMD_MM_MARS/
    162162tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
    163163cd TESTCASE
  • trunk/MESOSCALE_DEV/MANUAL/SRC/foreword.tex

    r209 r223  
    1 \chapter{Foreword}
     1\chapter*{Foreword}
    22
    33\vk
    4 Welcome! This manual describes how to use the Laboratoire de M\'et\'eorologie Dynamique (LMD) Martian Mesoscale Model. Many thanks for looking forward to using this model. Developping the LMD Martian Mesoscale Model required countless hours of hard work! A significant part of the model development and validation have been funded by ESA and CNES which are acknowledged here.
     4\paragraph{Welcome!} This manual describes how to use the Laboratoire de M\'et\'eorologie Dynamique (LMD) Martian Mesoscale Model. Many thanks for looking forward to using this model which development required countless hours of hard work! A significant part of the model development and validation have been funded by ESA and CNES which are acknowledged here.
    55
    6 \mk
    7 The main contact to reach at LMD to become an user of the model is Aymeric SPIGA (main developper, \href{mailto:aymeric.spiga@upmc.fr}{\nolinkurl{aymeric.spiga@upmc.fr}}). Alternative contacts at LMD for mesoscale modeling inquiries are Ehouarn MILLOUR~\url{ehouarn.millour@lmd.jussieu.fr} or Fran\c cois FORGET~\url{francois.forget@lmd.jussieu.fr}. The model is distributed freely to academic partners in the frame of scientific collaborations, but not to industrial and commercial partners. At any event, we are open both to new scientific collaboration projects and contractual proposals.
     6\paragraph{Contact} The main contact to reach at LMD to become an user of the model is Aymeric SPIGA (main developper, \href{mailto:aymeric.spiga@upmc.fr}{\nolinkurl{aymeric.spiga@upmc.fr}}). Alternative contacts at LMD for mesoscale modeling inquiries are Ehouarn MILLOUR~\url{ehouarn.millour@lmd.jussieu.fr} or Fran\c cois FORGET~\url{francois.forget@lmd.jussieu.fr}. We are open to questions and suggestions on new scientific collaborations, teaching/outreach actions or contractual proposals.
    87
    9 \mk
    10 [To our academic partners] Please cite the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} if you'd like to refer to the LMD Martian Mesoscale Model in one of your publication. If your paper makes use of specific simulations carried out with the LMD Martian Mesoscale Model, please consider including A. Spiga as a co-author of your work and asking for help with writing the part related to mesoscale modeling. If you have any idea of specific simulations and wonder if it is ever possible to perform those with the LMD Martian Mesoscale Model, please do not hesitate to ask! If your study requires a significant work on a peculiar Martian physical parameterization, please do not hesitate to tell us about it and we would determine additional participants in the LMD team.
     8\paragraph{Copyright (LMD)} The LMD Martian Mesoscale Model sources are made available on the condition that we make no representations or warranties regarding the reliability or validity of the model predictions nor the use to which such model predictions should be put, disclaim any and all responsibility for any errors or inaccuracies in the model predictions and bear no responsibility for any use made of this model predictions by any party. Scientific use of LMD Martian Mesoscale Model simulations is freely allowed provided that the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} is correctly quoted in all publications and that we are kept informed of usage and developments.. If your paper makes use of specific simulations carried out with the LMD Martian Mesoscale Model, please consider including Aymeric SPIGA as a co-author of your work and asking, if needed, for help with writing the part related to mesoscale modeling. If your study requires additional work on a specific Martian physical parameterization, please consider including other members of the LMD team in addition to Aymeric SPIGA. The LMD Martian Mesoscale Model may not be put to any commercial use without specific authorization.
    119
    12 \mk
    13 Part of the LMD Martian Mesoscale Model is based on the terrestrial model WRF which is in the public domain. If you are an user of the LMD Martian Mesoscale Model, you are therefore an user of the WRF model. Please take a minute to fill in the WRF registration form so that the WRF development team knows about the people using their model: \url{http://www.mmm.ucar.edu/wrf/users/download/wrf-regist.php}. \noindent \scriptsize \emph{WRF was developed at the National Center for Atmospheric Research (NCAR) which is operated by the University Corporation for Atmospheric Research (UCAR). NCAR and UCAR make no proprietary claims, either statutory or otherwise, to this version and release of WRF and consider WRF to be in the public domain for use by any person or entity for any purpose without any fee or charge. UCAR requests that any WRF user include this notice on any partial or full copies of WRF. WRF is provided on an "AS IS" basis and any warranties, either express or implied, including but not limited to implied warranties of non-infringement, originality, merchantability and fitness for a particular purpose, are disclaimed. In no event shall UCAR be liable for any damages, whatsoever, whether direct, indirect, consequential or special, that arise out of or in connection with the access, use or performance of WRF, including infringement actions. WRF is a registered trademark of the University Corporation for Atmospheric Research (UCAR).} \normalsize
     10\paragraph{Copyright (WRF)} Part of the LMD Martian Mesoscale Model is based on the terrestrial model WRF which is in the public domain. If you are an user of the LMD Martian Mesoscale Model, you are therefore an user of the WRF model. Please take a minute to fill in the WRF registration form so that the WRF development team knows about the people using their model: \url{http://www.mmm.ucar.edu/wrf/users/download/wrf-regist.php}. \noindent \scriptsize \emph{WRF was developed at the National Center for Atmospheric Research (NCAR) which is operated by the University Corporation for Atmospheric Research (UCAR). NCAR and UCAR make no proprietary claims, either statutory or otherwise, to this version and release of WRF and consider WRF to be in the public domain for use by any person or entity for any purpose without any fee or charge. UCAR requests that any WRF user include this notice on any partial or full copies of WRF. WRF is provided on an "AS IS" basis and any warranties, either express or implied, including but not limited to implied warranties of non-infringement, originality, merchantability and fitness for a particular purpose, are disclaimed. In no event shall UCAR be liable for any damages, whatsoever, whether direct, indirect, consequential or special, that arise out of or in connection with the access, use or performance of WRF, including infringement actions. WRF is a registered trademark of the University Corporation for Atmospheric Research (UCAR).} \normalsize
    1411
    1512\clearemptydoublepage
  • trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex

    r220 r223  
    1616\item your computer is connected to the internet;
    1717\item you have~\ttt{200 Mo} free disk space available;
    18 \item your OS is Linux\footnote{The model was also successfully compiled on MacOSX; ``howto" information is available upon request but could have become obsolete on recent versions of Apple hardware and software. It is probably possible to compile the model on Windows using Cygwin but this has not implemented nor tested. You are kindly advised to install a Linux distribution on your computer (e.g. Ubuntu, Debian, Fedora, ...).} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots);
     18\item your OS is Linux\footnote{The model was also successfully compiled on MacOSX; ``howto" information is available upon request but could have become obsolete on recent versions of Apple hardware and software. It is probably possible to compile the model on Windows using Cygwin but this has not been implemented nor tested. You are kindly advised to install a Linux distribution on your computer (e.g. Ubuntu, Debian, Fedora, ...).} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots);
    1919\item at least one of the following Fortran compilers is installed on your computer
    2020\begin{itemize}
     
    4545
    4646\sk
    47 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$LMDMOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.
    48 
    49 \begin{finger}
    50 \item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
    51 \begin{verbatim}
    52 mkdir $LMDMOD/MPI
    53 mv mpich2-1.0.8.tar.gz $LMDMOD/MPI
    54 cd $LMDMOD/MPI
    55 tar xzvf mpich2-1.0.8.tar.gz
    56 cd mpich2-1.0.8
    57 ./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
    58 # please wait...
    59 make > mk.log 2> mkerr.log &
    60 declare -x WHERE_MPI=$LMDMOD/MPI/mpich2-1.0.8/bin
    61 \end{verbatim}
    62 \normalsize
    63 \end{finger}
    64 
    65 \sk
    66 \subsection{Compiling the terrestrial WRF model}
     47\subsection{Compiling the terrestrial WRF model}\label{terrestrial}
    6748
    6849\sk The LMD Martian Mesoscale Model is based on the terrestrial NCEP/NCAR ARW-WRF Mesoscale Model. As a first step towards the compilation of the Martian version, we advise you to check that the terrestrial model compiles on your computer with either \ttt{g95} or \ttt{pgf90} or \ttt{ifort}. On the ARW-WRF website \url{http://www.mmm.ucar.edu/wrf/users/download/get\_source.html}, you will be allowed to freely download the model after a quick registration process (click on ``New users"). Make sure to download the version 2.2 of the WRF model and copy the \ttt{WRFV2.2.TAR.gz} archive to your current working directory. Then please extract the model sources and configure the compilation process:
     
    8263
    8364\sk
    84 If the compilation is successful, the file \ttt{log\_error} should be empty or only reporting few warnings). In the \ttt{main} folder two executables \ttt{ideal.exe} and \ttt{run.exe} should be found and allow you to run the test simulation:
     65If the compilation is successful, the file \ttt{log\_error} should be empty or only reporting few warnings). In the \ttt{main} folder two executables \ttt{ideal.exe} and \ttt{run.exe} should be found and allow you to run\footnote{If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will probably complain about an error reading the namelist. Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} in the \ttt{namelist.input} file to solve the problem.} the test simulation:
    8566\begin{verbatim}
    8667cd test/em_hill2d_x
     
    9071
    9172\sk
    92 During the simulation, the time taken by the computer to perform integrations at each dynamical timestep  is displayed in the standard output. The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}. The model results are stored in a \ttt{wrfout} netCDF data file you might like to browse with a \ttt{NETCDF}-compliant software such as \ttt{ncview}, or read with your favorite graphical software.
    93 %
    94 \begin{finger} \item If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will probably complain about an error reading the namelist. Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order} in the \ttt{namelist.input} file to solve the problem. \end{finger}
     73During the simulation, the time taken by the computer to perform integrations at each dynamical timestep  is displayed in the standard output. The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}. The model results are stored in a \ttt{wrfout} netCDF data file you might like to browse with a \ttt{NETCDF}-compliant software such as \ttt{ncview}, or read with your favorite graphical software. Once you have checked the WRF terrestrial model compiles and runs well on your system, you can delete all files related to the operations done in this section~\ref{terrestrial}.
    9574
    9675\mk
    9776\section{Main installation of the model sources}
    9877
    99 \sk
    100 \subsection{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive}
    101 
    102 \sk
    103 Please set the environment variable \ttt{\$LMDMOD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$LMDMOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$LMDMOD} directory and extract the files. Then execute the \ttt{prepare} script that would do some necessary installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
     78\paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MOD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MOD} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
    10479%
    10580\begin{verbatim}       
    106 declare -x LMDMOD=/disk/user/MODELS
    107 declare -x MMM=$LMDMOD/LMD_MM_MARS
    108 cp LMD_MM_MARS.tar.gz $LMDMOD
    109 cd $LMDMOD
     81declare -x MOD=/disk/user/MODELS
     82declare -x MMM=$MOD/LMD_MM_MARS
     83cp LMD_MM_MARS.tar.gz $MOD
     84cd $MOD
    11085tar xzvf LMD_MM_MARS.tar.gz
    111 cd $LMDMOD/LMD_MM_MARS
     86cd $MOD/LMD_MM_MARS
    11287./SRC/SCRIPTS/prepare  ## or simply ./prepare if the script is in LMD_MM_MARS
    11388\end{verbatim}
    11489
    115 \sk
    116 \subsection{Method 2: You were given a \ttt{svn} link \ttt{the\_link}}
    117 
    118 \sk
    119 \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$LMDMOD} and \ttt{\$MMM}:
     90\paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MOD} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
    12091
    12192\begin{verbatim}
     
    12495svn update LMDZ.MARS MESOSCALE
    12596cd MESOSCALE
    126 declare -x LMDMOD=$PWD
    127 declare -x MMM=$LMDMOD/LMD_MM_MARS
    128 \end{verbatim}
    129 
    130 \sk
    131 The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }:
    132 
    133 \begin{verbatim}
     97declare -x MOD=$PWD
     98declare -x MMM=$MOD/LMD_MM_MARS
     99## to get latest updates later on
    134100cd the_name_of_your_local_destination_folder
    135101svn update LMDZ.MARS MESOSCALE
     
    137103\end{verbatim}
    138104
     105\mk
     106\section{Parallel computations (optional)}
     107
     108\sk
     109Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$MOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.
     110
     111\begin{finger}
     112\item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
     113\begin{verbatim}
     114mkdir $MOD/MPI
     115mv mpich2-1.0.8.tar.gz $MOD/MPI
     116cd $MOD/MPI
     117tar xzvf mpich2-1.0.8.tar.gz
     118cd mpich2-1.0.8
     119./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
     120# please wait...
     121make > mk.log 2> mkerr.log &
     122declare -x WHERE_MPI=$MOD/MPI/mpich2-1.0.8/bin
     123\end{verbatim}
     124\normalsize
     125\end{finger}
     126
    139127\clearemptydoublepage
  • trunk/MESOSCALE_DEV/MANUAL/SRC/keep

    r220 r223  
    2121
    2222
     23\item Defining several domains yields
     24distinct files
     25\ttt{geo\_em.d01.nc},
     26\ttt{geo\_em.d02.nc},
     27\ttt{geo\_em.d03.nc}\ldots
    2328
    2429\mk
  • trunk/MESOSCALE_DEV/MANUAL/SRC/parameters.tex

    r220 r223  
    5151
    5252\sk
    53 \subsection{Advice on filling \ttt{namelist.input}}
     53\subsection{Advice on filling \ttt{namelist.input}}\label{namelist}
    5454
    5555\paragraph{Test case} An interesting exercise is to analyze comparatively the \ttt{TESTCASE/namelist.input} file (cf. section~\ref{sc:arsia}) with the reference \ttt{namelist.input\_full} given above, so that you could understand which settings are being made in the Arsia Mons simulation. Then you could try to modify parameters in the \ttt{namelist.input} file and re-run the model to start getting familiar with the various settings. Given that the test case relies on pre-computed initial and boundary conditions, not all parameters can be changed in the \ttt{namelist.input} file.
     
    5757\paragraph{Syntax} Please pay attention to rigorous syntax while editing your personal \ttt{namelist.input} file to avoid reading error. If the model complains about this at runtime, start again with the available template \ttt{\$MMM/SIMU/namelist.input\_full}.
    5858
    59 \paragraph{Time management} Usually a Martian user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year. In the \ttt{namelist.input} file, the settings for starting/ending time must be done in the form year/month/day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix~\ref{calendar}) is here to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~4, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 2^{\circ}$.
     59\paragraph{Time management} Usually a Martian user would like to start/end the mesoscale simulation at a given solar aerocentric longitude~$L_s$ or a given sol in the Martian year\footnote{Information on Martian calendars: \url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}.}. In the \ttt{namelist.input} file, the settings for starting/ending time must be done in the form year/month/day with each month corresponding to a ``slice" of~$30^{\circ}$~$L_s$. The file~\ttt{\$MMM/SIMU/calendar} (reproduced in appendix~\ref{calendar}) is here to help the user to perform the conversion prior to filling the \ttt{namelist.input} file. In the above example of \ttt{namelist.input\_minim}, the simulation with the LMD Martian Mesoscale Model takes place on month~7 and day~1, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 180^{\circ}$. In the Arsia Mons test case, the simulation with the LMD Martian Mesoscale Model takes place on month~1 and day~4, which corresponds, according to the \ttt{calendar} file, to~$L_s \sim 2^{\circ}$.
    6060
    6161\mk
  • trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex

    r221 r223  
    1414
    1515\sk
    16 First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$LMDMOD/TMPDIR} as indicated below.
    17 
    18 \begin{verbatim}
    19 ln -sf /bigdisk/user $LMDMOD/TMPDIR
    20 mkdir $LMDMOD/TMPDIR/GCMINI
    21 mkdir $LMDMOD/TMPDIR/WPSFEED
    22 mkdir $LMDMOD/TMPDIR/WRFFEED
    23 \end{verbatim}
    24 
    25 \sk
    26 A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{DIRCOMP} for illustration in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:
    27 
    28 \begin{verbatim}
    29 cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
     16First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MOD/TMPDIR} as indicated below.
     17
     18\begin{verbatim}
     19ln -sf /bigdisk/user $MOD/TMPDIR
     20mkdir $MOD/TMPDIR/GCMINI
     21mkdir $MOD/TMPDIR/WPSFEED
     22mkdir $MOD/TMPDIR/WRFFEED
     23\end{verbatim}
     24
     25\sk
     26A second prerequisite to the installation of the preprocessing tools is that the LMD Martian Mesoscale Model was compiled at least once. If this is not the case, please compile the model with the \ttt{makemeso} command described in section~\ref{sc:makemeso}. The compilation process created an installation directory adapted to your particular choice of compiler$+$machine (what we named \ttt{your\_compdir} for illustration in section~\ref{sc:makemeso}, which could be for instance \ttt{g95\_32\_single}). The preprocessing tools will also be installed in this directory. Please type the following commands:
     27
     28\begin{verbatim}
     29cd $MOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
    3030ln -sf ../SRC/SCRIPTS/prepare_ini .
    3131./prepare_ini
     
    5959
    6060\sk
    61 All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
    62 
    63 \begin{verbatim}
    64 cd $LMDMOD/LMD_MM_MARS
     61All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
     62
     63\begin{verbatim}
     64cd $MOD/LMD_MM_MARS
    6565./build_static
    6666\end{verbatim}
     
    7878
    7979\sk
    80 The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$LMDMOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$LMDMOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands must be used and should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
    81 
    82 \begin{verbatim}
    83 cd $LMDMOD/LMDZ.MARS
     80The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
     81
     82\begin{verbatim}
     83cd $MOD/LMDZ.MARS
    8484./compile
    8585\end{verbatim}
    8686
    8787\sk
    88 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$LMDMOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
     88The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
    8989
    9090\mk
     
    110110\end{center}
    111111
     112\sk
     113\subsection{Step 1: Running the GCM and converting data}
     114
     115\sk
     116Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:
     117
     118\begin{verbatim}
     119cd $MOD/LMDZ.MARS/myGCM
     120./launch_gcm    ## answer: your desired starting sol for the simulations
     121\end{verbatim}
     122
     123        %\mk
     124        %\marge An example of input meteorological file 
     125        %\ttt{diagfi.nc} file can be downloaded
     126        %at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
     127        %%
     128        %Please deflate the archive and copy the \ttt{diagfi.nc} file
     129        %in \ttt{\$MOD/TMPDIR/GCMINI}.
     130        %%
     131        %Such a file can then be used to define the initial
     132        %and boundary conditions, and we will go
     133        %through the three preprocessing steps.
     134
     135\sk
     136Once the GCM simulations is finished, the programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if every went well with the conversion,
     137the directory \ttt{\$MOD/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}.
     138
     139\begin{verbatim}
     140cd $MOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
     141echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
     142./readmeteo.exe < readmeteo.def
     143\end{verbatim}
     144
     145\sk
     146\subsection{Step 2: Interpolation on the regional domain}
     147
     148\sk
     149\paragraph{Step 2a} In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows you to define the mesoscale simulation domain, to horizontally interpolate the topography, thermal inertia and albedo fields at the domain resolution and to calculate useful fields such as topographical slopes. Please execute the commands:
     150
     151\begin{verbatim}
     152cd $MMM/your_install_dir/WPS
     153ln -sf $MMM/TESTCASE/namelist.wps .   # test case (or use your customized file)
     154./geogrid.exe
     155\end{verbatim}
     156
     157The result of \ttt{geogrid.exe} -- and thus the definition of the mesoscale domain -- can be checked in the NETCDF file \ttt{geo\_em.d01.nc} (using for instance \ttt{ncview}, or your favorite graphical interface for netCDF files, or python-based scripts as in section~\ref{postproc}). If you are unhappy with the results or you want to change the location of the mesoscale domain on the planet, the horizontal resolution, the number of grid points \ldots, please modify the parameter file \ttt{namelist.wps}, content thereof is reproduced/commented on the next page, and execute again \ttt{geogrid.exe}.
     158
     159\begin{finger}
     160\item No input meteorological data are actually needed to execute \ttt{geogrid.exe}. This step~2a can be achieved/prepared e.g. before step~1. It is probably a good idea to prepare step~2 by choosing the mesoscale simulation domain, while GCM computations are done in step~1.
     161\item More details about the database and more options of interpolation could be found in the file \ttt{geogrid/GEOGRID.TBL} (for advanced users only).
     162\item Two examples are given in Figure~\ref{vallespolar}.
     163\end{finger}
     164
     165\footnotesize
     166\codesource{namelist.wps_TEST}
     167\normalsize
     168
     169\begin{figure}[h!]
     170\begin{center}
     171\includegraphics[width=0.48\textwidth]{valles.png}
     172\includegraphics[width=0.48\textwidth]{LMD_MMM_d1_20km_domain_100.png}
     173\end{center}
     174\caption{\label{vallespolar} (Left plot) An example of mercator domain in the Valles Marineris region as simulated by \textit{Spiga and Forget} [2009, their section 3.3]: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 401}, \ttt{e\_we = 121}, \ttt{dx = 12000}, \ttt{dy = 12000}, \ttt{map\_proj='mercator'}, \ttt{ref\_lat = -8}, \ttt{ref\_lon = -68}. (Right plot) An example of north polar domain with stereographical projection: relevant parameters in \ttt{namelist.wps} are: \ttt{e\_we = 117}, \ttt{e\_we = 117}, \ttt{dx = 20000}, \ttt{dy = 20000}, \ttt{map\_proj='polar'}, \ttt{ref\_lat = 90}, \ttt{ref\_lon = 0.1}, \ttt{truelat1  =  90}, \ttt{stand\_lon =  0.1}.}
     175\end{figure}
     176
     177
     178
     179\sk
     180\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MOD/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
     181
     182\begin{verbatim}
     183cd $MMM/your_install_dir/WPS
     184mkdir WRFFEED/current
     185./metgrid.exe
     186\end{verbatim}
     187
     188\sk
     189\subsection{Step 3: Vertical interpolation on mesoscale levels}
     190
     191\sk
     192The last preprocessing step before being able to run the mesoscale simulation at step~4 is to execute \ttt{real.exe} to perform the interpolation from the vertical levels of the GCM to the vertical levels defined in the mesoscale model. This program also prepares the final initial state for the simulation in files called \ttt{wrfinput} and the boundary conditions in files called \ttt{wrfbdy}. To successfully execute \ttt{real.exe}, you need the \ttt{met\_em.*} files and the \ttt{namelist.input} file to be in the same directory as \ttt{real.exe}. Parameters in \ttt{namelist.input} controlling the behavior of the vertical interpolation are those labelled with \ttt{(p3)} in the detailed list introduced in chapter~\ref{zeparam}.
     193
     194\begin{verbatim}
     195cd $MOD/TESTCASE   ## or anywhere you would like to run the simulation
     196ln -sf $MOD/TMPDIR/WRFFEED/met_em* .
     197./real.exe
     198\end{verbatim}
     199
     200\sk
     201The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
     202
     203\begin{finger}
     204\item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be all the same. }
     205\end{finger}
    112206
    113207\clearemptydoublepage
  • trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex

    r221 r223  
    1 
    2 
    3 \mk
    4 \subsubsection{Meteorological data}
    5 
    6 \ttt{launch\_gcm}
    7 
    8 \mk
    9 The preprocessing tools generate initial and boundary conditions
    10 from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
    11 %
    12 If you would like to run a mesoscale simulation at a given
    13 season, you need to first run a GCM simulation and output
    14 the meteorological fields at the considered season.
    15 %
    16 For optimal forcing at the boundaries, we advise you
    17 to write the meteorological fields to the
    18 \ttt{diagfi.nc} file at least each two hours.
    19 %
    20 Please also make sure that the following fields
    21 are stored in the NETCDF \ttt{diagfi.nc} file:
    22 
    23 \footnotesize
    24 \codesource{contents_diagfi}
    25 
    26 \normalsize
    27 \begin{finger}
    28 \item If the fields
    29 \ttt{emis},
    30 \ttt{co2ice},
    31 \ttt{q01},
    32 \ttt{q02},
    33 \ttt{tsoil}
    34 are missing in the \ttt{diagfi.nc} file,
    35 they are replaced by respective default
    36 values $0.95$, $0$, $0$, $0$, tsurf.
    37 \end{finger}
    38 
    39 \mk
    40 \marge An example of input meteorological file
    41 \ttt{diagfi.nc} file can be downloaded
    42 at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
    43 %
    44 Please deflate the archive and copy the \ttt{diagfi.nc} file
    45 in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
    46 %
    47 Such a file can then be used to define the initial
    48 and boundary conditions, and we will go
    49 through the three preprocessing steps.
    50 
    51 \mk
    52 \subsection{Preprocessing steps}
    53 
    54 \mk
    55 \subsubsection{Step 1: Converting GCM data}
    56 
    57 \mk
    58 The programs in the \ttt{PREP\_MARS} directory
    59 convert the data from the NETCDF \ttt{diagfi.nc}
    60 file into separated binary datafiles for each
    61 date contained in \ttt{diagfi.nc}, according to
    62 the formatting needed by the
    63 preprocessing programs at step 2.
    64 %
    65 These programs can be executed by the following
    66 commands:
    67 \begin{verbatim}
    68 cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
    69 echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
    70 ./readmeteo.exe < readmeteo.def
    71 \end{verbatim}
    72 %
    73 \marge If every went well with the conversion,
    74 the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
    75 should contain files named \ttt{LMD:}.
    76 
    77 \mk
    78 \subsubsection{2: Interpolation on the regional domain}
    79 
    80 \mk
    81 In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
    82 you to define the mesoscale simulation domain
    83 to horizontally interpolate the topography,
    84 thermal inertia and albedo fields at the domain
    85 resolution and to calculate useful fields
    86 such as topographical slopes.%\pagebreak
    87 
    88 \mk
    89 \marge Please execute the commands:
    90 %
    91 \begin{verbatim}
    92 cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
    93 ln -sf ../../TESTCASE/namelist.wps .   # test case
    94 ./geogrid.exe
    95 \end{verbatim}
    96 %
    97 \marge The result of \ttt{geogrid.exe}
    98 -- and thus the definition of the mesoscale
    99 domain -- can be checked in the NETCDF
    100 file \ttt{geo\_em.d01.nc}.
    101 %
    102 A quick check can be performed using the command line
    103 \begin{verbatim}
    104 ncview geo_em.d01.nc
    105 \end{verbatim}
    106 \marge if \ttt{ncview} is installed, or the \ttt{IDL}
    107 script \ttt{out\_geo.pro}
    108 \begin{verbatim}
    109 idl
    110 IDL> out_geo, field1='TOPO'
    111 IDL> out_geo, field1='TI'
    112 IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
    113 IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
    114 IDL> exit
    115 \end{verbatim}
    116 \marge if the demo version of \ttt{IDL} is installed.
    117 %
    118 Of course if your favorite graphical tool supports
    119 the NETCDF standard, you might use it to check the
    120 domain definition in \ttt{geo\_em.d01.nc}.
    121 
    122 \mk
    123 \marge If you are unhappy with the results or
    124 you want to change
    125 the location of the mesoscale domain on the planet,
    126 the horizontal resolution,
    127 the number of grid points \ldots,
    128 please modify the parameter
    129 file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
    130 %
    131 Here are the contents of \ttt{namelist.wps}:
    132 %
    133 \codesource{namelist.wps_TEST}
    134 
    135 \begin{finger}
    136 %
    137 \item No input meteorological data
    138 are actually needed to execute \ttt{geogrid.exe}.
    139 %
    140 \item More details about the database and
    141 more options of interpolation could be
    142 found in the file \ttt{geogrid/GEOGRID.TBL}.
    143 %
    144 \item Defining several domains yields
    145 distinct files
    146 \ttt{geo\_em.d01.nc},
    147 \ttt{geo\_em.d02.nc},
    148 \ttt{geo\_em.d03.nc}\ldots
    149 \end{finger}
    150 
    151 \mk
    152 \marge Once the \ttt{geo\_em} file(s) are generated,
    153 the \ttt{metgrid.exe} program performs
    154 a similar horizontal interpolation
    155 of the meteorological fields to the mesoscale
    156 domain as the one performed by \ttt{geogrid.exe}
    157 for the surface data.
    158 %
    159 Then the program writes the results in
    160 \ttt{met\_em} files and also collects
    161 the static fields and domain parameters
    162 included in the \ttt{geo\_em} file(s)
    163 %
    164 Please type the following commands:
    165 \begin{verbatim}
    166 cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
    167 ./metgrid.exe
    168 \end{verbatim}
    169 %
    170 \marge If every went well,
    171 the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
    172 should contain the \ttt{met\_em.*} files.
    173 
    174 \mk
    175 \subsubsection{Step 3: Vertical interpolation on mesoscale levels}
    176 
    177 \mk
    178 \marge The last step is to execute \ttt{real.exe}
    179 to perform the interpolation from the vertical
    180 levels of the GCM to the vertical levels
    181 defined in the mesoscale model.
    182 %
    183 This program also prepares the final initial
    184 state for the simulation in files called
    185 \ttt{wrfinput} and the boundary conditions
    186 in files called \ttt{wrfbdy}.
    187 
    188 \mk
    189 \marge To successfully execute \ttt{real.exe},
    190 you need the \ttt{met\_em.*} files
    191 and the \ttt{namelist.input} file
    192 to be in the same directory as \ttt{real.exe}.
    193 %
    194 Parameters in \ttt{namelist.input}
    195 controlling the behavior of the vertical interpolation
    196 are those labelled with \ttt{(p3)} in the detailed
    197 list introduced in the previous chapter.
    198 
    199 \mk
    200 \marge Please type the following commands
    201 to prepare files for the Arsia Mons test case
    202 (or your personal test case if you changed
    203 the parameters in \ttt{namelist.wps}):
    204 \begin{verbatim}
    205 cd $LMDMOD/TESTCASE
    206 ln -sf $LMDMOD/WRFFEED/met_em* .
    207 ./real.exe
    208 \end{verbatim}
    209 
    210 \mk
    211 \marge The final message of the \ttt{real.exe}
    212 should claim the success of the processes and you
    213 are now ready to launch the integrations
    214 of the LMD Martian Mesoscale Model again
    215 with the \ttt{wrf.exe} command as in section
    216 \ref{sc:arsia}.
    217 
    218 \begin{finger}
    219 \item When you modify either
    220 \ttt{namelist.wps} or \ttt{namelist.input},
    221 make sure that the common parameters
    222 are exactly similar in both files
    223 (especially when running nested simulations)
    224 otherwise either \ttt{real.exe} or \ttt{wrf.exe}
    225 command will exit with an error message.
    226 \end{finger}
    227 %\pagebreak
    228 
    2291
    2302
     
    24820
    24921\section{Grid nesting}\label{nests}
     22
     23\codesource{namelist.wps_NEST}
    25024
    25125\section{Tracers}
  • trunk/MESOSCALE_DEV/MANUAL/SRC/whatis.tex

    r218 r223  
    22
    33\vk
    4 This chapter comprises the excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction.
     4This chapter comprises excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} which are dedicated to a general scientific and technical description of the LMD Martian Mesoscale Model, of its design and capabilities. Further details can be found in the reference \textit{Spiga and Forget} [2009]\nocite{Spig:09} paper and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}. Figure~\ref{modelstructure} summarizes the main points detailed in this introduction.
    55
    66\begin{center}
Note: See TracChangeset for help on using the changeset viewer.