Changeset 230 for trunk/MESOSCALE_DEV


Ignore:
Timestamp:
Jul 15, 2011, 8:45:44 PM (13 years ago)
Author:
aslmd
Message:

MESOSCALE: user manual. some updates: reference namelist.input, env variables in the text, new figures for the new TEST CASE and the FAST CASE for newphys.

Location:
trunk/MESOSCALE_DEV/MANUAL/SRC
Files:
2 added
2 deleted
4 edited

Legend:

Unmodified
Added
Removed
  • trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex

    r223 r230  
    2929
    3030\sk
    31 Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MOD}, but those are not important at this stage.} and sub-directories through the following command lines:
     31Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MESO}, but those are not important at this stage.} and sub-directories through the following command lines:
    3232\begin{verbatim}
    3333ls $MMM ; ls $MMM/*
     
    8787\item ask the user about compilation settings;
    8888\item retrieve some additional information about the system;
    89 \item create a directory \ttt{\$MOD/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case);
     89\item create a directory \ttt{\$MESO/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case);
    9090\item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.};
    9191\item execute the WRF \ttt{configure} script with the correct option;
     
    152152
    153153\sk
    154 In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz}
    155 archive that you can download in the following FTP site \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/LMD_MM_MARS_TESTCASE.tar.gz}. This test case simulates the hydrostatic atmospheric flow around Arsia Mons (Figure~\ref{arsia}) during half a sol with constant thermal inertia, albedo and dust opacity\footnote{Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time.}.
     154In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} archive that you can download in the following FTP site \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/LMD_MM_MARS_TESTCASE.tar.gz}. This test case simulates the hydrostatic atmospheric flow around Arsia Mons (Figure~\ref{arsia}) during half a sol in springtime with constant thermal inertia, albedo and dust opacity\footnote{Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time.}.
    156155
    157156\sk
     
    159158%
    160159\begin{verbatim}
    161 cp LMD_MM_MARS_TESTCASE.tar.gz $MOD/LMD_MM_MARS/
     160cp LMD_MM_MARS_TESTCASE.tar.gz $MESO/LMD_MM_MARS/
    162161tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
    163162cd TESTCASE
     
    167166
    168167\sk
    169 The files contained in \ttt{TESTCASE} prior to launching the simulations with the \ttt{wrf.exe} command illustrate which files are needed to perform step 4, i.e. running a LMD Martian Mesoscale Model simulation\footnote{For the test case presented here, a file named \ttt{dustopacity.def} is needed because for the sake of simplicity of this test case, we set idealized uniform dust opacity. The files \ttt{real.exe} and \ttt{namelist.wps} are included in the \ttt{TESTCASE} folder for further reference but not needed at this stage.}.
     168The files contained in \ttt{TESTCASE} prior to launching the simulations with the \ttt{wrf.exe} command illustrate which files are needed to perform step 4, i.e. running a LMD Martian Mesoscale Model simulation\footnote{For the test case presented here, a file named \ttt{dustopacity.def} is needed because for the sake of simplicity of this test case, we set idealized uniform dust opacity. The file \ttt{namelist.wps} is included in the \ttt{TESTCASE} folder for further reference but not needed at this stage.}.
    170169\begin{itemize}
    171170\item \ttt{namelist.input}: text file containing parameters for the dynamical core
     
    178177\begin{figure}[h!]
    179178\includegraphics[width=0.5\textwidth]{arsiadomain.png}
    180 \includegraphics[width=0.5\textwidth]{LMD_MMM_d1_20km_UV_HGT_Ls2_LT1_100.png}
     179\includegraphics[width=0.5\textwidth]{LMD_MMM_d1_20km_HGT_UV_10m-ALS_Ls8_LT1_100.png}
    181180\caption{\label{arsia} [Left plot] Simulation domain defined in the test case proposed as a demonstrator for running the LMD Martian Mesoscale Model. [Right plot] Nighttime winds predicted by the model~$10$~m above the surface. Both plots have been generated by command-line scripts written in~\ttt{python + numpy + matplotlib} (see chapter~\ref{postproc}).}
    182181\end{figure}
  • trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex

    r223 r230  
    7676\section{Main installation of the model sources}
    7777
    78 \paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MOD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MOD} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
     78\paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MESO} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MESO/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MESO} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:
    7979%
    8080\begin{verbatim}       
    81 declare -x MOD=/disk/user/MODELS
    82 declare -x MMM=$MOD/LMD_MM_MARS
    83 cp LMD_MM_MARS.tar.gz $MOD
    84 cd $MOD
     81declare -x MESO=/disk/user/MODELS
     82declare -x MMM=$MESO/LMD_MM_MARS
     83cp LMD_MM_MARS.tar.gz $MESO
     84cd $MESO
    8585tar xzvf LMD_MM_MARS.tar.gz
    86 cd $MOD/LMD_MM_MARS
     86cd $MESO/LMD_MM_MARS
    8787./SRC/SCRIPTS/prepare  ## or simply ./prepare if the script is in LMD_MM_MARS
    8888\end{verbatim}
    8989
    90 \paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MOD} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
     90\paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MESO} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.
    9191
    9292\begin{verbatim}
     
    9595svn update LMDZ.MARS MESOSCALE
    9696cd MESOSCALE
    97 declare -x MOD=$PWD
    98 declare -x MMM=$MOD/LMD_MM_MARS
     97declare -x MESO=$PWD  ## put absolute link in your .bashrc
     98declare -x MMM=$MESO/LMD_MM_MARS
    9999## to get latest updates later on
    100100cd the_name_of_your_local_destination_folder
     
    107107
    108108\sk
    109 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$MOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.
     109Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$your_software_dir/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.
    110110
    111111\begin{finger}
    112112\item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands:
    113113\begin{verbatim}
    114 mkdir $MOD/MPI
    115 mv mpich2-1.0.8.tar.gz $MOD/MPI
    116 cd $MOD/MPI
     114mkdir $your_software_dir/MPI   
     115mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/
     116cd $your_software_dir/MPI
    117117tar xzvf mpich2-1.0.8.tar.gz
    118118cd mpich2-1.0.8
     
    120120# please wait...
    121121make > mk.log 2> mkerr.log &
    122 declare -x WHERE_MPI=$MOD/MPI/mpich2-1.0.8/bin
     122declare -x WHERE_MPI=$your_software_dir/MPI/mpich2-1.0.8/bin
    123123\end{verbatim}
    124124\normalsize
  • trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex

    r223 r230  
    1414
    1515\sk
    16 First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MOD/TMPDIR} as indicated below.
    17 
    18 \begin{verbatim}
    19 ln -sf /bigdisk/user $MOD/TMPDIR
    20 mkdir $MOD/TMPDIR/GCMINI
    21 mkdir $MOD/TMPDIR/WPSFEED
    22 mkdir $MOD/TMPDIR/WRFFEED
     16First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MESO/TMPDIR} as indicated below.
     17
     18\begin{verbatim}
     19ln -sf /bigdisk/user $MESO/TMPDIR
     20mkdir $MESO/TMPDIR/GCMINI
     21mkdir $MESO/TMPDIR/WPSFEED
     22mkdir $MESO/TMPDIR/WRFFEED
    2323\end{verbatim}
    2424
     
    2727
    2828\begin{verbatim}
    29 cd $MOD/LMD_MM_MARS/g95_32_single/   ## or any of your install directory
     29cd $MMM/g95_32_single/   ## or any of your install directory
    3030ln -sf ../SRC/SCRIPTS/prepare_ini .
    3131./prepare_ini
     
    5959
    6060\sk
    61 All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MOD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
    62 
    63 \begin{verbatim}
    64 cd $MOD/LMD_MM_MARS
    65 ./build_static
     61All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MMM/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:
     62
     63\begin{verbatim}
     64cd $MMM
     65./SRC/SCRIPTS/build_static
    6666\end{verbatim}
    6767
     
    7878
    7979\sk
    80 The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MOD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
    81 
    82 \begin{verbatim}
    83 cd $MOD/LMDZ.MARS
     80The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MESO} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MESO/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:
     81
     82\begin{verbatim}
     83cd $MESO/LMDZ.MARS
    8484./compile
    8585\end{verbatim}
    8686
    8787\sk
    88 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MOD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
     88The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MESO/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.
    8989
    9090\mk
     
    114114
    115115\sk
    116 Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:
    117 
    118 \begin{verbatim}
    119 cd $MOD/LMDZ.MARS/myGCM
     116Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:
     117
     118\begin{verbatim}
     119cd $MESO/LMDZ.MARS/myGCM
    120120./launch_gcm    ## answer: your desired starting sol for the simulations
    121121\end{verbatim}
     
    127127        %%
    128128        %Please deflate the archive and copy the \ttt{diagfi.nc} file
    129         %in \ttt{\$MOD/TMPDIR/GCMINI}.
     129        %in \ttt{\$MESO/TMPDIR/GCMINI}.
    130130        %%
    131131        %Such a file can then be used to define the initial
     
    135135\sk
    136136Once the GCM simulations is finished, the programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if every went well with the conversion,
    137 the directory \ttt{\$MOD/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}.
    138 
    139 \begin{verbatim}
    140 cd $MOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
     137the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}.
     138
     139\begin{verbatim}
     140cd $MMM/your_install_dir/PREP\_MARS
    141141echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
    142142./readmeteo.exe < readmeteo.def
    143143\end{verbatim}
     144
     145%%% compile_and_exec ????
    144146
    145147\sk
     
    178180
    179181\sk
    180 \paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MOD/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
     182\paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MESO/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.
    181183
    182184\begin{verbatim}
     
    193195
    194196\begin{verbatim}
    195 cd $MOD/TESTCASE   ## or anywhere you would like to run the simulation
    196 ln -sf $MOD/TMPDIR/WRFFEED/met_em* .
     197cd $MESO/TESTCASE   ## or anywhere you would like to run the simulation
     198ln -sf $MESO/TMPDIR/WRFFEED/met_em* .
    197199./real.exe
    198200\end{verbatim}
     
    201203The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}.
    202204
     205\sk
    203206\begin{finger}
    204207\item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be all the same. }
  • trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex

    r223 r230  
    55
    66\chapter{Starting simulations from scratch: a summary}
     7
     8A quick guide to full execution
     9(a list of commands)
     10
    711
    812\mk
     
    1519To be completed
    1620\end{remarque}
     21
     22parler de xeyes
    1723
    1824
Note: See TracChangeset for help on using the changeset viewer.