Changeset 230 for trunk/MESOSCALE_DEV
- Timestamp:
- Jul 15, 2011, 8:45:44 PM (13 years ago)
- Location:
- trunk/MESOSCALE_DEV/MANUAL/SRC
- Files:
-
- 2 added
- 2 deleted
- 4 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/MESOSCALE_DEV/MANUAL/SRC/compile_exec.tex
r223 r230 29 29 30 30 \sk 31 Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$M OD}, but those are not important at this stage.} and sub-directories through the following command lines:31 Please take the time to check the contents of the \ttt{LMD\_MM\_MARS} directories\footnote{If you used method~$2$, you will probably notice that other directories than~\ttt{LMD\_MM\_MARS} are present in \ttt{\$MESO}, but those are not important at this stage.} and sub-directories through the following command lines: 32 32 \begin{verbatim} 33 33 ls $MMM ; ls $MMM/* … … 87 87 \item ask the user about compilation settings; 88 88 \item retrieve some additional information about the system; 89 \item create a directory \ttt{\$M OD/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case);89 \item create a directory \ttt{\$MESO/LMD\_MM\_MARS/your\_compdir} which name depends\footnote{For example, a \ttt{your\_compdir} directory named \ttt{g95\_32\_single} is created if the user requested a \ttt{g95} compilation of the code for single-domain simulations on a 32bits machine.} on the kind of compiler you are using, on whether your system is 32 or 64 bits, on whether sequential or parallel computations are planned and on the kind of simulations (idealized or real-case); 90 90 \item generate with \ttt{copy\_model} a directory \ttt{your\_compdir/WRFV2} containing links to \ttt{SRC/WRFV2} sources\footnote{A note to developers: this method ensures that any change to the model sources would be propagated to all the different \ttt{your\_compdir} installation folders.}; 91 91 \item execute the WRF \ttt{configure} script with the correct option; … … 152 152 153 153 \sk 154 In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} 155 archive that you can download in the following FTP site \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/LMD_MM_MARS_TESTCASE.tar.gz}. This test case simulates the hydrostatic atmospheric flow around Arsia Mons (Figure~\ref{arsia}) during half a sol with constant thermal inertia, albedo and dust opacity\footnote{Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time.}. 154 In order to test the compiled executables, a ready-to-use test case (with pre-generated initial and boundary conditions) is proposed in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz} archive that you can download in the following FTP site \url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/LMD_MM_MARS_TESTCASE.tar.gz}. This test case simulates the hydrostatic atmospheric flow around Arsia Mons (Figure~\ref{arsia}) during half a sol in springtime with constant thermal inertia, albedo and dust opacity\footnote{Though the simulation reproduces some reasonable features of the mesoscale circulation around Arsia Mons (e.g. slope winds), it should not be used for scientific purpose, for the number of grid points is unsufficient for single-domain simulation and the integration time is below the necessary spin-up time.}. 156 155 157 156 \sk … … 159 158 % 160 159 \begin{verbatim} 161 cp LMD_MM_MARS_TESTCASE.tar.gz $M OD/LMD_MM_MARS/160 cp LMD_MM_MARS_TESTCASE.tar.gz $MESO/LMD_MM_MARS/ 162 161 tar xzvf LMD_MM_MARS_TESTCASE.tar.gz 163 162 cd TESTCASE … … 167 166 168 167 \sk 169 The files contained in \ttt{TESTCASE} prior to launching the simulations with the \ttt{wrf.exe} command illustrate which files are needed to perform step 4, i.e. running a LMD Martian Mesoscale Model simulation\footnote{For the test case presented here, a file named \ttt{dustopacity.def} is needed because for the sake of simplicity of this test case, we set idealized uniform dust opacity. The file s \ttt{real.exe} and \ttt{namelist.wps} areincluded in the \ttt{TESTCASE} folder for further reference but not needed at this stage.}.168 The files contained in \ttt{TESTCASE} prior to launching the simulations with the \ttt{wrf.exe} command illustrate which files are needed to perform step 4, i.e. running a LMD Martian Mesoscale Model simulation\footnote{For the test case presented here, a file named \ttt{dustopacity.def} is needed because for the sake of simplicity of this test case, we set idealized uniform dust opacity. The file \ttt{namelist.wps} is included in the \ttt{TESTCASE} folder for further reference but not needed at this stage.}. 170 169 \begin{itemize} 171 170 \item \ttt{namelist.input}: text file containing parameters for the dynamical core … … 178 177 \begin{figure}[h!] 179 178 \includegraphics[width=0.5\textwidth]{arsiadomain.png} 180 \includegraphics[width=0.5\textwidth]{LMD_MMM_d1_20km_ UV_HGT_Ls2_LT1_100.png}179 \includegraphics[width=0.5\textwidth]{LMD_MMM_d1_20km_HGT_UV_10m-ALS_Ls8_LT1_100.png} 181 180 \caption{\label{arsia} [Left plot] Simulation domain defined in the test case proposed as a demonstrator for running the LMD Martian Mesoscale Model. [Right plot] Nighttime winds predicted by the model~$10$~m above the surface. Both plots have been generated by command-line scripts written in~\ttt{python + numpy + matplotlib} (see chapter~\ref{postproc}).} 182 181 \end{figure} -
trunk/MESOSCALE_DEV/MANUAL/SRC/installation.tex
r223 r230 76 76 \section{Main installation of the model sources} 77 77 78 \paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$M OD} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MOD/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MOD} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you:78 \paragraph{Method 1: You were given a \ttt{LMD\_MM\_MARS.tar.gz} archive} Please set the environment variable \ttt{\$MESO} to point at the directory where you will install the model and define the environment variable \ttt{\$MMM} as \ttt{\$MESO/LMD\_MM\_MARS}. Copy the \ttt{LMD\_MM\_MARS.tar.gz} file in the \ttt{\$MESO} directory and extract the files. Then execute the \ttt{prepare} script that would do all installation tasks\footnote{Deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS}, download the ARW-WRF sources from the web, apply a (significant) ``Martian patch" to these sources and build the structure of your \ttt{LMD\_MM\_MARS} directory} for you: 79 79 % 80 80 \begin{verbatim} 81 declare -x M OD=/disk/user/MODELS82 declare -x MMM=$M OD/LMD_MM_MARS83 cp LMD_MM_MARS.tar.gz $M OD84 cd $M OD81 declare -x MESO=/disk/user/MODELS 82 declare -x MMM=$MESO/LMD_MM_MARS 83 cp LMD_MM_MARS.tar.gz $MESO 84 cd $MESO 85 85 tar xzvf LMD_MM_MARS.tar.gz 86 cd $M OD/LMD_MM_MARS86 cd $MESO/LMD_MM_MARS 87 87 ./SRC/SCRIPTS/prepare ## or simply ./prepare if the script is in LMD_MM_MARS 88 88 \end{verbatim} 89 89 90 \paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$M OD} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }.90 \paragraph{Method 2: You were given a \ttt{svn} link \ttt{the\_link}} \emph{You must have Subversion (\ttt{svn}) installed on your system to follow this method}. Please use the name of our server repository combined to an \ttt{svn checkout} command to get the model sources\footnote{At this stage, it is essential to have registered to the WRF website (see foreword) because our server contains some part of the ARW-WRF sources.}. Please also set the environment variable \ttt{\$MESO} and \ttt{\$MMM}. The first download of the model sources could be a bit long. Compared to method~$1$, this method~$2$ using \ttt{svn} would allow you to easily get the latest updates and bug fixes done on the LMD Martian Mesoscale Model by the development team\footnote{If you are not interested by this feature, please replace the command line featuring \ttt{svn checkout} by this command line \ttt{svn export the\_link/LMDZ.MARS the\_link/MESOSCALE} }. 91 91 92 92 \begin{verbatim} … … 95 95 svn update LMDZ.MARS MESOSCALE 96 96 cd MESOSCALE 97 declare -x M OD=$PWD98 declare -x MMM=$M OD/LMD_MM_MARS97 declare -x MESO=$PWD ## put absolute link in your .bashrc 98 declare -x MMM=$MESO/LMD_MM_MARS 99 99 ## to get latest updates later on 100 100 cd the_name_of_your_local_destination_folder … … 107 107 108 108 \sk 109 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$ MOD/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable.109 Parallel computations with the Message Passing Interface (MPI) standard are supported by the LMD Martian Mesoscale Model. If you want to use this capability, you would have to add the installation of MPICH2 as a additional prerequisite. Once the installation is completed, it is required to define the environment variable \ttt{\$WHERE\_MPI} to point in your \ttt{mpich} \ttt{bin} directory, even if you added the \ttt{\$your_software_dir/MPI/mpich2-1.0.8/bin} directory to your \ttt{\$PATH} variable. 110 110 111 111 \begin{finger} 112 112 \item \scriptsize Here is a brief ``how-to" to install MPICH2, although this surely does not replace reading carefully installation notes and choosing what installation suits best your system. Please download the current stable version of the sources (e.g. we choose here an old version \ttt{mpich2-1.0.8.tar.gz} to illustrate the commands) on the MPICH2 website \url{http://www.mcs.anl.gov/research/projects/mpich2} and install the MPICH2 utilities by the following commands: 113 113 \begin{verbatim} 114 mkdir $ MOD/MPI115 mv mpich2-1.0.8.tar.gz $ MOD/MPI116 cd $ MOD/MPI114 mkdir $your_software_dir/MPI 115 mv mpich2-1.0.8.tar.gz $your_software_dir/MPI/ 116 cd $your_software_dir/MPI 117 117 tar xzvf mpich2-1.0.8.tar.gz 118 118 cd mpich2-1.0.8 … … 120 120 # please wait... 121 121 make > mk.log 2> mkerr.log & 122 declare -x WHERE_MPI=$ MOD/MPI/mpich2-1.0.8/bin122 declare -x WHERE_MPI=$your_software_dir/MPI/mpich2-1.0.8/bin 123 123 \end{verbatim} 124 124 \normalsize -
trunk/MESOSCALE_DEV/MANUAL/SRC/preproc.tex
r223 r230 14 14 15 15 \sk 16 First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$M OD/TMPDIR} as indicated below.17 18 \begin{verbatim} 19 ln -sf /bigdisk/user $M OD/TMPDIR20 mkdir $M OD/TMPDIR/GCMINI21 mkdir $M OD/TMPDIR/WPSFEED22 mkdir $M OD/TMPDIR/WRFFEED16 First and foremost, since the preprocessing utilities could generate (or involve) files of quite significant sizes, it is necessary to define a directory where these files would be stored. Such a directory (e.g. \ttt{/bigdisk/user}) must be linked with the name \ttt{\$TMPDIR} as follows. In addition, three directories \ttt{GCMINI}, \ttt{WPSFEED}, \ttt{WRFFEED} have to be defined in \ttt{\$MESO/TMPDIR} as indicated below. 17 18 \begin{verbatim} 19 ln -sf /bigdisk/user $MESO/TMPDIR 20 mkdir $MESO/TMPDIR/GCMINI 21 mkdir $MESO/TMPDIR/WPSFEED 22 mkdir $MESO/TMPDIR/WRFFEED 23 23 \end{verbatim} 24 24 … … 27 27 28 28 \begin{verbatim} 29 cd $M OD/LMD_MM_MARS/g95_32_single/ ## or any of your install directory29 cd $MMM/g95_32_single/ ## or any of your install directory 30 30 ln -sf ../SRC/SCRIPTS/prepare_ini . 31 31 ./prepare_ini … … 59 59 60 60 \sk 61 All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$M OD/LMD\_MM\_MARS/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities:62 63 \begin{verbatim} 64 cd $M OD/LMD_MM_MARS65 ./ build_static61 All the static data (topography, thermal inertia, albedo) needed to initialize the model are included in the \ttt{\$MMM/WPS\_GEOG} directory. By default, only coarse-resolution datasets\footnote{ Corresponding to the fields stored in the file \ttt{surface.nc} known by LMD-MGCM users: \url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc} } are available, but the directory also contains sources and scripts to install finer resolution datasets: 32 and/or 64 pixel-per-degree (ppd) MOLA topography (\ttt{mola\_topo32} and \ttt{mola\_topo64}), 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo (\ttt{albedo\_TES}), 20 ppd TES thermal inertia (\ttt{thermal\_TES}). The role of the \ttt{build\_static} script is to automatically download these datasets from the web (namely PDS archives) and convert them to an acceptable format for a future use by the preprocessing utilities: 62 63 \begin{verbatim} 64 cd $MMM 65 ./SRC/SCRIPTS/build_static 66 66 \end{verbatim} 67 67 … … 78 78 79 79 \sk 80 The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$M OD} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MOD/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}:81 82 \begin{verbatim} 83 cd $M OD/LMDZ.MARS80 The LMD Martian GCM is supposed to be run to compute meteorological fields that will be used as initial and boundary conditions each one or two Martian hours to the limited-area LMD Martian Mesoscale Model. Hence the LMD Martian GCM must be compiled in your system (see the LMD-MGCM user manual for further details \url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}). If you did not get the model using the \ttt{svn} method, please request us to send you an archive containing the LMD-MGCM named \ttt{LMDZ.MARS.meso.tar.gz}, which you have to extract in the \ttt{\$MESO} directory. If you got the model using \ttt{svn}, you do not have to request this file. In the \ttt{\$MESO/LMDZ.MARS} directory, a script named \ttt{compile} can be found and must be used \emph{on the system you plan to run the mesoscale model on} to compile the GCM. The \ttt{compile} script is actually just a wrapper for the \ttt{makegcm} script which compile the GCM for you; the default \ttt{makegcm} script only works with Portland Group Fortran compiler \ttt{pgf90} but scripts allowing to compile the model using other Fortran compilers (including \ttt{g95} or \ttt{ifort}) are available upon request. The following commands should yield the compilation of two executables \ttt{newstart.e} and \ttt{gcm.e}: 81 82 \begin{verbatim} 83 cd $MESO/LMDZ.MARS 84 84 ./compile 85 85 \end{verbatim} 86 86 87 87 \sk 88 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$M OD/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system.88 The other necessary operation to prepare the LMD-MGCM for step~1 is to store a set of initial states for the LMD-MGCM to start with -- based on previous typical LMD-MGCM runs having reached equilibrium after ten years of integration. A reference database can be found in the following online big archive~\url{ftp://ftp.lmd.jussieu.fr/pub/aslmd/STARTBASE_64_48_32_t2.tar.gz}. This archive must be extracted somewhere on a disk that would be accessible by the system you plan to run the mesoscale model on; the absolute link of the \ttt{STARTBASE\_64\_48\_32\_t2} directory on your disk must be reported in the beginning of the script~\ttt{\$MESO/LMDZ.MARS/myGCM/launch\_gcm} (variable \ttt{startbase}). If those operations went well, please try the command line~\ttt{echo 22 | launch\_gcm} which should launch the GCM integrations on your system. 89 89 90 90 \mk … … 114 114 115 115 \sk 116 Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$M OD/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MOD/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}:117 118 \begin{verbatim} 119 cd $M OD/LMDZ.MARS/myGCM116 Here we assume that the user has chosen a given Martian sol or $L_s$ on which to start the mesoscale simulation. As already mentionned in section~\ref{namelist}, the file \ttt{\$MMM/SIMU/calendar} (or see appendix~\ref{calendar}) can help with this choice (sol$\rightarrow$$L_s$$\rightarrow$mesoscale date and vice-versa). In addition, the user has to check which sol is before the one wanted for simulation start and has $99$ in the first column: such sols are the ones for which an initial starting file for the GCM is available. Then please set the number of GCM simulated days \ttt{nday} in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def} accordingly: suppose the user you want to start a mesoscale simulation at sol~9 during 4~sols, then according to the \ttt{calendar} file, sol~8 is the closest file before sol~9 to be in the database, so \ttt{nday} must be at least~$5$. For optimal forcing at the boundaries, we advise you to write the meteorological fields to the \ttt{diagfi.nc} file at least each two hours, i.e. \ttt{ecritphy} is $40$ or~$80$ in \ttt{\$MESO/LMDZ.MARS/myGCM/run.def}. Eventually the GCM run can be launched using the following commands and should produce a netCDF data file named \ttt{diagfi.nc}\footnote{If the fields \ttt{emis}, \ttt{co2ice}, \ttt{q01}, \ttt{q02}, \ttt{tsoil} are missing in the \ttt{diagfi.nc} file, they are replaced by respective default values $0.95$, $0$, $0$, $0$, tsurf in the end of preprocessing step 1.}: 117 118 \begin{verbatim} 119 cd $MESO/LMDZ.MARS/myGCM 120 120 ./launch_gcm ## answer: your desired starting sol for the simulations 121 121 \end{verbatim} … … 127 127 %% 128 128 %Please deflate the archive and copy the \ttt{diagfi.nc} file 129 %in \ttt{\$M OD/TMPDIR/GCMINI}.129 %in \ttt{\$MESO/TMPDIR/GCMINI}. 130 130 %% 131 131 %Such a file can then be used to define the initial … … 135 135 \sk 136 136 Once the GCM simulations is finished, the programs in the \ttt{PREP\_MARS} directory allow the user to convert the data from the NETCDF \ttt{diagfi.nc} file into separated binary datafiles for each date contained in \ttt{diagfi.nc}, according to the formatting needed by the preprocessing programs at step 2. These programs can be executed by the following commands; if every went well with the conversion, 137 the directory \ttt{\$M OD/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}.138 139 \begin{verbatim} 140 cd $M OD/LMD_MM_MARS/your_install_dir/PREP\_MARS137 the directory \ttt{\$MESO/TMPDIR/WPSFEED} should contain files named \ttt{LMD:}. 138 139 \begin{verbatim} 140 cd $MMM/your_install_dir/PREP\_MARS 141 141 echo 1 | ./create_readmeteo.exe # drop the "echo 1 |" if you want control 142 142 ./readmeteo.exe < readmeteo.def 143 143 \end{verbatim} 144 145 %%% compile_and_exec ???? 144 146 145 147 \sk … … 178 180 179 181 \sk 180 \paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$M OD/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files.182 \paragraph{Step 2b} Once the \ttt{geo\_em} file(s) are generated, the \ttt{metgrid.exe} program performs a similar horizontal interpolation of the meteorological fields to the mesoscale domain as the one performed by \ttt{geogrid.exe} for the surface data (interpolation options can be modified by advanced users in \ttt{metgrid/METGRID.TBL}). Then the program writes the results in \ttt{met\_em} files and also collects the static fields and domain parameters included in the \ttt{geo\_em} file(s). If everything went well with the commands below, the directory \ttt{\$MESO/TMPDIR/WRFFEED} should contain \ttt{met\_em.*} files. 181 183 182 184 \begin{verbatim} … … 193 195 194 196 \begin{verbatim} 195 cd $M OD/TESTCASE ## or anywhere you would like to run the simulation196 ln -sf $M OD/TMPDIR/WRFFEED/met_em* .197 cd $MESO/TESTCASE ## or anywhere you would like to run the simulation 198 ln -sf $MESO/TMPDIR/WRFFEED/met_em* . 197 199 ./real.exe 198 200 \end{verbatim} … … 201 203 The final message of the \ttt{real.exe} should claim the success of the processes and you are now ready to launch the integrations of the LMD Martian Mesoscale Model again with the \ttt{wrf.exe} command as in section \ref{sc:arsia}. 202 204 205 \sk 203 206 \begin{finger} 204 207 \item \textbf{ When you modify either \ttt{namelist.wps} or \ttt{namelist.input}, make sure that the common parameters are exactly similar in both files (especially when running nested simulations) otherwise either \ttt{real.exe} or \ttt{wrf.exe} command will exit with an error message. Also, obviously the dates sent to \ttt{launch\_gcm} and written in both \ttt{namelist.input} and \ttt{namelist.wps} should be all the same. } -
trunk/MESOSCALE_DEV/MANUAL/SRC/user_manual_txt.tex
r223 r230 5 5 6 6 \chapter{Starting simulations from scratch: a summary} 7 8 A quick guide to full execution 9 (a list of commands) 10 7 11 8 12 \mk … … 15 19 To be completed 16 20 \end{remarque} 21 22 parler de xeyes 17 23 18 24
Note: See TracChangeset
for help on using the changeset viewer.