source: trunk/MESOSCALE/DOC/SRC/user_manual_txt.tex @ 168

Last change on this file since 168 was 168, checked in by aslmd, 13 years ago

MESOSCALE: ajout chapitre introduction au user manual.

  • Property svn:executable set to *
File size: 61.9 KB
Line 
1\chapter{What is the LMD Martian Mesoscale Model?}
2
3\mk
4\paragraph{Welcome !} The purpose of this introduction is to describe the Martian mesoscale model developed at the Laboratoire de M\'et\'eorologie Dynamique (LMD). This chapter comprises the excerpts from \textit{Spiga and Forget} [2009]\nocite{Spig:09} dedicated to the technical description of the LMD Martian Mesoscale Model. This serves as an introduction to the model, its design and capabilities. Further details can be found in the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} and subsequent papers about mesoscale applications: e.g., \textit{Spiga and Lewis} [2010]\nocite{Spig:10dust} and \textit{Spiga et al.} [2011]\nocite{Spig:11ti}. An introduction to Large-Eddy Simulations can be found in \textit{Spiga et al.} [2010]\nocite{Spig:10bl}.
5
6\paragraph{Important} Please cite the reference paper \textit{Spiga and Forget} [2009]\nocite{Spig:09} if you'd like to refer to the LMD Martian Mesoscale Model in one of your publication. If your paper makes use of simulations carried out with the LMD Martian Mesoscale Model, please consider including A. Spiga as a co-author of your work (and asking for help with writing the part related to mesoscale modeling). If you have any idea of specific simulations and wonder if it is ever possible to perform those with the LMD Martian Mesoscale Model, please do not hesitate to ask.
7
8\mk
9\section{Dynamical core}
10
11\sk
12The numerical integration of the atmospheric fluid dynamic equations is performed in meteorological models by the dynamical core. The LMD Martian Mesoscale Model dynamical core is based on the stable and carefully tested, fully parallellized, Advanced Research Weather Research and Forecasting model (hereinafter referred as ARW-WRF) [\textit{Skamarock et al.}, 2005, 2008\nocite{Skam:08}\nocite{Skam:05}], developed for terrestrial applications at NCEP/NCAR (version 2.2.1 - November 2007).
13
14\sk
15The ARW-WRF mesoscale model integrates the fully compressible non-hydrostatic Navier-Stokes equations in a specific area of interest on the planet. Since the mesoscale models can be employed to resolve meteorological motions less than few kilometers, a scale at which the vertical wind acceleration might become comparable to the acceleration of gravity, hydrostatic balance cannot be assumed, as is usually done in GCMs.
16
17\sk
18Mass, momentum, entropy, and tracer conservation are ensured by an explicitly conservative flux-form formulation of the fundamental equations, based on mass-coupled meteorological variables (winds, potential temperature, tracers). Alternatively, these variables are recast into a reference profile plus a perturbation to reduce truncation errors [\textit{Skamarock et al.}, 2008]\nocite{Skam:08}. Tracer transport can be computed by an additional forward-in-time scheme based on the Piecewise Parabolic Method [\textit{Carpenter et al.}, 1990]\nocite{Carp:90}, with positive definite and monotonic properties
19[\textit{Skamarock et al.}, 2006]\nocite{Skam:06}.
20
21\sk
22In the vertical dimension, the equations are projected, as suggested by \textit{Laprise} [1992]\nocite{Lapr:92}, on terrain-following mass-based coordinates (``eta levels"): $\eta = (\pi-\pi_t) / (\pi_s-\pi_t)$ where $\pi$ is the hydrostatic component of the pressure, $\pi_s$ the value at the surface and $\pi_t$ the (constant) upper boundary value. As shown in \textit{Laprise} [1992]\nocite{Lapr:92} and \textit{Janjic et al.} [2001]\nocite{Janj:01}, the choice of such vertical coordinates enables the integration of the ARW-WRF equations either in full non-hydrostatic mode or under the hydrostatic assumption. At the top of the domain, a free relaxation condition to zero vertical velocity is imposed (gravity wave absorbing layers can be defined as well).
23
24\sk
25In the horizontal dimension, the dynamical solver is available with three possible projections on the planetary sphere: Mercator (suitable for equatorial regions),  Lambert Conformal (for mid-latitudes),  and Polar Stereographic (for high-latitudes). Projections are defined by map scale factors, ensuring a regular computational grid whatever the map projection should be. Polar simulations are therefore devoid of any pole singularity, an usual drawback of the GCMs that requires the use of additional filtering. The spatial discretization is an Arakawa C-grid, where normal velocities are staggered one-half grid length from the thermodynamic variables [\textit{Arakawa}, 1966]\nocite{Arak:66}.
26
27\sk
28In the temporal dimension, a third-order Runge-Kutta integration scheme is employed for improved numerical accuracy and stability: the maximum stable Courant Friedrichs Lewy (CFL) numbers for advection are increased by a factor of two compared to the regular leapfrog integration scheme [\textit{Skamarock et al.}, 2008]. A time-splitting integration technique is implemented to prevent the meteorologically insignificant acoustic motions from triggering numerical instabilities [\textit{Klemp et al.}, 2007]\nocite{Klem:07}. Additional filters for acoustic external and internal modes damp residual instabilities possibly arising in the acoustic step integration.
29
30\sk
31In the ARW-WRF Runge-Kutta time-integration scheme, while pressure gradient and divergence terms are simply second order and centered, spatial discretizations of the advection terms for momentum, scalars and geopotential are 2nd through 6th order accurate [\textit{Wicker and Skamarock}, 2002]\nocite{Wick:02}. Martian simulations are performed with a 5th order discretized advection. One peculiarity of the odd-order advection discretization is the inherent inclusion of a dissipation term [\textit{Hundsdorfer et al.}, 1995]\nocite{Hund:95} with a coefficient proportional to the Courant number.
32
33\sk
34However, as was pointed out by \textit{Knievel et al.} [2007]\nocite{Knie:07}, this odd-ordered implicit scheme is not diffusive enough in low-wind or neutral/unstable stratification, and numerical noise in the wind fields might reach amplitudes comparable to the simulated winds. Such noise was found to be significant in the Martian case under near-surface afternoon superadiabatic conditions. The standard Martian simulations thus include the additional 6th order diffusion scheme developed by \textit{Knievel et al.}, with a removal parameter set for Martian applications to $20\%$ of the $2\,\Delta x$ noise in one timestep. While reducing the numerical noise near the surface to almost undiscernable amplitudes, the additional Knievel diffusion has little effect on the simulated meteorological fields.
35
36\sk
37Particular adaptations were required to use the ARW-WRF dynamical solver in the Martian environment. Physical constants, such as the acceleration of gravity and the planetary rotation rate, were converted to the Martian values. Vegetation and ocean-related variables were not used, and replaced with variables more suitable for the Martian applications (e.g., thermal inertia). Martian dates are given by the aerocentric solar longitude $L_s$, which indicates the position of Mars with respect to the Sun (0, 90, 180, 270 degrees are, respectively, the beginning of the northern hemisphere spring, summer, fall and winter). The terrestrial calendar was thus replaced with the LMD-GCM Martian calendar built on 669 Martian sols split in 12 ``aerocentric longitude"-based months (each of them is $L_s=30^{\circ}$ long, and thus encloses an irregular number of Martian sols due to the high eccentricity of the orbit), and one hour was defined as $1/24$ sol.
38
39\mk
40\section{Martian physics}
41
42\sk
43In any meteorological model, the 3D dynamical core is coupled with parameterization schemes (most often 1D) to compute at each grid point of the simulation domain the particular physics of the considered planetary environment: diabatic forcing of the atmospheric circulation (radiative transfer, soil thermal diffusion); sub-grid scale dynamical parameterizations (Planetary Boundary Layer [PBL] diffusion and mixing, convective adjustment); tracer sources and sinks (microphysical processes, chemistry, dust sedimentation and lifting). The LMD-MGCM complete physical parameterizations are interfaced with the adapted ARW-WRF dynamical core, described in the previous section, by a new ``driver" that is built on the same principles as the ARW-WRF terrestrial parameterization schemes, which are all switched off for the Martian applications. Thus, the LMD Martian Mesoscale Model shares the same comprehensive physical parameterizations as the LMD-MGCM, in order to simulate the Martian dust, CO$_2$, H$_2$O and photochemistry cycles [\textit{Forget et al.}, 1999; \textit{Montmessin et al.}, 2004; \textit{Lefevre et al.}, 2004].
44
45\sk
46\subsection{Physical parameterizations}
47
48\sk
49The radiative transfer in the model accounts for CO$_2$ gas infrared absorption/emission [\textit{Hourdin et al.}, 1992]\nocite{Hour:92} and visible and infrared dust absorption, emission and diffusion [\textit{Forget et al.}, 1998, 1999]\nocite{Forg:98grl}. Description of the CO$_2$ condensation processes in the model can be found in \textit{Forget et al.} [1998b]\nocite{Forg:98}. Thermal conduction in the soil is simulated by the 11-layer soil model developed by \textit{Hourdin et al.} [1993]\nocite{Hour:93} for Mars (soil density and soil specific heat capacity are set as constants). Turbulent closure is based on turbulent viscosity with coefficients calculated from the ``$2.5$-order" scheme by \textit{Mellor and Yamada} [1982]\nocite{Mell:82}, improved by \textit{Galperin et al.} [1988]\nocite{Galp:88}. In the case where vertical mixing is handled in the independent 1D physical packages, the native vertical mixing schemes in the ARW-WRF dynamical core are switched off, and the most appropriate choice for explicit horizontal diffusion is the built-in ARW-WRF scheme based on horizontal deformation [\textit{Smagorinsky}, 1963]\nocite{Smag:63}.
50
51\sk
52Recent improvements on the radiative transfer computations [\textit{Dufresne et al.}, 2005]\nocite{Dufr:05}, on the slope irradiance estimations [\textit{Spiga and Forget}, 2008]\nocite{Spig:08grl}, on the dust lifting and sedimentation [\textit{Forget et al.}, 1999b\nocite{Forg:99icm5}; \textit{Newmann et al.}, 2002]\nocite{Newm:02a}, on the water cycle and water ice clouds [\textit{Montmessin et al.}, 2004]\nocite{Mont:04}, and on the photochemical species [\textit{Lefevre et al.}, 2004]\nocite{Lefe:04}, particularly ozone [\textit{Lefevre et al.}, 2008]\nocite{Lefe:08}, are also natively included in the LMD Martian Mesoscale Model. The non-local thermodynamic equilibrium (NLTE) parameterizations for thermosphere applications [\textit{Gonz\'alez-Galindo et al.}, 2005\nocite{Gonz:05}] as well as estimations of the atmospheric exchanges with the Martian regolith [\textit{B\"ottger et al.}, 2005]\nocite{Bott:05}, are also available in the model.
53
54%\sk
55%Upcoming improvements of the LMD-MGCM physics [\textit{Forget et al.}, 2007]\nocite{Forg:07emsec}, following the recent measurements by instruments onboard Mars Express (MEx) and MRO, will be included in the LMD Martian Mesoscale Model too. Examples of future parameterizations that will be added in both models are the radiative effects of water ice clouds, which could significantly modify the atmospheric temperatures [\textit{Wilson et al.}, 2007]\nocite{Wils:07}, and the new dust radiative properties derived from recent measurements by the OMEGA instrument onboard MEx [\textit{M\"a\"att\"anen et al.}, 2008]\nocite{Maat:08} and the CRISM instrument onboard MRO [\textit{M.~J. Wolff and M. Vincendon}, personal communication, 2008].
56
57\sk
58Two physical parameterizations of the LMD-MGCM, specifically designed for synoptic-scale meteorological applications, are not used in the mesoscale applications.
59
60\sk
61Firstly, in the mesoscale domain, the topographical field is described with horizontal resolutions from tens of kilometers to hundreds of meters. The \textit{Lott and Miller} [1997]\nocite{Lott:97} subgrid-scale topographical drag parameterization and the \textit{Miller et al.} [1989]\nocite{Mill:89} gravity-wave drag scheme can thus be switched off, as the topographical influence on the atmospheric flow is computed by the dynamical core at the chosen mesoscale resolutions.
62
63\sk
64Secondly, in order to ensure numerical stability, and to account for subgrid-scale mixing processes insufficiently handled in the PBL scheme, it is usually necessary to modify any unstable layer with negative potential temperature gradients (an usual near-surface situation during Martian afternoons) into a neutral equivalent [\textit{Hourdin et al.}, 1993]. As pointed out by \textit{Rafkin} [2003b]\nocite{Rafk:03adj}, the use of such an artificial convective adjustment scheme might be questionable in Martian atmospheric models, should they be GCMs or mesoscale models. Since numerical stability is ensured in the LMD Martian Mesoscale Model by choosing the appropriate dynamical timestep with respect to the CFL condition, and using the aforementioned ARW-WRF nominal filters and diffusion schemes, the convective adjustment scheme used in the LMD-MGCM can thus be switched off in the LMD Martian Mesoscale Model.
65
66\mk
67\subsection{Physical timestep}
68
69\sk
70Invoking physical packages often with respect to the dynamical computations was found to be necessary to accurately account for near-surface friction effects where the wind acceleration is particularly high, typically in regions of strong Martian topographically-driven circulation. In such areas, if the ratio between the physical timestep and the dynamical timestep is above $\sim 5$, the model predicts winds spuriously increasing with the chosen ratio and varying with the horizontal resolution. On the contrary, if this ratio is less than $\sim 5$, the simulated winds neither vary significantly with the chosen ratio nor with the horizontal resolution.
71
72\sk
73A ratio equal to 1 is chosen in the standard LMD Martian Mesoscale Model simulations. This choice is in conformity with the strategy adopted in the terrestrial ARW-WRF model. Besides, computing the physical parameterizations at the same frequency as the dynamical integration is profitable to some physical parameterizations, such as the formation of clouds (which is sensitive to rapid temperature change). Note that radiative transfer computations are usually carried out less often to save computational time.
74
75\sk
76When the ratio between the physical timestep and the dynamical timestep is superior to 1, two distinct strategies could be adopted. Interestingly, we found that splitting the physical tendency in equal parts and blending it with the dynamical tendency at each dynamical timestep computation is slightly more stable (understand: allows for higher dynamical timesteps) than applying the whole physical tendency when the physical parameterizations are computed, and letting the dynamical core naturally evolve until the next physics call. However, an analysis of the simulated meteorological fields in both cases does not reveal significant differences.
77
78\mk
79\section{Initial and boundary conditions}
80\label{ssc:inibdy}
81
82\mk
83\subsection{Starting state and horizontal boundaries}
84
85\sk
86Mesoscale simulations can be performed in a limited domain anywhere on the planet. Thus, boundary conditions for the main meteorological fields (horizontal winds, temperature, tracers) have to be provided during the simulations, in addition to an atmospheric starting state. Idealized simulations usually require the use of periodic, symmetric or open boundary conditions, whereas real-case simulations need specified climatologies at the boundaries.
87
88\sk
89The specified boundary conditions and the atmospheric starting state are derived from previously performed $64\times48\times25$ (i.e., horizontal resolution of $5.625^{\circ}$ in longitude and $3.75^{\circ}$ in latitude, model top $\sim$~80~km~altitude) LMD-MGCM simulations which have reached equilibrium, typically after $\sim 10$ simulated years. GCM results are often used every Martian hour to constrain the mesoscale model at the domain boundaries. Temporal interpolations to each mesoscale timestep and spatial interpolations on the mesoscale domain are performed from the LMD-MGCM inputs. A relaxation zone of a given width (user-defined, usually 5 grid points) is implemented at the boundaries of the ARW-WRF domain to enable both the influence of the large-scale fields on the limited area, and the development of the specific mesoscale circulation inside the domain. The interpolations and the use of a relaxation zone prevent the prescribed meteorological fields at the lateral boundaries from having sharp gradients and from triggering spurious waves or numerical instabilities (the situation where the relaxation zone crosses steep topographical gradients should however be avoided).
90
91\mk
92\subsection{Nesting or single-domain strategy ?}
93\label{ssc:nestingvalid}
94
95\sk
96The model includes one-way and two-way (or ``feedback") nesting capabilities. The nested simulations feature two kinds of domains where the meteorological fields are computed: the "parent" domain, with a large geographical extent, a coarse grid resolution, and specified boundary conditions, and the "nested" domains, centered in a particular zone of interest, with a finer grid resolution, and boundary conditions provided by its parent domain.
97
98\sk
99The nesting capabilities can be used only if deemed necessary, and single-domain simulations may be the primary type of run performed.
100
101\sk
102Firstly, employing the same physical parameterizations in the mesoscale model computations and in the GCM simulations defining the boundary and initial conditions, ensures a very consistent meteorological forcing at the boundaries of the mesoscale domain. This assumption was not denied by further examination of the performed simulations: mesoscale predictions are not unrealistically departing from the LMD-MGCM prescribed fields at the boundaries, and the mesoscale influence naturally adds to the synoptic (large-scale) tendency communicated at the boundaries.
103
104\sk
105Secondly, the single-domain approach is appropriate as long as the variations of near-surface winds, pressure and temperature induced by ``passing" thermal tides through the east-west boundaries are not unrealistic. This criterion is specific to Martian mesoscale modeling and was described by \textit{Tyler et al.} [2002]. In the various simulations performed with the LMD Martian Mesoscale Model, a likely spurious influence of the passing thermal tides was only detected in the near-surface meteorological fields calculated at the $\sim 5$ near-boundaries grid points. The amplitudes of the departures were negligible ($\delta T \apprle 3$~K; $\delta u, \delta v \apprle 5\%$) and did not require the use of domains nested inside one semi-hemispheric parent domain [\textit{Tyler et al.}, 2002]. However, the analysis of the simulated fields at the near-boundaries grid points should be carried out with caution when choosing the single-domain approach. A practical solution to this drawback is to define a large domain, centered on the chosen area of interest, with a sufficient number of grid points ($75 \times 75$ being a minimal requirement).
106
107\sk
108Thirdly, \textit{Dimitrijevic and Laprise} [2005]\nocite{Dimi:05} showed, by the so-called ``Big Brother" approach, that the single-domain approach yields unbiased results when the boundary forcing involves a minimum of $\sim 8-10$ GCM grid points. Thus, given the resolution of the GCM fields used to constrain the LMD Martian Mesoscale Model, single-domain simulations with, for instance, a horizontal resolution of $20$~km shall be performed on at least $133 \times 88$ grid points. \textit{Antic et al.} [2006]\nocite{Anti:06} found that the ``$8-10$ grid points" limit can be lowered in situations of complex topography, because the dynamical influence of these mesoscale features is responsible for the larger part of the mesoscale circulation in the domain. Such situations are rather common on Mars, and the aforementioned ``minimal" grid can be of slightly smaller horizontal extent in areas such as Olympus Mons or Valles Marineris.
109
110\sk
111Thus the sizes of the simulation grids have to be chosen in order to ensure the applicability of the single-domain approach. The nesting technique is used only when defining a single domain with sufficient geographical extent would have required too many grid points to handle the computations within reasonable CPU time. For instance, with ``$64 \times 48$" GCM simulations as boundary conditions, the use of the single-domain strategy to model the Arsia Mons circulation at $5$ km resolution imposes a simulation grid of at least $531 \times 354$ points. The nesting technique is more suitable for this kind of simulation.
112
113\mk
114\subsection{Surface fields}
115
116\sk
117Surface static data intended for the mesoscale domain are extracted from maps derived from recent spacecraft measurements: 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola}, 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01}, 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07}. A smoother composite thermal inertia map derived from \textit{Palluconi and Kieffer} [1981]\nocite{Pall:81}, \textit{Mellon et al.} [2000]\nocite{Mell:00} and \textit{Vasavada et al.} [2000]\nocite{Vasa:00} can be alternatively used for better continuity with LMD-MGCM simulations. Except for CO$_2$ ice covered areas, emissivity is set to $0.95$. The roughness length $z_0$ is set to the constant value of $1$~cm, but further versions of the model will use spatially-varying $z_0$ [\textit{H\'ebrard et al.}, 2007]\nocite{Hebr:07}. Initial values for time-varying surface data, such as CO$_2$ and H$_2$O ice on the surface and soil temperatures, are derived from the GCM simulations. The latter initialization reduces the spin-up time for surface temperature to roughly one simulated sol.
118
119\sk
120The LMD Martian Mesoscale Model has the complete ability to simulate the dust cycle (lifting, sedimentation, transport). However, the high sensivity of the results to the assumptions made on threshold wind stress and injection rate [\textit{Basu et al.}, 2004]\nocite{Basu:04} leads us to postpone these issues to future studies. Instead, similarly to the reference LMD-MGCM simulations, dust opacities are prescribed in the mesoscale model from 1999-2001 TES measurements, thought to be representative of Martian atmospheric conditions outside of planet-encircling dust storm events [\textit{Montabone et al.}, 2006]\nocite{Mont:06luca}. In the vertical dimension, as described in \textit{Forget et al.} [1999], and in accordance with the general consensus of well-mixed dust in equilibrium with sedimentation and mixing processes [\textit{Conrath}, 1975]\nocite{Conr:75}, dust mixing ratio is kept constant from the surface up to a given elevation $z_{\textrm{\tiny{max}}}$ above which it rapidly declines. Both in the nominal GCM and mesoscale simulations, $z_{\textrm{\tiny{max}}}$ as a function of areocentric longitude and latitude is calculated from the ``MGS scenario" [\textit{Forget et al.}, 2003]\nocite{Forg:03}.
121
122\mk
123\subsection{Vertical interpolation}
124
125\sk
126In the process of initialization and definition of boundary conditions, the vertical interpolation of GCM meteorological fields to the terrain-following mesoscale levels must be treated with caution. While deriving the near-surface meteorological fields from GCM inputs, one may address the problem of underlying topographical structures at fine mesoscale horizontal resolution, e.g., a deep crater that is not resolved in the coarse GCM case.
127
128\sk
129A crude extrapolation of the near-surface GCM fields to the mesoscale levels is usually acceptable for terrestrial applications. On Mars, owing to the low density and heat capacity of the Martian atmosphere, the surface temperature is to first order controlled by radiative equilibrium, and thus it is left relatively unaffected by variations of topography [e.g. \textit{Nayvelt et al.}, 1997]\nocite{Nayv:97}. A practical consequence, which renders an extrapolation strategy particularly wrong on Mars, is that the near-surface temperature and wind fields vary much more with the distance from the surface than with the absolute altitude above the areoid (or equivalently with the pressure level). Initial tests carried out with the extrapolation strategy showed that differences between temperatures at the boundaries and temperatures computed within the mesoscale domain close to these boundaries often reach $20-30$~K near the surface. An interpolation based only on terrain-following principles solves this problem near the surface but was found to lead to numerical instabilities at higher altitudes during the mesoscale integrations.
130
131\sk
132Therefore, input meteorological data need to be recast on intermediate pressure levels $P'$ with a low level smooth transition from terrain-following levels (for the near-surface environment) to constant pressure levels (for the free atmosphere at higher altitude). We thus have $P'(x,y)=\alpha + \beta \, P_s(x,y)$, $P_s$ being the surface pressure at the resolution of the GCM simulations. To ensure a realistic low-level transition, the technique described in \textit{Millour et al.} [2008]\nocite{Mill:08ddd}, based on high-resolution GCM results, is employed to calculate the $P'$ levels. The mesoscale surface pressure field $p_s$ is an input parameter of the method, since the near-surface adiabatic cooling over mountains and warming within craters are taken into account. Note that $p_s(x,y)$ is calculated from $P_s(x,y)$ on the basis of the high-resolution topography of the mesoscale domain $z(x,y)$ by $$p_s(x,y) = P_s(x,y) \, e^{ \frac{g \, [Z(x,y)-z(x,y)]}{R \, T(x,y)} }$$ \noindent where $Z(x,y)$ is the topography at the resolution of the GCM simulations, $R$ the gas law constant, $g$ the acceleration of gravity, and $T(x,y)$ the temperature predicted by the GCM $1$~km above the surface (see \textit{Spiga et al.} [2007]\nocite{Spig:07omeg}). Without reinterpolating the data, the intermediate pressure $P'$ levels are then simply converted into their mesoscale counterparts $p'$ by substituting $p_s$ for $P_s$ in the formula $P'(x,y)=\alpha + \beta \, P_s(x,y)$. Finally, the built-in ARW-WRF vertical interpolation onto the final mesoscale terrain-following levels can be performed, as the problem of extrapolation is solved by the use of the intermediate pressure levels $p'$.
133
134\sk
135The initial atmospheric state obtained through this ``hybrid" method ensures low-amplitude adjustments of the meteorological fields by the mesoscale model at the beginning of the performed simulations (i.e., in the first thousands of seconds). Furthermore, the continuity between the large-scale forcing and the mesoscale computations near the limits of the domain, as well as the numerical stability of the simulations, appear as significantly improved compared to methods either based on extrapolation (especially in areas of uneven terrains) or terrain-following interpolation.
136
137%\pagebreak
138\includepdf[pages=1,offset=25mm -20mm]{meso.pdf}
139\clearemptydoublepage
140
141\chapter{First steps toward running the model}
142
143\mk
144This chapter is meant for first time users of the LMD Martian Mesoscale Model.
145%
146We describe how to install the model on your system, compile the program and run a test case.
147%
148Experience with either the terrestrial WRF mesoscale model or the LMD Martian GCM is not absolutely required,
149although it would help you getting more easily through the installation process.
150
151\mk
152\section{Prerequisites}
153
154\mk
155\subsection{General requirements}
156
157\mk
158In order to install the LMD Martian Mesoscale Model, please ensure that:
159\begin{citemize}
160\item your computer is connected to the internet;
161\item your OS is Linux\footnote{
162%%%%%%%%%%%%%%
163The model was also successfully compiled on MacOSX;
164``howto" information is available upon request.
165%%%%%%%%%%%%%%
166} with a decent set of basic commmands (\ttt{sed}, \ttt{awk}, \ldots);
167\item your Fortran compiler is the PGI commercial compiler \ttt{pgf90} or the GNU
168free compiler\footnote{
169%%%%%%%%%%%%%%
170Sources and binaries available on \url{http://www.g95.org}
171%%%%%%%%%%%%%%
172} \ttt{g95};
173\item your C compiler is \ttt{gcc} and C development libraries are included;
174\item \ttt{bash}, \ttt{m4} and \ttt{perl} are installed on your computer;
175\item \ttt{NETCDF} libraries have been compiled \emph{on your system}.
176\end{citemize} 
177%
178\begin{finger}
179\item You might also find useful -- though not mandatory -- to install on your system:
180\begin{citemize}
181\item the \ttt{ncview} utility\footnote{
182%%%%%%
183\url{http://meteora.ucsd.edu/~pierce/ncview\_home\_page.html}
184%%%%%%
185}, which is a nice tool to visualize the contents of a NETCDF file;
186\item the \ttt{IDL} demo version\footnote{
187%%%%%%
188\url{http://www.ittvis.com/ProductServices/IDL.aspx}
189%%%%%%
190}, which is used by the plot utilities provided with the model.
191\end{citemize} 
192\end{finger}
193
194\mk
195\marge Three environment variables associated with the \ttt{NETCDF} libraries must be defined:
196\begin{verbatim}
197declare -x NETCDF=/disk/user/netcdf 
198declare -x NCDFLIB=$NETCDF/lib       
199declare -x NCDFINC=$NETCDF/inc       
200\end{verbatim}
201
202\begin{finger}
203\item All command lines in the document are proposed in \ttt{bash}.
204\end{finger}
205
206%%[csh] setenv NETCDF /disk/user/netcdf
207%%[csh] setenv NCDFLIB $NETCDF/lib
208%%[csh] setenv NCDFINC $NETCDF/inc
209
210\mk
211\marge You also need the environment variable \ttt{\$LMDMOD} to point
212at the directory where you will install the model (e.g. \ttt{/disk/user/MODELS}):
213\begin{verbatim}
214declare -x LMDMOD=/disk/user/MODELS 
215\end{verbatim}
216%[csh] setenv LMDMOD /disk/user/MODELS
217%
218\begin{finger}
219\item Please check that $\sim 200$~Mo free disk space is available in \ttt{/disk}.
220\end{finger}
221
222\mk
223\subsection{Parallel computations}
224
225\mk
226\marge Parallel computations with the Message Passing Interface (MPI) standard are supported by
227the ARW-WRF mesoscale model.
228%
229If you want to use this capability in the LMD Martian Mesoscale Model,
230you would have the installation of MPICH2 as a additional prerequisite.
231
232\mk
233\marge Please download the current stable version of the sources
234(e.g. \ttt{mpich2-1.0.8.tar.gz}) on the MPICH2 website
235\url{http://www.mcs.anl.gov/research/projects/mpich2}
236and install the MPICH2 utilities by the following commands:
237%
238\begin{verbatim}
239mkdir $LMDMOD/MPI
240mv mpich2-1.0.8.tar.gz $LMDMOD/MPI
241cd $LMDMOD/MPI
242tar xzvf mpich2-1.0.8.tar.gz
243cd mpich2-1.0.8
244./configure --prefix=$PWD --with-device=ch3:nemesis > conf.log 2> conferr.log &
245# please wait...
246make > mk.log 2> mkerr.log &
247declare -x WHERE_MPI=$LMDMOD/MPI/mpich2-1.0.8/bin
248\end{verbatim}
249%
250\begin{finger}
251\item Even if you add the \ttt{\$LMDMOD/MPI/mpich2-1.0.8/bin}
252directory to your \ttt{\$PATH} variable, defining the environment
253variable \ttt{\$WHERE\_MPI} is still required
254to ensure a successful compilation of the model.
255\end{finger}
256
257\mk
258\subsection{Compiling the terrestrial WRF model}
259
260\mk
261The LMD Martian Mesoscale Model is based on the terrestrial NCEP/NCAR ARW-WRF Mesoscale Model.
262%
263As a first step towards the compilation of the Martian version, we advise you to check that the terrestrial
264model compiles on your computer with either \ttt{g95} or \ttt{pgf90}.
265
266\mk
267\marge On the ARW-WRF website \url{http://www.mmm.ucar.edu/wrf/users/download/get\_source.html}, you will be allowed
268to freely download the model after a quick registration process (click on ``New users").
269%
270Make sure to download the version 2.2 of the WRF model and copy the
271\ttt{WRFV2.2.TAR.gz} archive to the \ttt{\$LMDMOD} folder.
272
273\mk
274\marge Then please extract the model sources and configure the compilation process:
275\begin{verbatim}
276cd $LMDMOD
277tar xzvf WRFV2.2.TAR.gz
278cd WRFV2
279./configure
280\end{verbatim}
281
282\mk
283\marge The \ttt{configure} script analyzes your architecture
284and proposes you several possible compilation options.
285%
286Make sure to choose the ``single-threaded, no nesting"
287option related to either \ttt{g95} (should be option $13$ on a $32$~bits Linux PC)
288or \ttt{pgf90} (should be option $1$ on a $32$~bits Linux PC).
289
290\mk
291\marge The next step is to compile the WRF model by choosing the kind of
292simulations you would like to run.
293%
294A simple and direct test consists in trying to compile
295the idealized case of a 2D flow impinging on a small hill:
296\begin{verbatim}
297./compile em_hill2d_x > log_compile 2> log_error &
298\end{verbatim}
299%
300\begin{finger}
301\item In case you encounter problems compiling the ARW-WRF model,
302please read documentation on the website
303\url{http://www.mmm.ucar.edu/wrf/users},
304contact the WRF helpdesk or search the web for your error message.
305\end{finger}%\pagebreak
306
307\mk
308\marge If the compilation was successful
309(the file \ttt{log\_error} should be empty
310or only reporting few warnings), you should find
311in the \ttt{main} folder two executables
312\ttt{ideal.exe} and \ttt{run.exe}
313that would allow you to run the test
314simulation:
315\begin{verbatim}
316cd test/em_hill2d_x
317./ideal.exe
318./wrf.exe
319\end{verbatim}
320%
321During the simulation, the time taken by the computer
322to perform integrations at each dynamical timestep
323is displayed in the standard output.
324%
325The simulation should end with a message \ttt{SUCCESS COMPLETE WRF}.
326%
327The model results are stored in a \ttt{wrfout} data file
328you might like to browse with a \ttt{NETCDF}-compliant software
329such as \ttt{ncview}.
330%
331\begin{finger}
332\item If you compiled the model with \ttt{g95}, \ttt{ideal.exe} will
333probably complain about an error reading the namelist.
334%
335Please move the line \ttt{non\_hydrostatic} below the line \ttt{v\_sca\_adv\_order}
336in the \ttt{namelist.input} file to solve the problem.
337\end{finger}
338
339\mk
340\section{Compiling the Martian model}
341
342\mk
343\subsection{Extracting and preparing the sources}
344
345\mk
346To start the installation of the Martian mesoscale model,
347download the archive \ttt{LMD\_MM\_MARS.tar.gz}
348(click on \url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS.tar.gz}
349or use the \ttt{wget} command).
350%
351Copy the sources in the \ttt{\$LMDMOD} directory and extract the files:
352\begin{verbatim}
353cp LMD_MM_MARS.tar.gz $LMDMOD
354cd $LMDMOD
355tar xzvf LMD_MM_MARS.tar.gz
356\end{verbatim}
357
358\mk
359\marge Execute the \ttt{prepare} script
360that would do some necessary preparatory tasks for you:
361deflate the various compressed archives contained into \ttt{LMD\_MM\_MARS},
362download the ARW-WRF sources from the web,
363apply a (quite significant) ``Martian patch" to these sources
364and build the final structure of your \ttt{LMD\_MM\_MARS} directory:
365\begin{verbatim}
366cd $LMDMOD/LMD_MM_MARS
367./prepare
368\end{verbatim}
369
370\mk
371\marge Please check the contents of the \ttt{LMD\_MM\_MARS} directory:
372\begin{citemize}
373\item seven \ttt{bash} scripts:
374\ttt{build\_static},
375\ttt{copy\_model},
376\ttt{makemeso},
377\ttt{prepare},
378\ttt{prepare\_ini},\linebreak 
379\ttt{prepare\_post},
380\ttt{save\_all};
381\item the sources directory \ttt{SRC};
382\item the static data directory \ttt{WPS\_GEOG};
383\item the simulation utilities directory \ttt{SIMU}.
384\end{citemize}
385%
386\marge and check that the \ttt{LMD\_MM\_MARS/SRC} directory contains:
387\begin{citemize}
388\item the model main sources in \ttt{WRFV2},
389\item the preprocessing sources in \ttt{WPS} and \ttt{PREP\_MARS},
390\item the postprocessing sources in \ttt{ARWpost},
391\item three \ttt{tar.gz} archives and two information text files. %\ttt{saved} and \ttt{datesave}.
392\end{citemize}
393
394\mk
395\subsection{Main compilation step}
396\label{sc:makemeso}
397
398\mk
399In order to compile the model, execute the \ttt{makemeso} compilation script
400in the \ttt{LMD\_MM\_MARS}\linebreak directory
401%
402\begin{verbatim}
403cd $LMDMOD/LMD_MM_MARS
404./makemeso
405\end{verbatim}
406%
407\marge and answer to the questions about
408\begin{asparaenum}[1.]%[\itshape Q1\upshape)]
409\item compiler choice (and number of processors if using MPI)
410\item number of grid points in longitude [61]
411\item number of grid points in latitude [61]
412\item number of vertical levels [61]
413\item number of tracers [1]
414\item number of domains [1]
415\end{asparaenum}
416
417%\mk
418\begin{finger}
419\item On the first time you compile the model, you will probably wonder what to reply
420to questions $2$ to $6$ \ldots type the answers given in brackets to compile an executable suitable
421for the test case given below.
422\item Suppose you compiled a version of the model for a given set of parameters $1$ to $6$
423to run a specific compilation.
424If you would like to run another simulation
425with at least one of parameters $1$ to $6$ 
426subject to change, the model needs to be recompiled\footnote{This
427necessary recompilation each time the number of grid points,
428tracers and domains is modified is imposed by the LMD physics code.
429The WRF dynamical core alone is much more flexible.} with \ttt{makemeso}.
430\item When you use parallel computations, please bear in mind that with
431$2$ (resp. $4$, $6$, $8$, $16$) processors the whole domain would be separated
432into $2$ (resp. $2$, $3$, $4$, $4$) tiles over
433the latitude direction and $1$ (resp. $2$, $2$, $2$, $4$) tile over the longitude direction.
434Thus make sure that the number of grid points minus $1$ in each direction
435could be divided by the aforementioned number of tiles over the considered
436direction.
437\item If you use grid nesting, note that no more than $4$ processors can be used.
438\end{finger}
439
440\mk
441\marge The \ttt{makemeso} is an automated script which performs
442the following serie of tasks:
443%It is useful to detail and comment the  performed by the \ttt{makemeso} script:
444\begin{citemize}
445\item determine if the machine is 32 or 64 bits;
446\item ask the user about the compilation settings;
447\item create a corresponding directory \ttt{\$LMDMOD/LMD\_MM\_MARS/DIRCOMP};
448\begin{finger}
449\item For example, a \ttt{DIRCOMP} directory named \ttt{g95\_32\_single} 
450is created if the user requested
451a \ttt{g95} compilation of the code for single-domain simulations
452on a 32bits machine.
453\end{finger}
454\item generate with \ttt{copy\_model} a directory \ttt{DIRCOMP/WRFV2} containing links to \ttt{SRC/WRFV2} sources;
455\begin{finger}
456\item This method ensures that any change to the model sources would
457be propagated to all the different \ttt{DIRCOMP} installation folders.
458\end{finger}
459\item execute the WRF \ttt{configure} script with the correct option;
460\item tweak the resulting \ttt{configure.wrf} file to include a link towards the Martian physics;
461\item calculate the total number of horizontal grid points handled by the LMD physics;
462\item duplicate LMD physical sources if nesting is activated;
463\begin{finger}
464\item The model presently supports 3 nests, but more nests
465can be included by adaptating the following files:
466\begin{verbatim}
467$LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_inifis3.inc
468$LMDMOD/LMD_MM_MARS/SRC/WRFV2/call_meso_physiq3.inc
469$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/duplicate3
470$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/libf/generate3
471$LMDMOD/LMD_MM_MARS/SRC/WRFV2/mars_lmd/makegcm*  ## search for 'nest'
472\end{verbatim}%\pagebreak
473\end{finger}
474\item compile the LMD physical packages with the appropriate \ttt{makegcm} command
475and collect the compiled objects in the library \ttt{liblmd.a};
476\begin{finger}
477\item During this step that could be a bit long,
478especially if you defined more than one domain,
479the \ttt{makemeso} script provides you with the full path towards
480the text file \ttt{log\_compile\_phys} in which you can check for
481compilation progress and possible errors.
482%
483In the end of the process, you will find an
484error message associated to the generation of the
485final executable.
486%
487Please do not pay attention to this, as the compilation of the LMD
488sources is meant to generate a library of
489compiled objects called \ttt{liblmd.a} instead of a program.
490\end{finger}
491\item compile the modified Martian ARW-WRF solver, including
492the \ttt{liblmd.a} library;
493\begin{finger}
494\item When it is the first time the model is compiled, this
495step could be quite long.
496%
497The \ttt{makemeso} script provides you with a \ttt{log\_compile}
498text file where the progress of the compilation can be checked
499and a \ttt{log\_error} text file listing errors and warnings
500during compilation.
501%
502A list of warnings related to \ttt{grib}
503utilities (not used in the Martian model) 
504may appear and have no impact on the
505final executables.
506\item The compilation with \ttt{g95} might be unsuccessful
507due to some problems with files related to terrestrial microphysics.
508%
509Please type the following commands:
510\begin{verbatim}
511cd $LMDMOD/LMD_MM_MARS/SRC
512tar xzvf g95.tar.gz
513cp -f g95/WRFV2_g95_fix/* WRFV2/phys/
514cd $LMDMOD/LMD_MM_MARS
515\end{verbatim}
516\marge then recompile the model with the \ttt{makemeso} command.
517\end{finger}
518\item change the name of the executables in agreements with the
519settings provided by the user.
520\begin{finger}
521\item If you choose to answer to the \ttt{makemeso} questions using the
522aforementioned parameters in brackets, you should have in the
523\ttt{DIRCOMP} directory two executables:
524\begin{verbatim}
525real_x61_y61_z61_d1_t1_p1.exe
526wrf_x61_y61_z61_d1_t1_p1.exe
527\end{verbatim}
528%
529The directory also contains a text file
530in which the answers to the questions are stored, which
531allows you to re-run the script without the
532``questions to the user" step:
533\begin{verbatim}
534./makemeso < makemeso_x61_y61_z61_d1_t1_p1
535\end{verbatim}
536\end{finger}
537\end{citemize}
538
539\mk
540\section{Running a simple test case}
541\label{sc:arsia}
542
543\mk
544We suppose that you had successfully compiled
545the model at the end of the previous section
546and you had used the answers in brackets
547to the \ttt{makemeso} questions.
548
549\mk
550\marge In order to test the compiled executables,
551a ready-to-use test case
552(with pre-generated initial and boundary
553conditions) is proposed
554in the \ttt{LMD\_MM\_MARS\_TESTCASE.tar.gz}
555archive you can download at
556\url{http://www.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/LMD_MM_MARS_TESTCASE.tar.gz}.
557%
558This test case simulates the hydrostatic
559atmospheric flow around Arsia Mons during half a sol
560with constant thermal inertia, albedo
561and dust opacity.
562
563\begin{finger}
564\item Though the simulation reproduces some reasonable
565features of the mesoscale circulation around Arsia
566Mons (e.g. slope winds), it should not be used
567for scientific purpose, for the number of grid points
568is unsufficient for single-domain simulation
569and the integration time is below the necessary spin-up time.
570\end{finger}
571%\pagebreak
572
573\marge To launch the test simulation, please type
574the following commands, replacing the
575\ttt{g95\_32\_single} directory with its corresponding
576value on your system:
577%
578\begin{verbatim}
579cp LMD_MM_MARS_TESTCASE.tar.gz $LMDMOD/LMD_MM_MARS/
580tar xzvf LMD_MM_MARS_TESTCASE.tar.gz
581cd TESTCASE
582ln -sf ../g95_32_single/real_x61_y61_z61_d1_t1_p1.exe wrf.exe 
583nohup wrf.exe > log_wrf &
584\end{verbatim}
585
586%tar xzvf wrfinput.tar.gz
587
588\begin{finger}
589\item If you compiled the model using MPICH2,
590the command to launch a simulation is slightly different:
591%
592\begin{verbatim}
593[simulation on 2 processors on 1 machine]
594mpd &      # first-time only (or after a reboot)
595           # NB: may request the creation of a file .mpd.conf
596mpirun -np 8 wrf.exe < /dev/null &      # NB: mpirun is only a link to mpiexec 
597tail -20 rsl.out.000?     # to check the outputs
598\end{verbatim}
599\begin{verbatim}
600[simulation on 16 processors in 4 connected machines]
601echo barry.lmd.jussieu.fr > ~/mpd.hosts
602echo white.lmd.jussieu.fr >> ~/mpd.hosts
603echo loves.lmd.jussieu.fr >> ~/mpd.hosts
604echo tapas.lmd.jussieu.fr >> ~/mpd.hosts
605ssh barry.lmd.jussieu.fr   # make sure that ssh to other machines
606                           # is possible without authentification
607mpdboot -f ~/mpd.hosts -n 4
608mpdtrace
609mpirun -l -np 16 wrf.exe < /dev/null &   # NB: mpirun is only a link to mpiexec
610tail -20 rsl.out.00??     # to check the outputs
611\end{verbatim}
612\end{finger}
613
614
615\mk
616\chapter{Setting the simulation parameters}
617
618\mk
619In this chapter, we describe how to set the various parameters
620defining a given simulation.
621%
622As could be inferred from the content of the \ttt{TESTCASE} directory,
623two parameter files are needed to run the model:
624\begin{enumerate}
625\item The parameters related to the dynamical part of the model can be set
626in the file \ttt{namelist.input} according to the ARW-WRF namelist formatting.
627\item The parameters related to the physical part of the model can be set
628in the file \ttt{callphys.def} according to the LMD-MGCM formatting.
629\end{enumerate}
630
631\mk
632\section{Dynamical settings}
633
634\mk
635\ttt{namelist.input} controls the behavior of the dynamical core
636in the LMD Martian Mesoscale Model.
637%
638Compared to the file the ARW-WRF users are familiar with\footnote{
639%%%
640A description of this file can be found in \ttt{SRC/WRFV2/run/README.namelist}.
641%%%
642}, the \ttt{namelist.input} in the LMD Martian Mesoscale Model
643is much shorter.
644%
645The only mandatory parameters in this file
646are information on time control\footnote{
647%%%
648More information on the adopted Martian calendar:
649\url{http://www-mars.lmd.jussieu.fr/mars/time/solar_longitude.html}
650%%%
651} and domain definition.
652
653\mk
654\marge The minimal version of the \ttt{namelist.input}
655file corresponds to standard simulations with the model.
656%
657It is however possible to modify optional parameters
658if needed, as is the case in the \ttt{namelist.input} 
659associated to the Arsia Mons test case
660(e.g. the parameter \ttt{non\_hydrostatic} is set to false
661to assume hydrostatic equilibrium, whereas standard
662simulations are non-hydrostatic).
663
664\mk
665\marge A detailed description of the \ttt{namelist.input} file is given below\footnote{
666%%%
667You may find the corresponding file in \ttt{SIMU/namelist.input\_full}.
668%%%
669}.
670%
671Comments on each of the parameters are provided,
672with the following labels:
673\begin{citemize}
674\item \ttt{(*)} denotes parameters not to be modified,
675\item \ttt{(r)} indicates parameters which modification implies a new recompilation of the model,
676\item \ttt{(n)} describes parameters involved when nested domains are defined,
677\item \ttt{(p1)}, \ttt{(p2)}, \ttt{(p3)} mention parameters which modification implies a new processing
678of initial and boundary conditions (see next chapter),
679\item \ttt{(*d)} denotes dynamical parameters which modification implies
680non-standard simulations -- please read \ttt{SRC/WRFV2/run/README.namelist} 
681and use with caution.
682\end{citemize}
683%
684If omitted, the optional parameters would be set to their default
685values indicated below.\pagebreak
686
687\centers{\ttt{-- file: namelist.input\_full --}}\codesource{namelist.input_full}\centers{\ttt{-- end file: namelist.input\_full --}}
688
689\begin{finger}
690\item Please pay attention to rigorous syntax while
691editing your personal \ttt{namelist.input} file
692to avoid reading error.
693\item To modify the default values (or even add
694personal parameters) in the \ttt{namelist.input} file,
695edit the \ttt{SRC/WRFV2/Registry/Registry.EM} file.
696%
697You will then have to recompile the model with \ttt{makemeso} ;
698answer \ttt{y} to the last question.
699\end{finger}
700
701\mk
702\marge In case you run simulations with \ttt{max\_dom}
703nested domains, you have to set \ttt{max\_dom} parameters
704wherever there is a ``," in the above list.
705%
706Here is an example of the resulting syntax of the
707\ttt{time\_control}, \ttt{domains} and \ttt{bdy\_control}
708categories in \ttt{namelist.input}:
709%
710\codesource{OMG_namelist.input}
711
712\section{Physical settings}
713
714\mk
715\ttt{callphys.def} controls the behavior of the physical parameterizations
716in the LMD Martian\linebreak Mesoscale Model.
717%
718The organization of this file is exactly similar
719to the corresponding file in the LMD Martian GCM, which
720user manual can be found at
721\url{http://web.lmd.jussieu.fr/~forget/datagcm/user_manual.pdf}.
722
723\mk
724\marge Please find in what follows the contents of \ttt{callphys.def}:
725%
726\centers{\ttt{-- file: callphys.def --}}\codesource{callphys.def}\centers{\ttt{-- end file: callphys.def --}}
727
728\mk
729\begin{finger}
730\item Note that in the given example
731the convective adjustment,
732the gravity wave parameterization,
733and the NLTE schemes are turned off, as is
734usually the case in typical Martian tropospheric
735mesoscale simulations.
736\item \ttt{iradia} sets the frequency
737(in dynamical timesteps) at which
738the radiative computations are performed.
739\item Modifying \ttt{callphys.def} only implies
740to recompile the model if the number of tracers is different.
741\item If you run a simulation with, say, $3$ domains,
742please ensure that you defined three files
743\ttt{callphys.def}, \ttt{callphys\_d2.def} and \ttt{callphys\_d3.def}.
744\end{finger}
745
746\mk
747\chapter{Preprocessing utilities}
748
749\mk
750In the previous chapter, we decribed the simulation settings
751in the \ttt{namelist.input} file.
752%
753We saw that any modification of the parameters
754labelled with \ttt{(p1)}, \ttt{(p2)} or \ttt{(p3)} 
755implies the initial and boundary conditions
756and/or the domain definition to be recomputed prior to running the model again.
757%
758As a result, you were probably unable to change many of the parameters
759of the Arsia Mons test case (proposed in section \ref{sc:arsia}) in which
760the initial and boundary conditions -- as well as the domain of
761simulation -- were predefined.
762
763\mk
764\marge In this chapter, we describe the installation and use of the preprocessing tools to
765define the domain of simulation, calculate an initial atmospheric state
766and prepare the boundary conditions for the chosen simulation time.
767%
768This necessary step would eventually allow you to run your own simulations at the specific season and region
769you are interested in, with a complete ability to modify any of the parameters in \ttt{namelist.input}.
770
771\mk
772\section{Installing the preprocessing utilities}
773
774\mk
775First and foremost, since the preprocessing utilities could generate
776(or involve) files of quite significant sizes, it is necessary
777to define a directory where these files would be stored.
778%
779Such a directory (e.g. \ttt{/bigdisk/user}) must be linked as follows
780%
781\begin{verbatim}
782ln -sf /bigdisk/user $LMDMOD/TMPDIR
783\end{verbatim}
784
785\mk
786\marge A second prerequisite to the installation of the preprocessing tools is that the LMD Martian
787Mesoscale Model was compiled at least once.
788%
789If this is not the case, please compile
790the model with the \ttt{makemeso} command
791(see section \ref{sc:makemeso}).
792
793\mk
794\marge The compilation process created an
795installation directory adapted to your
796particular choice of compiler$+$machine.
797%
798The preprocessing tools will also
799be installed in this directory.
800%
801Please type the following commands:
802%
803\begin{verbatim}
804cd $LMDMOD/LMD_MM_MARS/g95_32_single/   ## or any install directory
805ln -sf ../prepare_ini .
806./prepare_ini
807\end{verbatim}
808
809\mk
810\marge The script \ttt{prepare\_ini} plays with the preprocessing tools
811an equivalent role as the \ttt{copy\_model} with the model sources :
812files are simply linked to their actual location in the \ttt{SRC} folder.
813%
814Once you have executed \ttt{prepare\_ini}, please check that
815two folders were generated: \ttt{PREP\_MARS} and \ttt{WPS}.
816
817\mk
818\marge In the \ttt{PREP\_MARS} directory, please compile
819the programs \ttt{create\_readmeteo.exe} and \ttt{readmeteo.exe},
820using the compiler mentionned in the name of the current
821installation directory:
822%
823\begin{verbatim}
824echo $PWD
825cd PREP_MARS/
826./compile [or] ./compile_g95
827ls -lt create_readmeteo.exe readmeteo.exe
828cd ..
829\end{verbatim}
830
831\mk
832\marge In the \ttt{WPS} directory, please compile
833the programs \ttt{geogrid.exe} and \ttt{metgrid.exe}:
834\begin{verbatim}
835cd WPS/   
836./configure   ## select your compiler + 'NO GRIB2' option
837./compile
838ls -lt geogrid.exe metgrid.exe
839\end{verbatim}
840
841\mk
842\marge Apart from the executables you just compiled,
843the preprocessing utilities include \ttt{real.exe},
844which was compiled by the \ttt{makemeso} script
845along with the mesoscale model executable \ttt{wrf.exe}.
846%
847\ttt{real.exe} should be copied or linked in the
848simulation directory (e.g. \ttt{TESTCASE} for the
849Arsia Mons test case) to be at the same level than
850\ttt{namelist.input}.
851
852\begin{finger}
853\item Even though the name of the executable writes
854e.g. \ttt{real\_x61\_y61\_z61\_d1\_t1\_p1.exe}, such program
855is not related to the specific \ttt{makemeso}
856parameters -- contrary to the \ttt{wrf.exe} executable.
857%
858We just found that renaming the (possibly similar
859if the model sources were not modified) 
860\ttt{real.exe} was a practical way not to confuse
861between executables compiled at different moments.
862\end{finger}
863
864\mk
865\section{Running the preprocessing utilities}
866
867\mk
868When you run a simulation with \ttt{wrf.exe},
869the program attempts to read the initial state
870in the files
871\ttt{wrfinput\_d01},
872\ttt{wrfinput\_d02}, \ldots 
873(one file per domain)
874and the parent domain boundary conditions
875in \ttt{wrfbdy\_d01}.
876%
877The whole chain of data conversion and
878interpolation needed to generate those
879files is summarized in the diagram next
880page.
881%
882Three distinct preprocessing steps are
883necessary to generate the final files.
884%
885As is described in the previous section,
886some modifications in the \ttt{namelist.input} file
887[e.g. start/end dates labelled with \ttt{(p1)}]
888requires a complete reprocessing from step $1$ to step $3$
889to successfully launch the simulation,
890whereas other changes
891[e.g. model top labelled with \ttt{(p3)}] 
892only requires a quick reprocessing at step $3$, keeping
893the files generated at the end of step $2$
894the same.
895 
896\mk
897\subsection{Input data}
898
899\mk
900\subsubsection{Static data}
901
902\mk
903All the static data
904(topography, thermal inertia, albedo)
905needed to initialize the model
906are included in the \ttt{\$LMDMOD/LMD\_MM\_MARS/WPS\_GEOG} directory.
907%
908By default, only coarse-resolution datasets\footnote{
909%%%
910Corresponding to the fields stored in the
911file \ttt{surface.nc} known by LMD-MGCM users:
912\url{http://web.lmd.jussieu.fr/~forget/datagcm/datafile/surface.nc}
913%%%
914} are available, but the directory also contains sources and scripts
915to install finer resolution datasets:
916\begin{citemize}
917\item 32 and/or 64 pixel-per-degree (ppd) MOLA topography [\textit{Smith et al.}, 2001]\nocite{Smit:01mola},
918\item 8 ppd MGS/Thermal Emission Spectrometer (TES) albedo [\textit{Christensen et al.}, 2001]\nocite{Chri:01},
919\item 20 ppd TES thermal inertia [\textit{Putzig and Mellon}, 2007]\nocite{Putz:07}
920\end{citemize}
921\pagebreak
922\includepdf[pages=1,offset=25mm -20mm]{diagramme.pdf}
923
924\mk
925\marge The role of the \ttt{build\_static} script is to
926automatically download these datasets from the web
927(namely PDS archives) and convert them to an
928acceptable format for a future use by the
929preprocessing utilities:
930%
931\begin{verbatim}
932cd $LMDMOD/LMD_MM_MARS
933./build_static
934\end{verbatim}
935%
936\begin{finger}
937\item Please install the \ttt{octave}
938free software\footnote{
939%%%
940Available at \url{http://www.gnu.org/software/octave}
941%%%
942} on your system to be able to use the
943\ttt{build\_static} script.
944%
945Another solution is to browse into each of the
946directories contained within \ttt{WPS\_GEOG}, download the
947data with the shell scripts and execute the \ttt{.m} scripts with either
948\ttt{octave} or the commercial software \ttt{matlab}
949(just replace \ttt{\#} by \ttt{\%}).
950%
951\item If you do not manage to execute the \ttt{build\_static} script,
952converted ready-to-use datafiles are available upon request.
953%
954\item The building of the MOLA 64ppd topographical
955database can be quite long. Thus, such a process is
956not performed by default by the \ttt{build\_static} script.
957If the user would like to build this database,
958please remove the \ttt{exit} command in the script, just above
959the commands related to the MOLA 64ppd.
960%
961\item The resulting \ttt{WPS\_GEOG} can reach a size
962of several hundreds of Mo.
963%
964You might move such a folder in a place
965with more disk space available, but then be
966sure to create in \ttt{\$LMDMOD/LMD\_MM\_MARS}
967a link to the new location
968of the directory.
969\end{finger}
970
971\mk
972\subsubsection{Meteorological data}
973
974\mk
975The preprocessing tools generate initial and boundary conditions
976from the \ttt{diagfi.nc} outputs of LMD-MGCM simulations.
977%
978If you would like to run a mesoscale simulation at a given
979season, you need to first run a GCM simulation and output
980the meteorological fields at the considered season.
981%
982For optimal forcing at the boundaries, we advise you
983to write the meteorological fields to the
984\ttt{diagfi.nc} file at least each two hours.
985%
986Please also make sure that the following fields
987are stored in the NETCDF \ttt{diagfi.nc} file:
988
989\footnotesize
990\codesource{contents_diagfi}
991
992\normalsize
993\begin{finger}
994\item If the fields
995\ttt{emis},
996\ttt{co2ice},
997\ttt{q01},
998\ttt{q02},
999\ttt{tsoil} 
1000are missing in the \ttt{diagfi.nc} file,
1001they are replaced by respective default
1002values $0.95$, $0$, $0$, $0$, tsurf.
1003\end{finger}
1004
1005\mk
1006\marge An example of input meteorological file
1007\ttt{diagfi.nc} file can be downloaded
1008at \url{http://web.lmd.jussieu.fr/~aslmd/LMD_MM_MARS/diagfi.nc.tar.gz}.
1009%
1010Please deflate the archive and copy the \ttt{diagfi.nc} file
1011in \ttt{\$LMDMOD/TMPDIR/GCMINI}.
1012%
1013Such a file can then be used to define the initial
1014and boundary conditions, and we will go
1015through the three preprocessing steps.
1016
1017\mk
1018\subsection{Preprocessing steps} 
1019
1020\mk
1021\subsubsection{Step 1: Converting GCM data}
1022
1023\mk
1024The programs in the \ttt{PREP\_MARS} directory
1025convert the data from the NETCDF \ttt{diagfi.nc}
1026file into separated binary datafiles for each
1027date contained in \ttt{diagfi.nc}, according to
1028the formatting needed by the
1029preprocessing programs at step 2.
1030%
1031These programs can be executed by the following
1032commands:
1033\begin{verbatim}
1034cd $LMDMOD/LMD_MM_MARS/your_install_dir/PREP\_MARS
1035echo 1 | ./create_readmeteo.exe     # drop the "echo 1 |" if you want control
1036./readmeteo.exe < readmeteo.def
1037\end{verbatim}
1038%
1039\marge If every went well with the conversion,
1040the directory \ttt{\$LMDMOD/TMPDIR/WPSFEED}
1041should contain files named \ttt{LMD:}.
1042
1043\mk
1044\subsubsection{2: Interpolation on the regional domain}
1045
1046\mk
1047In the \ttt{WPS} directory, the \ttt{geogrid.exe} program allows
1048you to define the mesoscale simulation domain
1049to horizontally interpolate the topography,
1050thermal inertia and albedo fields at the domain
1051resolution and to calculate useful fields
1052such as topographical slopes.%\pagebreak
1053
1054\mk
1055\marge Please execute the commands:
1056%
1057\begin{verbatim}
1058cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
1059ln -sf ../../TESTCASE/namelist.wps .   # test case
1060./geogrid.exe
1061\end{verbatim}
1062%
1063\marge The result of \ttt{geogrid.exe} 
1064-- and thus the definition of the mesoscale
1065domain -- can be checked in the NETCDF
1066file \ttt{geo\_em.d01.nc}.
1067%
1068A quick check can be performed using the command line
1069\begin{verbatim}
1070ncview geo_em.d01.nc
1071\end{verbatim} 
1072\marge if \ttt{ncview} is installed, or the \ttt{IDL}
1073script \ttt{out\_geo.pro}
1074\begin{verbatim}
1075idl
1076IDL> out_geo, field1='TOPO'
1077IDL> out_geo, field1='TI'
1078IDL> SPAWN, 'ghostview geo_em.d01_HGT_M.ps &'
1079IDL> SPAWN, 'ghostview geo_em.d01_THERMAL_INERTIA.ps &'
1080IDL> exit
1081\end{verbatim}
1082\marge if the demo version of \ttt{IDL} is installed.
1083%
1084Of course if your favorite graphical tool supports
1085the NETCDF standard, you might use it to check the
1086domain definition in \ttt{geo\_em.d01.nc}.
1087
1088\mk
1089\marge If you are unhappy with the results or
1090you want to change
1091the location of the mesoscale domain on the planet,
1092the horizontal resolution,
1093the number of grid points \ldots,
1094please modify the parameter
1095file \ttt{namelist.wps} and execute again \ttt{geogrid.exe}.
1096%
1097Here are the contents of \ttt{namelist.wps}:
1098%
1099\codesource{namelist.wps_TEST} 
1100
1101\begin{finger}
1102%
1103\item No input meteorological data
1104are actually needed to execute \ttt{geogrid.exe}.
1105%
1106\item More details about the database and
1107more options of interpolation could be
1108found in the file \ttt{geogrid/GEOGRID.TBL}.
1109%
1110\item Defining several domains yields
1111distinct files
1112\ttt{geo\_em.d01.nc},
1113\ttt{geo\_em.d02.nc},
1114\ttt{geo\_em.d03.nc}\ldots
1115\end{finger}
1116
1117\mk
1118\marge Once the \ttt{geo\_em} file(s) are generated,
1119the \ttt{metgrid.exe} program performs
1120a similar horizontal interpolation
1121of the meteorological fields to the mesoscale
1122domain as the one performed by \ttt{geogrid.exe} 
1123for the surface data.
1124%
1125Then the program writes the results in
1126\ttt{met\_em} files and also collects
1127the static fields and domain parameters
1128included in the \ttt{geo\_em} file(s)
1129%
1130Please type the following commands:
1131\begin{verbatim}
1132cd $LMDMOD/LMD_MM_MARS/your_install_dir/WPS
1133./metgrid.exe
1134\end{verbatim}
1135%
1136\marge If every went well,
1137the directory \ttt{\$LMDMOD/TMPDIR/WRFFEED}
1138should contain the \ttt{met\_em.*} files.
1139
1140\mk
1141\subsubsection{Step 3: Vertical interpolation on mesoscale levels}
1142
1143\mk
1144\marge The last step is to execute \ttt{real.exe}
1145to perform the interpolation from the vertical
1146levels of the GCM to the vertical levels
1147defined in the mesoscale model.
1148%
1149This program also prepares the final initial
1150state for the simulation in files called
1151\ttt{wrfinput} and the boundary conditions
1152in files called \ttt{wrfbdy}.
1153
1154\mk
1155\marge To successfully execute \ttt{real.exe},
1156you need the \ttt{met\_em.*} files
1157and the \ttt{namelist.input} file
1158to be in the same directory as \ttt{real.exe}.
1159%
1160Parameters in \ttt{namelist.input}
1161controlling the behavior of the vertical interpolation
1162are those labelled with \ttt{(p3)} in the detailed
1163list introduced in the previous chapter.
1164
1165\mk
1166\marge Please type the following commands
1167to prepare files for the Arsia Mons test case
1168(or your personal test case if you changed
1169the parameters in \ttt{namelist.wps}):
1170\begin{verbatim}
1171cd $LMDMOD/TESTCASE
1172ln -sf $LMDMOD/WRFFEED/met_em* .
1173./real.exe
1174\end{verbatim}
1175
1176\mk
1177\marge The final message of the \ttt{real.exe}
1178should claim the success of the processes and you
1179are now ready to launch the integrations
1180of the LMD Martian Mesoscale Model again
1181with the \ttt{wrf.exe} command as in section
1182\ref{sc:arsia}.
1183
1184\begin{finger}
1185\item When you modify either
1186\ttt{namelist.wps} or \ttt{namelist.input},
1187make sure that the common parameters
1188are exactly similar in both files
1189(especially when running nested simulations)
1190otherwise either \ttt{real.exe} or \ttt{wrf.exe}
1191command will exit with an error message.
1192\end{finger}
1193%\pagebreak
1194
1195
1196\chapter{Starting simulations from scratch}
1197
1198\mk
1199\section{Running your own GCM simulations}
1200
1201\begin{remarque}
1202To be completed
1203\end{remarque}
1204
1205\mk
1206\section{Complete simulations with \ttt{runmeso}}
1207
1208\begin{remarque}
1209To be completed
1210\end{remarque}
1211
1212
1213\chapter{Outputs}
1214
1215\mk
1216\section{Postprocessing utilities and graphics}
1217
1218\begin{remarque}
1219To be completed. Do-it-all \ttt{idl} scripts
1220would be described here !
1221\end{remarque}
1222
1223\mk
1224\section{Modify the outputs}
1225
1226\begin{remarque}
1227To be completed.
1228Though the method is different,
1229we kept all the convenient aspects of \ttt{writediagfi}
1230\end{remarque}
1231
1232\chapter{Frequently Asked Questions}
1233
1234
1235\begin{finger}
1236\item Which timestep should I choose to avoid crashes of the model ?
1237\item In the Martian simulations, why can't I define boundaries each 6 hours as on Earth ?
1238\item Help ! I get strange assembler errors or ILM errors while compiling !
1239\item Is it possible to run the model on a specific configuration that is not supported ?
1240\item Why do I have to define four less rows in the parent domain
1241when performing nested runs ?
1242\item I am kind of nostalgic of early/middle Mars. How could I run
1243mesoscale simulations at low/high obliquity ?
1244\item Why \ttt{real.exe} is crashing when the model top pressure is
1245lower than $2$~Pa ?
1246\item Can I use the two-way nesting ?
1247\end{finger}
1248
1249\begin{remarque}
1250To be completed.
1251\end{remarque}
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
Note: See TracBrowser for help on using the repository browser.