1 | \chapter{Frequently Asked Questions, Tips and Troubleshooting}\label{faq} |
---|
2 | |
---|
3 | \vk |
---|
4 | Browse this chapter if you encounter problems or issues while using the LMD Martian Mesoscale Model. Before reading what follows, please ensure that: |
---|
5 | \begin{citemize} |
---|
6 | \item you made no errors in using the model; |
---|
7 | \item your problem is not addressed in the previous chapters; |
---|
8 | \item your operating system and machine are in good health. |
---|
9 | \end{citemize} |
---|
10 | You might also read this chapter out of curiosity: it might be useful for your experience as an user. |
---|
11 | |
---|
12 | \mk |
---|
13 | \section{General questions} |
---|
14 | |
---|
15 | \sk |
---|
16 | \noindent \textbf{I don't know anything about mesoscale meteorology. Does that prevent me from becoming an user of your model?} |
---|
17 | \begin{finger} |
---|
18 | \item Not really. It is the purpose of this user manual to help you with running simulations with the LMD Martian Mesoscale Model. Now, you will probably not be able to interpret simulation results that easily, but we will then be happy to help you with our expertise on atmospheric science and to advise good books so that you learn more about this topic. |
---|
19 | \end{finger} |
---|
20 | |
---|
21 | \sk |
---|
22 | \noindent \textbf{I don't have time, or feeling overwhelmed by learning how to use the model.} |
---|
23 | \begin{finger} |
---|
24 | \item There are particular cases in which our team might be able to run the simulation for your study. Or help someone you would hire to do the work with learning about how to use the model and answer to questions. We are open to discussion. |
---|
25 | \end{finger} |
---|
26 | |
---|
27 | \mk |
---|
28 | \section{Compilation} |
---|
29 | |
---|
30 | \sk |
---|
31 | \noindent \textbf{The model compiled yesterday. Now, with no apparent changes, it does not compile.} |
---|
32 | \begin{finger} |
---|
33 | \item This is one of the most frustating situation. Remember though that there is $99\%$ chance that the reason for the problem is either stupid or none of your responsability. Please check that: |
---|
34 | \begin{citemize} |
---|
35 | \item Disk quota is not exceeded; |
---|
36 | \item You are working on the same machine as the day before; |
---|
37 | \item No source file has been accidentally modified; no links broken; |
---|
38 | \item No updates has been performed on your system during the night; |
---|
39 | \item Recompiling with \ttt{makemeso -f} does not solve the problem. |
---|
40 | \end{citemize} |
---|
41 | \end{finger} |
---|
42 | |
---|
43 | \sk |
---|
44 | \noindent \textbf{The model is no longer compiling, after I abruptly stopped the \ttt{makemeso} script because I realized that I made a mistake (e.g. I was compiling on the wrong machine).} |
---|
45 | \begin{finger} |
---|
46 | \item Recompile the model from scratch by adding the option \ttt{-f} to \ttt{makemeso}. |
---|
47 | \end{finger} |
---|
48 | |
---|
49 | \sk |
---|
50 | \noindent \textbf{I am asking for compiling the model on a huge grid (e.g. over $200 \times 200 \times 100$ for a single-processor run). The compilation fails with ``relocated fits" errors.} |
---|
51 | \begin{finger} |
---|
52 | \item Try to lower the number of grid points (either horizontal or vertical) or consider using parallel computations where computations over the model grid will be split over several processors. |
---|
53 | \end{finger} |
---|
54 | |
---|
55 | \sk |
---|
56 | \noindent \textbf{I am afraid I explored a given compilation directory in \ttt{\$MMM} (say \ttt{g95\_32\_single}) and broke something, e.g. deleted or break some links. The model does not compile anymore.} |
---|
57 | \begin{finger} |
---|
58 | \item Delete the corresponding compilation directory. Since it is mostly filled with symbolic links, you will only lose the previously compiled executables and the (possibly modified) \ttt{Registry.EM} file. Save those files prior to deletion of the compilation directory if you would like to keep those. Then run again \ttt{makemeso} for the same combination of compiler/system and a new clean version of the compilation directory will reappear, while the model executables are recompiled from scratch. |
---|
59 | \end{finger} |
---|
60 | |
---|
61 | \sk |
---|
62 | \noindent \textbf{I update the model's sources through \ttt{svn update} and the compilation failed with the new version} |
---|
63 | \begin{finger} |
---|
64 | \item It could happen (but this is not usual) that we move, create or delete some files in \ttt{\$MMM/SRC} while developing new capabilities or bug fixes for the model -- and commit the changes to the reference version of the model. Please apply the solution proposed in the previous point and the model can be compiled again (because our rule is to commit only versions of the model which could be compiled). Possible problems can be anticipated by having a look to commit log through the command \ttt{svn log}. The vast majority of our commits, and subsequent reference model changes, is perfectly transparent for the user. |
---|
65 | \end{finger} |
---|
66 | |
---|
67 | \sk |
---|
68 | \noindent \textbf{I would like to learn more about the interface between the WRF dynamical core and the LMD Martian physical parameterizations.} |
---|
69 | \begin{finger} |
---|
70 | \item The program source that is responsible for the interface between the dynamical core and the physical parameterizations is \ttt{module\_lmd\_driver.F} in \ttt{\$MMM/SRC/WRFV2/phys/}. |
---|
71 | \end{finger} |
---|
72 | |
---|
73 | \sk |
---|
74 | \noindent \textbf{WPS does not compile with my favorite compiler (the one I have to use for model integrations) but seems to work with another one} |
---|
75 | \begin{finger} |
---|
76 | \item Go to the folder corresponding to your favorite compiler. Remove the \ttt{WPS} folder and link here the \ttt{WPS} folder obtained with the alternate compiler. The \ttt{runmeso} workflow will then work just fine if you select your favorite compiler. |
---|
77 | \end{finger} |
---|
78 | |
---|
79 | \sk |
---|
80 | \noindent \textbf{I think I found a bug in the model.} |
---|
81 | \begin{finger} |
---|
82 | \item This is not impossible! Please double check then contact us. |
---|
83 | \end{finger} |
---|
84 | |
---|
85 | \mk |
---|
86 | \section{Preprocessing steps} |
---|
87 | |
---|
88 | \sk |
---|
89 | \noindent \textbf{I would like to have smoother surface properties.} |
---|
90 | \begin{finger} |
---|
91 | \item Increase the smoothing parameter \ttt{smooth\_passes} in the file \ttt{WPS/geogrid/GEOGRID.TBL} for each field you would like to get smoother, then restart at step 2 (execution of \ttt{geogrid.exe}). |
---|
92 | \end{finger} |
---|
93 | |
---|
94 | \sk |
---|
95 | \noindent \textbf{I would like to know more about customizing the calculations made by \ttt{geogrid.exe} and \ttt{metgrid.exe}.} |
---|
96 | \begin{finger} |
---|
97 | \item You probably want to know more about various settings in \ttt{WPS/geogrid/GEOGRID.TBL} and \ttt{WPS/geogrid/METGRID.TBL}. A detailed description can be found here \url{http://www.mmm.ucar.edu/wrf/users/docs/user_guide/users_guide_chap3.html} (some parameters are not relevant for Mars). |
---|
98 | \end{finger} |
---|
99 | |
---|
100 | \sk |
---|
101 | \noindent \textbf{To speed up initializations, I would like to define GCM constraints at the domain boundaries each 6 Martian hours, instead of each one or two hours as it is usually done (cf. \ttt{interval\_seconds = 3700}).} |
---|
102 | \begin{finger} |
---|
103 | \item It is not a good idea. Near-surface atmospheric fields undergo a strong daily cycle on Mars which you will not be able to capture if \ttt{interval\_seconds} is higher than 7400 seconds (i.e. two Martian hours). |
---|
104 | \end{finger} |
---|
105 | |
---|
106 | \sk |
---|
107 | \noindent \textbf{\ttt{real.exe} is sometimes crashing with certain (low) values of \ttt{p\_top\_requested}.} |
---|
108 | \begin{finger} |
---|
109 | \item The program \ttt{real.exe} attempts to come up with nice equally-spaced-in-altitude vertical levels above the boundary layer up to the model top. This is done by an iterating algorithm integrating the hydrostatic equation, which sometimes does not converge if the model top is too high (typically for values of \ttt{p\_top\_requested} below~$5$~Pa). Try to lower \ttt{force\_sfc\_in\_vinterp}, increase \ttt{max\_dz}, or modify \ttt{tiso} to help the algorithm to converge. An alternate solution to set values for \ttt{p\_top\_requested} below~$5$~Pa is to prescribe your own vertical levels (see next point). |
---|
110 | \end{finger} |
---|
111 | |
---|
112 | \sk |
---|
113 | \noindent \textbf{I would like to define my own vertical levels.} |
---|
114 | \begin{finger} |
---|
115 | \item Create a file \ttt{levels} with all your mass-based model levels (see chapter~\ref{whatis}) in it then add the optional setting in \ttt{\&domains} in \ttt{namelist.input} |
---|
116 | \begin{verbatim} |
---|
117 | eta_levels = 1.000000, |
---|
118 | 0.000000 |
---|
119 | \end{verbatim} |
---|
120 | You might also want to use \ttt{eta\_levels} to prescribe directly in \ttt{namelist.input} the list of your custom model levels. Please ensure that the lowermost model level is $1$, the uppermost is $0$ and vertical resolution is refined in the boundary layer ($\sim 8$ vertical levels above surface). |
---|
121 | \end{finger} |
---|
122 | |
---|
123 | \mk |
---|
124 | \section{Runtime} |
---|
125 | |
---|
126 | \sk |
---|
127 | \noindent \textbf{I would like to know how long my simulation will last.} |
---|
128 | \begin{finger} |
---|
129 | \item Check the log information while \ttt{wrf.exe} is running. The effective time to realize each integrating or writing step is indicated. Hence you can extrapolate and predict the total simulation time. If you use parallel computations, have a look in \ttt{rsl.error.0000} to get this information. |
---|
130 | \end{finger} |
---|
131 | |
---|
132 | \sk |
---|
133 | \noindent \textbf{With default settings, I have one \ttt{wrfout*} file per simulated day, each one of those containing fields hour by hour. I want to change this.} |
---|
134 | \begin{finger} |
---|
135 | \item If you want to have an output frequency higher [lower] than one per hour, decrease [increase] the parameter \ttt{history\_interval} in \ttt{namelist.input} (remember that each unit of \ttt{history\_interval} is $100$~seconds). If you want to have more [less] data in each individual file, increase [decrease] the parameter \ttt{frames\_per\_outfile} in \ttt{namelist.input}. |
---|
136 | \end{finger} |
---|
137 | |
---|
138 | \sk |
---|
139 | \noindent \textbf{Looks like in the model (cf. \ttt{namelist.input}), a Martian hour is~$3700$ seconds. The reality is closer to~$3699$ seconds.} |
---|
140 | \begin{finger} |
---|
141 | \item This is true, though obviously the~$3700$ figure is much more convenient and choosing this instead of~$3699$ has no impact whatsoever on simulations which last typically less than one month, and most often only a few days. |
---|
142 | \end{finger} |
---|
143 | |
---|
144 | \sk |
---|
145 | \noindent \textbf{I want to know the local time for a given model output.} |
---|
146 | \begin{finger} |
---|
147 | \item Time management in the model, which includes the way output files are named, relates to UTC time, i.e. local time at longitude~$0^{\circ}$. The time given in the name of each \ttt{wrfout*} file refers to the first frame written in the file -- using \ttt{history\_interval} allows you to infer universal time for all frames in the file. Another method is to look at the variable \ttt{Times} in \ttt{wrfout*}. Once you know about universal time, you can check the domain longitudes in \ttt{XLONG} to calculate local time at any location. |
---|
148 | \end{finger} |
---|
149 | |
---|
150 | \sk |
---|
151 | \noindent \textbf{The executable \ttt{wrf.exe} crashes a few seconds after launching and I don't know why.} |
---|
152 | \begin{finger} |
---|
153 | \item Please check all outputs from \ttt{wrf.exe}: \ttt{wrfout*} files and information log (note that the model can be made more verbose by setting \ttt{debug\_level = 200} in \ttt{namelist.input}). It is usually possible to find hints about the problem(s) which make the model become unstable or crash. Sometimes it is just one file that is missing. If \ttt{cfl} warnings are reported in information log, it is probably a good idea to lower the timestep, but this will not fix the problem all the time especially if there are wrong settings and subsequent physical inconsistencies. If everything looks fine in the information log, try to lower \ttt{history\_interval} to $1$ in \ttt{namelist.input} so that much frequent outputs can be obtained in the \ttt{wrfout*} files and the problem can be further diagnosed through analyzing simulated meteorological fields. |
---|
154 | \end{finger} |
---|
155 | |
---|
156 | \sk |
---|
157 | \noindent \textbf{I don't know which timestep should I choose to prevent the model from crashing.} |
---|
158 | \begin{finger} |
---|
159 | \item The answer depends on the horizontal resolution according to the CFL condition -- and whether the dynamical core is used in hydrostatic or non-hydrostatic mode, plus other factors (e.g. slopes, temperature gradients, etc\ldots). Please refer to the table in \textit{Spiga and Forget} [2009] for guidelines about timestep; or check examples in \ttt{\$MMM/SIMU/DEF}. A rule-of-thumb to start with is to set \ttt{time\_step} to the value of \ttt{dx} in kilometers; this value can be sometimes raised to get faster integrations. If the \ttt{time\_step} parameter is too large for the horizontal resolution~\ttt{dx} and violates the CFL criterion, \ttt{wrf.exe} usually issues warnings about CFL violation in the first integration steps. |
---|
160 | \end{finger} |
---|
161 | |
---|
162 | \sk |
---|
163 | \noindent \textbf{Looks like \ttt{wrf.exe} is crashing because there are dynamical instabilities on the lateral boundaries apparently close to a topographical obstacle.} |
---|
164 | \begin{finger} |
---|
165 | \item Check that no steep slope (mountain, crater) is located at the domain boundaries. Otherwise, change the domain's center so that no major topographical gradient is located close to the domain boundaries (in the relaxation zone). This is also true for nested simulations at the boundary between parent and nested domains. |
---|
166 | \end{finger} |
---|
167 | |
---|
168 | \sk |
---|
169 | \noindent \textbf{I compiled the model with \ttt{ifort}. At runtime it stops after a few integration steps because a segmentation fault appeared.} |
---|
170 | \begin{finger} |
---|
171 | \item The model uses a lot of memory, especially when large domains or nests are requested. Try the command \ttt{ulimit -s unlimited}. If this does not solve the problem, try other solutions listed in this chapter. |
---|
172 | \end{finger} |
---|
173 | |
---|
174 | \sk |
---|
175 | \noindent \textbf{The model seems not being able to produce outputs although the log files indicate writing files has been done. This is the case especially when I increased the number of grid points.} |
---|
176 | \begin{finger} |
---|
177 | \item Set the environment variable \ttt{WRFIO\_NCD\_LARGE\_FILE\_SUPPORT} to 1 |
---|
178 | \begin{verbatim} |
---|
179 | declare -x WRFIO_NCD_LARGE_FILE_SUPPORT=1 |
---|
180 | \end{verbatim} |
---|
181 | and recompile the model from scratch. Your model will be able then to produce very large files (especially restart files). |
---|
182 | \end{finger} |
---|
183 | |
---|
184 | \mk |
---|
185 | \section{Specific simulations} |
---|
186 | |
---|
187 | \sk |
---|
188 | \noindent \textbf{It seems difficult to me to find a number of horizontal grid points for parallel nested simulations that is compliant with all constraints mentioned in section~\ref{nests}.} |
---|
189 | \begin{finger} |
---|
190 | \item A tip to find a compliant domain size for nested simulations: for the parent domain, choose \ttt{e\_we} and \ttt{e\_sn} according to~$e_{we} = n_{proc} \times i + 1$ with~$n_{proc}$ being a multiple of~$4$ and~$2 \, i +1$ being a multiple of~$3$. For child domains, set \ttt{e\_we} and \ttt{e\_sn} according to \ttt{e\_we[child domain] = e\_we[parent domains] + 4}. |
---|
191 | \end{finger} |
---|
192 | %%%%% TROP SPECIFIQUE !!! |
---|
193 | %% ----------------------------------------------------------------------- |
---|
194 | %% -- si possible comment determiner taille ? |
---|
195 | %% nproc doit diviser e_we-1 (1er nest) |
---|
196 | %% grid_ratio doit diviser e_we-1 +4 (1er nest) |
---|
197 | %% soit e_we=ye+1 |
---|
198 | %% grid_ratio divise ye+4 et nproc divise ye |
---|
199 | %% soit nproc=8, ye=8*i |
---|
200 | %% ainsi il existe j tel que 8i + 4 = 3j ou encore 4*[2i+1] = 3j |
---|
201 | %% verifie par exemple si 2i+1 est multiple de 3 |
---|
202 | %% il suffit donc de trouver un multiple impair de 3 et de deduire i |
---|
203 | %% par exemple 2i+1=33 >>>> i=16 |
---|
204 | %% >>>> e_we = 129 pour le 1er nest (et ajouter 4 pour les suivants) |
---|
205 | %% ------------------------------------------------------------------------ |
---|
206 | |
---|
207 | |
---|
208 | |
---|
209 | %%% DIFFUSION FOR TRACERS |
---|
210 | %%% GRAVITY WAVE ABSORBING LAYER |
---|
211 | %%% ILM files with PGF90 ? |
---|
212 | %%% WPS PREP_MARS peuvent être liés entre e.g. pgf et mpi, ou ifort et mpifort |
---|
213 | |
---|
214 | %%% RESTART: see SVN |
---|
215 | |
---|
216 | %%% LMD with old physics does not compile with ifort |
---|
217 | |
---|
218 | %%% rel_path= 32ppd:thermal_TES/ |
---|
219 | %%% -- il faudrait mettre ca dans GEOGRID.TBL |
---|
220 | |
---|
221 | |
---|
222 | |
---|
223 | |
---|
224 | \clearemptydoublepage |
---|