Preprint, 18th Conference on Severe Local Storms, Amer. Met. Soc.,
San Francisco, CA, Feburary. 1996
REALTIME NUMERICAL PREDICTION OF STORM-SCALE WEATHER
DURING VORTEX '95: GOALS AND METHODOLOGY
Kelvin K. Droegemeier+,1,2, Ming Xue1, Adwait Sathye1,
Keith Brewster1,
Gene Bassett1,2, Jian Zhang1, Yuhe Liu1,
Min Zou1, Andrew Crook3, Vince Wong1,
Richard Carpenter4, and Craig Mattocks5
1Center for Analysis and Prediction of
Storms* and 2School of Meteorology
University of Oklahoma
Norman, OK 73019
3National Center for Atmospheric Research*, Boulder, CO 80307
4Center for Computational Geosciences
University of Oklahoma
Norman, OK 73019
5NOAA Atlantic Oceanographic and Meteorological Laboratory Miami,
FL 33149
1. INTRODUCTION
The Center for Analysis and Prediction of Storms was established at the
University of Oklahoma in 1988 as one of the National Science Foundation's
first 11 Science and Technology (S&T) Centers. Its mission is to demonstrate
the practicability of storm-scale numerical weather prediction and to develop,
test, and validate a regional forecast system appropriate for operational,
commercial, and research applications. Its ultimate vision is to make available
a fully functioning, multi-season storm-scale NWP system around the turn
of the century.
Numerical prediction models were first used in operational forecasting about
four decades ago, with spatial resolutions of order 200 km the norm. Since
that time, such models have grown in sophistication and complexity, and
advances in computer technology have led to spatial resolutions as dense
as 30 km over domains several thousand kilometers on a side. Despite this
move toward representing smaller scale weather, operational models continue
to lack the spatial resolution and input data required to capture highly
energetic, sometimes destructive and short-duration events such as thunderstorms,
snow bands and downslope windstorms.
In contrast to the trend at operational centers toward increasingly higher
spatial resolution, CAPS is attacking the NWP problem from the other direction,
i.e., beginning with the explicit representation of storm-scale events using
model resolutions of order 1 km and input data provided primarily by scanning
Doppler radars. This philosophy requires that the somewhat larger-scale
or background environment also be properly represented, and thus CAPS' work
extends slightly into the meso-
regime as well.
As a step toward meeting its objectives in storm-scale NWP, and to draw
the operational community more closely into its development efforts, CAPS
began in 1993 a series of realtime operational tests (CRAFT, or Cooperative
Regional Assimilation and Forecast Test) of its prediction system (ARPS,
or Advanced Regional Prediction System;
Xue et al. 1995) in collaboration with the Norman NWSFO and Experimental
Forecast Facility (EFF). During the spring 1993 and 1994 severe weather
seasons, these consisted of 4-hour predictions in which the 1 km resolution
ARPS was initialized using a single forecast sounding to determine the basic
mode of convective development over specified regions of Oklahoma and north
Texas (Janish et al. 1994). These tests, patterned after Project
STORMTIPE (Brooks et al. 1993), were extremely valuable because they
introduced CAPS to the constraints and challenges of an operational environment,
provided a direct mechanism for bringing forecaster input into CAPS, and
brought to light problems with the ARPS that had, until that time, been
unrecognized in a controlled research setting. However, they were still
simple relative to the storm-scale prediction concept envisioned by CAPS
(i.e., horizontally inhomogeneous model initial conditions, boundary conditions
provided by larger-scale models, and initial conditions based on Doppler
radar data).
A more realistic operational experiment was conducted by CAPS as part of
the VORTEX '95 field program (Rasmussen et al. 1994). Specifically,
from late-April through early June, the ARPS was run on a nearly daily basis
with fairly complete physics at two horizontal resolutions: an outer mesoscale
domain of 15 km spacing and a nested, storm-scale domain of 3 km spacing.
The coarse grid prediction was initialized from the NMC Rapid Update Cycle
(RUC) forecast valid at 18Z, while the fine mesh was initialized using the
RUC and OLAPS (see section 3). Six hour forecasts were produced for both
domains.
We describe herein the goals and methodology of the operational tests, including
the porting of ARPS to and its performance on the Cray T3D parallel supercomputer.
A companion paper in this volume (Xue et al. 1996) presents the operations
summary and results from selected cases.
2. GOALS
The spring 1995 operational tests, the first conducted by CAPS to date in
a true NWP mode, were designed to achieve the following goals, listed in
arbitrary order: a) to provide experience to, and obtain feedback from,
operational forecasters using a nonhydrostatic storm-scale model; b) to
evaluate model skill and develop tools for doing so given the spatial and
temporal intermittency of storm-scale weather; c) to develop forecast products
appropriate for the storm-scale; d) to gain practical experience dealing
with the logistics of operational NWP, including data acquisition, formatting
and communication, high-performance parallel computing, and product generation;
and e) to solicit information about ARPS' forecasts from local, national,
and international scientists and students in the government, private, and
educational sectors by making the model output available on the World Wide
Web.
3. OPERATIONAL CONFIGURATION
Two 6-hour forecasts were made with the ARPS each operational day from 26
April through 8 June 1995. The first utilized 15 km horizontal grid resolution
over an area 1200 x 1200 square km centered over western Oklahoma (Fig.
1). The vertical grid resolution varied over 35 levels from 100 m near the
ground to 900 m at the top of the domain. Initial and
Figure 1. Configuration of the ARPS prediction grids used
in the VORTEX '95 operational tests. The 3 km resolution inner domain, nested
one-way within the 15 km resolution domain, was repositioned daily based
on the anticipated location of severe weather.
boundary conditions were provided by NMC's 60 km resolution Rapid Update
Cycle (RUC) forecast valid at 18Z the same day. The initial fields were
interpolated in space directly onto the ARPS grid (which used the RUC terrain),
while the boundary conditions were interpolated linearly in time using 3-hourly
RUC data. This version of the ARPS used the Kuo cumulus parameterization
scheme, a surface energy budget and 2-layer soil model package, a 1.5 order
TKE turbulence parameterization, and stability-dependent surface momentum,
heat and moisture fluxes.
The second forecast utilized 3 km horizontal grid spacing over an area 336
x 336 km, the location of which was based on the daily severe weather target
area as determined by VORTEX forecasters (Fig. 1). This domain was nested
one-way within the 15 km resolution domain described above, and used the
same physics and vertical grid except with the Kuo scheme replaced by the
Kessler warm-rain microphysical parameterization. For most days during the
latter half of the experiment, the initial conditions for the storm-scale
domain were provided by the 10 km resolution Oklahoma Local Analysis and
Prediction System (OLAPS; Brewster et al. 1994; McGinley 1995; Fig.
1), which included data from several sources, including the Oklahoma Mesonet.
[See Albers 1995 for a description of the LAPS wind analysis.]
The detailed operational flowchart for the VORTEX '95 predictions is shown
in Fig. 2. Note that the 15 km resolution ARPS was initialized from a 6-hour
forecast (RUC) that was valid at 18Z but available at 16Z, around
the time the ARPS model was started! Hourly data dumps from this 15 km grid
forecast were used as boundary conditions for the inner 3 km resolution
run. WSR-88D data were handled through the remapper described by Brewster
et al. (1995), and all other observations interfaced to the prediction
system through the OLAPS. Conversion of data to the ARPS coordinate system
was controlled by a general piece of software called EXT2ARPS (i.e., convert
external file to ARPS format), which also performs a 3-D mass continuity
adjustment on the interpolated wind fields.
Figure 2. Data flow chart for the realtime ARPS predictions
made during VORTEX '95. Solid (dashed) arrows indicate data flow for initial
(boundary) conditions. Adapted from Brewster et al. (1995).
The model execution and product generation for the entire forecast cycle
was automated through the use of UNIX shell scripts and cron tabs. Each
day, a CAPS "duty scientist", which was often a graduate or undergraduate
student, monitored the forecast process, which from start to finish took
about 3 hours.
4. EVALUATION AND ARCHIVAL
The ARPS forecasts were evaluated in a variety of ways by scientists and
students around the world. Locally, the ARPS gridded output was converted
to GEMPAK format and shipped at hourly intervals to the NOAA Storm Prediction
Center (which, in 1995, was located at the National Severe Storms Laboratory)
and the co-located VORTEX Operations Center. There, forecasters could use
tools such as NTRANS to create selected graphical products for evaluation.
Several 4-panel images of selected fields for both the 15 and 3 km resolution
runs were also automatically created and shipped to the CAPS World Wide
Web Home Page (http://wwwcaps.uoknor.edu/Forecasts). This allowed local
students to easily compare the forecasts with other data (e.g., OLAPS) in
realtime, and was also a convenient mechanism for obtaining input from interested
scientists around the world.
All data collected during VORTEX, including the OLAPS analyses, are being
archived by the NCAR Office of Field Programs Support (OFPS). During each
ARPS forecast, history dumps of all raw model fields were produced at hourly
intervals, and these data, along with all initial and boundary conditions,
source and object code, and the model executable, were saved on the mass
storage system at the Pittsburgh Supercomputing Center. At this time, the
forecasts have not been archived at the OFPS.
5. COMPUTATIONAL STRATEGY AND MODEL PERFORMANCE
Two different computers were used for the operational tests, both located
at the Pittsburgh Supercomputing Center. The first, a 16-processor Cray
C90, was used for the 15 km resolution outer domain predictions. Two processors
of this machine were dedicated to the operational tests each day, and a
6 hour forecast took from 17 to 20 minutes of wallclock time to complete.
This code required 22 million words of central memory (domain of 83 x 83
x 35 points), and executed at around 400 megaflops per processor, or approximately
40% of peak machine performance. Due to the many changes made to the ARPS
just prior to the tests, a complete optimization was not performed. Subsequent
benchmark timings showed that the ARPS can execute at 450 to 500 megaflops
per processor on the C90.
The 3 km storm-scale domain was executed on a 256 node dedicated partition
of the massively parallel Cray T3D supercomputer. This machine supports
a distributed memory that is globally addressable on a 3-D torus network,
and to make effective use of its power, the ARPS was converted to a distributed
memory model using the Parallel Virtual Machine (PVM) message passing library
(see Droegemeier et al. 1995 and Johnson et al. 1994 for details
on data decomposition).
In order to avoid maintaining more than one version of the ARPS, the message
passing calls were fused into the original vector/sequential code as standard
comment statements written in plain English (e.g., "cMP bc 2d real").
A translator developed at CAPS by scientific programmers Norman Lin and
Adwait Sathye was then used to automatically convert these comments into
appropriate PVM or MPI (message passing interface) syntax. The translator,
written as a general tool, bases its coding decisions on the execution platform
(specified by the user), and thus all of the local modifications (or "hooks)
are provided by the translator (e.g., for switching between the SPMD and
master-slave paradigm if desired). Because most parallel platforms lack
I/O support, each process created by the PVM library read and wrote its
own individual files, and tools were created at CAPS to automatically split
and merge these files as needed.
Although code synchronization calls, calls for global operations (e.g.,
sum, max, min), and initialization calls (e.g., to obtain processor ID's)
were coded by hand, we estimate that 80% of the ARPS conversion was handled
automatically by the translator developed at CAPS.
The T3D code required 36 million words of memory (112 x 112 x 35 points),
and executed during the operational tests at a speed of 11 megaflops per
processor, which is about 7% of peak machine performance for the Alpha EV4
chip used. This translated into about 75 minutes of wallclock time to generate
a 6 hour forecast. We attribute this rather low code performance (most tuned
fluid dynamics codes run at about 15 to 20 megaflops per processor on the
T3D) to the relatively small cache (1 kiloword) available on the Alpha chip,
the lack of any optimization in support of the operational tests, and the
relatively slow speed of PVM (improvements have since been made). Recent
tests with an MPI version of the ARPS show much better performance on several
platforms, and we anticipate much better statistics in our subsequent field
evaluations.
The memory requirements for these tests were well within the memory available
on the Cray C90. However, we chose to run on the T3D in 1995 in order to
prepare ourselves for the much larger problem sizes that will only be accommodated
on such machines in the future.
6. THE FUTURE
Overall CAPS views this operational experiment as highly successful, both
scientifically and technologically. The model performed surprisingly well
on a number of cases (see Xue et al. 1996 for examples), even though no
convection was present in the initial state provided by the RUC and no single-Doppler
velocity retrievals or data assimilation were employed. The former situation,
which resulted in a time lag between the model forecast and the real atmosphere,
will be remedied in spring 1996 when a diabatic initialization scheme that
uses radar reflectivity to diagnose latent heating is added. Additionally
in 1996, as shown in Fig. 3, CAPS hopes to begin using WSR-88D wideband
data in realtime, at least from the KTLX radar, and bring the resolution
of the two domains down to 10 and 2 km. The ARPS initial state will be generated
through a sequence of iterations between the ARPS and a new analysis system
written specifically for its coordinate framework. This new system, components
of which will be taken from LAPS, will provide a higher-resolution and more
data rich set of fields than is available only from the RUC forecast.
Figure 3. Projected data flow schematic for the realtime
ARPS predictions during 1996.
CAPS will continue its series of operational tests beyond 1996, with increasing
emphasis on wintertime weather. Instead of attempting to run continuously
for long periods of time, a number of "operational periods" will
be identified during each season to allow sufficient spin-up of personnel
and computing resources while at the same time limiting the commitment of
same. The ARPS will also be run during quiescent period to evaluate the
prediction of basic parameters such as surface temperature.
Finally, CAPS has embarked on a 3-year research project with American Airlines
to evaluate the feasibility of small-scale NWP in airline operations, with
specific emphasis on hub airports and selected high traffic routes. This
project is affectionately known as "Hub-CAPS!"
7. ACKNOWLEDGMENTS
This research was supported by the Center for Analysis and Prediction of
Storms (CAPS) at the University of Oklahoma. CAPS is funded by Grant ATM91-20009
from the National Science Foundation, and by a supplemental grant through
the NSF from the Federal Aviation Administration. Computer resources were
provided by the Pittsburgh Supercomputing Center, which is also sponsored
by the NSF. The authors gratefully acknowledge Sue Weygandt for drafting
the figures, and express sincere appreciation to their colleagues at the
Norman National Weather Service Forecast Office, the NOAA National Severe
Storms Laboratory, and the NOAA Storm Prediction Center.
8. REFERENCES
Albers, S., 1995: The LAPS wind analysis. Weather and Forecasting,
10, 342-352.
Brewster, K., F. Carr, N. Lin, J. Straka, and J. Krause, 1994: A local analysis
and prediction system for initializing realtime convective-scale models.
Preprints, 10th Conf. on Num. Wea. Pred., 18-22 July, Amer. Meteor.
Soc., Portland, OR, 596-598.
Brewster, K., S. Albers, J. Carr, and M. Xue, 1995: Initializing a nonhydrostatic
forecast model using WSR-88D data and OLAPS. Preprints, 27th Conf. on
Radar Meteor., 9-13 October, Amer. Meteor. Soc., Vail, CO.
Brooks, H.E., C.A. Doswell III, and L.J. Wicker, 1993: STORMTIPE: A Forecasting
experiment using a three-dimensional cloud model. Wea. and Forecasting.,
8, 352-362.
Droegemeier, K.K., M. Xue, K. Johnson, M. O'Keefe, A. Sawdey, G. Sabot,
S. Wholey, N.T. Lin, and K. Mills, 1995: Weather prediction: A scalable
storm-scale model. Chapter 3 (p. 45-92) in High Performance Computing,
G. Sabot (Ed.), Addison-Wesley, Reading, Massachusetts, 246pp.
Janish, P.R., K.K. Droegemeier, M. Xue, K. Brewster, and J. Levit, 1994:
Evaluation of the Advanced Regional Prediction System (ARPS) for storm-scale
operational forecasting. Preprints, 14th Conf. on Wea. Analysis and Forecasting,
15-20 January, Amer. Meteor. Soc., Dallas, TX, 224-229.
Johnson, K.W., J. Bauer, G.A. Riccardi, K.K. Droegemeier, and M. Xue, 1994:
Distributed processing of a regional prediction model. Mon. Wea. Rev.,
122, 2558-2572.
McGinley, J.A., 1995: Opportunities for high resolution data analysis, prediction,
and product dissemination within the local weather office. Preprints, 14th
Conf. on Weather Analysis and Forecasting, 15-20 January, Dallas, TX,
Amer. Meteor. Soc., 478-485.
Rasmussen, E.N., J.M. Straka, R. Davies-Jones, C.A. Doswell III, F.H. Carr,
M.D. Eilts, and D.R. MacGorman, 1994: Verification of the Origins of Rotation
in Tornadoes Experiment (VORTEX). Bull. Amer. Meteor. Soc., 75,
995-1005.
Xue, M., K. K. Droegemeier, V. Wong,
A. Shapiro, and K. Brewster, 1995: ARPS Version 4.0 User's Guide.
Center for Analysis and Prediction of Storms, Univ. of Oklahoma, 380pp.
[Available from CAPS, 100 East Boyd, Room 1110, Norman, OK, 73019.]
Xue, M., K. Brewster, F. Carr, K. Droegemeier,
V. Wong, Y. Liu, A. Sathye, G. Bassett, P. Janish, J. Levit and P. Bothwell,
1996: Realtime numerical prediction of storm-scale weather during VORTEX
'95, Part II: Operations summary and example predictions. Preprints, 18th
Conf. on Severe Local Storms, 19-23 Feb., Amer. Meteor. Soc., San Francisco,
CA. This volume.