NSSL Laboratory Review 2015
Books and peer-reviewed publications authored by NSSL and NSSL/CIMMS employees, FY 2010–2014
FY 2014 — 114 publications
2014: Projections of heat waves with high impact on human health in Europe. Global and Planetary Change, 119, 71–84, doi:10.1016/j.gloplacha.2014.05.006., , , , , , ,
Climate change will result in more intense, more frequent and longer lasting heat waves. The most hazardous conditions emerge when extreme daytime temperatures combine with warm night-time temperatures, high humidities and light winds for several consecutive days. Here, we assess present and future heat wave impacts on human health in Europe. Present daily physiologically equivalent temperatures (PET) are derived from the ERA-Interim reanalysis. PET allows to specifically focus on heat-related risks on humans. Regarding projections, a suite of high-resolution regional climate models – run under SRES A1B scenario – has been used. A quantile–quantile adjustment is applied to the daily simulated PET to correct biases in individual model climatologies and a multimodel ensemble strategy is adopted to encompass model errors. Two types of heat waves differently impacting human health – strong and extreme stress – are defined according to specified thresholds of thermal stress and duration. Heat wave number, frequency, duration and amplitude are derived for each type. Results reveal relatively strong correlations between the spatial distribution of strong and extreme heat wave amplitudes and mortality excess for the 2003 European summer. Projections suggest a steady increase and a northward extent of heat wave attributes in Europe. Strong stress heat wave frequencies could increase more than 40 days, lasting over 20 days more by 2075–2094. Amplitudes might augment up to 7 °C per heat wave day. Important increases in extreme stress heat wave attributes are also expected: up to 40 days in frequency, 30 days in duration and 4 °C in amplitude. We believe that with this information at hand policy makers and stakeholders on vulnerable populations to heat stress can respond more effectively to the future challenges imposed by climate warming.
2013: Theory and observations of controls on lightning flash size spectra. Journal of the Atmospheric Sciences, 70, 4012–4029, doi:10.1175/JAS-D-12-0289.1., ,
Previous analyses of very high frequency (VHF) Lightning Mapping Array (LMA) observations relative to the location of deep convective updrafts have noted a systematic pattern in flash characteristics. In and near strong updrafts, flashes tend to be smaller and more frequent, while flashes far from strong vertical drafts exhibit the opposite tendency. This study quantitatively tests these past anecdotal observations using LMA data for two supercell storms that occurred in Oklahoma in 2004. The data support a prediction from electrostatics that frequent breakdown and large flash extents are opposed. An energetic scaling that combines flash rate and flash area exhibits a 5/3 power-law scaling regime on scales of a few kilometers and a maximum in flash energy at about 10 km. The spectral shape is surprisingly consistent across a range of moderate to large flash rates. The shape of this lightning flash energy spectrum is similar to that expected of turbulent kinetic energy spectra in thunderstorms. In line with the hypothesized role of convective motions as the generator of thunderstorm electrical energy, the correspondence between kinematic and electrical energy spectra suggests that advection of charge-bearing precipitation by the storm’s flow, including in turbulent eddies, couples the electrical and kinematic properties of a thunderstorm.
2014: Continuous variability in thunderstorm primary electrification and an evaluation of inverted-polarity terminology. Atmospheric Research, 135-136, 274–284, doi:10.1016/j.atmosres.2012.10.009., , ,
Several field campaigns since the year 2000 have focused on anomalously electrified or “inverted polarity” thunderstorms. This study synthesizes these recent results, and considers how variability in the non-inductive relative-growth rate electrification mechanism might clarifying the meaning of “inverted polarity”. Instead of falling into two polarity classes, electrification and charge structure in strong updrafts vary continuously, as expected if depletion of supercooled water is a primary control on electrification. Two- or three-dimensional storm flows or other electrification mechanisms are required to combine one or more of these electrification regimes into “inverted” or otherwise complicated local charge sequences. Cloud flashes that result from these local charge sequences should be termed “positive” and “negative” instead of “normal” and “inverted” because cloud flashes of either polarity can occur at any altitude in thunderstorms.
2014: Forecaster Use and Evaluation of real-time 3DVAR analyses during Severe Thunderstorm and Tornado Warning Operations in the Hazardous Weather Testbed. Weather and Forecasting, 29, 601–613, doi:10.1175/WAF-D-13-00107.1., , , , ,
A weather-adaptive three-dimensional variational data assimilation (3DVAR) system was included in the NOAA Hazardous Weather Testbed as a first step toward introducing warn-on-forecast initiatives into operations. NWS forecasters were asked to incorporate the data in conjunction with single-radar and multisensor products in the Advanced Weather Interactive Processing System (AWIPS) as part of their warning-decision process for real-time events across the United States. During the 2011 and 2012 experiments, forecasters examined more than 36 events, including tornadic supercells, severe squall lines, and multicell storms. Products from the 3DVAR analyses were available to forecasters at 1-km horizontal resolution every 5 min, with a 4–6-min latency, incorporating data from the national Weather Surveillance Radar-1988 Doppler (WSR-88D) network and the North American Mesoscale model. Forecasters found the updraft, vertical vorticity, and storm-top divergence products the most useful for storm interrogation and quickly visualizing storm trends, often using these tools to increase the confidence in a warning decision and/or issue the warning slightly earlier. The 3DVAR analyses were most consistent and reliable when the storm of interest was in close proximity to one of the assimilated WSR-88D, or data from multiple radars were incorporated into the analysis. The latter was extremely useful to forecasters in blending data rather than having to analyze multiple radars separately, especially when range folding obscured the data from one or more radars. The largest hurdle for the real-time use of 3DVAR or similar data assimilation products by forecasters is the data latency, as even 4–6 min reduces the utility of the products when new radar scans are available.
2014: Enhancing quantitative precipitation estimation over the continental United States using a ground-space multi-sensor integration approach. IEEE Geoscience and Remote Sensing Letters, 11, 1305–1309, doi:10.1109/LGRS.2013.2295768., , , , ,
Quantitative precipitation estimation (QPE) based on ground weather radar could be considerably affected by the broadening, ascent, and blockage of the radar beam. These problems are particularly prevalent in mountainous regions. The current study proposes a multi-sensor approach to improve the ground-radar QPE in complex terrain. The proposed method, namely the Vertical Profile of Reflectivity (VPR) Identification and Enhancement (VPR-IE), integrates NOAA's National Mosaic QPE (NMQ) system and NASA's Tropical Rainfall Measuring Mission (TRMM) precipitation radar (PR) measurements. This study demonstrates promising performance of VPR-IE in the Mountainous West Region of the U.S. The potential error sources of this approach and its real-time implementation over the Continental United States are addressed as well.
2013: Performance evaluation of radar and satellite rainfalls for Typhoon Morakot over Taiwan: Are remote-sensing products ready for gauge denial scenario of extreme events?. Journal of Hydrology, 506, 4–13, doi:10.1016/j.jhydrol.2012.12.026., , , , , , , , , ,
This study evaluated rainfall estimates from ground radar network and four satellite algorithms with a relatively dense rain gauge network over Taiwan Island for the 2009 extreme Typhoon Morakot at various spatiotemporal scales (from 0.04° to 0.25° and hourly to event total accumulation). The results show that all the remote-sensing products underestimate the rainfall as compared to the rain gauge measurements, in an order of radar (−18%), 3B42RT (−19%), PERSIANN-CCS (28%), 3B42V6 (−36%), and CMORPH (−61%). The ground radar estimates are also most correlated with gauge measurements, having a correlation coefficient (CC) of 0.81 (0.82) at 0.04° (0.25°) spatial resolution. For satellite products, CMORPH has the best spatial correlation (0.70) but largely underestimates the total rainfall accumulation. Compared to microwave ingested algorithms, the IR-dominant algorithms provide a better estimation of the total rainfall accumulation but poorly resolve the temporal evolution of the warm cloud typhoon, especially for a large overestimation at the early storm stage. This study suggests that the best performance comes from the ground radar estimates that could be used as an alternative in case of the gauge denial. However, the current satellite rainfall products still have limitations in terms of resolution and accuracy, especially for this type of extreme typhoon.
2013: Evaluation of spatial errors of precipitation rates and types from TRMM spaceborne radar over the southern CONUS. Journal of Hydrometeorology, 14, 1884–1896, doi:10.1175/JHM-D-13-027.1., , , , , , , , , , ,
In this paper, the authors estimate the uncertainty of the rainfall products from NASA and Japan Aerospace Exploration Agency's (JAXA) Tropical Rainfall Measurement Mission (TRMM) Precipitation Radar (PR) so that they may be used in a quantitative manner for applications like hydrologic modeling or merging with other rainfall products. The spatial error structure of TRMM PR surface rain rates and types was systematically studied by comparing them with NOAA/National Severe Storms Laboratory's (NSSL) next generation, high-resolution (1 km/5 min) National Mosaic and Multi-Sensor Quantitative Precipitation Estimation (QPE; NMQ/Q2) over the TRMM-covered continental United States (CONUS). Data pairs are first matched at the PR footprint scale (5 km/instantaneous) and then grouped into 0.25° grid cells to yield spatially distributed error maps and statistics using data from December 2009 through November 2010. Careful quality control steps (including bias correction with rain gauges and quality filtering) are applied to the ground radar measurements prior to considering them as reference data. The results show that PR captures well the spatial pattern of total rainfall amounts with a high correlation coefficient (CC; 0.91) with Q2, but this decreases to 0.56 for instantaneous rain rates. In terms of precipitation types, Q2 and PR convective echoes are spatially correlated with a CC of 0.63. Despite this correlation, PR's total annual precipitation from convection is 48.82% less than that by Q2, which points to potential issues in the PR algorithm's attenuation correction, nonuniform beam filling, and/or reflectivity-to-rainfall relation. Finally, the spatial analysis identifies regime-dependent errors, in particular in the mountainous west. It is likely that the surface reference technique is triggered over complex terrain, resulting in high-amplitude biases.
2013: Evaluation of the successive V6 and V7 TRMM multisatellite precipitation analysis over the Continental United States. Water Resources Research, 49, 8174–8186, doi:10.1002/2012WR012795., , , , , , , , , , , , ,
 The spatial error structure of surface precipitation derived from successive versions of the TRMM Multisatellite Precipitation Analysis (TMPA) algorithms are systematically studied through comparison with the Climate Prediction Center Unified Gauge daily precipitation Analysis (CPCUGA) over the Continental United States (CONUS) for 3 years from June 2008 to May 2011. The TMPA products include the version-6(V6) and version-7(V7) real-time products 3B42RT (3B42RTV6 and 3B42RTV7) and research products 3B42 (3B42V6 and 3B42V7). The evaluation shows that 3B42V7 improves upon 3B42V6 over the CONUS regarding 3 year mean daily precipitation: the correlation coefficient (CC) increases from 0.85 in 3B42V6 to 0.92 in 3B42V7; the relative bias (RB) decreases from −22.95% in 3B42V6 to −2.37% in 3B42V7; and the root mean square error (RMSE) decreases from 0.80 in 3B42V6 to 0.48 mm in 3B42V7. Distinct improvement is notable in the mountainous West especially along the coastal northwest mountainous areas, whereas 3B42V6 (also 3B42RTV6 and 3B42RTV7) largely underestimates: the CC increases from 0.86 in 3B42V6 to 0.89 in 3B42V7, and the RB decreases from −44.17% in 3B42V6 to −25.88% in 3B42V7. Over the CONUS, 3B42RTV7 gained a little improvement over 3B42RTV6 as RB varies from −4.06% in 3B42RTV6 to 0.22% in 3B42RTV7. But there is more overestimation with the RB increasing from 8.18% to 14.92% (0.16–3.22%) over the central US (eastern).
2013: Similarity and difference of the two successive V6 and V7 TRMM multisatellite precipitation analysis performance over China. Journal of Geophysical Research - Atmospheres, 118, 13060–13074, doi:10.1002/2013JD019964., , , , , , , , , ,
 Similarities and differences of spatial error structures of surface precipitation estimated with successive version 6 (V6) and version 7 (V7) Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA) algorithms are systematically analyzed through comparison with the China Meteorological Administration's national daily precipitation analysis from June 2008 to May 2011. The TMPA products include V6 and V7 real-time products 3B42RTV6 and 3B42RTV7 and research products 3B42V6 and 3B42V7. Both versions of research products outperform their respective real-time counterparts. 3B42V7 clearly improves upon 3B42V6 over China in terms of daily mean precipitation; the correlation coefficient (CC) increases from 0.89 to 0.93, the relative bias (RB) improves from −4.91% to −0.05%, and the root-mean-square error (RMSE) improves from 0.69 mm to 0.54 mm. When considering 3 year mean precipitation, 3B42V7 shows similar spatial patterns and statistical performance to 3B42V6. Both 3B42RTV7 and 3B42RTV6 demonstrate similar bias patterns in most regions of China with overestimation by 20% in arid regions (i.e., the north and west of China) and slight underestimation in humid regions (e.g., −5.82% in southern China). However, 3B42RTV7 overestimates precipitation more than 3B42RTV6 in the cold Qinghai-Tibetan plateau, resulting in a much higher RB of 139.95% (128.69%, 136.09%, and 121.11%) in terms of 3 year annual (spring, summer, and autumn) daily mean precipitation and an even worse performance during winter. In this region, 3B42RTV7 shows an overall slightly degraded performance than 3B42RTV6 with CC decreasing from 0.81 to 0.73 and RB (RMSE) increasing from 21.22% (0.95 mm) to 35.84% (1.27 mm) in terms of daily precipitation.
2014: Application of Object-based Time-Domain Diagnostics for Tracking Precipitation Systems in Convection-allowing Models. Weather and Forecasting, 29, 517–542., , , , ,
Meaningful verification and evaluation of convection-allowing models requires approaches that do not rely on point-to-point matches of forecast and observed fields. In this study, one such approach—a beta version of the Method for Object-Based Diagnostic Evaluation (MODE) that incorporates the time dimension [known as MODE time-domain (MODE-TD)]—was applied to 30-h precipitation forecasts from four 4-km grid-spacing members of the 2010 Storm-Scale Ensemble Forecast system with different microphysics parameterizations. Including time in MODE-TD provides information on rainfall system evolution like lifetime, timing of initiation and dissipation, and translation.
The simulations depicted the spatial distribution of time-domain precipitation objects across the United States quite well. However, all simulations overpredicted the number of objects, with the Thompson microphysics scheme overpredicting the most and the Morrison method the least. For the smallest smoothing radius and rainfall threshold used to define objects [8 km and 0.10 in. (1 in. = 2.54 cm), respectively], the most common object duration was 3 h in both models and observations. With an increased smoothing radius and rainfall threshold, the most common duration became shorter. The simulations depicted the diurnal cycle of object frequencies well, but overpredicted object frequencies uniformly across all forecast hours. The simulations had spurious maxima in initiating objects at the beginning of the forecast and a corresponding spurious maximum in dissipating objects slightly later. Examining average object velocities, a slow bias was found in the simulations, which was most pronounced in the Thompson member. These findings should aid users and developers of convection-allowing models and motivate future work utilizing time-domain methods for verifying high-resolution forecasts.
2014: CONUS-wide evaluation of National Weather Service flash flood guidance products. Weather and Forecasting, 29, 377–392, doi:10.1175/WAF-D-12-00124.1., , , , ,
This study quantifies the skill of the National Weather Service’s (NWS) flash flood guidance (FFG) product. Generated by River Forecast Centers (RFCs) across the United States, local NWS Weather Forecast Offices compare estimated and forecast rainfall to FFG to monitor and assess flash flooding potential. A national flash flood observation database consisting of reports in the NWS publication Storm Data and U.S. Geological Survey (USGS) stream gauge measurements are used to determine the skill of FFG over a 4-yr period. FFG skill is calculated at several different precipitation-to-FFG ratios for both observation datasets. Although a ratio of 1.0 nominally indicates a potential flash flooding event, this study finds that FFG can be more skillful when ratios other than 1.0 are considered. When the entire continental United States is considered, the highest observed critical success index (CSI) with 1-h FFG is 0.20 for the USGS dataset, which should be considered a benchmark for future research that seeks to improve, modify, or replace the current FFG system. Regional benchmarks of FFG skill are also determined on an RFC-by-RFC basis. When evaluated against Storm Data reports, the regional skill of FFG ranges from 0.00 to 0.19. When evaluated against USGS stream gauge measurements, the regional skill of FFG ranges from 0.00 to 0.44.
2014: Cloud Microphysical Properties Retrieved from Downwelling Infrared Radiance Measurements Made at Eureka, Nunavut, Canada (2006–09). Journal of Applied Meteorology and Climatology, 53, 772–791, doi:10.1175/JAMC-D-13-0113.1., , , , ,
The radiative properties of clouds are related to cloud microphysical and optical properties, including water path, optical depth, particle size, and thermodynamic phase. Ground-based observations from remote sensors provide high-quality, long-term, continuous measurements that can be used to obtain these properties. In the Arctic, a more comprehensive understanding of cloud microphysics is important because of the sensitivity of the Arctic climate to changes in radiation. Eureka, Nunavut (80 N, 86 W, 10m), Canada, is a research station located on Ellesmere Island. A large suite of ground-based remote sensors at Eureka provides the opportunity to make measurements of cloud microphysics using multiple instruments and methodologies. In this paper, cloud microphysical properties are presented using a retrieval method that utilizes infrared radiances obtained from an infrared spectrometer at Eureka between March 2006 and April 2009. These retrievals provide a characterization of the microphysics of ice and liquid in clouds with visible optical depths between 0.25 and 6, which are a class of clouds whose radiative properties depend greatly on their microphysical properties. The results are compared with other studies that use different methodologies at Eureka, providing context for multimethod perspectives. The authors’ findings are supportive of previous studies, including seasonal cycles in phase and liquid particle size, weak temperature–phase dependencies, and frequent occurrences of supercooled water. Differences in microphysics are found between mixed-phase and single-phase clouds for both ice and liquid. The Eureka results are compared with those obtained using a similar retrieval technique during the Surface Heat Budget of the Arctic Ocean (SHEBA) experiment.
2014: Estimation of Near Surface Wind Speeds in Strongly Rotating Flows. Applied Mathematics and Computation, 235C, 201–211, doi:10.1016/j.amc.2014.01.010., , ,
Modeling studies consistently demonstrate that the most violent winds in tornadic vortices occur in the lowest tens of meters above the surface. These velocities are unobservable by radar platforms due to line of sight considerations. In this work, a methodology is developed which utilizes parametric tangential velocity models derived from Doppler radar measurements, together with a tangential momentum and mass continuity constraint, to estimate the radial and vertical velocities in a steady axisymmetric frame. The main result is that information from observations aloft can be extrapolated into the surface layer of the vortex. The impact of the amount of information available to the retrieval is demonstrated through some numerical tests with pseudo-data.
2013: Real-time measurement of the range correlation for range oversampling processing. Journal of Atmospheric and Oceanic Technology, 30, 2885–2895, doi:10.1175/JTECH-D-13-00090.1., ,
As range-oversampling processing has become more practical for weather radars, implementation issues have become important to ensure the best possible performance. For example, all of the linear transformations that have been utilized for range-oversampling processing directly depend on the normalized range correlation matrix. Hence, accurately measuring the correlation in range time is essential to avoid reflectivity biases and to ensure the expected variance reduction. Although the range correlation should be relatively stable over time, hardware changes and drift due to changing environmental conditions can have measurable effects on the modified pulse. To reliably track changes in the range correlation, an automated real-time method is needed that does not interfere with normal data collection. A method is proposed that uses range-oversampled data from operational radar scans and that works with radar returns from both weather and ground clutter. In this paper, the method is described, tested using simulations, and validated with time series data.
2014: Adaptive Range Oversampling to Improve Estimates of Polarimetric Variables on Weather Radars. Journal of Atmospheric and Oceanic Technology, 31, 1853–1866, doi:10.1175/JTECH-D-13-00216.1., ,
One way to reduce the variance of meteorological-variable estimates on weather radars without increasing dwell times is by using range oversampling techniques. Such techniques could significantly improve the estimation of polarimetric variables, which typically require longer dwell times to achieve the desired data quality compared to the single-polarization spectral moments. In this paper, an efficient implementation of adaptive pseudowhitening that was developed for single-polarization radars is extended for dual polarization. Adaptive pseudowhitening maintains the performance of pure whitening at high signal-to-noise ratios and equals or outperforms the digital matched filter at low signal-to-noise ratios. This approach results in improvements for polarimetric-variable estimates that are consistent with the improvements for spectral-moment estimates described in previous work. The performance of the proposed technique is quantified using simulations that show that the variance of polarimetric-variable estimates can be reduced without modifying the scanning strategies. The proposed technique is applied to real weather data to validate the expected improvements that can be realized operationally.
2014: The roles of ambient and storm-generated vorticity in the development of near-ground rotation in a simulated supercell. Journal of the Atmospheric Sciences, 71, 3027–3051, doi:10.1175/JAS-D-13-0123.1., , ,
The authors use a high-resolution supercell simulation to investigate the source of near-ground vertical vorticity by decomposing the vorticity vector into barotropic and nonbarotropic parts. This way, the roles of ambient and storm-generated vorticity can be isolated. A new Lagrangian technique is employed in which material fluid volume elements are tracked to analyze the rearrangement of ambient vortex-line segments. This contribution is interpreted as barotropic vorticity. The storm-generated vorticity is treated as the residual between the known total vorticity and the barotropic vorticity.
In the simulation the development of near-ground vertical vorticity is an outflow phenomenon. There are distinct “rivers” of cyclonic shear vorticity originating from the base of downdrafts that feed into the developing near-ground vortex. The origin of these rivers of vertical vorticity is primarily horizontal baroclinic production, which is maximized in the lowest few hundred meters AGL. Subsequently, this horizontal vorticity is tilted upward while the parcels are still descending. The barotropic vorticity remains mostly streamwise along the analyzed trajectories and does not acquire a large vertical component as the parcels reach the ground. Thus, the ambient vorticity that is imported into the storm contributes only a small fraction of the total near-ground vertical vorticity.
2013: Low-Level Polarimetric Radar Signatures in EnKF Analyses and Forecasts of the May 8, 2003 Oklahoma City Tornadic Supercell: Impact of Multimoment Microphysics and Comparisons with Observation. Advances in Meteorology, 2013, 1–13, doi:10.1155/2013/818394., , , , ,
The impact of increasing the number of predicted moments in a multimoment bulk microphysics scheme is investigated using ensemble Kalman filter analyses and forecasts of the May 8, 2003 Oklahoma City tornadic supercell storm and the analyses are validated using dual-polarization radar observations. The triple-moment version of the microphysics scheme exhibits the best performance, relative to the single- and double-moment versions, in reproducing the low-ZDR hail core and high-ZDR arc, as well as an improved probabilistic track forecast of the mesocyclone. A comparison of the impact of the improved microphysical scheme on probabilistic forecasts of the mesocyclone track with the observed tornado track is also discussed.
2014: Low-Level ZDR Signatures in Supercell Forward Flanks: The Role of Size Sorting and Melting of Hail. Journal of the Atmospheric Sciences, 71, 276–299, doi:10.1175/JAS-D-13-0118.1., , , , , ,
The low levels of supercell forward flanks commonly exhibit distinct differential reflectivity (ZDR) signatures, including the low-ZDR hail signature and the high-ZDR "arc." The ZDR arc has been previously associated with size sorting of raindrops in the presence of vertical wind shear; here this model is extended to include size sorting of hail. Idealized simulations of a supercell storm observed by the Norman, Oklahoma (KOUN), polarimetric radar on 1 June 2008 are performed using a multimoment bulk microphysics scheme, in which size sorting is allowed or disallowed for hydrometeor species. Several velocity diameter relationships for the hail fall speed are considered, as well as fixed or variable bulk densities that span the graupel-to-hail spectrum. A T-matrix-based emulator is used to derive polarimetric fields from the hydrometeor state variables.
Size sorting of hail is found to have a dominant impact on ZDR and can result in a ZDR arc from melting hail even when size sorting is disallowed in the rain field. The low-ZDR hail core only appears when size sorting is allowed for hail. The mean storm-relative wind in a deep layer is found to align closely with the gradient in mean mass diameter of both rain and hail, with a slight shift toward the storm-relative mean wind below the melting level in the case of rain. The best comparison with the observed 1 June 2008 supercell is obtained when both rain and hail are allowed to sort, and the bulk density and associated fall-speed curve for hail are predicted by the model microphysics.
2013: Skill assessment of a real-time forecast system utilizing a coupled hydrologic and coastal hydrodynamic model during Hurricane Irene (2011). Continental Shelf Research, 71, 78–94, doi:10.1016/j.csr.2013.10.007., , , , , , , , , , , , , , ,
Due to the devastating effects of recent hurricanes in the Gulf of Mexico (e.g., Katrina, Rita, Ike and Gustav), the development of a high-resolution, real-time, total water level prototype system has been accelerated. The fully coupled model system that includes hydrology is an extension of the ADCIRC Surge Guidance System (ASGS), and will henceforth be referred to as ASGS-STORM (Scalable, Terrestrial, Ocean, River, Meteorological) to emphasize the major processes that are represented by the system.The ASGS-STORM system incorporates tides, waves, winds, rivers and surge to produce a total water level, which provides a holistic representation of coastal flooding. ASGS-STORM was rigorously tested during Hurricane Irene, which made landfall in late August 2011 in North Carolina. All results from ASGS-STORM for the advisories were produced in real-time, forced by forecast wind and pressure fields computed using a parametric tropical cyclone model, and made available via the web. Herein, a skill assessment, analyzing wind speed and direction, significant wave heights, and total water levels, is used to evaluate ASGS-STORM's performance during Irene for three advisories and the best track from the National Hurricane Center (NHC). ASGS-STORM showed slight over-prediction for two advisories (Advisory 23 and 25) due to the over-estimation of the storm intensity. However, ASGS-STORM shows notable skill in capturing total water levels, wind speed and direction, and significant wave heights in North Carolina when utilizing Advisory 28, which had a slight shift in the track but provided a more accurate estimation of the storm intensity, along with the best track from the NHC. Results from ASGS-STORM have shown that as the forecast of the advisories improves, so does the accuracy of the models used in the study; therefore, accurate input from the weather forecast is a necessary, but not sufficient, condition to ensure the accuracy of the guidance provided by the system. While Irene provided a real-time test of the viability of a total water level system, the relatively insignificant freshwater discharges precludes definitive conclusions about the role of freshwater discharges on total water levels in estuarine zones. Now that the system has been developed, on-going work will examine storms (e.g., Floyd) for which the freshwater discharge played a more meaningful role.
2014: HyMeX-SOP1: The field campaign dedicated to heavy precipitation and flash flooding in the northwestern Mediterranean. Bulletin of the American Meteorological Society, 95, 1083–1100, doi:10.1175/BAMS-D-12-00244.1., , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
The Mediterranean region is frequently affected by heavy precipitation events associated with flash floods, landslides, and mudslides that cause hundreds of millions of euros in damages per year and, often, casualties. A major field campaign was devoted to heavy precipitation and f lash f loods from 5 September to 6 November 2012 within the framework of the 10-yr international Hydrological Cycle in the Mediterranean Experiment (HyMeX) dedicated to the hydrological cycle and related high-impact events. The 2-month field campaign took place over the northwestern Mediterranean Sea and its surrounding coastal regions in France, Italy, and Spain. The observation strategy of the field experiment was devised to improve knowledge of the following key components leading to heavy precipitation and flash flooding in the region: 1) the marine atmospheric f lows that transport moist and conditionally unstable air toward the coasts, 2) the Mediterranean Sea acting as a moisture and energy source, 3) the dynamics and microphysics of the convective systems producing heavy precipitation, and 4) the hydrological processes during flash floods. This article provides the rationale for developing this first HyMeX field experiment and an overview of its design and execution. Highlights of some intensive observation periods illustrate the potential of the unique datasets collected for process understanding, model improvement, and data assimilation.
2014: MPING Crowd-Sourcing Weather Reports for Research. Bulletin of the American Meteorological Society, 95, 1335–1342, doi:10.1175/BAMS-D-13-00014.1., , , , , , ,
The Weather Service Radar-1988 Doppler (WSR-88D) network within the United States has recently been upgraded to include dual-polarization capability. Among the expectations that have resulted from the upgrade is the ability to discriminate between different precipitation types in winter precipitation events. To know how well any such algorithm performs and whether new algorithms are an improvement, observations of winter precipitation type are needed. Unfortunately, the automated observing systems cannot discriminate between some of the more important types. Thus, human observers are needed. Yet, to deploy dedicated human observers is impractical because the knowledge needed to identify the various precipitation types is common among the public. To most efficiently gather such observations would require the public to be engaged as citizen scientists using a very simple, convenient, nonintrusive method. To achieve this, a simple “app” called mobile Precipitation Identification Near the Ground (mPING) was developed to run on “smart” phones or, more generically, web-enabled devices with GPS location capabilities. Using mPING, anyone with a smartphone can pass observations to researchers at no additional cost to their phone service or to the research project. Deployed in mid-December 2012, mPING has proven to be not only very popular, but also capable of providing consistent, accurate observational data.
2014: Evaluation of a Cloud-Scale Lightning Data Assimilation Technique and a 3DVAR Method for the Analysis and Short-Term Forecast of the 29 June 2012 Derecho Event. Monthly Weather Review, 142, 183–202, doi:10.1175/MWR-D-13-00142.1., , , , , ,
This work evaluates the short-term forecast (≤6 h) of the 29–30 June 2012 derecho event from the Advanced Research core of the Weather Research and Forecasting Model (WRF-ARW) when using two distinct data assimilation techniques at cloud-resolving scales (3-km horizontal grid). The first technique assimilates total lightning data using a smooth nudging function. The second method is a three-dimensional variational technique (3DVAR) that assimilates radar reflectivity and radial velocity data. A suite of sensitivity experiments revealed that the lightning assimilation was better able to capture the placement and intensity of the derecho up to 6 h of the forecast. All the simulations employing 3DVAR, however, best represented the storm’s radar reflectivity structure at the analysis time. Detailed analysis revealed that a small feature in the velocity field from one of the six selected radars in the original 3DVAR experiment led to the development of spurious convection ahead of the parent mesoscale convective system, which significantly degraded the forecast. Thus, the relatively simple nudging scheme using lightning data complements the more complex variational technique. The much lower computational cost of the lightning scheme may permit its use alongside variational techniques in improving severe weather forecasts on days favorable for the development of outflow-dominated mesoscale convective systems.
2014: Relationships between southeast Australian temperature anomalies and large-scale climate drivers. Journal of Climate, 27, 1395–1412, doi:10.1175/JCLI-D-13-00229.1., ,
Over the past century, particularly after the 1960s, observations of mean maximum temperatures reveal an increasing trend over the southeastern quadrant of the Australian continent.
Correlation analysis of seasonally averaged mean maximum temperature anomaly data for the period 1958-2012 is carried out for a representative group of 10 stations in southeast Australia (SEAUS). For the warm season (November-April) there is a positive relationship with El Niño–Southern Oscillation (ENSO), the Pacific Decadal Oscillation (PDO), and an inverse relationship with the Antarctic Oscillation (AAO) for most stations. For the cool season (May-October), most stations exhibit similar relationships with the AAO, positive correlations with the Dipole Mode Index (DMI) and marginal inverse relationships with the Southern Oscillation Index (SOI) and the PDO. However, for both seasons, the Blocking Index (BI, as defined by Pook and Gibson 1999) in the Tasman Sea (160°E) clearly is the dominant climate mode affecting maximum temperature variability in SEAUS with negative correlations in the range r=-0.30 to -0.65. These strong negative correlations arise from the usual definition of BI, which is positive when blocking high pressure systems occur over the Tasman Sea (near 45°S, 160°E), favoring the advection of modified cooler, higher latitude maritime air over SEAUS.
A point-by-point correlation with global sea surface temperatures (SSTs), principal component analysis, and wavelet power spectra support the relationships with ENSO and DMI. Notably, the analysis reveals that the maximum temperature variability of one group of stations is explained primarily by local factors (warmer near-coastal SSTs), rather than teleconnections with large-scale drivers.
2014: Relationships between California rainfall variability and large-scale climate drivers. International Journal of Climatology, 34, 3626–3640, doi:10.1002/joc.4112.,
Bootstrapped correlation statistics of seasonally averaged monthly rainfall anomalies for the period 1950 – 2012 were evaluated between a comprehensive set of Atlantic- and Pacific-based large-scale climate drivers and a group of representative stations over the state of California, United States. In line with past seminal works, the analysis showed a dominant influence of the El Niño–Southern Oscillation (Pacific Decadal Oscillation) during the wetter (drier) period, particularly to the south (north) of the state. The largest correlation magnitudes were obtained for the Southern Oscillation Index (SOI) and the sea surface temperature (SST)-based El Niño–Southern Oscillation (ENSO) indices Niños 3 and 3.4. A point-by-point correlation analysis with global SSTs, principal component analysis on the Pacific SST fields and wavelet spectral decomposition all assessed the robustness of the aforementioned relationships. The point-by-point correlation analysis with the SSTs further revealed an overall lack of relationships between rainfall and Atlantic-based climate drivers and suggested that Indian Ocean SSTs are weakly correlated with California rainfall. This analysis also revealed that the relationship of northern California rainfall with the Pacific Decadal Oscillation (PDO) could be attributed to warm SST anomalies confined along the North American coast during the warm phase of the PDO. Notably, the wavelet analysis applied to extended rainfall records (1900 – 2012) revealed that the peaks in the rainfall power spectrum for southern California during the drier period are coincident with calendar years associated with the nearby passage or landfall of tropical systems in the southern California/North Baja region rather than teleconnections with large-scale climate drivers.
2013: The development of a hybrid EnKF-3DVAR algorithm for storm-scale data assimilation. Advances in Meteorology, 2013, 1–12, doi:10.1155/2013/512656., , ,
A hybrid 3DVAR-EnKF data assimilation algorithm is developed based on 3DVAR and ensemble Kalman filter (EnKF) programs within the Advanced Regional Prediction System (ARPS). The hybrid algorithm uses the extended alpha control variable approach to combine the static and ensemble-derived flow-dependent forecast error covariances. The hybrid variational analysis is performed using an equal weighting of static and flow-dependent error covariance as derived from ensemble forecasts. The method is first applied to the assimilation of simulated radar data for a supercell storm. Results obtained using 3DVAR (with static covariance entirely), hybrid 3DVAR-EnKF, and the EnKF are compared. When data from a single radar are used, the EnKF method provides the best results for the model dynamic variables, while the hybrid method provides the best results for hydrometeor related variables in term of rms errors. Although storm structures can be established reasonably well using 3DVAR, the rms errors are generally worse than seen from the other two methods. With two radars, the results from 3DVAR are closer to those from EnKF. Our tests indicate that the hybrid scheme can reduce the storm spin-up time because it fits the observations, especially the reflectivity observations, better than the EnKF and the 3DVAR at the beginning of the assimilation cycles.
2014: Some Observing System Simulation Experiments with a Hybrid 3DEnVAR System for Stormscale Radar Data Assimilation. Monthly Weather Review, 142, 3326–3346, doi:10.1175/MWR-D-14-00025.1., ,
A hybrid three-dimensional ensemble–variational data assimilation (3DEnVAR) algorithm is developed based on the 3D variational data assimilation (3DVAR) and ensemble Kalman filter (EnKF) programs with the Advanced Regional Prediction System (ARPS). The method uses the extended control variable approach to combine the static and ensemble-derived flow-dependent forecast error covariances. The method is applied to the assimilation of simulated data from two radars for a supercell storm. Some sensitivity experiments are performed to answer questions about how flow-dependent covariance estimated from the forecast ensemble can be best used in the hybrid 3DEnVAR scheme. When the ensemble size is relatively small (with 5 or 10 ensemble members), it is found that experiments with a weaker weighting value for the ensemble covariance leads to better analysis results. Even when severe sampling errors exist, introducing ensemble-estimated covariances into the variational method still benefits the analysis. For reasonably large ensemble sizes (50–100 members), a stronger relative weighting (>0.8) for the ensemble covariance leads to better analyses from the hybrid 3DEnVAR. In addition, the sensitivity experiments also indicate that the best results are obtained when the number of the augmented control variables is a function of three spatial dimensions and ensemble members, and is the same for all analysis variables.
2013: Sensitivity of Convective Initiation Prediction to Near-Surface Moisture When Assimilating Radar Refractivity: Impact Tests Using OSSEs. Journal of Atmospheric and Oceanic Technology, 30, 2281–2302, doi:10.1175/JTECH-D-12-00038.1., , , ,
The Advanced Regional Prediction System (ARPS) three-dimensional variational (3DVAR) system is enhanced to include the analysis of radar-derived refractivity measurements. These refractivity data are most sensitive to atmospheric moisture content and provide high-resolution information on near-surface moisture that is important to convective initiation (CI) and precipitation forecasting. Observing system simulation experiments (OSSEs) are performed using simulated refractivity data. The impacts of refractivity on CI and subsequent forecasts are investigated in the presence of varying observation error, radar location, data coverage, and different uncertainties in the background field. Cycled refractivity assimilation and forecasts are performed and the results compared to the truth. In addition to the perfect model experiments, imperfect model experiments are performed where the forecasts use the Weather Research and Forecasting (WRF) model instead of the ARPS. A simulation for the 19 May 2010 central plain convection case is used for the OSSEs. It involves a large storm system, large convective available potential energy, and little convective inhibition, allowing for CI along a warm front in northern Oklahoma and ahead of a dryline later to the southwest. Emphasis is placed on the quality of moisture analyses and the subsequent forecasts of CI. Results show the ability of refractivity assimilation to correct low-level moisture errors, leading to improved CI forecasts. Equitable threat scores for reflectivity are generally higher when refractivity data are assimilated. Tests show small sensitivity to increased observational error or ground clutter coverage, and greater sensitivity to the limited data coverage of a single radar.
2013: Impact of a diagnostic pressure equation constraint on tornadic supercell thunderstorms forecasts initialized using 3DVAR radar data assimilation. Advances in Meteorology, 2013, 1–12, doi:10.1155/2013/947874., , ,
A diagnostic pressure equation constraint has been incorporated into a storm-scale three-dimensional variational (3DVAR) data assimilation system. This diagnostic pressure equation constraint (DPEC) is aimed to improve dynamic consistency among different model variables so as to produce better data assimilation results and improve the subsequent forecasts. Ge et al. (2012) described the development of DPEC and testing of it with idealized experiments. DPEC was also applied to a real supercell case, but only radial velocity was assimilated. In this paper, DPEC is further applied to two real tornadic supercell thunderstorm cases, where both radial velocity and radar reflectivity data are assimilated. The impact of DPEC on radar data assimilation is examined mainly based on the storm forecasts. It is found that the experiments using DPEC generally predict higher low-level vertical vorticity than the experiments not using DPEC near the time of observed tornadoes. Therefore, it is concluded that the use of DPEC improves the forecast of mesocyclone rotation within supercell thunderstorms. The experiments using different weighting coefficients generate similar results. This suggests that DPEC is not very sensitive to the weighting coefficients.
2014: Severe-Thunderstorm Reanalysis Environments and Collocated Radiosonde Observations. Journal of Applied Meteorology and Climatology, 53, 742–751, doi:10.1175/JAMC-D-13-0263.1., , ,
This research compares reanalysis-derived proxy soundings from the North American Regional Reanalysis (NARR) with collocated observed radiosonde data across the central and eastern United States during the period 2000–11: 23 important parameters used for forecasting severe convection are examined. Kinematic variables such as 0–6-km bulk wind shear are best represented by this reanalysis, whereas thermodynamic variables such as convective available potential energy exhibit regional biases and are generally overestimated by the reanalysis. For thermodynamic parameters, parcel-ascent choice is an important consideration because of large differences in reanalysis low-level moisture fields versus observed ones. Results herein provide researchers with potential strengths and limitations of using NARR data for the purposes of depicting climatological information for hazardous convective weather and initializing model simulations. Similar studies should be considered for other reanalysis datasets.
2014: Evaluation of past, present, and future tools for radar-based flash flood prediction. Hydrological Sciences Journal, 59, 1377–1389, doi:10.1080/02626667.2014.919391., , , ,
The societal impacts of flash floods are more significant than any other weather-related hazard. They are often manifested in the form of damage to infrastructure, flooding of roadways and bridges, creating deadly hazards to motorists and inundation of crops and pasture. Some of these hazards can be anticipated and thus mitigated given effective warning systems. This study describes the tools proposed over recent decades in the USA to predict flash flooding and evaluates them using a common observational data set. Design recommendations for flash-flood forecasting systems are provided, taking into account today's availability of high-resolution rainfall data at scales commensurate with flash flooding, their archives, spatial data sets to describe physiographic properties, and ever-increasing computational resources.
2014: Automated Identification of Enhanced Rainfall Rates Using the Near-Storm Environment for Radar Precipitation Estimates. Journal of Hydrometeorology, 15, 1238–1254, doi:10.1175/JHM-D-13-042.1., , ,
Reliable and timely flash flood warnings are critically dependent on the accuracy of real-time rainfall estimates. Precipitation is not only the most vital input for basin-scale accumulation algorithms such as the Flash Flood Monitoring and Prediction (FFMP) program used operationally by the U.S. National Weather Service, but it is the primary forcing for hydrologic models at all scales. Quantitative precipitation estimates (QPE) from radar are widely used for such a purpose because of their high spatial and temporal resolution. However, converting the native radar variables into an instantaneous rain rate is fraught with challenges and uncertainties. This study addresses the challenge of identifying environments conducive for tropical rain rates, or rain rates that are enhanced by highly productive warm rain growth processes. Model analysis fields of various thermodynamic and moisture parameters were used as predictors in a decision tree–based ensemble to generate probabilities of warm rain–dominated drop growth. Variable importance analysis from the ensemble training showed that the probability accuracy was most dependent on two parameters in particular: freezing-level height and lapse rates of temperature. The probabilities were used to assign a tropical rain rate for hourly QPE and were evaluated against existing Z–R–based QPE products available to forecasters. The probability-based delineations showed improvement in QPE over the existing methods, but the two predictands tested had varying levels of performance for the storm types evaluated and require further study.
2014: An Electrical and Polarimetric Analysis of the Overland Reintensification of Tropical Storm Erin (2007). Monthly Weather Review, 142, 2321–2344, doi:10.1175.MWR-D-13-00360.1., , , , ,
While passing over central Oklahoma on 18–19 August 2007, the remnants of Tropical Storm Erin unexpectedly reintensified and developed an eyelike feature that was clearly discernable in Weather Surveillance Radar-1988 Doppler (WSR-88D) imagery. During this brief reintensification period, Erin traversed a region of dense surface and remote sensing observation networks that provided abundant data of high spatial and temporal resolution. This study analyzes data from the polarimetric KOUN S-band radar, total lightning data from the Oklahoma Lightning Mapping Array, and ground-flash lightning data from the National Lightning Detection Network.
Erin’s reintensification was atypical since it occurred well inland and was accompanied by stronger maximum sustained winds and gusts (25 and 37 m/s, respectively) and lower minimum sea level pressure(1001.3hPa) than while over water. Radar observations reveal several similarities to those documented in mature tropical cyclones over open water, including outward-sloping eyewall convection, near 0-dBZ reflectivities within the eye, and relatively large updraft velocities in the eyewall as inferred from single-Doppler winds and ZDR columns.
Deep, electrified convection near the center of circulation preceded the formation of Erin’s eye, with maximum lightning activity occurring prior to and during reintensification. The results show that inner-core convection may have played a role in the reinvigoration of the storm
2014: Kinematic and Precipitation Characteristics of Convective Systems Observed by Airborne Doppler Radar during the Life Cycle of a Madden–Julian Oscillation in the Indian Ocean. Monthly Weather Review, 142, 1385–1402, doi:10.1175/MWR-D-13-00252.1., ,
This study presents characteristics of convective systems observed during the Dynamics of the Madden–Julian oscillation (DYNAMO) experiment by the instrumented NOAA WP-3D aircraft. Nine separate missions, with a focus on observing mesoscale convective systems (MCSs), were executed to obtain data in the active and inactive phase of a Madden–Julian oscillation (MJO) in the Indian Ocean. Doppler radar and in situ thermodynamic data are used to contrast the convective system characteristics during the evolution of the MJO. Isolated convection was prominent during the inactive phases of the MJO, with deepening convection during the onset of the MJO. During the MJO peak, convection and stratiform precipitation became more widespread. A larger population of deep convective elements led to a larger area of stratiform precipitation. As the MJO decayed, convective system top heights increased, though the number of convective systems decreased, eventually transitioning back to isolated convection. A distinct shift of echo top heights and contoured frequency-by-altitude diagram distributions of radar reflectivity and vertical wind speed indicated that some mesoscale characteristics were coupled to the MJO phase. Convective characteristics in the climatological initiation region (Indian Ocean) were also apparent. Comparison to results from the Tropical Ocean and Global Atmosphere Coupled Ocean–Atmosphere Response Experiment (TOGA COARE) in the western Pacific indicated that DYNAMO MCSs were linearly organized more parallel to the low-level shear and without strong cold pools than in TOGA COARE. Three-dimensional MCS airflow also showed a different dynamical structure, with a lack of the descending rear inflow present in shear perpendicularly organized TOGA COARE MCSs. Weaker, but deeper updrafts were observed in DYNAMO.
2013: Spatial and Temporal Characteristics of Heavy Hourly Rainfall in the United States. Monthly Weather Review, 141, 4564–4575, doi:10.1175/MWR-D-12-00297.1., , ,
The climatology of heavy rain events from hourly precipitation observations by Brooks and Stensrud is revisited in this study using two high-resolution precipitation datasets that incorporate both gauge observations and radar estimates. Analyses show a seasonal cycle of heavy rain events originating along the Gulf Coast and expanding across the eastern two-thirds of the United States by the summer, comparing well to previous findings. The frequency of extreme events is estimated, and may provide improvements over prior results due to both the increased spatial resolution of these data and improved techniques used in the estimation. The diurnal cycle of heavy rainfall is also examined, showing distinct differences in the strength of the cycle between seasons.
2013: Radial-Based Noise Power Estimation for Weather Radars. Journal of Atmospheric and Oceanic Technology, 30, 2737–2753, doi:10.1175/JTECH-D-13-00008.1., , ,
A radar antenna intercepts thermal radiation from various sources including the ground, the sun, the sky, precipitation, and man-made radiators. In the radar receiver, this external radiation produces noise that constructively adds to the receiver internal noise and results in the overall system noise. Consequently, the system noise power is dependent on the antenna position and needs to be estimated accurately. Inaccurate noise power measurements may lead to reduction of coverage if the noise power is overestimated or to radar data images cluttered by noise speckles if the noise power is underestimated. Moreover, when an erroneous noise power is used at low-to-moderate signal-to-noise ratios, estimators can produce biased meteorological variables. Therefore, to obtain the best quality of radar products, it is desirable to compute meteorological variables using the noise power measured at each antenna position. In this paper, an effective method is proposed to estimate the noise power in real time from measured powers at each radial. The technique uses a set of criteria to detect radar range resolution volumes that do not contain weather signals and uses those to estimate the noise power. The algorithm is evaluated using both simulated and real time series data; results show that the proposed technique accurately produces estimates of the system noise power. An operational implementation of this technique is expected to significantly improve the quality of weather radar products with a relatively small computational burden.
2014: On the Use of a Radial-Based Noise Power Estimation Technique to Improve Estimates of the Correlation Coefficient on Dual-Polarization Weather Radars. Journal of Atmospheric and Oceanic Technology, 31, 1867–1880, doi:10.1175/JTECH-D-14-00052.1.,
A weather surveillance radar antenna intercepts thermal radiation from various sources, including the ground, the sun, the sky, and precipitation. In the radar receiver, this external radiation produces noise that adds to the receiver internal noise and results in the system noise power varying with the antenna position. If these variations are not captured, they translate into erroneous signal powers because these are computed via subtraction of noise power measurements from the overall power estimates. This may lead to biased meteorological variables at low to moderate signal-to-noise ratios if those are computed using signal power estimates. In dual-polarization radars, this problem is even more pronounced, particularly for correlation coefficient estimates that use noise power measurements from both the horizontal and vertical channels. An alternative is to use estimators that eliminate the need for noise corrections but require sufficient correlation of signals in sample time, which limits their applicability. Therefore, when the use of the latter is inappropriate, the quality of correlation coefficient estimates can be improved by computing them using sufficiently accurate noise powers measured at each antenna position. An effective technique that estimates the noise powers in real time at each scan direction and in parallel with weather data collection has been proposed. Herein, the impacts of such a technique on the estimation of the correlation coefficient are investigated. The results indicate that the use of more accurate noise power estimates can significantly reduce the bias of correlation coefficient estimates, thus visibly improving the correlation coefficient fields. This is expected because the correlation coefficient is computed using noise power measurements from both the horizontal and vertical channels.
2014: Assessment of Censoring Using Coherency-Based Detectors on Dual-Polarized Weather Radar. Journal of Atmospheric and Oceanic Technology, 31, 1694–1703, doi:10.1175/JTECH-D-13-00074.1., , ,
In Doppler weather radars, signals may exhibit coherency in sample time, whereas noise does not. Additionally, in dual-polarized radars, samples of precipitation echo obtained in the two orthogonally polarized channels are substantially more correlated than samples of noise. Therefore, estimates of auto- and cross correlations can be used individually, collectively, and/or with power measurements to enhance detection of precipitation signals, compared to the approach that uses only power estimates from one channel. A possible advantage of using only estimates of coherency for signal detection is that the detector’s performance is less sensitive to errors in noise power measurements. Hence, censoring is more likely to produce desired false alarm rates even if nonnegligible uncertainties are present in the noise power estimates. In this work these aspects are considered using real data from weather radars. Three novel censoring approaches are evaluated and compared to the censoring approach that uses only estimates of signal and noise powers. The first approach uses only cross-correlation measurements, and the second approach combines these with the lag-1 autocorrelation estimates. The third approach utilizes all estimates as in the previous two approaches in combination with power measurements from the horizontal and the vertical channels. Herein, it is shown that, when more accurate measurements of noise powers are available, the third approach produces the highest detection rates followed by the second and the first approaches. Also, it is corroborated that the first and the second approaches exhibit less sensitivity to inaccurate system noise power measurements than the third one.
2014: Multiscale characteristics and evolution of perturbations for warm season convection allowing precipitation forecasts: Dependence on background flow and method of perturbation. Monthly Weather Review, 142, 1053–1073, doi:10.1175/MWR-D-13-00204.1., , , , , , , ,
Multiscale convection-allowing precipitation forecast perturbations are examined for two forecasts and systematically over 34 forecasts out to 30-h lead time using Haar Wavelet decomposition. Two small-scale initial condition (IC) perturbation methods are compared to the larger-scale IC and physics perturbations in an experimental convection-allowing ensemble. For a precipitation forecast driven primarily by a synoptic-scale baroclinic disturbance, small-scale IC perturbations resulted in little precipitation forecast perturbation energy on medium and large scales, compared to larger-scale IC and physics (LGPH) perturbations after the first few forecast hours. However, for a case where forecast convection at the initial time grew upscale into a mesoscale convective system (MCS), small-scale IC and LGPH perturbations resulted in similar forecast perturbation energy on all scales after about 12 h. Small-scale IC perturbations added to LGPH increased total forecast perturbation energy for this case. Averaged over 34 forecasts, the small-scale IC perturbations had little impact on large forecast scales while LGPH accounted for about half of the error energy on such scales. The impact of small-scale IC perturbations was also less than, but comparable to, the impact of LGPH perturbations on medium scales. On small scales, the impact of small-scale IC perturbations was at least as large as the LGPH perturbations. The spatial structure of small-scale IC perturbations affected the evolution of forecast perturbations, especially at medium scales. There was little systematic impact of the small-scale IC perturbations when added to LGPH. These results motivate further studies on properly sampling multiscale IC errors.
2014: Forecast evaluation of an Observing System Simulation Experiment assimilating both radar and satellite data. Monthly Weather Review, 142, 107–124., , , ,
In part 1 of this study, Jones et al. (2013a) compared the relative skill of assimilating simulated radar reflectivity and radial velocity observations and satellite 6.95 μm brightness temperatures (TB) and found that both improved analyses of water vapor and cloud hydrometeor variables for a cool season, high impact weather event across the central U.S. In this study, we examine the impact of the observations on 1 – 3 h forecasts and provide additional analysis of the relationship between simulated satellite and radar data observations to various water vapor and cloud hydrometeor variables. Correlation statistics showed that the radar and satellite observations are sensitive to different variables. Assimilating 6.95 μm TB primarily improved the atmospheric water vapor and frozen cloud hydrometeor variables such as ice and snow. Radar reflectivity proved more effective in both the lower and mid-troposphere with the best results observed for rain water, graupel, and snow. The impacts of assimilating both data sets decreases rapidly as a function of forecast time. By 1 h, the effects of satellite data become small on forecast cloud hydrometeor values, though it remains useful for atmospheric water vapor. The impacts of radar data last somewhat longer, sometimes up to 3 h, but also display a large decrease in effectiveness by 1 h. Generally, assimilating both satellite and radar data simultaneously generates the best analysis and forecast for most cloud hydrometeor variables.
2014: Comparison of disdrometer and X-band mobile radar observations in convective storms. Monthly Weather Review, 141, 2414–2435., , , ,
Microphysical data from thunderstorms are sparse, yet they are essential to validate microphysical schemes in numerical models. Mobile, dual-polarization X-band radars are capable of providing a wealth of data that include radar reflectivity, drop shape, and hydrometeor type. However, X-band radars suffer from beam attenuation in heavy rainfall and hail, which can be partially corrected with attenuation correction schemes. In this research, we compare surface disdrometer observations to results from a differential phase-based attenuation correction scheme. This scheme is applied to data recorded by the National Oceanic and Atmospheric Administration (NOAA) X-band dual-Polarized (NOXP) mobile radar, which was deployed during the second Verification of the Origins of Rotation in Tornadoes EXperiment (VORTEX2). Results are presented from five supercell thunderstorms and one squall line (183 minutes of data). The median disagreement (radar-disdrometer) in attenuation-corrected reflectivity (Z) and differential reflectivity (ZDR) is just 1.0 dB and 0.19 dB, respectively. However, two data subsets reveal much larger discrepancies in Z (ZDR): 5.8 dB (1.6 dB) in a hailstorm and -13 dB (-0.61 dB) when the radar signal quality index (SQI) is less than 0.8. The discrepancies are much smaller when disdrometer and S-band WSR-88D Z are compared, with differences of -1.5 dB (hailstorm) and -0.66 dB (NOXP SQI < 0.8). A comparison of the hydrometeor type retrieved from disdrometer and NOXP radar data is also presented, in which the same class is assigned 63% of the time.
2013: Subtropical - polar jet interactions in Southern Plains dust storms. Journal of Geophysical Research, 118, 12893–12914, doi:10.1002/2013JD020345., , , , , , , ,
The origin of two separate Southern High Plains dust storms, which occurred over a 2 day period in February 2007, is traced to an interaction between the subtropical jet (STJ) and the polar jet (PJ). A large-scale thermal wind imbalance resulting from the confluence of these two jets led to a series of mesoscale circulations that ultimately produced the dust storms. Understanding the connectivity between the dust storms with differing geometries is central to the present investigation. The study rests on the interpretation of analyses from upper air and surface observations complemented by imagery from satellites, the 32 km gridded data set from the North American Regional Reanalysis, and a fine-resolution (6 km grid) simulation from the Weather Research and Forecasting model. Principal assertions from the present study are (1) scale interaction is fundamental to the creation of an environment conducive to dust storm development, (2) low to middle tropospheric mass adjustment is the primary response to a large-scale imbalance, (3) the mesoscale mass adjustment is associated with circulations about a highly accelerative jet streak resulting from the merger of the PJ and STJ, (4) the structure of the jet streak resulting from this merger governs the evolution of the geometry of the dust plumes, with plumes that initially had a straight-line orientation developing a semicircular geometry, and (5) it is concluded that improvements in dust storm prediction will depend on an augmentation to the upper air network in concert with a flow-dependent data assimilation strategy.
2014: Multi-sensor imaging and space-ground cross-validation for 2010 flood along Indus River, Pakistan. Remote Sensing, 6, 2392–2407, doi:10.3390/rs6032393., , , , ,
Flood monitoring was conducted using multi-sensor data from space-borne optical, and microwave sensors; with cross-validation by ground-based rain gauges and streamflow stations along the Indus River; Pakistan. First; the optical imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) was processed to delineate the extent of the 2010 flood along Indus River; Pakistan. Moreover; the all-weather all-time capability of higher resolution imagery from the Advanced Synthetic Aperture Radar (ASAR) is used to monitor flooding in the lower Indus river basin. Then a proxy for river discharge from the Advanced Microwave Scanning Radiometer (AMSR-E) aboard NASA’s Aqua satellite and rainfall estimates from the Tropical Rainfall Measuring Mission (TRMM) are used to study streamflow time series and precipitation patterns. The AMSR-E detected water surface signal was cross-validated with ground-based river discharge observations at multiple streamflow stations along the main Indus River. A high correlation was found; as indicated by a Pearson correlation coefficient of above 0.8 for the discharge gauge stations located in the southwest of Indus River basin. It is concluded that remote-sensing data integrated from multispectral and microwave sensors could be used to supplement stream gauges in sparsely gauged large basins to monitor and detect floods.
2014: Evaluation of three high-resolution satellite precipitation estimates: Potential for monsoon monitoring over Pakistan. Advances in Space Research, 54, 670–684, doi:10.1016/j.asr.2014.04.017., , , , , ,
Multi-sensor precipitation datasets including two products from the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) and estimates from Climate Prediction Center Morphing Technique (CMORPH) product were quantitatively evaluated to study the monsoon variability over Pakistan. Several statistical and graphical techniques are applied to illustrate the nonconformity of the three satellite products from the gauge observations. During the monsoon season (JAS), the three satellite precipitation products captures the intense precipitation well, all showing high correlation for high rain rates (>30 mm/day). The spatial and temporal satellite rainfall error variability shows a significant geo-topography dependent distribution, as all the three products overestimate over mountain ranges in the north and coastal region in the south parts of Indus basin. The TMPA-RT product tends to overestimate light rain rates (approximately 100%) and the bias is low for high rain rates (about ±20%). In general, daily comparisons from 2005 to 2010 show the best agreement between the TMPA-V7 research product and gauge observations with correlation coefficient values ranging from moderate (0.4) to high (0.8) over the spatial domain of Pakistan. The seasonal variation of rainfall frequency has large biases (100–140%) over high latitudes (36N) with complex terrain for daily, monsoon, and pre-monsoon comparisons. Relatively low uncertainties and errors (Bias ±25% and MAE 1–10 mm) were associated with the TMPA-RT product during the monsoon-dominated region (32–35N), thus demonstrating their potential use for developing an operational hydrological application of the satellite-based near real-time products in Pakistan for flood monitoring.
2014: Absorption Properties of Supercooled Liquid Water between 31 and 225 GHz: Evaluation of Absorption Models Using Ground-Based Observations. Journal of Applied Meteorology and Climatology, 53, 1028–1045, doi:10.1175/JAMC-D-13-0214.1., , , , , , ,
Microwave radiometers (MWR) are commonly used to quantify the amount of supercooled liquid water (SLW) in clouds; however, the accuracy of the SLW retrievals is limited by the poor knowledge of the SLW dielectric properties at microwave frequencies. Six liquid water permittivity models were compared with ground-based MWR observations between 31 and 225 GHz from sites in Greenland, the German Alps, and a low-mountain site; average cloud temperatures of observed thin cloud layers range from 0 degC to -33 degC. A recently published method to derive ratios of liquid water opacity from different frequencies was employed in this analysis. These ratios are independent of liquid water path and equal to the ratio of alpha_liquid at those frequencies that can be directly compared with the permittivity model predictions. The observed opacity ratios from all sites show highly consistent results that are generally within the range of model predictions; however, none of the models are able to approximate the observations over the entire frequency and temperature range. Findings in earlier published studies were used to select one specific model as a reference model for alpha_liquid at 90 GHz; together with the observed opacity ratios, the temperature dependence of alpha_liquid at 31.4, 52.28, 150, and 225 GHz was derived. The results reveal that two models fit the opacity ratio data better than the other four models, with one of the two models fitting the data better for frequencies below 90 GHz and the other for higher frequencies. These findings are relevant for SLW retrievals and radiative transfer in the 31-225 GHz frequency region.
2014: Marine Fog: A Review. Atmospheric Research, 143, 142–175, doi:http:dx.doi.org/10.1016/j.atmosres.2013.12.012., , , , , ,
Abstract The objective of this review is to discuss physical processes over a wide range of spatial scales that govern the formation, evolution, and dissipation of marine fog. We consider marine fog as the collective combination of fog over the open sea along with coastal sea fog and coastal land fog. The review includes a history of sea fog research, field programs, forecasting methods, and detection of sea fog via satellite observations where similarity in radiative properties of fog top and the underlying sea induce further complexity. The main thrust of the study is to provide insight into causality of fog including its initiation, maintenance, and destruction. The interplay between the various physical processes behind the several stages of marine fog is among the most challenging aspects of the problem. An effort is made to identify this interplay between processes that include the microphysics of fog formation and maintenance, the influence of large-scale circulations and precipitation/clouds, radiation, turbulence (air–sea interaction), and advection. The environmental impact of marine fog is also addressed. The study concludes with an assessment of our current knowledge of the phenomenon, our principal areas of ignorance, and future lines of research that hold promise for advances in our understanding.
2014: Lessons from the 2011 Joplin tornado. Natural Hazards Observer, XXXIX, 10–15., , , ,
Days after the tornado, the National Institute of Standards and Technology, in cooperation with the National Oceanic and Atmospheric Administration, dispatched a team of researchers with expertise in structural, fire, and wind engineering, disaster sociology (human behavior and emergency communication and response), meteorology, and severe storm predictions and warnings to conduct a technical investigation of the event under the National Construction Safety Team Act. One of the objectives of the NIST investigation was to un- derstand how the public responded to the National Weather Service’s and the city of Joplin’s emergency warnings and communications during the event. This study provided an approximation of the environmental conditions that existed during the storm (via wind speed estimates), an evaluation of the performance of build- ings in the affected area, and an understanding of the consequences of the tornado for the people who were in its path. As with any NCST investigation, the ultimate goal was to develop findings and recommendations for improvements to codes, standards, and practices for buildings and emergency communication procedures that will lead to improved safety in tornadoes.
2013: A dual-polarization radar signature of hydrometeor refreezing in winter storms. Journal of Applied Meteorology and Climatology, 52, 2549–2566, doi:10.1175/JAMC-D-12-0311.1., , , ,
Polarimetric radar measurements in winter storms that produce ice pellets have revealed a unique signature that is indicative of ongoing hydrometeor refreezing. This refreezing signature is observed within the low-level subfreezing air as an enhancement of differential reflectivity ZDR and specific differential phase KDP and a decrease of radar reflectivity factor at horizontal polarization ZH and copolar correlation coefficient ρhv. It is distinct from the overlying melting-layer “brightband” signature and suggests that unique microphysical processes are occurring within the layer of hydrometeor refreezing. The signature is analyzed for four ice-pellet cases in central Oklahoma as observed by two polarimetric radars. A statistical analysis is performed on the characteristics of the refreezing signature for a case of particularly long duration. Several hypotheses are presented to explain the appearance of the signature, along with a summary of the pros and cons for each. It is suggested that preferential freezing of small drops and local ice generation are plausible mechanisms for the appearance of the ZDR and KDP enhancements. Polarimetric measurements and scattering calculations are used to retrieve microphysical information to explore the validity of the hypotheses. The persistence and repetitiveness of the signature suggest its potential use in operational settings to diagnose the transition between freezing rain and ice pellets.
2014: The anatomy and physics of Zdr columns: Investigating a polarimetric radar signature with a spectral bin microphysical model. Journal of Applied Meteorology and Climatology, 53, 1820–1843., , , , , ,
Polarimetric radar observations of deep convective storms frequently reveal columnar enhancements of differential reflectivity Zdr. Such "Zdr columns" can extend upward more than 3 km above the environmental 0 deg level, indicative of supercooled liquid drops being lofted by the updraft. Previous observational and modeling studies of Zdr columns are reviewed. To address remaining questions, the Hebrew University Cloud Model, an advanced spectral bin microphyscal model, is coupled with a polarimetric radar operator to simulate the formation and life cycles of Zdr columns in a deep convective continental storm. In doing so, the mechanisms by which Zdr columns are produced are clarified, including the formation of large drops in the updraft by recirculation of smaller raindrops formed aloft back into the updraft at low levels. The internal hydrometeor structure of Zdr columns is quantified, revealing the transition from supercooled liquid drops to freezing drops to hail with height in the Zdr column. The life cycle of Zdr columns from early formation, through growth to maturity, to demise is described, showing how hail falling out through the weakening or ascending updraft bubble dominates the reflectivity factor Zh causing the death of the Zdr column and leaving behind its "ghost" of supercooled drops. In addition, the practical applications of Zdr columns and their evolution are explored. The height of the Zdr column is correlated with updraft strength, and the evolution of the Zdr column height is correlated with increases in Zh and hail mass content at the ground after a lag of 10 - 15 min.
2014: Edward Epstein's stochastic - dynamic approach to ensemble weather prediction. Bulletin of the American Meteorological Society, 95, 99–116, doi:10.1175/BAMS-D-13-00036.2.,
In the late-1960s, well before the availability of computer power to produce ensemble weather forecasts, Edward Epstein (1931 – 2008) developed a stochastic – dynamic prediction (SDP) method for calculating the temporal evolution of mean value, variance and covariance of the model variables—the statistical moments of a time varying probability density function that define an ensemble forecast. This statistical – dynamical approach to ensemble forecasting is an alternative to the Monte Carlo formulation that is currently used in operations. The stages of Epstein’s career that led to his development of this methodology are presented with the benefit of his oral history and supporting documentation that describes the retreat of strict deterministic weather forecasting. The important follow-on research by two of Epstein’s protégés, Rex Fleming and Eric Pitcher, is also presented. A low-order nonlinear dynamical system is used to explore the rudiments of SDP and to compare and contrast deterministic forecasting with statistical – dynamical forecasts from SDP and Monte Carlo. An exact probability density function for the problem has been found through solution to Liouville’s equation. The paper ends with a discussion of SDP’s strengths and weaknesses and its possible future as an operational and research tool in probabilistic-dynamic weather prediction.
2014: Deterministic/Probabilistic prediction of nonlinear advection: Part II Online Supplement. Bulletin of the American Meteorological Society, 95, ES1–ES4, doi:10.1175/BAMS-D-13-00036.2.,
The exact solution to the probabilistic-deterministic constraint of nonlinear advection is found through solution to Louisville’s equation and compared with the approximate solutions via Monte Carlo and Epstein SDP methods. This contribution is an online supplement to the paper “Edward Epstein’s stochastic dynamic approach to ensemble weather prediction”.
2014: Ground Clutter Detection using the Statistical Properties of Signals Received with a Polarimetric Radar. IEEE Transactions on Signal Processing, 62, 597–606., , ,
Polarimetric weather radars provide additional
measurements that allow better characterization of the targeted
medium. Because ground clutter has different polarimetric characteristics
from weather echoes, dual-polarization measurements
can be used to distinguish one from the other. Ground clutter and
weather signals also have different statistical properties which
can be utilized to distinguish one from the other. A test statistic,
obtained from the generalized likelihood ratio test (GLRT), and a
simple Bayesian classifier (SBC), with inputs from the mean and
covariance of the received signals, are developed to detect ground
clutter in the presence of weather signals. It is found that the test
statistic produces false detections caused by narrow-band zero-velocity
weather signals while the SBC can effectively neutralize
them. This work is aimed at detecting ground clutter based solely
on data from each resolution volume. The performances of the
test statistic and SBC are shown by applying them to radar data
collected with the University of Oklahoma-Polarimetric Radar for
Innovation in Meteorology and Engineering.
2014: Characterizing Spatiotemporal Variations of Hourly Rainfall by Gauge and Radar in the Mountainous Three Gorges Region. Journal of Applied Meteorology and Climatology, 53, 873–889, doi:10.1175/JAMC-D-13-0277.1., , , , ,
Understanding spatiotemporal rainfall patterns in mountainous areas is of great importance for prevention of natural disasters such as flash floods and landslides. There is little knowledge about rainfall variability over historically underobserved complex terrains, however, and especially about the variations of hourly rainfall. In this study, the spatiotemporal variations of hourly rainfall in the Three Gorges region (TGR) of China are investigated with gauge and newly available radar data. The spatial pattern of hourly rainfall has been examined by a number of statistics, and they all show that the rainfall variations are time-scale and location dependent. In general, the northern TGR receives more-intense and longer-duration rainfall than do other parts of the TGR, and short-duration storms could occur in most of the TGR. For temporal variations, the summer diurnal cycle shifts from a morning peak in the west to a late-afternoon peak in the east while a mixed pattern of two peaks exists in the middle. In statistical terms, empirical model–based estimation indicates that the correlation scale of hourly rainfall is about 40 km. Further investigation shows that the correlation distance varies with season, from 30 km in the warm season to 60 km in the cold season. In addition, summer rainstorms extracted from radar rainfall data are characterized by short duration (6–8 h) and highly localized patterns (5–17 and 13–36 km in the minor and major directions, respectively). Overall, this research provides quantitative information about the rainfall regime in the TGR and shows that the combination of gauge and radar data is useful for characterizing the spatiotemporal pattern of storm rainfall over complex terrain.
2014: Radar-based quantitative precipitation estimate in the Three Gorges region of Yangtze River. JOURNAL OF HYDROELECTRIC ENGINEERING, 33, 29–35., , , , ,
Weather radar can detect spatiotemporal variability of precipitation in high resolution, but it suffers several sources of error. In this study, over the Three Gorges region (TGR) in Yangtze River, we have developed a first version of single radar based quantitative precipitation estimation (QPE) algorithms including various critical procedures, such as beam blockage analysis, ground clutter filter, rain type identification, and multiple Z-R relations. Current radar based QPE shows obvious range dependent performance, which suggests it could only reliably retrieve rainfall within the range of 100 km. Further, comparison among three datasets processed with different algorithms reveals that beam blockage is one of major error sources; within the range of 50km, affected by ground clutters, false alarm is another typical error. Multiple Z-R relations have effectively reduced bias at various ranges. The overall performance indicates that radar QPE generally underestimates (overestimates) typical stratiform (convective) storm rainfall events, and as a whole, it tends to slightly underestimate hourly rainfall during the 2010 summer.
2013: Workshop on Weather Ready Nation: Science Imperatives for Severe Thunderstorm Research. Bulletin of the American Meteorological Society, 94, ES171–ES174, doi:10.1175/BAMS-D-12-00238.1., ,
2013: Research Needs for Severe Weather Research: An Introduction. International Journal of Mass Emergencies and Disasters, 31, 335–339., ,
2013: An Integrated Agenda for Research on Severe Storms. International Journal on Mass Emergencies and Disasters, 13, 428–454., ,
This article summarizes the major issues identified in the Birmingham workshop, which are categorized below in terms of the eight white paper topics. A number of these issues crossed disciplinary lines and, thus, were discussed in more than one of the disciplinary groups. Consequently, the allocation of issues to groups is somewhat arbitrary. The following section summarizes the organizational recommendations and research issues identified during the workshop. It is important to note that these discussions focused mostly on issues at the interface of different disciplines, especially the relationship between the meteorological and social/behavioral sciences. The last section of this report provides relatively detailed descriptions of 12 research projects that were proposed by the workshop participants and, where necessary, supplemented by the editors.
2014: Storm-Scale Ensemble Kalman Filter Assimilation of Total Lightning Flash-Extent Data. Monthly Weather Review, 142, 3683–3695, doi:10.1175/MWR-D-14-00061.1.,
A set of observing system simulation experiments (OSSEs) demonstrates the potential benefit from en- semble Kalman filter (EnKF) assimilation of total lightning flash mapping data. Synthetic lightning data were generated to mimic the Geostationary Lightning Mapper (GLM) instrument that is planned for the Geostationary Operational Environmental Satellite R-series (GOES-R) platform. The truth simulation was conducted using multimoment bulk microphysics, explicit electrification mechanisms, and a branched lightning parameterization to produce 2-min-averaged synthetic pseudo-GLM observations at 8-km GLM resolution and at a hypothetical 1-km resolution.
The OSSEs use either perfect (two-moment bulk) or imperfect (single-moment, graupel only) microphysics. One OSSE with perfect microphysics included the same electrification physics as the truth simulation to generate lightning flash rates and flash-extent densities (FED). The other OSSEs used linear relationships between flash rate and graupel echo volume as the observation operator. The assimilation of FED at 8-km horizontal resolution can effectively modulate the convection simulated at 1-km horizontal resolution by sharpening the location of reflectivity echoes and the spatial location probability of convective updrafts. Tests with zero flash rates show that the lightning assimilation can help to limit spurious deep convection, as well. Pseudo-GLM observations at 1 km further sharpen the analyses of location (updraft and reflectivity) of the relatively simple storm structure.
2013: Investigation of ground-based microwave radiometer calibration techniques at 530 hPa. Atmos. Meas. Technol, 6, 2641–2658, doi:10.5194/amt-6-2641-2013., , , , ,
Ground-based microwave radiometers (MWR) are becoming more and more common for remotely sensing the atmospheric temperature and humidity profile as well as path-integrated cloud liquid water content. The calibration accuracy of the state-of-the-art MWR HATPRO-G2 (Humidity And Temperature Profiler - Generation 2) was investigated during the second phase of the Radiative Heating in Underexplored Bands Campaign (RHUBC-II) in northern Chile (5320 m above mean sea level, 530 hPa) conducted by the Atmospheric Radiation Measurement (ARM) program conducted between August and October 2009. This study assesses the quality of the two frequently used liquid nitrogen and tipping curve calibrations by performing a detailed error propagation study, which exploits the unique atmospheric conditions of RHUBC-II. Both methods are known to have open issues concerning systematic offsets and calibration repeatability. For the tipping curve calibration an uncertainty of ±0.1 to ±0.2 K (K-band) and ±0.6 to ±0.7 K (V-band) is found. The uncertainty in the tipping curve calibration is mainly due to atmospheric inhomogeneities and the assumed air mass correction for the Earth curvature. For the liquid nitrogen calibration the estimated uncertainty of ±0.3 to ±1.6 K is dominated by the uncertainty of the reflectivity of the liquid nitrogen target. A direct comparison between the two calibration techniques shows that for six of the nine channels that can be calibrated with both methods, they agree within the assessed uncertainties. For the other three channels the unexplained discrepancy is below 0.5 K. Systematic offsets, which may cause the disagreement of both methods within their estimated uncertainties, are discussed.
2014: The physical processes of current cutoff in lightning leaders. Journal of Geophysical Research, 119, 1–15, doi:10.1002/2013JD020494.,
Current cutoff in lightning channels, which takes place in the development of both intracloud and cloud-to-ground flashes, is still poorly understood. A new evaluation of the conditions leading to current cutoff, and also of the two existing hypotheses of the cutoff mechanism is the main objective of the paper. We reviewed the literature with results of measurements and modeling of free-burning arcs in a laboratory (the closest analogs of lightning leaders) focusing on the relationship between the internal electric field and current. This relationship is an essential factor governing the leader’s behavior in the current cutoff. Our analysis of the physical processes that lead to current cutoff in lightning channels is based on the accepted requirement that the electrical parameters of the leader satisfy the physical principles for plasma-channel formation. In our analysis of the mechanisms leading to current cutoff, we identify the two types of current cutoff in lightning channels: the first is the current cutoff in a single, unbranched leader channel, which occurs as the result of reaching the threshold for conditions for leader propagation; the second type of current cutoff occurs in branched leaders, when screening by the leader branches alters the ambient electrical environment, thus diminishing the leader current and causing cutoff at a branching point, or at the base of the straight channel that preceded branching. From observations of lightning in nature we provide new evidence of the screening effect of branching on current cutoff, and we advance the electrostatic model of this mechanism, introduced by Mazur and Ruhnke . We also critically evaluate the concept of lightning-channel instability, proposed by Heckman , as a mechanism leading to current cutoff, and show that the fundamentals of this concept, and, therefore, the concept in its entirety, are invalid.
2014: Enhancing understanding and improving prediction of severe weather through spatiotemporal relational learning. Machine Learning, 95, 27–50, doi:10.1007/s10994-013-5343-x., , , , ,
Severe weather, including tornadoes, thunderstorms, wind, and hail annually cause significant loss of life and property. We are developing spatiotemporal machine learning techniques that will enable meteorologists to improve the prediction of these events by improving their understanding of the fundamental causes of the phenomena and by building skillful empirical predictive models. In this paper, we present significant enhancements of our Spatiotemporal Relational Probability Trees that enable autonomous discovery of spatiotemporal relationships as well as learning with arbitrary shapes. We focus our evaluation on two real-world case studies using our technique: predicting tornadoes in Oklahoma and predicting aircraft turbulence in the United States. We also discuss how to evaluate success for a machine learning algorithm in the severe weather domain, which will enable new methods such as ours to transfer from research to operations, provide a set of lessons learned for embedded machine learning applications, and discuss how to field our technique.
2014: Doppler velocities at orthogonal polarizations in radar echoes from insects and birds. IEEE Geoscience and Remote Sensing Letters, 11, 592–596, doi:10.1109/LGRS.2013.2272011., , ,
The difference of Doppler velocities (DDV) measured with weather radar at horizontal and vertical polarizations in echoes from insects and birds are considered. In weather echoes, DDV is usually less than 0.5 m s-1 whereas in echoes from flying birds and insects it can reach 5-7 m s-1. Such large difference can be used as an additional parameter in distinguishing between weather and non-meteorological echoes. It is shown that large values of DDV pertain to multi-peaked Doppler spectra with different spectral differential reflectivities in the peaks.
2014: Parameterization of ice fall speeds in midlatitude cirrus: Results from SPartICus. Journal of Geophysical Research, 119, 3857–3876, doi:10.1002/2013JD020602., , , ,
The climate sensitivity predicted in general circulation models can be sensitive to the treatment of the ice particle fall velocity. In this study, the mass-weighted ice fall speed (Vm) and the number concentration ice fall speed (Vn) in midlatitude cirrus clouds are computed from in situ measurements of ice particle area and number concentration made by the two-dimensional stereo probe during the Small Particles In Cirrus field campaign. For single-moment ice microphysical schemes, Vm and the ice particle size distribution effective diameter (De) were parameterized in terms of cloud temperature (T) and ice water content (IWC). For two-moment schemes, Vm and Vn were related to De and the mean maximum dimension (meanD), respectively. For single-moment schemes, although the correlations of Vm and De with T were higher than the correlations of Vm and De with IWC, it is demonstrated that Vm and De are better predicted by using both T and IWC. The parameterization relating Vm to T and IWC is compared with another scheme relating Vm to T and IWC, with the latter based on millimeter cloud radar measurements. Regarding two-moment ice microphysical schemes, a strong correlation was found between De and Vm and between meanD and Vn owing to their similar weightings by ice particle mass and number concentration, respectively. Estimating Vm from De makes Vm a function of IWC and projected area, realistically coupling Vm with both the cloud microphysics and radiative properties.
2013: Multi-Doppler radar analysis and forecast of a tornadic thunderstorm using a 3D variational data assimilation technique and ARPS model. Advances in Meteorology, 2013, 1–18, doi:10.1155/2013/281695., , , ,
A three-dimensional variational (3DVAR) assimilation technique developed for a convective-scale NWP model—advanced regional prediction system (ARPS)—is used to analyze the 8 May 2003, Moore/Midwest City, Oklahoma tornadic supercell thunderstorm. Previous studies on this case used only one or two radars that are very close to this storm. However, three other radars observed the upper-level part of the storm. Because these three radars are located far away from the targeted storm, they were overlooked by previous studies. High-frequency intermittent 3DVAR analyses are performed using the data from five radars that together provide a more complete picture of this storm. The analyses capture a well-defined mesocyclone in the midlevels and the wind circulation associated with a hook-shaped echo. The analyses produced through this technique are used as initial conditions for a 40-minute storm-scale forecast. The impact of multiple radars on a short-term NWP forecast is most evident when compared to forecasts using data from only one and two radars. The use of all radars provides the best forecast in which a strong low-level mesocyclone develops and tracks in close proximity to the actual tornado damage path.
2014: Evaluation of Single-Doppler Radar Wind Retrievals in Flat and Complex Terrain. Journal of Applied Meteorology and Climatology, 53, 1920–1931, doi:10.1175/JAMC-D-13-0297.1., , , , , , , ,
The accuracy of winds derived from Next Generation Weather Radar (NEXRAD) level-II data is assessed by comparison with independent observations from 915-MHz radar wind profilers. The evaluation is carried out at two locations with very different terrain characteristics. One site is located in an area of complex terrain within the State Line Wind Energy Center in northeastern Oregon. The other site is located in an area of flat terrain on the east-central Florida coast. The National Severe Storm Laboratory’s two-dimensional variational data assimilation (2DVar) algorithm is used to retrieve wind fields from the KPDT (Pendleton, Oregon) and KMLB (Melbourne, Florida) NEXRAD radars. Wind speed correlations at most observation height levels fell in the range from 0.7 to 0.8, indicating that the retrieved winds followed temporal fluctuations in the profiler-observed winds reasonably well. The retrieved winds, however, consistently exhibited slow biases in the range of 1–2 m s−1. Wind speed difference distributions were broad, with standard deviations in the range from 3 to 4 m s−1. Results from the Florida site showed little change in the wind speed correlations and difference standard deviations with altitude between about 300 and 1400 m AGL. Over this same height range, results from the Oregon site showed a monotonic increase in the wind speed correlation and a monotonic decrease in the wind speed difference standard deviation with increasing altitude. The poorest overall agreement occurred at the lowest observable level (~300 m AGL) at the Oregon site, where the effects of the complex terrain were greatest.
2013: Processing and Calibration of Submillimeter Fourier Transform Radiometer Spectra From the RHUBC-II Campaign. IEEE Transactions on Geoscience and Remote Sensing, 51, 5187–5198, doi:10.1109/TGRS.2012.2231869., ,
The Radiative Heating in Underexplored Bands Campaign-II, conducted in 2009 from a high-altitude site in northern Chile, combined ground-based radiometry with radiosonde measurements of atmospheric state, for the purpose of testing atmospheric radiation models under conditions strongly influenced by water vapor in the middle to upper troposphere. A suite of broadband Fourier transform spectrometers (FTSs) measured the entire terrestrial thermal radiance spectrum from 1000- to 3.3-μm wavelength. The submillimeter portion of the spectrum, from 1000 to 85 μm (300-3500 GHz) was covered by a polarizing FTS referred to as the Smithsonian Astrophysical Observatory (SAO) FTS. Here, we describe data processing and radiometric calibration algorithms for this instrument. These include correction of interferograms for periodic sampled lag error, development of a temperature-dependent instrument calibration model, and principal component analysis of the complete set of spectra acquired during the campaign.
2014: Understanding Thermal Drift in Liquid Nitrogen Loads Used for Radiometric Calibration in the Field. Journal of Atmospheric and Oceanic Technology, 31, 647–655, doi:10.1175/JTECH-D-13-00171.1., , ,
An absorbing load in a liquid nitrogen bath is commonly used as a radiance standard for calibrating radiometers operating at microwave to infrared wavelengths. It is generally assumed that the physical temperature of the load is stable and equal to the boiling point temperature of pure N2 at the ambient atmospheric pressure. However, this assumption will fail to hold when air movement, as encountered in outdoor environments, allows O2 gas to condense into the bath. Under typical conditions, initial boiling point drift rates of order 25 mK/min can occur, and the boiling point of a bath maintained by repeated refilling with pure N2 can eventually shift by approximately 2 K. Laboratory bench tests of a liquid nitrogen bath under simulated wind conditions are presented together with an example of an outdoor radiometer calibration that demonstrates the effect, and the physical processes involved are explained in detail. A key finding is that in windy conditions, changes in O2 volume fraction are related accurately to fractional changes in bath volume due to boiloff, independent of wind speed. This relation can be exploited to ensure that calibration errors due to O2 contamination remain within predictable bounds.
2014: A Fully Reconfigurable Polarimetric Phased Array Antenna Testbed. International Journal of Antennas and Propagation, 2014, 1–14, doi:10.1155/2014/439606., , , , , ,
The configurable phased array demonstrator (CPAD) is a low-cost, reconfigurable, small-scale testbed for the dual-polarized array antenna and radar prototype. It is based on the concept that individual transmit and receive (TR) modules and radiating elements can be configured in different ways to study the impact of various array manifolds on adiation pattern performance. For example, CPAD is configured as (a) a 4 × 4 planar array, (b) a planar array with mirror configuration, and (c) a circular array to support the multifunctional phased array radar (MPAR) system risk eduction studies. System descriptions are given in detail, and measurements are made and results are analyzed.
2013: Correcting fast-mode pressure errors in storm-scale ensemble Kalman filter analyses. Advances in Meteorology, 2013, 1–14, doi:10.1155/2013/624931., ,
A typical storm-scale ensemble Kalman filter (EnKF) analysis/forecast system is shown to introduce imbalances into the ensemble posteriors that generate acoustic waves in subsequent integrations. When the EnKF is used to research storm-scale dynamics, the resulting spurious pressure oscillations are large enough to impact investigation of processes driven by nonhydrostatic pressure gradient forces. Fortunately, thermodynamic retrieval techniques traditionally applied to dual-Doppler wind analyses can be adapted to diagnose the balanced portion of an EnKF pressure analysis, thereby eliminating the fast-mode pressure oscillations. The efficacy of this approach is demonstrated using a high-resolution supercell thunderstorm simulation as well as EnKF analyses of a simulated and a real supercell.
2013: Correction of Radar QPE Errors Associated with Low and Partially Observed Brightband Layers. Journal of Hydrometeorology, 14, 1933–1943., ,
The melting of aggregated snow/crystals often results in an enhancement of the reflectivity observed by weather radars, and this is commonly referenced as the bright band (BB). The locally high reflectivity often causes overestimation in radar quantitative precipitation estimates (QPE) if no appropriate correction is applied. When the melting layer is high, a complete BB layer profile (including top, peak, and bottom) can be observed by the ground radar, and a vertical profile of reflectivity (VPR) correction can be made to reduce the BB impact. When a melting layer is near the ground and the bottom part of the bright band cannot be observed by the ground radar, a VPR correction cannot be made directly from the Weather Surveillance Radar-1988 Doppler (WSR-88D) radar observations. This paper presents a new VPR correction method under this situation. From high-resolution precipitation profiler data, an empirical relationship between BB peak and BB bottom is developed. The empirical relationship is combined with the apparent BB peak observed by volume scan radars and the BB bottom is found. Radar QPEs are then corrected based on the estimated BB bottom. The new method was tested on 13 radars during seven low brightband events over different areas in the United States. It is shown to be effective in reducing the radar QPE overestimation under low brightband situations.
2013: Correction of Radar QPE Errors for Nonuniform VPRs in Mesoscale Convective Systems Using TRMM Observations. Journal of Hydrometeorology, 14, 1672–1682., , , , ,
Mesoscale convective systems (MCSs) contain both regions of convective and stratiform precipitation, and a bright band (BB) is often found in the stratiform region. Inflated reflectivity intensities in the BB often cause positive biases in radar quantitative precipitation estimation (QPE). A vertical profile of reflectivity (VPR) correction is necessary to reduce such biases. However, existing VPR correction methods for ground-based radars often perform poorly for MCSs owing to their coarse resolution and poor coverage in the vertical direction, especially at far ranges. Spaceborne radars such as the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), on the other hand, can provide high resolution VPRs. The current study explores a new approach of incorporating the TRMM VPRs into the VPR correction for the Weather Surveillance Radar-1988 Doppler (WSR-88D) radar QPE. High-resolution VPRs derived from the Ku-band TRMM PR data are converted into equivalent S-band VPRs using an empirical technique. The equivalent S-band TRMM VPRs are resampled according to the WSR-88D beam resolution, and the resampled (apparent) VPRs are then used to correct for BB effects in the WSR-88D QPE when the ground radar VPR cannot accurately capture the BB bottom. The new scheme was tested on six MCSs from different regions in the United States and it was shown to provide effective mitigation of the radar QPE errors due to BB contamination.
2013: A real-time automated convective and stratiform precipitation segregation algorithm in native radar coordinates. Q.J.R. Meteorol. Soc, 139, 2233–2240., , ,
A new convective/stratiform (C/S) precipitation segregation algorithm was developed for applications with single radar volume scan data and in its native (spherical) coordinates. The new algorithm consists of two parts: the first is to find convective rainfall cores based on physical characteristics of different rainfall types, and the second is to delineate the full convective area through seeded region growing. The new scheme takes into account radar sampling characteristics and a variety of precipitation scenarios where the C/S delineation was relatively challenging. The new C/S delineation scheme has two impacts on radar quantitative precipitation estimation (QPE): (i) correctly separate convective and stratiform regions such that appropriate Ze–R relationships can be applied; (ii) correctly define the stratiform area such that a vertical profile of reflectivity (VPR) correction can be applied. The VPR correction is very important to reduce overestimation errors in radar QPEs associated with bright band. The new algorithm was tested on many events and showed improved performance over previous schemes, especially when handling strong bright band and melting graupels in stratiform precipitation. The new scheme was also tested in radar quantitative precipitation estimation (QPE) and it consistently reduced the root mean square error and mean absolute bias in the radar QPE when compared with gauges.
2014: Improving WSR-88D Radar QPE for Orographic Precipitation Using Profiler Observations. Journal of Hydrometeorology, 15, 1135–1151, doi:10.1175/JHM-D-13-0131.1., , , , ,
Quantitative precipitation estimation (QPE) in the West Coast region of the United States has been a big challenge for Weather Surveillance Radar-1988 Doppler (WSR-88D) because of severe blockages caused by the complex terrain. The majority of the heavy precipitation in the West Coast region is associated with strong moisture flux from the Pacific that interacts with the coastal mountains. Such orographic enhancement of precipitation occurs at low levels and cannot be observed well by WSR-88D because of severe blockages. Specifically, the radar beam either samples too high above the ground or misses the orographic enhancement at lower levels, or the beam broadens with range and cannot adequately resolve vertical variations of the reflectivity structure. The current study developed an algorithm that uses S-band Precipitation Profiler (S-PROF) radar observations in northern California to improve WSR-88D QPEs in the area. The profiler data are used to calculate two sets of reference vertical profiles of reflectivity (RVPRs), one for the coastal mountains and another for the Sierra Nevada. The RVPRs are then used to correct the WSR-88D QPEs in the corresponding areas. The S-PROF–based VPR correction methodology (S-PROF-VPR) has taken into account orographic processes and radar beam broadenings with range. It is tested using three heavy rain events and is found to provide significant improvements over the operational radar QPE.
2014: Performance assessment of the successive Version 6 and Version 7 TMPA products over the climate-transitional zone in the southern Great Plains, USA. Journal of Hydrology, 513, 446–456, doi:10.1016/j.jhydrol.2014.03.040., , , , , ,
This study assesses the latest version, Version 7 (V7) Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) rainfall estimates by comparison with the previous version, Version 6 (V6), for both near-real-time product (3B42RT) and post-real-time research products (3B42) over the climate-transitional zone in the southern Great Plains, USA. Two basins, the Verdigris River Basin (VRB) in the east and the Upper Washita Basin (UWB) in the west, with distinctive precipitation but similar vegetation and elevation, were selected to evaluate the TMPA products using rain gauge-blended products with WSR-88D NEXRAD Stage IV. This study sheds important insights into the detailed spatiotemporal precipitation errors, and also reveals algorithm performance during extreme events over the two low-relief basins within a high precipitation gradient zone. Based on nine years of measurements (2002–2010), this study shows that: (1) 3B42V7 corrects the widespread rainfall underestimation from research product 3B42V6, especially for the drier UWB with relative bias (RB) improvement from −23.24% to 2.24%. (2) 3B42RTV7 reduces the widespread, notable overestimation from the real-time product 3B42RTV6, with minor overestimation in the wet VRB and underestimation in the dry UWB. (3) For both versions of TMPA products, larger root mean square error (RMSE) but higher correlation coefficients (CCs) tend to appear for the wet VRB, while lower RMSE and CC mostly occur in the dry UWB. 3B42RTV7 shows a drawback that the CC declines significantly, especially in the dry region where it drops below 0.5. (4) Seasonally, autumn rainfall estimations in both versions and basins have the least bias. The 3B42RTV6 overestimation and 3B42V6 underestimation of spring and summer rainfall, which dominate the annual total bias, are significantly reduced for both basins in the V7 products. Winter precipitation estimation improvement is also noticeable with significant RB and RMSE reductions. However, considerable overestimation in summer rainfall still exists for the wet basin. (5) Although V7 has the overall best performance, it still shows deficiency in detecting extreme rainfall events in low-relief regions, tending to underestimate peak rainfall intensity and to misrepresent timing and locations. Results from this study can be used for reference in the algorithm development of the next generation of Integrated Multi-Satellite Retrievals for Global Precipitation Measurement (GPM) scheduled to launch in 2014.
2013: Retrieving 3-D wind field from phased-array radar rapid scans. Advances in Meteorology, 2013, 441–456, doi:10.1155/2013/792631., , , , ,
The previous two-dimensional simple adjoint method for retrieving horizontal wind field from a time sequence of single-Doppler scans of reflectivity and/or radial velocity is further developed into a new method to retrieve both horizontal and vertical winds at high temporal and spatial resolutions. This new method performs two steps. First, the horizontal wind field is retrieved on the conical surface at each tilt (elevation angle) of radar scan. Second, the vertical velocity field is retrieved in a vertical cross-section along the radar beam with the horizontal velocity given from the first step. The method is applied to phased array radar (PAR) rapid scans of the storm winds and reflectivity in a strong microburst event and is shown to be able to retrieve the three-dimensional wind field around a targeted downdraft within the storm that subsequently produced a damaging microburst. The method is computationally very efficient and can be used for real-time applications with PAR rapid scans.
2014: Monitoring surface dryness using geostationary thermal observations. Remote Sensing Letters, 5, 10–18, doi:10.1080/2150704X.2013.862601., , , ,
This study introduces an operational land dryness index (DI) developed by the National Oceanic and Atmospheric Administration (NOAA)/National Severe Storms Laboratory (NSSL) to assist mainly in wildfire risk assessment and forecasting. The index is developed based on observations of daytime rise of surface radiative temperature from the geostationary operational environmental satellites (GOES). Thermal measurements of heating rates are normalized using solar radiation, also from GOES, to account for spatial changes in solar time. Maps of the DI are developed systematically over the continental US on a daily basis from cloud-free pixels. In addition, anomalies of the DI are evaluated with respect to a 5-year mean to further classify the extent of dryness. The DI is assessed using (1) the microwave-based soil moisture product from the National Snow and Ice Data Center (NSIDC), (2) estimates of soil moisture from the North American Land Data Assimilation System (NLDAS), that is based on the North American land surface model and (3) vegetation cover estimated from the normalized difference vegetation index (NDVI). An overall agreement is found between the DI and microwave-based estimates of soil moisture. The peak absolute correlation, which reached around 0.6, is found in late summer over scrubland. The correlation between the products also shows a seasonal pattern that needs to be corroborated with further observations. The consistency of the developed product with other independent measures implies its reliability and its potential in wildfire forecasting.
2014: Sources of uncertainty in precipitation type forecasting. Weather and Forecasting, 29, 936–953, doi:dx.doi.org/10.1175/WAF-D-14-00007.1., , , , ,
Five implicit precipitation type algorithms are assessed using observed and model-forecast sounding data in order to measure their accuracy and to gauge the effects of model uncertainty on algorithm performance. When applied to observed soundings, all algorithms provide very reliable guidance on snow and rain (SN and RA). However, their skills for ice pellets and freezing rain (IP and FZRA) are comparatively low. Most misclassifications of IP are for FZRA and vice-versa. Deeper investigation reveals that no method used in any of the algorithms to differentiate between IP and FZRA allows for clear discrimination between the two forms. The effects of model uncertainty are also considered. For SN and RA, these effects are minimal and each algorithm performs reliably. Conversely, IP and FZRA are strongly impacted. When the range of uncertainty is fully accounted for, their resulting wetbulb temperature profiles are nearly indistinguishable leading to very poor skill for all algorithms. Although currently-available data does not allow for a thorough investigation, comparison of the statistics from only those soundings that are associated with long-duration, horizontally-uniform regions of FZRA shows there are significant differences between these profiles and those that are from more transient, highly variable environments. Hence, a five-category (SN, RA, IP, FZRA, and IP/FZRA mix) approach is advocated to differentiate between sustained regions of horizontally-uniform FZRA (or IP) from more mixed environments.
2014: Next-Generation Severe Weather Forecasting and Communication. EOS (AGU), 95, 325–326, doi:10.1002/2014EO360001., , ,
Despite advances in predicting hazardous weather, the underlying methodologies used to generate severe weather watches and warnings have changed little over the past several decades. Now, researchers are proposing a new concept called Forecasting a Continuum of Environmental Threats (FACETs), which aims to enhance weather forecasting with high-resolution probabilistic hazard information.
2013: Polarimetric radar characteristics of melting hail. Part I: Theoretical simulations using spectral microphysical modeling. Journal of Applied Meteorology and Climatology, 52, 2849–2870., , , ,
Spectral (bin) microphysics models are used to simulate polarimetric radar variables in melting hail. Most computations are performed in a framework of a steady-state, one-dimensional column model. Vertical profiles of Z, ZDR, KDP, Ah, and ADP are modeled at S, C, and X bands for a variety of size distributions of ice particles aloft. The impact of temperature lapse rate, humidity, vertical air velocities, and ice particle density on the vertical profiles of the radar variables is also investigated. Polarimetric radar signatures of melting hail depend on the degree of melting or the height of the radar resolution volume with respect to the freezing level, which determines the relative fractions of partially and completely melted hail (i.e., rain). Simulated vertical profiles of radar variables are very sensitive to radar wavelength and the slope of the size distribution of hail aloft, which is well correlated with maximal hail size. Analysis of relative contributions of different parts of the hail / rain size spectrum to the radar variables allows explaining a number of experimentally observed features such as large differences in Z of hail at the three radar wavelengths, unusually high values of ZDR at C band, and relative insensitivity of the measurements at C and X bands to the presence of large hail exceeding 2.5 cm in diameter. Modeling results are consistent with S- and C-band polarimetric radar observations and are utilized in Part II for devising practical algorithms for hail detection and determination of its size as well as attenuation correction and rainfall estimation in the presence of hail.
2013: Polarimetric radar characteristics of melting hail. Part II: Practical implications. Journal of Applied Meteorology and Climatology, 52, 2871–2886., , , ,
The results of theoretical modeling in Part I of this paper series are utilized to develop practical recommendations for developing the algorithms for hail detection and determination of its size as well as attenuation correction and rainfall estimation in the presence of hail. A new algorithm for discrimination between small hail (with maximal size less than 2.5 cm), large hail (with diameters between 2.5 and 5.0 cm), and giant hail with size exceeding 5.0 cm is proposed and implemented for applications with the S-band dual-polarization WSR-88D radars. The fuzzy-logic algorithm is based on the combined use of radar reflectivity Z, differential reflectivity ZDR, and cross-correlation coefficient ρhv. The parameters of the membership functions depend on the height of the radar resolution volume with respect to the freezing level, exploiting the size-dependent melting characteristics of hailstones. The attenuation effects in melting hail are quantified in this study and a novel technique for polarimetric attenuation correction in the presence of hail is suggested. The use of a rainfall estimator based on specific differential phase KDP is justified based on the results of theoretical simulations and comparison of actual radar retrievals at S band with gage measurements for storms containing large hail with diameters exceeding 2.5 cm.
2014: Potential utilization of specific attenuation for rainfall estimation, mitigation of partial beam blockage, and radar networking. Journal of Atmospheric and Oceanic Technology, 31, 599–619., , , ,
The potential utilization of specific attenuation A for rainfall estimation, mitigation of partial beam blockage, and radar networking is investigated. The R(A) relation is less susceptible to the variability of drop size distributions than traditional rainfall algorithms based on radar reflectivity Z, differential reflectivity ZDR, and specific differential phase KDP in a wide range of rain intensity. Specific attenuation is estimated from the radial profile of the measured Z and the total span of the differential phase using the ZPHI method. Since the estimated A is immune to reflectivity biases caused by radar miscalibration, attenuation, partial beam blockage, and wet radomes, rain retrieval from R(A) is also immune to the listed factors. The R(A) method was tested at X band using data collected by closely located radars in Germany and at S band for polarimetrically upgraded WSR-88D radars in the US. It is demonstrated that the two adjacent X-band radars one of which is miscalibrated and another is affected by partial beam blockage produce almost indistinguishable fields of rain rate. It is also shown that the R(A) method yields robust estimates of rain rates and rain totals at S band where specific attenuation is vanishingly small. The X-band and S-band estimates of rainfall obtained from R(A) are in good agreement with gauges.
2013: Geostationary Operational Environmental Satellite (GOES)-14 super rapid scan operations to prepare for GOES-R. Journal of Applied Remote Sensing, 7, 1–20, doi:10.1117/1.JRS.7.073462., , , , , , , , , , ,
Geostationary Operational Environmental Satellite (GOES)-14 imager was operated by National Oceanic and Atmospheric Administration (NOAA) in an experimental rapid scan 1-min mode that emulates the high-temporal resolution sampling of the Advanced Baseline Imager (ABI) on the next generation GOES-R series. Imagery with a refresh rate of 1 min of many phenomena were acquired, including clouds, convection, fires, smoke, and hurricanes, including 6 days of Hurricane Sandy through landfall. NOAA had never before operated a GOES in a nearly continuous 1-min mode for such an extended period of time, thereby making these unique datasets to explore the future capabilities possible with GOES-R. The next generation GOES-R imager will be able to routinely take mesoscale (1000 km×1000 km) images every 30 s (or two separate locations every minute). These images can be acquired even while scanning continental United States and full disk images. These high time-resolution images from the GOES-14 imager are being used to prepare for the GOES-R era and its advanced imager. This includes both the imagery and quantitative derived products such as cloud-top cooling. Several animations are included to showcase the rapid change of the many phenomena observed during super rapid scan operations for GOES-R (SRSOR).
2014: Incorporating surface soil moisture information in error modeling of TRMM passive microwave rainfall. IEEE Transactions on Geoscience and Remote Sensing, 52, 6226–6240, doi:10.1109/TGRS.2013.2295795., , , , , ,
This study assesses the significance of conditioning the error modeling of The National Aeronautics and Space Administration (NASA)'s Tropical Rainfall Measurement Mission Microwave Imager rainfall algorithm (2A12) to near-surface soil moisture data derived from a land surface model. The term “conditioning” means the model parameters' dependence on soil wetness. The Oklahoma (OK) region is used as the study area due to its relatively low vegetation and smooth terrain and the availability of high-quality in situ hydrometeorological data from the Mesonet network. The study period includes two warm seasons (March to October) from 2009 and 2010. The National Oceanic and Atmospheric Administration/National Severe Storms Laboratory ground radar-based National Mosaic and Quantitative Precipitation Estimation system (NMQ/Q2) is used as high-resolution (5-min/1-km) reference rainfall. The surface wetness conditions (wet, dry, and normal) were determined from surface soil moisture fields simulated by the NASA Catchment Land Surface Model forced with Q2 rainfall fields. A 2-D satellite rainfall error model, SREM2D, is used to provide the ensemble error representation of 2A12 rainfall using two different parameter calibration approaches: conditioning the SREM2D parameters to the surface soil wetness categories versus not. The statistical analysis of model-generated ensembles and associated error metrics show better performance when surface wetness information is used in SREM2D. In terms of quantification, the ensemble rainfall from the conditional SREM2D parameter calibration shows better reference rainfall encapsulation. The conditioning of SREM2D to soil wetness can apply to rainfall rate estimates from other microwave sensors on board low Earth orbiting satellites and is valuable for the forthcoming missions on precipitation (Global Precipitation Measurement) and soil moisture (Soil Moisture Active Passive).
2014: VORTEX2 observations of a low-level mesocyclone with multiple internal rear-flank downdraft momentum surges in the 18 May 2010 Dumas, Texas, supercell. Monthly Weather Review, 142, 2935–2960., , , , , ,
Observations collected in the second Verification of the Origins of Rotation in Tornadoes Experiment during a 15-min period of a supercell occurring on 18 May 2010 near Dumas, Texas, are presented. The primary data collection platforms include two Ka-band mobile Doppler radars, which collected a near-surface, short-baseline dual-Doppler dataset within the rear-flank outflow of the Dumas supercell; an X-band, phased-array mobile Doppler radar, which collected volumetric single-Doppler data with high temporal resolution; and in situ thermodynamic and wind observations of a six-probe mobile mesonet.
Rapid evolution of the Dumas supercell was observed, including the development and decay of a low-level mesocyclone and four internal rear-flank downdraft (RFD) momentum surges. Intensification and upward growth of the low-level mesocyclone were observed during periods when the midlevel mesocyclone was minimally displaced from the low-level circulation, suggesting an upward-directed perturbation pressure gradient force aided in the intensification of low-level rotation. The final three internal RFD momentum surges evolved in a manner consistent with the expected behavior of a dynamically forced occlusion downdraft, developing at the periphery of the low-level mesocyclone during periods when values of low-level cyclonic azimuthal wind shear exceeded values higher aloft. Failure of the low-level mesocyclone to acquire significant vertical depth suggests that dynamic forcing above internal RFD momentum surge gust fronts was insufficient to lift the negatively buoyant air parcels comprising the RFD surges to significant heights. As a result, vertical acceleration and the stretching of vertical vorticity in surge parcels were limited, which likely contributed to tornadogenesis failure.
2014: Examination of a Real-Time 3DVAR Analysis System in the Hazardous Weather Testbed. Weather and Forecasting, 39, 63–77, doi:10.1175/WAF-D-13-00044.1., , , , , , , , , , ,
Forecasters and research meteorologists tested a real-time three-dimensional variational data assimilation (3DVAR) system in the Hazardous Weather Testbed during the springs of 2010-2012 to determine its capabilities to assist in the warning process for severe storms. This storm-scale system updates a dynamically consistent three-dimensional wind field every five minutes, with a horizontal and average vertical grid spacing of 1 km and 400 m, respectively. The system analyzed the life cycles of 218 supercell thunderstorms on 27 event days during these experiments, producing multiple products such as vertical velocity, vertical vorticity, and updraft helicity. These data are compared to multi-radar / multi-sensor data from the Warning Decision Support System – Integrated Information to document the performance characteristics of the system, such as how vertical vorticity values compare to azimuthal shear fields calculated directly from Doppler radial velocity. Data are stratified by range from nearest radar as well as the number of radars entering into the analysis of a particular storm. The 3DVAR system shows physically realistic trends of updraft speed and vertical vorticity for a majority of cases. Improvements are needed to better estimate the near-surface winds when no radar is nearby and to improve the timeliness of the input data. However, the 3DVAR wind field information provides an integrated look at storm structure that may be of more use to forecasters than traditional radar-based proxies used to infer severe weather potential.
2013: The Impact of Covariance Localization for Radar Data on EnKF Analyses of a Developing MCS: Observing System Simulation Experiments. Monthly Weather Review, 141, 3691–3709, doi:10.1175/MWR-D-12-00203.1., ,
Several observing system simulation experiments (OSSEs) were performed to assess the impact of covariance localization of radar data on ensemble Kalman filter (EnKF) analyses of a developing convective system. Simulated Weather Surveillance Radar-1988 Doppler (WSR-88D) observations were extracted from a truth simulation and assimilated into experiments with localization cutoff choices of 6, 12, and 18 km in the horizontal and 3, 6, and 12 km in the vertical. Overall, increasing the horizontal localization and decreasing the vertical localization produced analyses with the smallest RMSE for most of the state variables. The convective mode of the analyzed system had an impact on the localization results. During cell mergers, larger horizontal localization improved the results. Prior state correlations between the observations and state variables were used to construct reverse cumulative density functions (RCDFs) to identify the correlation length scales for various observation-state pairs. The OSSE with the smallest RMSE employed localization cutoff values that were similar to the horizontal and vertical length scales of the prior state correlations, especially for observation-state correlations above 0.6. Vertical correlations were restricted to state points closer to the observations than in the horizontal, as determined by the RCDFs. Further, the microphysical state variables were correlated with the reflectivity observations on smaller scales than the three-dimensional wind field and radial velocity observations. The ramifications of these findings on localization choices in convective-scale EnKF experiments that assimilate radar data are discussed.
2014: Analysis of flash flood parameters and human impacts in the US from 2006 to 2012. Journal of Hydrology, 519, 863–870, doi:10.1016/j.jhydrol.2014.07.004., , , , , ,
Several different factors external to the natural hazard of flash flooding can contribute to the type and magnitude of their resulting damages. Human exposure, vulnerability, fatality and injury rates can be minimized by identifying and then mitigating the causative factors for human impacts. A database of flash flooding was used for statistical analysis of human impacts across the U.S. 21,549 flash flood events were analyzed during a 6-year period from October 2006 to 2012. Based on the information available in the database, physical parameters were introduced and then correlated to the reported human impacts. Probability density functions of the frequency of flash flood events and the PDF of occurrences weighted by the number of injuries and fatalities were used to describe the influence of each parameter.
The factors that emerged as the most influential on human impacts are short flood durations, small catchment sizes in rural areas, vehicles, and nocturnal events with low visibility. Analyzing and correlating a diverse range of parameters to human impacts give us important insights into what contributes to fatalities and injuries and further raises questions on how to manage them.
2014: Total Lightning Observations and Tools for the 20 May 2013 Moore, Oklahoma, Tornadic Supercell. Journal of Operational Meteorology, 2, 71–88, doi:10.15191/nwajom.2014.0207., , , , ,
On 20 May 2013, a supercell thunderstorm developed west-southwest of Newcastle, Oklahoma, and even-tually produced an EF-5 tornado that struck Moore, Oklahoma. This article describes how total lightning observations associated with this rotating storm could benefit warning operations. This effort focuses on (i) the Geostationary Operational Environmental Satellite-R pseudo-geostationary lightning mapper product, (ii) the National Aeronautics and Space Administration’s Short-term Prediction Research and Transition Center/Meteorological Development Laboratory’s total lightning tracking tool, and (iii) a real-time lightning jump algorithm currently under development. Use of these three tools revealed a distinct increase or “jump” in the storm’s lightning flash rates prior to reported severe weather. Lightning jumps occurred 19 min prior to severe hail and coincided with the storm’s initial growth, while the second lightning jump occurred 26 min prior to tornado touchdown. This second jump accompanied an increase in rotational depth and strength. These rapid increases in total lightning activity can provide improved situational awareness to forecasters, as lightning jumps relate to the rapid strengthening of a storm’s updraft and serve as a precursor to the stretching of the storm vortex prior to severe weather events. Although lightning jumps alone do not always indicate imminent severe weather, they (i) have the potential to help reduce false alarms and (ii) can guide forecasters to issue warnings earlier than they would have with radar data alone.
2013: Backscatter differential phase – estimation and variability. Journal of Applied Meteorology and Climatology, 52, 2529–2548, doi:10.1175/JAMC-D-13-0124.1., , , , ,
On the basis of simulations and observations made with polarimetric radars operating at X, C, and S bands, the backscatter differential phase δ has been explored; δ has been identified as an important polarimetric variable that should not be ignored in precipitation estimations that are based on specific differential phase KDP, especially at shorter radar wavelengths. Moreover, δ bears important information about the dominant size of raindrops and wet snowflakes in the melting layer. New methods for estimating δ in rain and in the melting layer are suggested. The method for estimating δ in rain is based on a modified version of the “ZPHI” algorithm and provides reasonably robust estimates of δ and KDP in pure rain except in regions where the total measured differential phase ΦDP behaves erratically, such as areas affected by nonuniform beam filling or low signal-to-noise ratio. The method for estimating δ in the melting layer results in reliable estimates of δ in stratiform precipitation and requires azimuthal averaging of radial profiles of ΦDP at high antenna elevations. Comparisons with large disdrometer datasets collected in Oklahoma and Germany confirm a strong interdependence between δ and differential reflectivity ZDR. Because δ is immune to attenuation, partial beam blockage, and radar miscalibration, the strong correlation between ZDR and δ is of interest for quantitative precipitation estimation: δ and ZDR are differently affected by the particle size distribution (PSD) and thus may complement each other for PSD moment estimation. Furthermore, the magnitude of δ can be utilized as an important calibration parameter for improving microphysical models of the melting layer.
2014: Using microwave backhaul links to optimize the performance of algorithms for rainfall estimation and attenuation correction. Journal of Atmospheric and Oceanic Technology, 31, 1748–1760., , , , ,
The variability in rain drop size distributions and attenuation effects are the two major sources of uncertainty in radar-based quantitative precipitation estimation (QPE) even when dual polarization radars are used. New methods are introduced to exploit the measurements by commercial microwave radio links to reduce the uncertainties in both attenuation correction and rainfall estimation. The ratio α of specific attenuation A and specific differential phase KDP is the key parameter used in attenuation correction schemes and recently introduced R(A) algorithm. It is demonstrated that the factor α can be optimized using microwave links at Ku band oriented along radar radials with an accuracy of about 20-30%. The microwave links with arbitrary orientation can be utilized to optimize the intercepts in the R(KDP) and R(A) relations with an accuracy of about 25%. The performance of the suggested methods is tested using the polarimetric C-band radar operated by the German Weather Service on Mount Hohenpeissenberg in southern Germany and two radially-oriented Ku-band microwave links from the Ericsson GmbH.
2014: Investigations of backscatter differential phase in the melting layer. Journal of Applied Meteorology and Climatology, 53, 2344–2359., , , ,
Backscatter differential phase within the melting layer has been identified as a reliably measurable but still under-utilized polarimetric variable. Polarimetric radar observations at X band in Germany and S band in the United States are presented, which show maximal observed δ of 8.5° at X band but up to 70° at S band. Dual-frequency observation at X and C band in Germany and dual-frequency observations at C and S band in the U.S. are compared in order to explore the regional frequency dependencies of the δ signature. Theoretical simulations based on usual assumptions about the microphysical composition of the melting layer cannot reproduce the observed large values of δ at the lower-frequency bands and also underestimate the enhancements in differential reflectivity ZDR and reductions in the cross-correlaction coefficient ρhv. Simulations using a two-layer T-matrix code and a simple model for the representation of accretion can, however, explain the pronounced δ signatures at S and C bands in conjunction with small δ at X band. We conclude, that the δ signature bears information about microphysical accretion and aggregation processes in the melting layer and the degree of riming of the snowflakes aloft.
2014: Information Content and Uncertainties in Thermodynamic Profiles and Liquid Cloud Properties Retrieved from the Ground-Based Atmospheric Emitted Radiance Interferometer (AERI). Journal of Applied Meteorology and Climatology, 53, 752–771, doi:10.1175/JAMC-D-13-0126.1., ,
The Atmospheric Emitted Radiance Interferometer (AERI) observes spectrally resolved downwelling radiance emitted by the atmosphere in the infrared portion of the electromagnetic spectrum. Profiles of temperature and water vapor, and cloud liquid water path and effective radius for a single liquid cloud layer, are retrieved using an optimal estimation–based physical retrieval algorithm from AERI-observed radiance data. This algorithm provides a full error covariance matrix for the solution, and both the degrees of freedom for signal and the Shannon information content. The algorithm is evaluated with both synthetic and real AERI observations. The AERI is shown to have approximately 85% and 70% of its information in the lowest 2 km of the atmosphere for temperature and water vapor profiles, respectively. In clear-sky situations, the mean bias errors with respect to the radiosonde profiles are less than 0.2 K and 0.3 g/kg for heights below 2 km for temperature and water vapor mixing ratio, respectively; the maximum root-mean-square errors are less than 1 K and 0.8 g/kg. The errors in the retrieved profiles in cloudy situations are larger, due in part to the scattering contribution to the downwelling radiance that was not accounted for in the forward model. Scattering is largest in one of the spectral regions used in the retrieval, however, and removing this spectral region results in a slight reduction of the information content but a considerable improvement in the accuracy of the retrieved thermodynamic profiles.
2014: Aircraft Evaluation of Ground-Based Raman Lidar Water Vapor Turbulence Profiles in Convective Mixed Layers. Journal of Atmospheric and Oceanic Technology, 31, 1078–1088, doi:10.1175/JTECH-D-13-00075.1., , , ,
High temporal and vertical resolution water vapor measurements by Raman and differential absorption lidar systems have been used to characterize the turbulent fluctuations in the water vapor mixing ratio field in convective mixed layers. Since daytime Raman lidar measurements are inherently noisy (due to solar background and weak signal strengths), the analysis approach needs to quantify and remove the contribution of the instrument noise in order to derive the desired atmospheric water vapor mixing ratio variance and skewness profiles. This is done using the approach outlined by Lenschow et al.; however, an intercomparison with in situ observations was not performed.
Water vapor measurements were made by a diode laser hygrometer flown on a Twin Otter aircraft during the Routine Atmospheric Radiation Measurement (ARM) Program Aerial Facility Clouds with Low Optical Water Depths Optical Radiative Observations (RACORO) field campaign over the ARM Southern Great Plains (SGP) site in 2009. Two days with Twin Otter flights were identified where the convective mixed layer was quasi stationary, and hence the 10-s, 75-m data from the SGP Raman lidar could be analyzed to provide profiles of water vapor mixing ratio variance and skewness. Airborne water vapor observations measured during level flight legs were compared to the Raman lidar data, demonstrating good agreement in both variance and skewness. The results also illustrate the challenges of comparing a point sensor making measurements over time to a moving platform making similar measurements horizontally.
2014: An improved algorithm for polar cloud-base detection by ceilometer over ice sheets. Atmos. Meas. Technol, 7, 1153–1167, doi:10.5194/amt-7-1153-2014., , , , , ,
Optically thin ice and mixed-phase clouds play an important role in polar regions due to their effect on cloud radiative impact and precipitation. Cloud-base heights can be detected by ceilometers, low-power backscatter lidars that run continuously and therefore have the potential to provide basic cloud statistics including cloud frequency, base height and vertical structure. The standard cloud-base detection algorithms of ceilometers are designed to detect optically thick liquid-containing clouds, while the detection of thin ice clouds requires an alternative approach. This paper presents the polar threshold (PT) algorithm that was developed to be sensitive to optically thin hydrometeor layers (minimum optical depth tau >= 0.01). The PT algorithm detects the first hydrometeor layer in a vertical attenuated backscatter profile exceeding a predefined threshold in combination with noise reduction and averaging procedures. The optimal backscatter threshold of 3 × 10-4 km-1 sr-1 for cloud-base detection near the surface was derived based on a sensitivity analysis using data from Princess Elisabeth, Antarctica and Summit, Greenland. At higher altitudes where the average noise level is higher than the backscatter threshold, the PT algorithm becomes signal-to-noise ratio driven. The algorithm defines cloudy conditions as any atmospheric profile containing a hydrometeor layer at least 90 m thick. A comparison with relative humidity measurements from radiosondes at Summit illustrates the algorithm’s ability to significantly discriminate between clear-sky and cloudy conditions. Analysis of the cloud statistics derived from the PT algorithm indicates a year-round monthly mean cloud cover fraction of 72 % (+/-10 %) at Summit without a seasonal cycle. The occurrence of optically thick layers, indicating the presence of supercooled liquid water droplets, shows a seasonal cycle at Summit with a monthly mean summer peak of 40 % (+/-4 %). The monthly mean cloud occurrence frequency in summer at Princess Elisabeth is 46 % (+/-5 %), which reduces to 12 % (+/-2.5 %) for supercooled liquid cloud layers. Our analyses furthermore illustrate the importance of optically thin hydrometeor layers located near the surface for both sites, with 87 % of all detections below 500 m for Summit and 80 % below 2 km for Princess Elisabeth. These results have implications for using satellite-based remotely sensed cloud observations, like CloudSat that may be insensitive for hydrometeors near the surface. The decrease of sensitivity with height, which is an inherent limitation of the ceilometer, does not have a significant impact on our results. This study highlights the potential of the PT algorithm to extract information in polar regions from various hydrometeor layers using measurements by the robust and relatively low-cost ceilometer instrument.
2014: Comparison of Next-Day Convection-Allowing Forecasts of Storm motion on 1- and 4-km Grids. Weather and Forecasting, 29, 878–893, doi:10.1175/WAF-D-14-00011.1., , ,
This study compares next-day forecasts of storm motion from convection-allowing models with 1- and 4-km grid spacing. A tracking algorithm is used to determine the motion of discrete storms in both the model forecasts and an analysis of radar observations. The distributions of both the raw storm motions and the deviations of these motions from the environmental flow are examined to determine the overall biases of the 1- and 4-km forecasts and how they compare to the observed storm motions. The mean storm speeds for the 1-km forecasts are significantly closer to the observed mean than those for the 4-km forecasts when viewed relative to the environmental flow/shear, but mostly for the shorter-lived storms. For storm directions, the 1-km forecast storms move similarly to the 4-km forecast storms on average. However, for the raw storm motions and those relative to the 0–6-km shear, results suggest that the 1-km forecasts may alleviate some of a clockwise (rightward) bias of the 4-km forecasts, particularly for those that do not deviate strongly from the 0–6-km shear vector. This improvement in a clockwise bias also is seen for the longer-lived storms, but is not seen when viewing the storm motions relative to the 850–300-hPa mean wind or Bunkers motion vector. These results suggest that a reduction from 4- to 1-km grid spacing can potentially improve forecasts of storm motion, but further analysis of closer storm analogs are needed to confirm these results and to explore specific hypotheses for their differences.
2014: Effects of resolution of satellite-based rainfall estimates on hydrologic modeling skill at different scales. Journal of Hydrometeorology, 15, 593–613, doi:10.1175/JHM-D-12-0113.1., , , , , , ,
Uncertainty due to resolution of current satellite-based rainfall products is believed to be an important source of error in applications of hydrologic modeling and forecasting systems. A method to account for the input’s resolution and to accurately evaluate the hydrologic utility of satellite rainfall estimates is devised and analyzed herein. A radar-based Multisensor Precipitation Estimator (MPE) rainfall product (4 km, 1 h) was utilized to assess the impact of resolution of precipitation products on the estimation of rainfall and subsequent simulation of streamflow on a cascade of basins ranging from approximately 500 to 5000 km2. MPE data were resampled to match the Tropical Rainfall Measuring Mission’s (TRMM) 3B42RT satellite rainfall product resolution (25 km, 3 h) and compared with its native resolution data to estimate errors in rainfall fields. It was found that resolution degradation considerably modifies the spatial structure of rainfall fields. Additionally, a sensitivity analysis was designed to effectively isolate the error on hydrologic simulations due to rainfall resolution using a distributed hydrologic model. These analyses revealed that resolution degradation introduces a significant amount of error in rainfall fields, which propagated to the streamflow simulations as magnified bias and dampened aggregated error (RMSEs). Furthermore, the scale dependency of errors due to resolution degradation was found to intensify with increasing streamflow magnitudes. The hydrologic model was calibrated with satellite- and original-resolution MPE using a multiscale approach. The resulting simulations had virtually the same skill, suggesting that the effects of rainfall resolution can be accounted for during calibration of hydrologic models, which was further demonstrated with 3B42RT.
2014: Monitoring and Understanding Changes in Extremes: Extratropical Storms, Winds, and Waves. Bulletin of the American Meteorological Society, 95, 377–386, doi:10.1175/BAMS-D-12-00162.1., , , , , , , , , , , , , , , , , , , , , , , , , ,
This scientific assessment examines changes in three climate extremes—extratropical storms, winds, and waves—with an emphasis on U.S. coastal regions during the cold season. There is moderate evidence of an increase in both extratropical storm frequency and intensity during the cold season in the Northern Hemisphere since 1950, with suggestive evidence of geographic shifts resulting in slight upward trends in offshore/coastal regions. There is also suggestive evidence of an increase in extreme winds (at least annually) over parts of the ocean since the early to mid-1980s, but the evidence over the U.S. land surface is inconclusive. Finally, there is moderate evidence of an increase in extreme waves in winter along the Pacific coast since the 1950s, but along other U.S. shorelines any tendencies are of modest magnitude compared with historical variability. The data for extratropical cyclones are considered to be of relatively high quality for trend detection, whereas the data for extreme winds and waves are judged to be of intermediate quality. In terms of physical causes leading to multidecadal changes, the level of understanding for both extratropical storms and extreme winds is considered to be relatively low, while that for extreme waves is judged to be intermediate. Since the ability to measure these changes with some confidence is relatively recent, understanding is expected to improve in the future for a variety of reasons, including increased periods of record and the development of “climate reanalysis” projects.
2014: Airborne Rain-Rate Measurement with a Wide-Swath Radar Altimeter. Journal of Atmospheric and Oceanic Technology, 31, 860–875, doi:10.1175/JTECH-D-13-00111.1., , , , , ,
The NOAA Wide-Swath Radar Altimeter (WSRA) uses 80 narrow beams spread over +/-30 degree in the cross- track direction to generate raster lines of sea surface topography at a 10-Hz rate from which sea surface directional wave spectra are produced. A +/-14 degree subset of the backscattered power data associated with the topography measurements is used to produce independent measurements of rain rate and sea surface mean square slope at 10-s intervals. Theoretical calculations of rain attenuation at the WSRA 16.15-GHz operating frequency using measured drop size distributions for both mostly convective and mostly stratiform rainfall demonstrate that the WSRA absorption technique for rain determination is relatively insensitive to both ambient temperature and the characteristics of the drop size distribution,in contrast to reflectivity techniques. The variation of the sea surface radar reflectivity in the vicinity of a hurricane is reviewed. Fluctuations in the sea surface scattering characteristics caused by changes in wind speed or the rain impinging on the surface cannot contaminate the rain measurement because they are calibrated out using the WSRA measurement of mean square slope. WSRA rain measurements from a NOAA WP-3D hurricane research aircraft off the North Carolina coast in Hurricane Irene on 26 August 2011 are compared with those from the stepped fre- quency microwave radiometer (SFMR) on the aircraft and the Next Generation Weather Radar (NEXRAD) National Mosaic and Multi-Sensor Quantitative Precipitation Estimation (QPE) system.
2014: A cloud-based global flood disaster community cyber-infrastructure: Development and demonstration. Environmental Modelling and Software, 58, 86–94, doi:10.1016/j.envsoft.2014.04.007/., , , , , , ,
Flood disasters have significant impacts on the development of communities globally. This study de- scribes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest de- velopments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of- the-art cloud computing and crowd-sourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.
2014: The Autocorrelation Spectral Density for Doppler-Weather-Radar Signal Analysis. IEEE Transactions on Geoscience and Remote Sensing, 52, 508–518, doi:10.1109/TGRS.2013.2241775., ,
Time-domain autocovariance processing is widely accepted as a computationally efficient method to estimate the first three spectral moments of Doppler weather radar signals (i.e., mean signal power, mean Doppler velocity, and spectrum width). However, when signals with different frequency content (e.g., ground clutter) contaminate the weather signal, spectral processing using the periodogram estimator of the power spectral density (PSD) is the preferred tool of analysis. After spectral processing (i.e., filtering), a PSD-based autocorrelation estimator is typically employed to produce unbiased estimates of the weather-signal spectral moments. However, the PSD does not convey explicit phase information, which has the potential to aid in the spectral analysis of radar signals. In this paper, the autocorrelation spectral density (ASD) is introduced for spectral analysis of weather-radar signals as a generalization of the classical PSD, and an ASD-based autocorrelation estimator is proposed to produce unbiased estimates of the weather-signal spectral moments. A significant advantage of the ASD over the PSD is that it provides explicit phase information that can be exploited to identify and remove certain types of contaminant signals. Thus, the ASD provides an alternative means for spectral analysis, which can lead to improved quality of meteorological data from weather radars.
2014: Ensemble Kalman filter analyses and forecasts of a severe mesoscale convective system using different choices of microphysics schemes. Monthly Weather Review, 142, 3243–3263, doi:10.1175/MWR-D-13-00260.1., , ,
A WRF-based ensemble data assimilation system is used to produce storm-scale analyses and forecasts of the 4-5 July 2003 severe mesoscale convective system (MCS) over Indiana and Ohio, which produced numerous high wind reports across the two states. Single-Doppler observations are assimilated into a 36-member, storm-scale ensemble during the developing stage of the MCS with the ensemble Kalman filter (EnKF) approach encoded in the Data Assimilation Research Testbed (DART). The storm-scale ensemble is constructed from mesoscale EnKF analyses produced from the assimilation of routinely available observations from land and marine stations, rawinsondes, and aircraft, in an attempt to better represent the complex mesoscale environment for this event.
Three EnKF simulations were performed using the NSSL 1- and 2-moment and Thompson microphysical schemes. All three experiments produce a linear convective segment at the final analysis time, similar to the observed system at 2300 UTC 4 July 2003. The higher-order schemes—in particular, the Thompson scheme—are better able to produce short-range forecasts of both the convective and stratiform components of the observed bowing MCS, and produce the smallest temperature errors when comparing surface observations and dropsonde data to corresponding model data. Only the higher-order microphysical schemes produce any appreciable rear-to-front flow in the stratiform precipitation region that trailed the simulated systems. Forecast performance by the three microphysics schemes is discussed in context of differences in microphysical composition produced in the stratiform precipitation regions of the rearward expanding MCSs.
2013: Prognostic equation for radar radial velocity derived by considering atmospheric refraction and earth curvature. Journal of the Atmospheric Sciences, 70, 3328–3338., ,
The prognostic equation for the radial velocity field observed with a Doppler radar is derived to include the effects of atmospheric refraction and earth curvature on radar beam height and slope angle. The derived equation, called radial-velocity equation, contains a high-order small term that can be truncated. The truncated radial-velocity equation is shown to be much more accurate than its counterpart radial-velocity equation derived without considering the effects of atmospheric refraction and earth curvature. The truncated equation has the same concise form as its counterpart radial-velocity equation but remains to be sufficiently accurate as a useful dynamic constraint for radar wind analysis and assimilation (in normal situations) even up to the farthest 300 km radial range of operational WSR-88D radar scans where its counterpart radial-velocity equation becomes erroneous.
2013: A two-step variational method for analyzing severely aliased radar velocity observations with small Nyquist velocities. Quart. J. Roy. Meteorol. Soc, 139, 1904–1911, doi:10.1002/qj.2075., ,
By formulating the effect of radar velocity aliasing and the resulting zigzag discontinuities into the cost function for the velocity azimuth display (VAD) analysis, the previously developed alias-robust VAD analysis, called AR-VAD, can estimate the horizontal mean wind by directly fitting the VAD uniform-wind model to raw aliased radial-velocity observations. In this article, the AR-VAD analysis is further developed into a two-step alias-robust variational analysis, called AR-Var, to estimate the radial-velocity field beyond the VAD uniform-wind model from raw aliased radial-velocity observations on each range circle. In the first step, the original AR-VAD analysis is modified to fit the raw aliased radial-velocity observations around each of the two zero radial-velocity points on the selected range circle. The two analyzed radial-velocity fields are then combined into a single radial-velocity field not rigidly constrained by the VAD uniform-wind assumption. This combined radial-velocity field represents an improved fit to the observations over the entire range circle and thus can be used as the first-guess background to refine and perform the AR-Var analysis in the second step. The two-step AR-Var analysis can provide a reliable reference radial-velocity field for the reference check in radar velocity de-aliasing even when the Nyquist velocity is reduced below 12 m/s, and this is illustrated by both idealized and real examples.
2014: Fitting parametric vortex to aliased Doppler velocities scanned from hurricane. Monthly Weather Review, 142, 94–106., , ,
An alias-robust least squares method that produces less errors than established methods is developed to produce reference radial velocities for automatically correcting raw aliased Doppler velocities scanned from hurricanes. This method estimates the maximum tangential velocity VM and its radial distance RM from the hurricane vortex center by fitting a parametric vortex model directly to raw aliased velocities at and around each selected vertical level. In this method, aliasing-caused zigzag discontinuities in the relationship between the observed and true radial velocities are formulated into the cost function by applying an alias operator to the entire analysis-minus-observation term to ensure the cost function to be smooth and concave around the global minimum. Simulated radar velocity observations are used to examine the cost function geometry around the global minimum in the space of control parameters (VM, RM). The results show that the global minimum point can estimate the true (VM, RM) approximately if the hurricane vortex center location is approximately known and the hurricane core and vicinity areas are adequately covered by the radar scans, and the global minimum can be found accurately by an efficient descent algorithm as long as the initial guess is in the concave vicinity of the global minimum. The method is used with elaborated refinements for automated dealiasing, and this utility is highlighted by an example applied to severely aliased radial velocities scanned from a hurricane.
2013: Prediction of convective storms at convection-resolving 1-km resolution over continental United States with radar data assimilation: An example case of 26 May 2008. Advances in Meteorology, 2013, 1–9, doi:10.1155/2013/259052., , , , , ,
For the first time ever, convection-resolving forecasts at 1 km grid spacing were produced in realtime in spring 2009 by the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma. The forecasts assimilated both radial velocity and reflectivity data from all operational WSR-88D radars within a domain covering most of the continental United States. In preparation for the realtime forecasts, 1 km forecast tests were carried out using a case from spring 2008 and the forecasts with and without assimilating radar data are compared with corresponding 4 km forecasts produced in realtime. Significant positive impact of radar data assimilation is found to last at least 24 hours. The 1 km grid produced a more accurate forecast of organized convection, especially in structure and intensity details. It successfully predicted an isolated severe-weather-producing storm nearly 24 hours into the forecast, which all ten members of the 4 km real time ensemble forecasts failed to predict. This case, together with all available forecasts from 2009 CAPS realtime forecasts, provides evidence of the value of both convection-resolving 1 km grid and radar data assimilation for severe weather prediction for up to 24 hours.
2013: Intercomparison of the Version-6 and Version-7 TMPA precipitation products over high and low latitudes basins with independent gauge networks: Is the newer version better in both real-time and post-real-time analysis for water resources and hydrologic extremes?. Journal of Hydrology, 508, 77–87, doi:10.1016/j.jhydrol.2013.10.050., , , , , , , , ,
The TRMM Multi-satellite Precipitation Analysis (TMPA) system underwent an important upgrade in early 2013, at which the newest Version-7 TMPA products were formally released. In this study, the TMPA successive versions, original Version-6 and current Version-7, were evaluated and intercompared by using independent gauge observation networks for a 7-yr (2003–09) period over two representative basins in China at different latitude bands. The TMPA products studied are the Version-6 and Version-7 real-time 3B42RT estimates (RTV6 and RTV7) and post-real-time 3B42 estimates (V6 and V7). Assessments indicate that RTV7 represents a substantial improvement over RTV6 with respect to the systematic bias in the low-latitude Mishui basin, reaching similar accuracy levels as with the gauge-adjusted research products. But, such improvement was not found in the high-latitude Laohahe basin, suggesting that the current Version 7 TMPA real-time estimates still have much room for improvement at high latitudes. On the other hand, the post-real-time research product V7, which is expected to provide better precipitation information for water resources management in ungauged regions, generally outperforms V6 over both gauged basins and has the best performance among the four standard TMPA estimates. The seasonal analyses show that the new Version-7 algorithm notably reduces the bias between TMPA and observations during winter months for the low-latitude Mishui basin, but fails to effectively alleviate the serious overestimation for winter precipitation occurring in the high-latitude Laohahe basin. The study also reveals that all the TMPA products significantly underestimate high rain rates over the Mishui basin, especially for strong typhoon events during summer. Thus, caution should be exercised when applying the current Version-7 TMPA products for simulation and prediction of hydrologic extremes associated with heavy rainfall, such as floods or landslides.
2013: The Ensemble Kalman Filter Analyses and Forecasts of the 8 May 2003 Oklahoma City Tornadic Supercell Storm Using Single- and Double-Moment Microphysics Schemes. Monthly Weather Review, 141, 3388–3412, doi:10.1175/MWR-D-12-00237.1., , , , ,
A combined mesoscale and storm-scale ensemble data-assimilation and prediction system is developed using the Advanced Research core of the Weather Research and Forecasting Model (WRF-ARW) and the ensemble adjustment Kalman filter (EAKF) from the Data Assimilation Research Testbed (DART) software package for a short-range ensemble forecast of an 8 May 2003 Oklahoma City, Oklahoma, tornadic supercell storm. Traditional atmospheric observations are assimilated into a 45-member mesoscale ensemble over a continental U.S. domain starting 3 days prior to the event. A one-way-nested 45-member storm-scale ensemble is initialized centered on the tornadic event at 2100 UTC on the day of the event. Three radar observation assimilation and forecast experiments are conducted at storm scale using a single-moment, a semi-double-moment, and a full double-moment bulk microphysics scheme. Results indicate that the EAKF initializes the supercell storm into the model with good accuracy after a 1-h-long radar observation assimilation window. The ensemble forecasts capture the movement of the main supercell storm that matches reasonably well with radar observations. The reflectivity structure of the supercell storm using a double-moment microphysics scheme appears to compare better to the observations than that using a single-moment scheme. In addition, the ensemble system predicts the probability of a strong low-level vorticity track of the tornadic supercell that correlates well with the observed rotation track. The rapid 3-min update cycle of the storm-scale ensemble from the radar observations seems to enhance the skill of the ensemble and the confidence of an imminent tornado threat. The encouraging results obtained from this study show promise for a short-range probabilistic storm-scale forecast of supercell thunderstorms, which is the main goal of NOAA's Warn-on-Forecast initiative.
2013: The Impact of Mesoscale Environmental Uncertainty on the Prediction of a Tornadic Supercell Storm Using Ensemble Data Assimilation Approach. Advances in Meteorology, 2013, 1–15, doi:10.1155/2013/731647., , , ,
Numerical experiments over the past years indicate that incorporating environmental variability is crucial for successful very short-range convective-scale forecasts. To explore the impact of model physics on the creation of environmental variability and its uncertainty, combined mesoscale-convective scale data assimilation experiments are conducted for a tornadic supercell storm. Two 36-member WRF-ARW model-based mesoscale EAKF experiments are conducted to provide background environments using either fixed or multiple physics schemes across the ensemble members. Two 36-member convective-scale ensembles are initialized using background fields from either fixed physics or multiple physics mesoscale ensemble analyses. Radar observations from four operational WSR-88Ds are assimilated into convective-scale ensembles using ARPS model-based 3DVAR system and ensemble forecasts are launched. Results show that the ensemble with background fields from multiple physics ensemble provides more realistic forecasts of significant tornado parameter, dryline structure, and near surface variables than ensemble from fixed physics background fields. The probabilities of strong low-level updraft helicity from multiple physics ensemble correlate better with observed tornado and rotation tracks than probabilities from fixed physics ensemble. This suggests that incorporating physics diversity across the ensemble can be important to successful probabilistic convective-scale forecast of supercell thunderstorms, which is the main goal of NOAA’s Warn-on-Forecast initiative.
2014: A Real-Time Algorithm for Merging Radar QPEs with Rain Gauge Observations and Orographic Precipitation Climatology. Journal of Hydrometeorology, 15, 1794–1809, doi:10.1175/JHM-D-13-0163.1., , , , ,
High-resolution, accurate quantitative precipitation estimation (QPE) is critical for monitoring and prediction of flash floods and is one of the most important drivers for hydrological forecasts. Rain gauges provide a direct measure of precipitation at a point, which is generally more accurate than remotely sensed observations from radar and satellite. However, high-quality, accurate precipitation gauges are expensive to maintain, and their distributions are too sparse to capture gradients of convective precipitation that may produce flash floods. Weather radars provide precipitation observations with significantly higher resolutions than rain gauge networks, although the radar reflectivity is an indirect measure of precipitation and radar-derived QPEs are subject to errors in reflectivity–rain rate (Z–R) relationships. Further, radar observations are prone to blockages in complex terrain, which often result in a poor sampling of orographically enhanced precipitation. The current study aims at a synergistic approach to QPE by combining radar, rain gauge, and an orographic precipitation climatology. In the merged QPE, radar data depict high-resolution spatial distributions of the precipitation and rain gauges provide accurate precipitation measurements that correct potential biases in the radar QPE. The climatology provides a high-resolution background of the spatial precipitation distribution in the complex terrain where radar coverage is limited or nonexistent. The merging algorithm was tested on heavy precipitation events in different areas of the United States and provided a superior QPE to the individual components. The new QPE algorithm is fully automated and can be easily implemented in an operational system.
2013: Assimilation of passive microwave streamflow signals for improving flood forecasting: A first study in Cubango River Basin, Africa. Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 6, 2375–2390, doi:10.1109/JSTARS.2013.2251321., , , , , , ,
Floods are among the most frequently occurring and disastrous natural hazards in the world. The overarching goal of this study is to investigate the utility of passive microwave AMSR-E signal and TRMM based precipitation estimates in improving flood prediction at the sparsely gauged Cubango River Basin, Africa. This is accomplished by coupling a widely used conceptual rainfall-runoff hydrological model with Ensemble Square Root Filter (EnSRF) to account for uncertainty in both forcing data and model initial conditions. Three experiments were designed to quantify the contributions of the AMSR-E signal to the flood prediction accuracy, in comparison to the benchmark assimilation of in-situ streamflow observations, for both “Open Loop” and “Assimilation” modules. In general, the EnSRF assimilation of both in-situ observations and AMSR-E signal-converted-streamflow effectively improved streamflow modeling performance in terms of three statistical measures. In order to further investigate AMSR-E signals' contribution to extreme events prediction skill, the upper 10th percentile daily streamflow was taken as the threshold. Results show significantly improved skill and detectability of floods as well as reduced false alarm rates. Given the global availability of satellite-based precipitation from current TRMM and future GPM, together with soil moisture information from the current AMSR-E and future SMAP mission at near real-time, this “first attempt” study at a sparsely gauged African basin shows that opportunities exist for an integrated application of a suite of satellite data in improving flood forecasting worldwide by careful fusion of remote sensing and in-situ observations.
2014: Weather Research and Forecasting model simulations of a rare springtime bow echo near the Great Salt Lake. Meteorological Applications, 10, 1–13, doi:10.1002/met.1455., , , ,
The semiarid climate and rugged terrain in the interior west of the United States do not favour the development of bow echoes, a type of convective storm associated with intense, damaging winds. However, on 21 April 2011, a bow echo associated with a fast-moving midtropospheric perturbation formed across the Great Salt Lake (GSL) in Utah, producing damaging winds along its path. Intrigued by the rarity of this bow echo and the inability of the North American Mesoscale model (NAM) to forecast it, this event was studied by using available observations and conducted simulations with the Advanced Research Weather Research and Forecasting (WRF) model. Sensitivity to the microphysics schemes (MPSs), horizontal grid spacing, intensity of moisture content, and a physical lake model in the WRF model were examined. It was found that: (a) reduction in grid spacing from 12 and 4 km to 1 km along with improved depiction of low-level moisture substantially improved the bow echo simulation, (b) the presence of GSL did not impact bow echo development, and (c) the WRF model appeared to inherit a phase error in the passage of the midtropospheric perturbation from the NAM initial and lateral boundary conditions. The phase error resulted in a 1–2 h delay in the bow echo passage. These results highlight the difficulties in simulating such a bow echo event, and suggest similar challenges future faced by subsequent regional climate downscaling studies on future extreme weather in the western United States.
2013: Development of a mesoscale ensemble data assimilation system at the naval research laboratory. Weather and Forecasting, 28, 1322–1336., , , , ,
An ensemble Kalman filter (EnKF) has been adopted and implemented at the Naval Research Laboratory (NRL) for mesoscale and storm-scale data assimilation to study the impact of ensemble assimilation of high- resolution observations, including those from Doppler radars, on storm prediction. The system has been improved during its implementation at NRL to further enhance its capability of assimilating various types of meteorological data. A parallel algorithm was also developed to increase the system’s computational efficiency on multiprocessor computers. The EnKF has been integrated into the NRL mesoscale data assimilation system and extensively tested to ensure that the system works appropriately with new observational data stream and forecast systems. An innovative procedure was developed to evaluate the impact of assimilated observations on ensemble analyses with no need to exclude any observations for independent validation (as required by the conventional evaluation based on data-denying experiments). The procedure was employed in this study to examine the impacts of ensemble size and localization on data assimilation and the results reveal a very interesting relationship between the ensemble size and the localization length scale. All the tests conducted in this study demonstrate the capabilities of the EnKF as a research tool for mesoscale and storm-scale data assimilation with potential operational applications.
2013: A Diabatic Lagrangian Technique for the Analysis of Convective Storms. Part I: Description and Validation via an Observing System Simulation Experiment. Journal of Atmospheric and Oceanic Technology, 30, 2248–2265, doi:10.1175/JTECH-D-12-00194.1.,
A diabatic Lagrangian analysis (DLA) technique for deriving potential temperature, water vapor and cloud water mixing ratios, and virtual buoyancy from three-dimensional time-dependent Doppler radar wind and reﬂectivity ﬁelds in storms is presented. The DLA method proceeds from heat and water substance conservation along discrete air trajectories via microphysical diabatic heating/cooling and simple damping and surface ﬂux parameterizations in a parcel-following ground-relative reference frame to thermodynamic ﬁelds on a regular grid of trajectory endpoints at a common analysis time. Rain and graupel precipitation size distributions are parameterized from observed reﬂectivity at discrete Lagrangian points to simplify the cloud model–based microphysically driven heating and cooling rate calculations. The DLA approximates the precipitation size distributions from reﬂectivity assuming conventional inverse exponential size distributions and prescribed input intercept parameter values based on the output of a mature simulated storm. The DLA is demonstrated via an observing system simulation experiment (OSSE), and its analysis compares favorably with the known output buoyancy and water substance ﬁelds in the simulated storm case. The DLA-analyzed thermal–solenoidal horizontal vorticity tendency is of comparable magnitude to the corresponding modeled solenoidal vorticity tendency. A test application of the DLA to a radar-observed storm is presented in a companion paper (Part II).
2013: A Diabatic Lagrangian Technique for the Analysis of Convective Storms. Part II: Application to a Radar-Observed Storm. Journal of Atmospheric and Oceanic Technology, 30, 2266–2280, doi:10.1175/JTECH-D-13-00036.1.,
A new diabatic Lagrangian analysis (DLA) technique that derives predicted ﬁelds of potential temperature, water vapor and cloud water mixing ratios, and virtual buoyancy from three-dimensional, time-dependent wind and reﬂectivity ﬁelds (see Part I) is applied to the radar-observed 9 June 2009 supercell storm during the Second Veriﬁcation of the Origins of Rotation in Tornadoes Experiment (VORTEX2). The DLA diagnoses ﬁelds of rain and graupel content from radar reﬂectivity and predicts the evolution of analysis variables following radar-inferred air trajectories in the evolving storm with application of the diagnosed precipitation ﬁelds to calculate Lagrangian-frame microphysical processes. Simple damping and surface ﬂux terms and initialization of trajectories from heterogeneous, parametric mesoscale analysis ﬁelds are also included in the predictive Lagrangian calculations. The DLA output compares favorably with observations of surface in situ temperature and water vapor mixing ratio and accumulated rainfall from a catchment rain gauge in the 9 June 2009 storm.
2014: Signal design to suppress coupling in the polarimetric phased array radar. Journal of Atmospheric and Oceanic Technology, 52, 1063–1077., , , ,
Examined are two related modes of polarimetric signal transmission that reduce coupling between the orthogonal components of received signals. For the surveillance scan with large unambiguous range and simultaneous mode of H and V transmission, pulse to pulse coding is suggested. It relaxes conditions on cross-coupling isolation from about 45 dB to about 25 dB while preserving the unambiguous range of over 460 km. For application to systematic codes during Doppler data acquisition, time-multiplexed (back to back) H and V pulses are proposed. This approach also relaxes the cross-coupling isolation to about 25 dB. These theoretically predicted values agree with those obtained by emulating the two schemes using oversampled time series data.
FY 2013 — 107 publications
2012: Introducing subgrid-scale cloud feedbacks to radiation for regional meteorological and climate modeling. Geophysical Research Letters, 39, L24809–L24809, doi:10.1029/2012GL054031., , , , , , , ,
Convective systems and associated cloudiness directly influence regional and local atmospheric radiation budgets, as well as dynamics and thermodynamics, through feedbacks. However, most subgrid-scale convective parameterizations in regional weather and climate models do not consider cumulus cloud feedbacks to radiation, resulting in biases in several meteorological parameters. We have incorporated this key feedback process into a convective parameterization and a radiation scheme in the Weather Research and Forecasting model, and evaluated the impacts of including this process in short-term weather and multiyear climate simulations. Introducing subgrid-scale convective cloudradiation feedbacks leads to a more realistic simulation of attenuation of downward surface shortwave radiation. Reduced surface shortwave radiation moderates the surface forcing for convection and results in a notable reduction in precipitation biases. Our research reveals a need for more in-depth consideration of the effects of subgrid-scale clouds in regional meteorology/climate and air quality models on radiation, photolysis, cloud mixing, and aerosol indirect effects.
2013: Polarimetric signatures above the melting layer in winter storms: an observational and modeling study. Journal of Applied Meteorology and Climatology, 52, 682–700., , , , ,
Polarimetric radar observations above the melting layer in winter storms reveal enhanced differential reflectivity ZDR and specific differential phase KDP, collocated with reduced co-polar correlation coefficient ρhv; these signatures often appear as isolated “pockets”. High-resolution RHIs and vertical profiles of polarimetric variables are analyzed for a winter storm that occurred in Oklahoma on 27 January 2009, observed with the polarimetric WSR-88D (KOUN) in Norman, OK. The ZDR maximum and ρhv minimum are located within the temperature range of -10 °C and -15 °C, whereas the KDP maximum is located just below the ZDR maximum. These signatures are coincident with reflectivity factor ZH that increases towards the ground. A simple kinematical, one-dimensional, two-moment bulk microphysical model is developed and coupled with electromagnetic scattering calculations to explain the nature of the observed polarimetric signature. The microphysics model includes nucleation, deposition, and aggregation, and considers only ice phase hydrometeors. Vertical profiles of the polarimetric radar variables (ZH, ZDR, KDP and ρhv) are calculated using the output from the microphysical model. The base model run reproduces the general profile and magnitude of the observed ZH and ρhv, and the correct shape (but not magnitude) of ZDR and KDP. Several sensitivity experiments are presented to determine if the modeled signatures of all variables can match the observed ones. However, the model is incapable of matching both the observed magnitude and shape of all polarimetric variables. This implies that some processes not included in the model (such as secondary ice generation) are important in producing the signature.
2013: July 2012 Greenland melt extent enhanced by low-level liquid clouds. Nature, 496, 83–86, doi:10.1038/nature12002., , , , , , , , ,
Melting of the world’s major ice sheets can affect human and environmental conditions by contributing to sea-level rise. In July 2012, an historically rare period of extended surface melting was observed across almost the entire Greenland ice sheet, raising questions about the frequency and spatial extent of such events. Here we show that low-level clouds consisting of liquid water droplets (‘liquid clouds’), via their radiative effects, played a key part in this melt event by increasing near-surface temperatures. We used a suite of surface-based observations, remote sensing data, and a surface energy-balance model. At the critical surface melt time, the clouds were optically thick enough and low enough to enhance the down- welling infrared flux at the surface. At the same time they were optically thin enough to allow sufficient solar radiation to penetrate through them and raise surface temperatures above the melting point. Outside this narrow range in cloud optical thickness, the radiative contribution to the surface energy budget would have been diminished, and the spatial extent of this melting event would have been smaller. We further show that these thin, low-level liquid clouds occur frequently, both over Greenland and across the Arctic, being present around 30–50 per cent of the time. Our results may help to explain the difficulties that global climate models have in simulating the Arctic surface energy budget, particularly as models tend to under-predict the formation of optically thin liquid clouds at super-cooled temperatures—a process potentially necessary to account fully for temperature feedbacks in a warming Arctic climate.
2013: Tornado Damage Estimation Using Polarimetric Radar. Weather and Forecasting, 28, 139–158., , , , ,
This study investigates the use of tornadic debris signature (TDS) parameters to estimate tornado damage severity using Norman, Oklahoma (KOUN), polarimetric radar data (polarimetric version of the Weather Surveillance Radar-1988 Doppler radar). Several TDS parameters are examined, including parameters based on the 10th or 90th percentiles of polarimetric variables (lowest tilt TDS parameters) and TDS parameters based on the TDS volumetric coverage (spatial TDS parameters). Two highly detailed National Weather Service (NWS) damage surveys are compared to TDS parameters. The TDS parameters tend to be correlated with the enhanced Fujita scale (EF) rating. The 90th percentile reflectivity, TDS height, and TDS volume increase during tornado intensification and decrease during tornado dissipation. For 14 tornado cases, the maximum or minimum TDS parameter values are compared to the tornado’s EF rating. For tornadoes with a higher EF rating, higher maximum values of the 90th percentile ZHH, TDS height, and volume, as well as lower minimum values of 10th percentile ρHV and ZDR, are observed. Maxima in spatial TDS parameters are observed after periods of severe, widespread tornado damage for violent tornadoes. This paper discusses how forecasters could use TDS parameters to obtain near-real-time information about tornado damage severity and spatial extent.
2013: Severe thunderstorms and climate change. Atmospheric Research, 123, 129–138, doi:10.1016/j.atmosres.2012.04.002.,
As the planet warms, it is important to consider possible impacts of climate change on severe thunderstorms and tornadoes. To further that discussion, the current distribution of severe thunderstorms as a function of large-scale environmental conditions is presented. Severe thunderstorms are much more likely to form in environments with large values of convective available potential energy (CAPE) and deep-tropospheric wind shear. Tornadoes and large hail are preferred in high-shear environments and non-tornadic wind events in low shear. Further, the intensity of tornadoes and hail, given that they occur, tends to be almost entirely a function of the shear and only weakly depends on the thermodynamics. Climate model simulations suggest that CAPE will increase in the future and the wind shear will decrease. Detailed analysis has suggested that the CAPE change will lead to more frequent environments favorable for severe thunderstorms, but the strong dependence on shear for tornadoes, particularly the strongest ones, and hail means that the interpretation of how individual hazards will change is open to question. The recent development of techniques to use higher-resolution models to estimate the occurrence of storms of various kinds is discussed. Given the large interannual variability in environments and occurrence of events, caution is urged in interpreting the observational record as evidence of climate change.
2012: Simulated vortex detection using a four-face phased-array Doppler radar. Weather and Forecasting, 27, 1598–1603, doi:10.1175/WAF-D-12-00059.1., ,
The National Weather Radar Testbed was established in Norman, Oklahoma, in 2002 to evaluate, in part, the feasibility of eventually replacing mechanically scanned parabolic antennas with electronically scanned phased-array antennas on weather surveillance radars. If a phased-array antenna system is to replace the current antenna, among the important decisions that must be made are the design (flat faces, cylinder, etc.) that will be needed to cover 360° in azimuth and the choice of an acceptable beamwidth. Investigating the flat-face option, four faces seem to be a reasonable choice for providing adequate coverage. To help with the beamwidth decision-making process, the influence of beamwidth on the resolution of various-sized simulated vortices is investigated. It is found that the half-power beamwidth across the antenna should be no more than 1.0° (equating to a broadside beamwidth of 0.75°) in order to provide National Weather Service forecasters with at least the same quality of data resolution that is currently available for making tornado and severe storm warnings.
2012: Simulation of Dryline Misovortex Dynamics and Cumulus Formation. Monthly Weather Review, 140, 3525–3551, doi:10.1175/MWR-D-11-00189.1., , , ,
A dryline and misocyclones have been simulated in a cloud-resolving model by applying specified initial and time-dependent lateral boundary conditions obtained from analyses of the 22 May 2002 International H2O Project (IHOP_2002) dataset. The initial and lateral boundary conditions were obtained from a combination of the time–spaced Lagrangian analyses for temperature and moisture with horizontal velocities from multiple-Doppler wind syntheses. The simulated dryline, horizontal dry-convective rolls (HCRs) and open cells (OCCs), misocyclones, and cumulus clouds are similar to the corresponding observed features. The misocyclones move northward at nearly the mean boundary layer (BL) wind speed, rotate dryline gradients owing to their circulations, and move the local dryline eastward via their passage. Cumuli develop along a secondary dryline, along HCR and OCC segments between the primary and secondary drylines, along HCR and OCC segments that have moved over the dryline, and within the dryline updraft. After the initial shearing instability develops, misocyclogenesis proceeds from tilting and stretching of vorticity by the persistent secondary dryline circulation. The resulting misocyclone evolution is discussed.
2013: The Atmospheric Radiation Measurement (ARM) program network of microwave radiometers: Instrumentation, data, and retrievals. Atmos. Meas. Technol, 6, 2359–2372, doi:10.5194/amt-6-2359-2013., , ,
The Climate Research Facility of the US Department of Energy’s Atmospheric Radiation Measurement (ARM) Program operates a network of ground-based microwave radiometers. Data and retrievals from these instruments have been available to the scientific community for almost 20 yr. In the past five years the network has expanded to include a total of 22 microwave radiometers deployed in various locations around the world. The new instruments cover a frequency range between 22 and 197 GHz and are consistently and automatically calibrated. The latest addition to the network is a new generation of three-channel radiometers, currently in the early stage of deployment at all ARM sites. The network has been specifically designed to achieve increased accuracy in the retrieval of precipitable water vapor (PWV) and cloud liquid water path (LWP) with the long-term goal of providing the scientific community with reliable, calibrated radiometric data and retrievals of important geophysical quantities with well-characterized uncertainties. The radiometers provide high-quality, continuous datasets that can be utilized in a wealth of applications and scientific studies. This paper presents an overview of the microwave instrumentation, calibration procedures, data, and retrievals that are available for download from the ARM data archive
2013: Evolution of Lightning Activity and Storm Charge Relative to Dual-Doppler Analysis of a High-Precipitation Supercell Storm. Monthly Weather Review, 141, 2199–2223, doi:10.1175/MWR-D-12-00258.1., , , ,
A high-precipitation tornadic supercell storm was observed on 29–30 May 2004 during the Thunderstorm Electrification and Lightning Experiment. Observational systems included the Oklahoma Lightning Mapping Array, mobile balloon-borne soundings, and two mobile C-band radars. The spatial distribution and evolution of lightning are related to storm kinematics and microphysics, specifically through regions of microphysical charging and the location and geometry of those charge regions. Lightning flashes near the core of this storm were extraordinarily frequent, but tended to be of shorter duration and smaller horizontal extent than typical flashes elsewhere. This is hypothesized to be due to the charge being in many small pockets, with opposite polarities of charge close together in adjoining pockets. Thus, each polarity of lightning leader could propagate only a relatively short distance before reaching regions of unfavorable electric potential. In the anvil, however, lightning extended tens of kilometers from the reflectivity cores in roughly horizontal layers, consistent with the charge spreading through the anvil in broad sheets. The strong, consistent updraft of this high-precipitation supercell storm combined with the large hydrometeor concentrations to produce the extremely high flash rates observed during the analysis period. The strength and size of the updraft also contributed to unique lightning characteristics such as the transient hole of reduced lightning density and discharges in the overshooting top.
2013: Supplementing flash flood reports with impact classifications. Journal of Hydrology, 477, 1–16, doi:10.1016/j.jhydrol.2012.09.036., , ,
Summary In recent years, there has been an increase in flash flood impacts, even as our ability to forecast events and warn areas at risk increases. This increase results from a combination of extreme events and the exposure of vulnerable populations. The issues of exposure and vulnerability to flash floods are not trivial because environmental circumstances in such events are specific and complex enough to challenge the general understanding of natural risks. Therefore, it seems essential to consider physical processes of flash floods concurrently with the impacts they trigger. This paper takes a first step in addressing this need by creating and testing the coherence of an impact-focused database based on two pre-existing public and expert-based survey datasets: the Severe Hazards Analysis and Verification Experiment (SHAVE) and the US National Weather Service (NWS) Storm Data. The SHAVE initiative proposes a new method for collecting near-real-time high-resolution observations on both environmental circumstances and their disastrous consequences (material and human losses) to evaluate radar-based forecasting tools. Forecast verification tools and methods are needed to pursue improving the spatial and temporal accuracy of forecasts. Nevertheless by enhancing SHAVE and NWS datasets with socially and spatially relevant information, we aim at improving future forecast ability to predict the amount and types of impacts.
This paper describes the procedures developed to classify and rank the impacts from the least to the most severe, then to verify the coherence and relevance of the impact-focused SHAVE dataset via cross-tabulation analysis of reported variables and GIS-sampled spatial characteristics. By crossing impact categories with socio-spatial characteristics, this analysis showed first benchmarks for the use of exposure layers in future flash flood impact forecasting models. The enhanced impact-focused datasets were used to test the capabilities of flash flood forecasting tools in predicting different categories of impacts for two extreme cases of flash flooding in Oklahoma, USA. Results showed a general tendency for the more severe impacts to be associated to higher mean exceedances over tool values. This means that, at least for these particular case studies, the tools were able to make a distinction between less severe and more severe impacts. Finally, a critical analysis of the NWS and SHAVE data collection methodologies was completed and challenges for future work were identified.
Highlights ► US flash flood report datasets are enhanced to create an impact-focused database. ► This database is used to test the ability of forecasting tools to predict impacts. ► We give a basis for the use of exposure grids in future impact forecasting tools. ► We provide recommendations for the data collection methodology of flood reports.
Keywords Flash flood; Database; Impact classification; Forecast verification
2013: Statistical and physical analysis of the vertical structure of precipitation in mountainous west region of the United States using 11+ years of spaceborne observations from TRMM precipitation radar. Journal of Applied Meteorology and Climatology, 52, 408–424, doi:10.1175/JAMC-D-12-095.1., , , , , , ,
This study presents a statistical analysis of the vertical structure of precipitation measured by NASA–Japan Aerospace Exploration Agency’s (JAXA) Tropical Rainfall Measuring Mission (TRMM) precipitation radar (PR) in the region of southern California, Arizona, and western New Mexico, where the ground-based Next-Generation Radar (NEXRAD) network finds difficulties in accurately measuring surface precipitation because of beam blockages by complex terrain. This study has applied TRMM PR version-7 products 2A23 and 2A25 from 1 January 2000 to 26 October 2011. The seasonal, spatial, intensity-related, and type-related variabilities are characterized for the PR vertical profile of reflectivity (VPR) as well as the heights of storm, freezing level, and bright band. The intensification and weakening of reflectivity at low levels in the VPR are studied through fitting physically based VPR slopes. Major findings include the following: precipitation type is the most significant factor determining the characteristics of VPRs, the shape of VPRs also influences the intensity of surface rainfall rates, the characteristics of VPRs have a seasonal dependence with strong similarities between the spring and autumn months, and the spatial variation of VPR characteristics suggests that the underlying terrain has an impact on the vertical structure. The comprehensive statistical and physical analysis strengthens the understanding of the vertical structure of precipitation and advocates for the approach of VPR correction to improve surface precipitation estimation in complex terrain.
2013: Empirical conversion of the vertical profile of reflectivity from Ku-band to S-band frequency. Journal of Geophysical Research, 118, 1814–1825, doi:10.1002/jgrd.50138., , , , , , ,
This paper presents an empirical method for converting reflectivity from Ku-band (13.8 GHz) to S-band (2.8 GHz) for several hydrometeor species, which facilitates the incorporation of Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR) measurements into quantitative precipitation estimation (QPE) products from the U.S. Next-Generation Radar (NEXRAD). The development of empirical dual-frequency relations is based on theoretical simulations, which have assumed appropriate scattering and microphysical models for liquid and solid hydrometeors (raindrops, snow, and ice/hail). Particle phase, shape, orientation, and density (especially for snow particles) have been considered in applying the T-matrix method to compute the scattering amplitudes. Gamma particle size distribution (PSD) is utilized to model the microphysical properties in the ice region, melting layer, and raining region of precipitating clouds. The variability of PSD parameters is considered to study the characteristics of dual-frequency reflectivity, especially the variations in radar dual-frequency ratio (DFR). The empirical relations between DFR and Ku-band reflectivity have been derived for particles in different regions within the vertical structure of precipitating clouds. The reflectivity conversion using the proposed empirical relations has been tested using real data collected by TRMM-PR and a prototype polarimetric WSR-88D (Weather Surveillance Radar 88 Doppler) radar, KOUN. The processing and analysis of collocated data demonstrate the validity of the proposed empirical relations and substantiate their practical significance for reflectivity conversion, which is essential to the TRMM-based vertical profile of reflectivity correction approach in improving NEXRAD-based QPE.
2013: Hydrological data assimilation with the Ensemble Square-Root-Filter: Use of streamflow observations to update model states for real-time flash flood forecasting. Advances in Water Resources, 59, 209–220, doi:10.1016/j.advwatres.2013.06.010., , , , ,
The objective of the study is to evaluate the potential of a data assimilation system for real-time flash flood forecasting over small watersheds by updating model states. To this end, the Ensemble Square-Root-Filter (EnSRF) based on the Ensemble Kalman Filter (EnKF) technique was coupled to a widely used conceptual rainfall-runoff model called HyMOD. Two small watersheds susceptible to flash flooding from America and China were selected in this study. The modeling and observational errors were considered in the framework of data assimilation, followed by an ensemble size sensitivity experiment. Once the appropriate model error and ensemble size was determined, a simulation study focused on the performance of a data assimilation system, based on the correlation between streamflow observation and model states, was conducted. The EnSRF method was implemented within HyMOD and results for flash flood forecasting were analyzed, where the calibrated streamflow simulation without state updating was treated as the benchmark or nature run. Results for twenty-four flash-flood events in total from the two watersheds indicated that the data assimilation approach effectively improved the predictions of peak flows and the hydrographs in general. This study demonstrated the benefit and efficiency of implementing data assimilation into a hydrological model to improve flash flood forecasting over small, instrumented basins with potential application to real-time alert systems.
Keywords Data assimilation; Ensemble Kalman filter; Ensemble Square-Root-Filter; Flash flood forecast; Rainfall–runoff model
2013: Evaluation and uncertainty estimation of NOAA/NSSL next-generation National Mosaic Quantitative Precipitation Estimation product (Q2) over the Continental United States. Journal of Hydrometeorology, 14, 1308–1322, doi:10.1175/JHM-D-12-0150.1., , , , , , , , ,
Quantitative precipitation estimation (QPE) products from the next-generation National Mosaic and QPE system (Q2) are cross-compared to the operational, radar-only product of the National Weather Service (Stage II) using the gauge-adjusted and manual quality-controlled product (Stage IV) as a reference. The evaluation takes place over the entire conterminous United States (CONUS) from December 2009 to November 2010. The annual comparison of daily Stage II precipitation to the radar-only Q2Rad product indicates that both have small systematic biases (absolute values > 8%), but the random errors with Stage II are much greater, as noted with a root-mean-squared difference of 4.5 mm day−1 compared to 1.1 mm day−1 with Q2Rad and a lower correlation coefficient (0.20 compared to 0.73). The Q2 logic of identifying precipitation types as being convective, stratiform, or tropical at each grid point and applying differential Z–R equations has been successful in removing regional biases (i.e., overestimated rainfall from Stage II east of the Appalachians) and greatly diminishes seasonal bias patterns that were found with Stage II. Biases and radar artifacts along the coastal mountain and intermountain chains were not mitigated with rain gauge adjustment and thus require new approaches by the community. The evaluation identifies a wet bias by Q2Rad in the central plains and the South and then introduces intermediate products to explain it. Finally, this study provides estimates of uncertainty using the radar quality index product for both Q2Rad and the gauge-corrected Q2RadGC daily precipitation products. This error quantification should be useful to the satellite QPE community who use Q2 products as a reference.
Keywords: Rainfall, Radars/Radar observations
2012: An objective high-resolution hail climatology of the contiguous United States. Weather and Forecasting, 27, 1235–1248, doi:10.1175/WAF-D-11-00151.1., , , , ,
The threat of damaging hail from severe thunderstorms affects many communities and industries on a yearly basis, with annual economic losses in excess of $1 billion (U.S. dollars). Past hail climatology has typically relied on the National Oceanic and Atmospheric Administration/National Climatic Data Center’s (NOAA/NCDC) Storm Data publication, which has numerous reporting biases and nonmeteorological artifacts. This research seeks to quantify the spatial and temporal characteristics of contiguous United States (CONUS) hail fall, derived from multiradar multisensor (MRMS) algorithms for several years during the Next-Generation Weather Radar (NEXRAD) era, leveraging the Multiyear Reanalysis of Remotely Sensed Storms (MYRORSS) dataset at NOAA’s National Severe Storms Laboratory (NSSL). The primary MRMS product used in this study is the maximum expected size of hail (MESH). The preliminary climatology includes 42 months of quality controlled and reprocessed MESH grids, which spans the warm seasons for four years (2007–10), covering 98% of all Storm Data hail reports during that time. The dataset has 0.01° latitude × 0.01° longitude × 31 vertical levels spatial resolution, and 5-min temporal resolution. Radar-based and reports-based methods of hail climatology are compared. MRMS MESH demonstrates superior coverage and resolution over Storm Data hail reports, and is largely unbiased. The results reveal a broad maximum of annual hail fall in the Great Plains and a diminished secondary maximum in the Southeast United States. Potential explanations for the differences in the two methods of hail climatology are also discussed.
2013: On the predictability of supercell thunderstorm evolution. Journal of the Atmospheric Sciences, 70, 1993–2011, doi:10.1175/JAS-D-12-0166.1., ,
Supercell thunderstorms produce a disproportionate amount of the severe weather in the United States, and accurate prediction of their movement and evolution is needed to warn the public of their hazards. This study explores the practical predictability of supercell thunderstorm forecasts in the presence of typical errors in the preconvective environmental conditions. The Advanced Research Weather Research and Forecasting model (ARW-WRF) is run at 1-km grid spacing and a control run of a supercell thunderstorm is produced using a horizontally homogeneous environment. Forecast errors from supercell environments derived from the 13-km Rapid Update Cycle (RUC) valid at 0000 UTC for forecast lead times up to 3 h are used to deﬁne the environmental errors, and 100 runs initialized with environmental perturbations characteristic of those errors are produced for each lead time. The simulations are analyzed to determine the spread and practical predictability of supercell thunderstorm forecasts from a storm-scale model, with the control used as truth.
Most of the runs perturbed with the environmental forecast errors produce supercell thunderstorms; however, there is much less predictability for storm motion and structure. Results suggest that an upper bound to the practical predictability of storm location with the current environmental uncertainty for a 1-h environmental forecast is about 2 h, with the predictability of the storms decreasing to 1 h as lead time increases. Smaller-scale storm features, such as midlevel mesocyclones and regions of heavy rainfall, display much less predictability than storm location. Mesocyclone location is predictable out to 40 min or less, while heavy 5-min rainfall location is not predictable.
2012: Forecasting tornado path lengths using a 3-dimensional object identification algorithm applied to convection-allowing forecasts. Weather and Forecasting, 27, 1090–1113., , , , , ,
A three-dimensional (in space and time) object identification algorithm is applied to high-resolution forecasts of hourly maximum updraft helicity (UH)—a diagnostic that identifies simulated rotating storms—with the goal of diagnosing the relationship between forecast UH objects and observed tornado pathlengths. UH objects are contiguous swaths of UH exceeding a specified threshold. Including time allows tracks to span multiple hours and entire life cycles of simulated rotating storms. The object algorithm is applied to 3 yr of 36-h forecasts initialized daily from a 4-km grid-spacing version of the Weather Research and Forecasting Model (WRF) run in real time at the National Severe Storms Laboratory (NSSL), and forecasts from the Storm Scale Ensemble Forecast (SSEF) system run by the Center for Analysis and Prediction of Storms for the 2010 NOAA Hazardous Weather Testbed Spring Forecasting Experiment. Methods for visualizing UH object attributes are presented, and the relationship between pathlengths of UH objects and tornadoes for corresponding 18- or 24-h periods is examined. For deterministic NSSL-WRF UH forecasts, the relationship of UH pathlengths to tornadoes was much stronger during spring (March–May) than in summer (June–August). Filtering UH track segments produced by high-based and/or elevated storms improved the UH–tornado pathlength correlations. The best ensemble results were obtained after filtering high-based and/or elevated UH track segments for the 20 cases in April–May 2010, during which correlation coefficients were as high as 0.91. The results indicate that forecast UH pathlengths during spring could be a very skillful predictor for the severity of tornado outbreaks as measured by total pathlength.
2013: Tornado path length forecasts from 2010-2011 using ensmble updraft helicity. Weather and Forecasting, 28, 387–407., , , , , , , ,
Examining forecasts from the Storm Scale Ensemble Forecast (SSEF) system run by the Center for Analysis and Prediction of Storms for the 2010 NOAA/Hazardous Weather Testbed Spring Forecasting Experiment, recent research diagnosed a strong relationship between the cumulative pathlengths of simulated rotating storms (measured using a three-dimensional object identification algorithm applied to forecast updraft helicity) and the cumulative pathlengths of tornadoes. This paper updates those results by including data from the 2011 SSEF system, and illustrates forecast examples from three major 2011 tornado outbreaks—16 and 27 April, and 24 May—as well as two forecast failure cases from June 2010. Finally, analysis updraft helicity (UH) from 27 April 2011 is computed using a three-dimensional variational data assimilation system to obtain 1.25-km grid-spacing analyses at 5-min intervals and compared to forecast UH from individual SSEF members.
2013: Dryline position errors in experimental convection-allowing NSSL-WRF Model forecasts and the operational NAM. Weather and Forecasting, 28, 746–761., , , ,
This study evaluates 24-h forecasts of dryline position from an experimental 4-km grid-spacing version of the Weather Research and Forecasting Model (WRF) run daily at the National Severe Storms Laboratory (NSSL), as well as the 12-km grid-spacing North America Mesoscale Model (NAM) run operationally by the Environmental Modeling Center of NCEP. For both models, 0000 UTC initializations are examined, and for verification 0000 UTC Rapid Update Cycle (RUC) analyses are used. For the period 1 April–30 June 2007–11, 116 cases containing drylines in all three datasets were identified using a manual procedure that considered specific humidity gradient magnitude, temperature, and 10-m wind. For the 24-h NAM forecasts, no systematic east–west dryline placement errors were found, and the majority of the east–west errors fell within the range ±0.5° longitude. The lack of a systematic bias was generally present across all subgroups of cases categorized according to month, weather pattern, and year. In contrast, a systematic eastward bias was found in 24-h NSSL-WRF forecasts, which was consistent across all subgroups of cases. The eastward biases seemed to be largest for the subgroups that favored “active” drylines (i.e., those associated with a progressive synoptic-scale weather system) as opposed to “quiescent” drylines that tend to be present with weaker tropospheric flow and have eastward movement dominated by vertical mixing processes in the boundary layer.
2013: Verification of Convection-Allowing WRF Model Forecasts of the Planetary Boundary Layer using Sounding Observations. Weather and Forecasting, 28, 842–862, doi:dx.doi.org/10.1175/MWR-D-11-00026.1., , , ,
This study evaluates forecasts of thermodynamic variables from five convection-allowing configurations of the WRF-ARW model that vary only by the planetary boundary layer (PBL) scheme including three “local” schemes (MYJ, QNSE, MYNN) and two schemes that include “non-local” mixing (ACM2 and YSU). The forecasts are compared to springtime radiosonde observations upstream from deep convection to gain a better understanding of the thermodynamic characteristics of these PBL schemes in this regime.
The morning PBLs are all too cool and dry despite having little bias in PBL depth (except for YSU). In the evening, the local schemes produce shallower PBLs that are often too shallow and too moist compared to non-local schemes. However, MYNN is nearly unbiased in PBL depth, moisture and potential temperature, which is comparable to the background NAM forecasts. This result gives confidence in the use of the MYNN scheme in convection-allowing configurations of WRF-ARW to alleviate the typical cool, moist bias of the MYJ scheme in convective boundary layers upstream from convection.
The morning cool and dry biases lead to an under (over) prediction of MLCAPE (MLCIN) at that time in all schemes. MLCAPE and MLCIN forecasts improve in the evening, with MYJ, QNSE, and MYNN having small mean errors, but ACM2 and YSU have a somewhat low bias. Strong observed capping inversions tend to be associated with an under prediction of MLCIN in the evening, as the model profiles are too smooth. MLCAPE tends to be over (under) predicted by MYJ and QNSE (MYNN, ACM2 and YSU) when the observed MLCAPE is relatively small (large).
2013: Lifting of Ambient Air by Density Currents in Sheared Environments. Journal of the Atmospheric Sciences, 70, 1204–1215, doi:10.1175/JAS-D-12-0149.1., ,
Two aspects of vorticity associated with cold pools are addressed. First, tilting of horizontal vortex tubes by the updraft at a gust front has been proposed as a means of getting near-ground rotation and hence a tornado. Theory and a numerical simulation are used to show that this mechanism will not work because warm air parcels approaching the gust front decelerate in strong adverse pressure gradient. The near-surface horizontal vorticity available for upward tilting is greatly reduced by horizontal compression before it is tilted. Consequently, uplifting of vortex tubes produces little vertical vorticity near the ground.
Second, it is shown that the baroclinic vorticity generated at the leading edge of the cold pool is transported rearward in the vortex sheet along the interface between cold and warm air, and the barotropic vorticity associated with environmental shear is conserved along streamlines. Warm parcels away from the interface do not acquire baroclinic vorticity to offset their barotropic vorticity, as assumed in a theory for long-lived squall lines. The vortex sheet has a far-field effect on the circulation in the warm air. A steady-state vortex method is used to propose why there is a steady noncirculating density current only when a lid is present and at a specific height.
2013: Links between Central West Western Australian Rainfall Variability and Large-Scale Climate Drivers. Journal of Climate, 26, 2222–2246, doi:10.1175/JCLI-D-12-00129.1., ,
Over the past century, and especially after the 1970s, rainfall observations show an increase (decrease) of the wet summer (winter) season rainfall over northwest (southwest) Western Australia. The rainfall in central west Western Australia (CWWA), however, has exhibited comparatively much weaker coastal trends, but a more prominent inland increase during the wet summer season. Analysis of seasonally averaged rainfall data from a group of stations, representative of both the coastal and inland regions of CWWA, revealed that rainfall trends during the 1958–2010 period in the wet months of November–April were primarily associated with El Niño–Southern Oscillation (ENSO), and with the southern annular mode (SAM) farther inland. During the wet months of May–October, the Indian Ocean dipole (IOD) showed the most robust relationships. Those results hold when the effects of ENSO or IOD are excluded, and were confirmed using a principal component analysis of sea surface temperature (SST) anomalies, rainfall wavelet analyses, and point-by-point correlations of rainfall with global SST anomaly fields. Although speculative, given their long-term averages, reanalysis data suggest that from 1958 to 2010 the increase in CWWA inland rainfall largely is attributable to an increasing cyclonic anomaly trend over CWWA, bringing onshore moist tropical flow to the Pilbara coast. During May–October, the flow anomaly exhibits a transition from an onshore to offshore flow regime in the 2001–10 decade, which is consistent with the observed weaker drying trend during this period.
2013: The implementation of an explicit charging and discharge lightning scheme within the WRF-ARW model: Benchmark simulations of a continental squall line, a tropical cyclone and a winter storm. Monthly Weather Review, 141, 2390–2415, doi:10.1175/MWR-D-12-00278.1., , , ,
This work describes the recent implementation of explicit lightning physics within the Weather Research and Forecasting (WRF) Model. Charging of hydrometeors consists of five distinct noninductive parameterizations, polarization of cloud water, and the exchange of charge during collisional mass transfer. The three components of the ambient electric field are explicitly solved for via the computationally efficient multigrid elliptic solver. The discharge process employs concepts adapted from two well-documented bulk lightning models, whereby charge reduction is imposed within a prescribed volume centered at grid points characterized by electric field magnitudes exceeding a given breakdown threshold.
This lightning model was evaluated through benchmark convection-allowing (3 km) model simulations of three contrasting convective systems: a continental squall line, a major hurricane (Rita 2005), and a winter storm. The areal coverage and magnitude of the simulated hourly flash origin density (FOD) for the continental squall line are qualitatively comparable to that of the total lightning data observations from Earth Networks Total Lightning Network (ENTLN). In agreement with the ENTLN observations, no FOD are simulated for the winter storm case. The simulated spatial FOD pattern of the hurricane and the eyewall gross charge structure were both in reasonable agreement with observations. The simulated FOD for all three cases were also evaluated against those obtained with the recently developed McCaul diagnostic lightning prediction schemes and exhibited overall good qualitative agreement with each other for Rita and the continental squall line.
2012: Tornadic supercell environment analyzed using surface and reanalysis data: A spatiotemporal relational data-mining approach. Journal of Applied Meteorology and Climatology, 51, 2203–2217, doi:10.1175/JAMC-D-11-060.1., , , ,
Oklahoma Mesonet surface data and North American Regional Reanalysis data were integrated with the tracks of over 900 tornadic and nontornadic supercell thunderstorms in Oklahoma from 1994 to 2003 to observe the evolution of near-storm environments with data currently available to operational forecasters. These data are used to train a complex data-mining algorithm that can analyze the variability of meteorological data in both space and time and produce a probabilistic prediction of tornadogenesis given variables describing the near-storm environment. The algorithm was assessed for utility in four ways. First, its probability forecasts were scored. The algorithm did produce some useful skill in discriminating between tornadic and nontornadic supercells as well as in producing reliable probabilities. Second, its selection of relevant attributes was assessed for physical signiﬁcance. Surface thermodynamic parameters, instability, and bulk wind shear were among the most signiﬁcant attributes. Third, the algorithm’s skill was compared with the skill of single variables commonly used for tornado prediction. The algorithm did noticeably outperform all of the single variables, including composite parameters. Fourth, the situational variations of the predictions from the algorithm were shown in case studies. They revealed instances both in which the algorithm excelled and in which the algorithm was limited.
2013: A Real-Time Weather-Adaptive 3DVAR Analysis System for Severe Weather Detections and Warnings. Weather and Forecasting, 28, 727–745, doi:10.1175/WAF-D-12-00093.1., , , , , , , , , , , ,
A real-time, weather-adaptive three-dimensional variational data assimilation (3DVAR) system has been adapted for the NOAA Warn-on-Forecast (WoF) project to incorporate all available radar observations within a moveable analysis domain. The key features of the system include 1) incorporating radar observations from multiple Weather Surveillance Radars-1988 Doppler (WSR-88Ds) with NCEP forecast products as a background state, 2) the ability to automatically detect and analyze severe local hazardous weather events at 1-km horizontal resolution every 5 min in real time based on the current weather situation, and 3) the identification of strong circulation patterns embedded in thunderstorms. Although still in the early development stage, the system performed very well within the NOAA's Hazardous Weather Testbed (HWT) Experimental Warning Program during preliminary testing in spring 2010 when many severe weather events were successfully detected and analyzed. This study represents a first step in the assessment of this type of 3DVAR analysis for use in severe weather warnings. The eventual goal of this real-time 3DVAR system is to help meteorologists better track severe weather events and eventually provide better warning information to the public, ultimately saving lives and reducing property damage.
2013: Impacts of Assimilating Measurements of Different State Variables with a Simulated Supercell Storm and Three-Dimensional Variational Method. Monthly Weather Review, 141, 2759–2777, doi:10.1175/MWR-D-12-00193.1., , ,
This paper investigates the impacts of assimilating measurements of different state variables, which can be potentially available from various observational platforms, on the cycled analysis and short-range forecast of supercell thunderstorms by performing a set of observing system simulation experiments (OSSEs) using a storm-scale three-dimensional variational data assimilation (3DVAR) method. The control experiments assimilate measurements every 5 min for 90 min. It is found that the assimilation of horizontal wind can reconstruct the storm structure rather accurately. The assimilation of vertical velocity , potential temperature , or water vapor can partially rebuild the thermodynamic and precipitation fields but poorly retrieves the wind fields. The assimilation of rainwater mixing ratio can build up the precipitation fields together with a reasonable cold pool but is unable to properly recover the wind fields. Overall, data have the greatest impact, while have the second largest impact. The impact of is the smallest. The impact of assimilation frequency is examined by comparing results using 1-, 5-, or 10-min assimilation intervals. When is assimilated every 5 or 10 min, the analysis quality can be further improved by the incorporation of additional types of observations. When are assimilated every minute, the benefit from additional types of observations is negligible, except for . It is also found that for , , and measurements, more frequent assimilation leads to more accurate analyses. For and , a 1-min assimilation interval does not produce a better analysis than a 5-min interval.
2013: A unified flash flood database across the United States. Bulletin of the American Meteorological Society, 94, 799–805, doi:10.1175/BAMS-D-12-00198.1., , , , , , , , , , , ,
Despite flash flooding being one of the most deadly and costly weather-related natural hazards worldwide, individual datasets to characterize them in the United States are hampered by limited documentation and can be difficult to access. This study is the first of its kind to assemble, reprocess, describe, and disseminate a georeferenced U.S. database providing a long-term, detailed characterization of flash flooding in terms of spatiotemporal behavior and specificity of impacts. The database is composed of three primary sources: 1) the entire archive of automated discharge observations from the U.S. Geological Survey that has been reprocessed to describe individual flooding events, 2) flash-flooding reports collected by the National Weather Service from 2006 to the present, and 3) witness reports obtained directly from the public in the Severe Hazards Analysis and Verification Experiment during the summers 2008–10. Each observational data source has limitations; a major asset of the unified flash flood database is its collation of relevant information from a variety of sources that is now readily available to the community in common formats. It is anticipated that this database will be used for many diverse purposes, such as evaluating tools to predict flash flooding, characterizing seasonal and regional trends, and improving understanding of dominant flood-producing processes. We envision the initiation of this community database effort will attract and encompass future datasets.
2013: They just don't make storms like this one anymore: Analyzing the anomalous record snowfall event of 1959. J. Operational Meteorology, one, 52–65., , , , , , ,
Extreme weather events are rare but significantly impact society making their study of the utmost importance. We have examined the synoptic features associated with a historic snowfall during February 1959 on Mount Shasta in northern California. Between February 13 - 19, Mt. Shasta received 480 cm of snow and set a single snow-event record for the mountain. The analysis of this event is challenging because of sparse and coarse-resolution atmospheric observations and the absence of satellite imagery; nonetheless, the analysis has contributed to our understanding of synoptic and mesoscale dynamics associated with extreme snowstorm events. We have used an array of methods ranging from the use of the National Centers for Environmental Prediction/National Center for Atmospheric Research reanalysis datasets, analysis of regional sounding and precipitation data, archived newspaper articles, and reminiscences from long-term residents of the area. Results indicate that a single mechanism is unable to produce a snowstorm of this magnitude. Synoptic components that phased several days prior to this event were the following: 1) amplification and breaking of Rossby waves, 2) availability of extratropical moisture that included enhanced midlevel moisture in the 850 - 600 hPa layer, and 3) an active subtropical jet stream.
2012: Evaluation of the Storm Prediction Center's Day 1 Convective Outlooks. Weather and Forecasting, 27, 1580–1585., ,
The Storm Prediction Center has issued daily convective outlooks since the mid-1950s. This paper represents an initial effort to examine the quality of these forecasts. Convective outlooks are plotted on a latitude-longitude grid with 80-km grid spacing and evaluated using storm reports to calculate verification measures including the probability of detection, frequency of hits, and critical success index. Results show distinct improvements in forecast performance over the duration of the study period, some of which can be attributed to apparent changes in forecasting philosophies.
2013: Preliminary investigation of the contribution of supercell thunderstorms to the climatology of heavy and extreme precipitation in the United States. Atmospheric Research, 123, 206–210, doi:10.1016/j.atmosres.2012.06.023., ,
The hazards attributed to supercell thunderstorms are primarily considered to be tornadoes, large hail, and damaging winds, but flooding from heavy and extreme rainfall is not as often considered with these storms. As a result there has been little research on the role that supercells play in the production of heavy and extreme rainfall events. In order to assess the contribution of supercells to the climatology of short-duration precipitation extremes the present study uses the Warning Decision Support System — Integrated Information to objectively identify supercell thunderstorms from mosaicked radar data during the year 2009 and associate them with high-resolution, accurate multisensor precipitation estimates at time scales of one hour. Supercells are found to be more likely than non-supercells to produce extreme and heavy precipitation, and comparisons are also made between storm types according to month.
2013: Objective Limits on Forecasting Skill of Rare Events. Weather and Forecasting, 28, 525–534, doi:10.1175/WAF-D-12-00113.1., , ,
A method for determining baselines of skill for the purpose of the verification of rare-event forecasts is described and examples are presented to illustrate the sensitivity to parameter choices. These “practically perfect” forecasts are designed to resemble a forecast that is consistent with that which a forecaster would make given perfect knowledge of the events beforehand. The Storm Prediction Center’s convective outlook slight risk areas are evaluated over the period from 1973 to 2011 using practically perfect forecasts to define the maximum values of the critical success index that a forecaster could reasonably achieve given the constraints of the forecast, as well as the minimum values of the critical success index that are considered the baseline for skillful forecasts.
Based on these upper and lower bounds the relative skill of convective outlook areas shows little to no skill until the mid-1990s, after which this value increases steadily. The annual frequency of skillful daily forecasts continues to increase from the beginning of the period of study, and the annual cycle shows maxima of the frequency of skillful daily forecasts occurring in May and June.
2013: Impact of Low-Level Jets on the Nocturnal Urban Heat Island Intensity in Oklahoma City. Journal of Applied Meteorology and Climatology, 52, 1779–1802., , , , , ,
Previous analysis of Oklahoma City (OKC), Oklahoma, temperature data indicated that urban heat islands (UHIs) frequently formed at night and the observed UHI intensity was variable (1°–4°C). The current study focuses on identifying meteorological phenomena that contributed to the variability of nocturnal UHI intensity in OKC during July 2003. Two episodes, one with a strong UHI signature and one with a weak signature, were studied in detail using observations along with simulations with the Weather Research and Forecasting model. Mechanical mixing associated with low-level jets (LLJs) played a critical role in moderating the nocturnal UHI intensity. During nights with weak LLJs or in the absence of LLJs, vertical mixing weakened at night and strong temperature inversions developed in the rural surface layer as a result of radiative cooling. The shallow stable boundary layer (SBL < 200 m) observed under such conditions was strongly altered inside the city because rougher and warmer surface characteristics caused vertical mixing that eroded the near-surface inversion. Accordingly, temperatures measured within the urban canopy layer at night were consistently higher than at nearby rural sites of comparable height (by ~3°–4°C). During nights with strong LLJs, however, the jets facilitated enhanced turbulent mixing in the nocturnal boundary layer. As a consequence, atmospheric stability was much weaker and urban effects played a much less prominent role in altering the SBL structure; therefore, UHI intensities were smaller (<1°C) during strong LLJs. The finding that rural inversion strength can serve as an indicator for UHI intensity highlights that the structure of the nocturnal boundary layer is important for UHI assessments.
2013: Using WSR-88D polarimetric data to identify bird-contaminated Doppler velocities. Advances in Meteorology, 2013, 286–298, doi:10.1155/2013/769275., , , , ,
As an important part of Doppler velocity data quality control for radar data assimilation and other quantitative applications, an automated technique is developed to identify and remove contaminated velocities by birds, especially migrating birds. This technique builds upon the existing hydrometeor classification algorithm (HCA) for dual-polarimetric WSR-88D radars developed at the National Severe Storms Laboratory, and it performs two steps. In the first step, the fuzzy-logic method in the HCA is simplified and used to identify biological echoes (mainly from birds and insects). In the second step, another simple fuzzy logic method is developed to detect bird echoes among the biological echoes identified in the first step and thus remove bird-contaminated velocities. The membership functions used by the fuzzy logic method in the second step are extracted from normalized histograms of differential reflectivity and differential phase for birds and insects, respectively, while the normalized histograms are constructed by polarimetric data collected during the 2012 fall migrating season and sorted for bird and insects, respectively. The performance and effectiveness of the technique are demonstrated by real-data examples.
2013: Evaluation of a Forward Operator to Assimilate Cloud Water Path into WRF-DART. Monthly Weather Review, 141, 2272–2289., , , ,
Assimilating satellite-retrieved cloud properties into storm-scale models has received limited attention despite its potential to provide a wide array of information to a model analysis. Available retrievals include cloud water path (CWP), which represents the amount of cloud water and cloud ice present in an integrated column, and cloud-top and cloud-base pressures, which represent the top and bottom pressure levels of the cloud layers, respectively. These interrelated data are assimilated into an Advanced Research Weather Research and Forecasting Model (ARW-WRF) 40-member ensemble with 3-km grid spacing using the Data Assimilation Research Testbed (DART) ensemble Kalman filter. A new CWP forward operator combines the satellite-derived cloud information with similar variables generated by WRF. This approach is tested using a severe weather event on 10 May 2010. One experiment only assimilates conventional (CONV) ob- servations, while the second assimilates the identical conventional observations and the satellite-derived CWP (PATH). Comparison of the CWP observations at 2045 UTC to CONV and PATH analyses shows that PATH has an improved representation of both the magnitude and spatial orientation of CWP compared to CONV. As- similating CWP acts both to suppress convection in the model where none is present in satellite data and to encourage convection where it is observed. Oklahoma Mesonet observations of downward shortwave flux at 2100 UTC indicate that PATH reduces the root-mean-square difference errors in downward shortwave flux by 75 W m22 compared to CONV. Reduction in model error is generally maximized during the initial 30-min forecast period with the impact of CWP observations decreasing for longer forecast times.
2013: Assimilation of satellite infrared radiances and Doppler radar observations during a cool season Observing System Simulation Experiment. Monthly Weather Review, 141, 3273–3299., , , ,
An Observing System Simulation Experiment is used to examine the impact of assimilating water vapor sensitive satellite infrared brightness temperatures and Doppler radar reflectivity and radial velocity observations on the analysis accuracy of a cool season extratropical cyclone. Assimilation experiments are performed for four different combinations of satellite, radar, and conventional observations using an ensemble Kalman filter assimilation system. Comparison with the high-resolution “truth” simulation indicates that the joint assimilation of satellite and radar observations reduces errors in cloud properties compared to the case in which only conventional observations are assimilated. The satellite observations provide the most impact in the mid- to upper-troposphere, whereas the radar data also improve the cloud analysis near the surface and aloft due to their greater vertical resolution and larger overall sample size. Errors in the wind field are also significantly reduced when radar radial velocity observations were assimilated. Overall, assimilating both satellite and radar data create the most accurate model analysis, which indicates that both observation types provide independent and complimentary information and illustrates the potential for these data sets for improving mesoscale model analyses and ensuing forecasts.
2013: A Feasibility Study for Probabilistic Convection Initiation Forecasts Based on Explicit Numerical Guidance. Bulletin of the American Meteorological Society, 94, 1213–1225, doi:10.1175/BAMS-D-11-00264.1., , , , , , , , , , , , , , ,
The 2011 Spring Forecasting Experiment in the NOAA Hazardous Weather Testbed (HWT) featured a significant component on convection initiation (CI). As in previous HWT experiments, the CI study was a collaborative effort between forecasters and researchers, with equal emphasis on experimental forecasting strategies and evaluation of prototype model guidance products. The overarching goal of the CI effort was to identify the primary challenges of the CI forecasting problem and to establish a framework for additional studies and possible routine forecasting of CI. This study confirms that convection-allowing models with grid spacing ~4 km represent many aspects of the formation and development of deep convection clouds explicitly and with predictive utility. Further, it shows that automated algorithms can skillfully identify the CI process during model integration. However, it also reveals that automated detection of individual convection cells, by itself, provides inadequate guidance for the disruptive potential of deep convection activity. Thus, future work on the CI forecasting problem should be couched in terms of convection-event prediction rather than detection and prediction of individual convection cells.
2013: Comparison of polarimetric signatures of hail at S and C bands for different hail sizes. Atmospheric Research, 123, 323–336., ,
In this study, severe hail cases in Oklahoma/USA are investigated by analyzing the data simultaneously collected by two closely located polarimetric weather radars operating at S and C bands. Polarimetric radar variables measured in the presence of hail at C band are quite different from the ones at S band due to more pronounced effects of resonance scattering and much stronger impact of attenuation. The differences are particularly strong in melting hail below the freezing level, but they can be substantial even at higher altitudes where hail is dry or grows in wet regime. As a consequence, the algorithms for hail detection and determination of its size developed at S band can't be directly applied to C band. Differences between vertical profiles of radar reflectivity Z, differential reflectivity ZDR, and cross-correlation coefficient ρhv in hail-bearing parts of the storms have been examined for large and giant hail. It is shown that in the presence of hail, ZDR(C) is usually higher than ZDR(S) and ρhv(C) < ρhv(S). The height of radar resolution volume with respect to the freezing level has to be taken into account in polarimetric hail detection/sizing. It is also demonstrated that giant hail is commonly associated with pronounced depression of ρhv in the areas of hail generation above the freezing level and the corresponding drop in ρhv at C band is much stronger than at S band. These results are compared to C-band polarimetric data collected in hail-bearing thunderstorms in Austria, where additional small hail size reports are available.
2013: Re-examination of the I-5 dust storm. Journal of Geophysical Research, 118, 1–19, doi:10.1002/jgrd.50131,2013., , , , , , ,
The infamous dust storm over the Thanksgiving holiday of 1991 that led to loss of life from numerous automobile accidents on Interstate 5 (I-5) has been re-examined. Pauley and collaborators (Pauley et al. 1996) conducted an earlier investigation of this storm where synoptic analyses from the U. S. Navy’s operational products followed the Danielsen paradigm (Danielsen 1974) — a paradigm linked to the tropopause fold phenomenon and a balanced thermally indirect circulation about the jet stream. Examination of mesoscale structures in the storm from the recently available North American Regional Reanalysis (NARR) gave evidence of a low-level direct circulation — ascent and cooling above the accident site — that demanded further investigation. A high-resolution Weather Research and Forecasting (WRF) model simulation in concert with surface and upper-air observations was then used to analyze the storm. Principal results from the study follow: (1) Although the model simulation gave evidence of a weak indirect circulation in the upper troposphere in support of the Danielsen’s paradigm, the dynamic control of the storm stemmed from the lower tropospheric mesoscale response to geostrophic imbalance, (2) A lower-tropospheric direct circulation led to mass/temperature adjustments that were confirmed by upper-air observations at locations neighboring the accident site, and (3) Boundary layer deepening and destabilization due to these mesoscale processes pin-pointed the timing and location of the dust storm. Although this study does not underestimate the value of analyses that focus on the larger/synoptic scales of motion, it does bring to light the value of investigation that makes use of the mesoscale resources in order to clarify synoptic-mesoscale interactions.
2013: Comparison of TRMM 2A25 products, version 6 and version 7, with NOAA/NSSL ground radar-based National Mosaic QPE. Journal of Hydrometeorology, 14, 661–669, doi:10.1175/JHM-D-12-030.1., , , , , ,
Characterization of the error associated with satellite rainfall estimates is a necessary component of deterministic and probabilistic frameworks involving spaceborne passive and active microwave measurements for applications ranging from water budget studies to forecasting natural hazards related to extreme rainfall events. The authors focus here on the relative error structure of Tropical Rainfall Measurement Mission (TRMM) precipitation radar (PR) quantitative precipitation estimation (QPE) at the ground by comparison of 2A25 products with reference values derived from NOAA/NSSL’s ground radar–based National Mosaic and QPE system (NMQ/Q2). The primary contribution of this study is to compare the new 2A25, version 7 (V7), products that were recently released as a replacement of version 6 (V6). Moreover, the authors supply uncertainty estimates of the rainfall products so that they may be used in a quantitative manner for applications like hydrologic modeling. This new version is considered superior over land areas and will likely be the final version for TRMM PR rainfall estimates. Several aspects of the two versions are compared and quantified, including rainfall rate distributions, systematic biases, and random errors. All analyses indicate that V7 is in closer agreement with the reference rainfall compared to V6.
Keywords: Rainfall, Radars/Radar observations, Remote sensing, Satellite observations, Error analysis, Statistics
2013: Influence of mesonet observations on the accuracy of surface analyses generated by an Ensemble Kalman Filter. Weather and Forecasting, 28, 815–841, doi:10.1175/WAF-D-12-00078.1., ,
The expansion of surface mesoscale networks (mesonets) across the United States provides a high-resolution observational dataset for meteorological analysis and prediction. To clarify the impact of mesonet data on the accuracy of surface analyses, 2-m temperature, 2-m dewpoint,and 10-m wind analyses for 2-week periods during the warm and cold seasons produced through an ensemble Kalman filter (EnKF) approach are compared to surface analyses created by the Real-Time Mesoscale Analysis (RTMA). Results show in general a similarity between the EnKF analyses and the RTMA, with the EnKF exhibiting a smoother appearance with less small- scale variability. Root-mean-square (RMS) innovations are generally lower for temperature and dewpoint from the RTMA, implying a closer fit to the observations. Kinetic energy spectra computed from the two analyses reveal that the EnKF analysis spectra match more closely to the spectra computed from observations and numerical models in earlier studies. Data-denial experiments using the EnKF completed for the first week of the warm and cold seasons, as well as for two periods characterized by high mesoscale variability within the ex- perimental domain, show that mesonet data removal imparts only minimal degradation to the analyses. This is because of the localized background covariances computed for the four surface variables having spatial scales much larger than the average spacing of mesonet stations. Results show that removing 75% of the mesonet observations has only minimal influence on the analysis.
2012: Freezing of raindrops in deep convective updrafts: A microphysical and polarimetric model. Journal of the Atmospheric Sciences, 69, 3471–3490, doi:10.1175/JAS-D-12-067.1., , ,
Polarimetric radar observations of convective storms routinely reveal positive differential reflectivity ZDR extending above the 0°C level, indicative of the presence of supercooled liquid particles lofted by the storm’s updraft. The summit of such “ZDR columns” is marked by a zone of enhanced linear depolarization ratio LDR or decreased copolar cross-correlation coefficient ρhv and a sharp decrease in ZDR that together mark a particle freezing zone. To better understand the relation between changes in the storm updraft and the observed polarimetric variables, it is necessary to first understand the physics governing this freezing process and the impact of freezing on the polarimetric variables. A simplified, one-dimensional explicit bin microphysics model of stochastic drop nucleation by an immersed foreign particle and subsequent deterministic freezing is developed and coupled with an electromagnetic scattering model to explore the impact of the freezing process on the polarimetric radar variables. As expected, the height of the ZDR column is closely related to the updraft strength and initial drop size distribution. Additionally, the treatment of the stochastic nucleation process can also affect the depth of the freezing zone, underscoring the need to accurately depict this process in parameterizations. Representation of stochastic nucleation and deterministic freezing for each drop size bin yields better agreement between observations and the modeled vertical profiles of the surface reflectivity factor ZH and ZDR than bulk microphysics schemes. Further improvements in the representation of the LDR cap, the observed ZDR gradient in the freezing zone, and the magnitude of the ρhv minimum may require inclusion of accretion, which was not included in this model.
2013: Monitoring and Understanding Trends in Extreme Storms: State of Knowledge. Bulletin of the American Meteorological Society, 94, 499–514, doi:10.1175/BAMS-D-11-00262.1., , , , , , , , , , , , , , , , , , , , , , , ,
The state of knowledge regarding trends and an understanding of their causes is presented for a specific subset of extreme weather and climate types. For severe convective storms (tornadoes, hailstorms, and severe thunderstorms), differences in time and space of practices of collecting reports of events make using the reporting database to detect trends extremely difficult. Overall, changes in the frequency of environments favorable for severe thunderstorms have not been statistically significant. For extreme precipitation, there is strong evidence for a nationally averaged upward trend in the frequency and intensity of events. The causes of the observed trends have not been determined with certainty, although there is evidence that increasing atmospheric water vapor may be one factor. For hurricanes and typhoons, robust detection of trends in Atlantic and western North Pacific tropical cyclone (TC) activity is significantly constrained by data heterogeneity and deficient quantification of internal variability. Attribution of past TC changes is further challenged by a lack of consensus on the physical link- ages between climate forcing and TC activity. As a result, attribution of trends to anthropogenic forcing remains controversial. For severe snowstorms and ice storms, the number of severe regional snowstorms that occurred since 1960 was more than twice that of the preceding 60 years. There are no significant multidecadal trends in the areal percentage of the contiguous United States impacted by extreme seasonal snowfall amounts since 1900. There is no distinguishable trend in the frequency of ice storms for the United States as a whole since 1950.
2012: Tuning the Auto-Nowcaster Automatically. Weather and Forecasting, 27, 1568–1579., , , , ,
AutoNowCaster (ANC) is an automated system that nowcasts thunderstorms, including thunderstorm initiation. However, its parameters have to be tuned to regional environments, a process that is time-consuming, labor-intensive and quite subjective. When the National Weather Service decided to explore using ANC in forecast operations, a faster, less labor-intensive and objective mechanism to tune the parameters for all the forecast offices was sought.
In this paper, a genetic algorithm approach to tuning ANC is described. The process consisted of choosing data sets, employing an objective forecast verification technique and devising a fitness function. ANC was modified to create nowcasts offline using weights iteratively generated by the genetic algorithm. The weights were generated by probabilistically combining weights with good fitness, leading to better and better weights as the tuning process proceeded.
The nowcasts created by ANC using the automatically determined weights are compared with the nowcasts created by ANC using weights that were the result of manual tuning. It is shown that nowcasts created using the automatically tuned weights are as skilled as the ones created through manual tuning. In addition, automated tuning can be done in a fraction of the time that it takes experts to analyze the data and tune the weights.
2013: An improved method to compute radar echo top heights. Weather and Forecasting, 28, 481–488, doi:10.1175/WAF-D-12-00084.1., , , ,
It is demonstrated that the traditional method, in widespread use on NEXRAD and other radar systems, to compute echo top heights results in both under- and overestimates. It is proposed that echo tops be computed by interpolating between elevation scans that bracket the echo top threshold. The traditional and proposed techniques are evaluated using simulated radar samples of a modeled thunderstorm and by sampling a high-resolution Range Height Indicator (RHI) of a real thunderstorm. It is shown that the proposed method results in smaller errors when higher elevation scans are available.
2013: Quality control of accumulated fields by applying spatial and temporal constraints. Journal of Atmospheric and Oceanic Technology, 30, 745–757, doi:10.1175/JTECH-D-12-00128.1., , ,
Accumulating gridded fields over time greatly magnifies the impact of impulse noise in the individual grids. A quality control method that takes advantage of spatial and temporal coherence can reduce the impact of such noise in accumulation grids. Such a method can be implemented using the image processing techniques of hysteresis and multiple hypothesis tracking (MHT). These steps are described in this paper and the method is applied to simulated data to quantify the improvements and explain the effect of various parameters. Finally, the quality control technique is applied to some illustrative real-world datasets.
2013: Data assimilation as a problem in optimal tracking: Application of Pontryagin's Minimum Principle t oAtmospheric Science. Journal of the Atmospheric Sciences, 70, 1257–1277, doi:http://dx.doi.org.10.1175/JAS-D-12-0217.1., , ,
A data assimilation strategy based on feedback control has been developed for the geophysical sciences - a strategy that uses model output to control the behavior of the dynamical system. Whereas optimal tracking through feedback control had its early history in application to vehicle trajectories in space science, the methodology has been adapted to geophysical dynamics by forcing the trajectory of a deterministic model to follow observations in accord with observation accuracy. Fundamentally, this off-line approach is based on Pontryagin’s minimum principle (PMP) where a least squares fit of idealized path to dynamical law follows from Hamiltonian mechanics. This utilitarian process optimally determines a forcing function that depends on the state (the feed-back component) and the observations. It follows that this optimal forcing accounts for the model error. From this model error, a correction to the one-step transition matrix is constructed. The above theory and technique is illustrated using linear Burgers’ equation that transfers energy from the large scale to the small scale.
2013: Bias Correction for Polarimetric Phased-Array Radar with Idealized Aperture and Patch Antenna Elements. IEEE Trans. on Geoscience and Remote Sensing, January, 473–489, doi:10.1109/TGRS.2012.2198070., , ,
Polarimetric phased-array radar (PPAR) creates biases in observed polarimetric parameters when the beam is pointed off broadside. Thus, a bias correction matrix needs to be applied for each beam direction. A bias correction matrix is developed for array elements consisting of either waveguide apertures or patches. Correction matrices are given for both the Alternate Transmission and Simultaneous Reception mode and the Simultaneous Transmission and Simultaneous Reception mode. The biases of polarimetric parameters measured with a PPAR without the application of a correction matrix are presented.
2013: Scan-to-Scan Correlation of Weather Radar Signals to Identify Ground Clutter. IEEE Trans. on Geoscience and Remote Sensing Letters, 10, 855–859, doi:10.1109/LGRS.2012.2226233., , , ,
The scan-to-scan correlation method to discriminate weather signals from ground clutter, described in this letter, takes advantage of the fact that the correlation time of radar echoes from hydrometeors is typically much shorter than that from ground objects. In this letter, the scan-to-scan correlation method ia applied to data from the WSR-88D, and its results are compared with those produced by the WSR-88D’s ground clutter detector. A subjective comparison with an operational clutter detection algorithm used on the network of weather radars shows that the scan-to-scan correlation method produces a similar clutter field but presents clutter locations with higher spatial resolution.
2013: A New Approach to Detect Ground Clutter Mixed with Weather Signals. IEEE Trans. Geoscience and Remote Sensing, 51, 2373–2387, doi:10.1109/TGRS.2012.2209658., , , , ,
Considering that the statistics of the phase and the power of weather signals in the spectral domain are different from those statistics for echoes from stationary objects, a spectrum clutter identification (SCI) algorithm has been developed to detect ground clutter using single polarization radars, but SCI can be extended for dual-pol radars. SCI examines both the power and phase in the spectral domain and uses a simple Bayesian classifier to combine four discriminants: spectral power distribution, spectral phase fluctuations, spatial texture of echo power, and spatial texture of spectrum width to make decisions as to the presence of clutter that can corrupt meteorological measurements. This work is focused on detecting ground clutter mixed with weather signals, even if the clutter power to signal power ratio is low. The performance of the SCI algorithm is shown by applying it to radar data collected by University of Oklahoma-Polarimetric Radar for Innovation in Meteorology and Engineering.
2013: Coordinated observations of sprites and in-cloud lightning flash structure. Journal of Geophysical Research, 118, 6607–6632, doi:10.1002/jgrd.50459., , , , , , , , , , , , , , , , , , ,
The temporal and spatial development of sprite-producing lightning flashes is examined with coordinated observations over an asymmetric mesoscale convective system (MCS) on 29 June 2011 near the Oklahoma Lightning Mapping Array (LMA). Sprites produced by a total of 26 lightning flashes were observed simultaneously on video from Bennett, Colorado and Hawley, Texas, enabling a triangulation of sprites in comparison with temporal development of parent lightning (in particular, negatively charged stepped leaders) in three-dimensional space. In general, prompt sprites produced within 20 ms after the causative stroke are less horizontally displaced (typically <30 km) from the ground stroke than delayed sprites, which usually occur over 40 ms after the stroke with significant lateral offsets (>30 km). However, both prompt and delayed sprites are usually centered within 30 km of the geometric center of relevant LMA sources (with affinity to negative stepped leaders) during the prior 100 ms interval. Multiple sprites appearing as dancing/jumping events associated with a single lightning flash could be produced either by distinct strokes of the flash, by a single stroke through a series of current surges superposed on an intense continuing current, or by both. Our observations imply that sprites elongated in one direction are sometimes linked to in-cloud leader structure with the same elongation, and sprites that were more symmetric were produced above the progression of multiple negative leaders. This suggests that the large-scale structure of sprites could be affected by the in-cloud geometry of positive charge removal. Based on an expanded dataset of 39 sprite-parent flashes by including more sprites recorded by one single camera over the same MCS, the altitude (above mean sea level, MSL) of positively charged cloud region tapped by sprite-producing strokes declined gradually from ~10 km MSL (-35°C) to around 6 km MSL (-10°C) as the MCS evolved through the mature stage. On average, the positive charge removal by causative strokes of sprites observed on 29 June is centered at 3.6 km above the freezing level or at 7.9 km above ground level.
2012: Predicting cloud-to-ground and intracloud lightning in weather forecast models. Weather and Forecasting, 27, 1470–1488., , , , ,
A new prognostic, spatially and temporally dependent variable is introduced to the Weather Research and Forecasting Model (WRF). This variable is called the potential electrical energy (Ep). It was used to predict the dynamic contribution of the grid-scale-resolved microphysical and vertical velocity fields to the production of cloud-to-ground and intracloud lightning in convection-allowing forecasts. The source of Ep is assumed to be the noninductive charge separation process involving collisions of graupel and ice particles in the presence of supercooled liquid water. The Ep dissipates when it exceeds preassigned threshold values and lightning is generated. An analysis of four case studies is presented and analyzed. On the 4-km simulation grid, a single cloud-to-ground lightning event was forecast with about equal values of probability of detection (POD) and false alarm ratio (FAR). However, when lighting was integrated onto 12-km and then 36-km grid overlays, there was a large improvement in the forecast skill, and as many as 10 cloud-to-ground lighting events were well forecast on the 36-km grid. The impact of initial conditions on forecast accuracy is briefly discussed, including an evaluation of the scheme in wintertime, when lightning activity is weaker. The dynamic algorithm forecasts are also contrasted with statistical lightning forecasts and differences are noted. The scheme is being used operationally with the Rapid Refresh (13 km) data; the skill scores in these operational runs were very good in clearly defined convective situations.
2013: Investigating the applicability of error correction ensembles of satellite rainfall products in river flow simulations. Journal of Hydrometeorology, 14, 1194–1211, doi:10.1175/JHM-D-12-074.1., , , , , ,
This study uses a stochastic ensemble-based representation of satellite rainfall error to predict the propagation in flood simulation of three quasi-global-scale satellite rainfall products across a range of basin scales. The study is conducted on the Tar-Pamlico River basin in the southeastern United States based on 2 years of data (2004 and 2006). The NWS Multisensor Precipitation Estimator (MPE) dataset is used as the reference for evaluating three satellite rainfall products: the Tropical Rainfall Measuring Mission (TRMM) real-time 3B42 product (3B42RT), the Climate Prediction Center morphing technique (CMORPH), and the Precipitation Estimation from Remotely Sensed Imagery Using Artificial Neural Networks–Cloud Classification System (PERSIANN-CCS). Both ground-measured runoff and streamflow simulations, derived from the NWS Research Distributed Hydrologic Model forced with the MPE dataset, are used as benchmarks to evaluate ensemble streamflow simulations obtained by forcing the model with satellite rainfall corrected using stochastic error simulations from a two-dimensional satellite rainfall error model (SREM2D). The ability of the SREM2D ensemble error corrections to improve satellite rainfall-driven runoff simulations and to characterize the error variability of those simulations is evaluated. It is shown that by applying the SREM2D error ensemble to satellite rainfall, the simulated runoff ensemble is able to envelope both the reference runoff simulation and observed streamflow. The best (uncorrected) product is 3B42RT, but after applying SREM2D, CMORPH becomes the most accurate of the three products in the prediction of runoff variability. The impact of spatial resolution on the rainfall-to-runoff error propagation is also evaluated for a cascade of basin scales (500–5000 km2). Results show a doubling in the bias from rainfall to runoff at all basin scales. Significant dependency to catchment area is exhibited for the random error propagation component.
Keywords: Satellite observations, Ensembles, Probability forecasts/models/distribution, Hydrologic models, Model errors, Flood events
2013: Total lightning characteristics relative to radar and satellite observations of Oklahoma mesoscale convective systems. Monthly Weather Review, 141, 1593–1611, doi:10.1175/MWR-D-11-00268.1., , , ,
The advent of regional very high frequency (VHF) Lightning Mapping Arrays (LMAs) makes it possible to begin analyzing trends in total lightning characteristics in ensembles of mesoscale convective systems (MCSs). Flash initiations observed by the Oklahoma LMA and ground strikes observed by the National Lightning Detection Network were surveyed relative to infrared satellite and base-scan radar reflectivity imagery for 30 mesoscale convective systems occurring over a 7-yr period. Total lightning data were available for only part of the life cycle of most MCSs, but well-defined peaks in flash rates were usually observed for MCSs having longer periods of data. The mean of the maximum 10-min flash rates for the ensemble of MCSs was 203 per min for total flashes and 41 per min for cloud-to-ground flashes (CGs). In total, 21% of flashes were CGs and 13% of CGs lowered positive charge to ground. MCSs with the largest maximum flash rates entered Oklahoma in the evening before midnight. All three MCSs entering Oklahoma in early morning after sunrise had among the smallest maximum flash rates. Flash initiations were concentrated in or near regions of larger reflectivity and colder cloud tops. The CG flash rates and total flash rates frequently evolved similarly, although the fraction of flashes striking ground usually increased as an MCS decayed. Total flash rates tended to peak approximately 90 min before the maximum area of the -52 deg C cloud shield, but closer in time to the maximum area of colder cloud shields. MCSs whose -52 deg C cloud shield grew faster tended to have larger flash rates.
2013: Aerosol Effects on Simulated Storm Electrification and Precipitation in a Two-Moment Bulk Microphysics Model. Journal of the Atmospheric Sciences, 70, 2032–2050, doi:10.1175/JAS-D-12-0264.1., ,
The effects of cloud condensation nuclei (CCN) concentrations are found to strongly affect the microphysical and electrical evolution of a numerically simulated small multicell storm. The simulations reproduce the well-known effects of updraft invigoration and delay of precipitation formation as increasing CCN from low to intermediate concentrations causes droplet sizes to decrease. Peak updrafts increased from 16 m s−1 at the lowest CCN to a maximum of 21–22 m s−1 at moderate CCN, where condensation latent heating is maximized. The transition from low to high CCN first maximizes warm-rain production before switching over to the ice process as the dominant precipitation mechanism. Average graupel density stays fairly high and constant at lower CCN, but then drops monotonically at higher CCN concentration, although high CCN also foster the appearance of small regions of larger, high-density graupel with high simulated radar reflectivity.
Graupel production increases monotonically as CCN concentration rises from 50 to about 2000 cm−3. The lightning response is relatively weak until the Hallett–Mossop rime-splintering ice multiplication becomes more active at CCN > 700 cm−3. At very high CCN concentrations (>2000 cm−3), graupel production decreases slowly, but lightning activity drops dramatically when the parameterization of Hallett–Mossop rime-splintering ice multiplication is based on the number of large cloud droplets collected by graupel. Conversely, lightning activity remains steady at extremely high CCN concentration when the Hallett–Mossop parameterization is based simply on the rate of rime mass accumulation. The results lend support to the aerosol hypothesis as applied to lightning production, whereby greater CCN concentration tends to lead to greater lightning activity, but with a large sensitivity to ice multiplication.
2013: Dynamics of Local Circulations in Mountainous Terrain during the RHUBC-II Project. Monthly Weather Review, 141, 3641–3656, doi:10.1175/MWR-D-12-00245.1., , , , ,
The Radiative Heating in Underexplored Bands Campaign (RHUBC-II) project was held from August to October 2009 in the Atacama Desert in Chile at 5320-m altitude. Observations from this experiment and a high-resolution numerical simulation with the Weather Research and Forecasting Model (WRF) were used to understand the structure and evolution of the atmosphere over a region with complex terrain and extremely dry environmental conditions. The mechanisms driving the local circulations during synoptically unperturbed conditions at the field site were studied. The study suggests that the field site is mainly affected by a mountain-scale and a plateau-scale thermally driven circulation. The latter seems to dominate. The advection of warm air by downslope flows from higher heights during nighttime may be the mechanism that counteracts the longwave radiative cooling at the surface, causing a small decrease of near-surface temperature during the night. WRF represents the near-surface and upper atmosphere reasonably well above the RHUBC-II site. Important orographic features are misrepresented in the model terrain, which may cause the observed differences in near-surface winds. The zonal pressure gradient between both sides of the mountain and the static stability of the air mass on the windward side of the terrain control the local circulations over the field site. Consequently, a misrepresentation of these mechanisms in the model may cause differences between the simulated winds and observations.
2013: Recoil Leader formation and development. Journal of Electrostatics, 71, 763–768., , , ,
The existing interpretation in the lightning literature, based on field measurements, defines recoil leaders as negative leaders. However recoil leaders are floating conductors, and, based on this physical assumption, they should be defined as bipolar and bidirectional leaders. This physics-based assumption has never previously been verified experimentally. Such verification, reported in this paper, has been obtained from observations of branched upward positive leaders from tall towers using a high-speed video system synchronized with electric and magnetic field change and luminosity measurements on the ground. The analysis of these observations clearly reveals the nature of recoil and dart leaders as bidirectional and bipolar electrodeless discharges that develop from a small region along a path of the decayed channels of a previous positive leader, or a positively charged return stroke of negative CG flashes.
2013: Enhanced spatiotemporal relational probability trees and forests. Data Mining and Knowledge Discovery, 26, 398–433, doi:10.1007/s10618-012-0261-2., , , , ,
Many real world domains are inherently spatiotemporal in nature. In this work, we introduce significant enhancements to two spatiotemporal relational learning methods, the spatiotemporal relational probability tree and the spatiotemporal relational random forest, that increase their ability to learn using spatiotemporal data. We enabled the models to formulate questions on both objects and the scalar and vector fields within and around objects, allowing the models to differentiate based on the gradient, divergence, and curl and to recognize the shape of point clouds defined by fields. This enables the model to ask questions about the change of a shape over time or about its orientation. These additions are validated on several real-world hazardous weather datasets. We demonstrate that these additions enable the models to learn robust classifiers that outperform the versions without these new additions. In addition, analysis of the learned models shows that the findings are consistent with current meteorological theories.
2013: Axis ratios and flutter angles of cloud ice particles: Retrievals from radar data. Journal of Atmospheric and Oceanic Technology, 30, 1691–1703, doi:10.1175/JTECH-D-12-00212.1., ,
A novel method of retrieving of the mean axis ratio (width/length) and standard deviation of orientation angles (σθ, which is called herein the intensity of fluttering) of ice cloud particles from polarimetric radar data is described. The method is based on measurements of differential reflectivity (ZDR) and the copolar correlation coefficient in cloud areas with ZDR > 4 dB. In three analyzed cases, the values of the retrieved axis ratio were in interval 0.15 to 0.4 and σθ found in interval from 2 to 20 degs. The latter values indicate that the particles experienced light to moderate flutter. Uncertainties in the retrievals due to uncertainties in the bulk ice density of the particles and possible presence of columnar crystals are considered. The retrieval method is applicable for centimeter wavelength radars; the analyzed data were collected with the dual polarization S-band WSR-88D radar.
2013: Structures of Bragg scatter observed with the polarimetric WSR-88D. Journal of Atmospheric and Oceanic Technology, 30, 1253–1258, doi:10.1175/JTECH-D-12-00210.1., , , ,
Enhancements to signal processing and data collection in the dual-polarization WSR-88D to increase its detection capability yield observations of “fine” structures from Bragg scatterers. Several types of the fine structures observed in and above the boundary layer are discussed. These Bragg scatter structures include the top of the convective boundary layer, non-precipitating clouds, strong convective plumes above the boundary layer, and a layer of weak reflections associated with decaying boundary layer turbulence. A conclusion that data from polarimetric WSR-88Ds can be used to obtain the depth of the convective boundary layer is made.
2013: An automated method for depicting mesocyclone paths and intensities. Weather and Forecasting, 28, 570–585, doi:10.1175/WAF-D-12-00065.1., , ,
The location and intensity of mesocyclone circulations can be tracked in real time by accumulating azimuthal shear values over time at every location of a uniform spatial grid. Azimuthal shear at low (0–3 km AGL) and midlevels (3–6 km AGL) of the atmosphere is computed in a noise-tolerant manner by fitting the Doppler velocity observations in the neighborhood of a pulse volume to a plane and finding the slope of that plane. Rotation tracks created in this manner are contaminated by nonmeteorological signatures caused by poor velocity dealiasing, ground clutter, radar test patterns, and spurious shear values. To improve the quality of these fields for real-time use and for an accumulated multiyear climatology, new dealiasing strategies, data thresholding, and multiple hypothesis tracking (MHT) techniques have been implemented. These techniques remove nearly all nonmeteorological contaminants, resulting in much clearer rotation tracks that appear to match mesocyclone paths and intensities closely.
2013: Surface-based Inversions above Central Greenland. Journal of Geophysical Research, 118, 1–12, doi:10.1029/2012JD018867., , , , , , ,
Surface-based temperature inversions (SBIs) are studied at Summit Station in central Greenland during the period spanning July 2010 to May 2012. The frequency and intensity of SBI are examined using microwave radiometer (MWR) temperature retrievals, radiosonde profiles, and near-surface meteorological data. Using the MWRs’ high temporal resolution, the diurnal, monthly, and annual cycles are investigated. Monthly mean values in SBI occurrence and intensity show that surface-based inversions are prevalent in the winter with decreasing values in the summer months. A case study on 20 February 2011 suggests that factors other than solar elevation angle influence the intensity of surface-based inversions. An increase in liquid water path corresponds to a decrease in SBI intensity, suggesting that liquid-bearing clouds, especially within the lowest 1 km, are associated with weaker surface-based inversions.
2013: On the mitigation of wind-turbine clutter for weather radars using range-Doppler spectral processing. IET Radar, Sonar & Navigation, 7, 178–190, doi:10.1049/iet-rsn.2012.0225., , ,
The unwanted return signals from wind turbines can contaminate the weather-radar data that are used by forecasters and automatic algorithms to issue forecast and warnings for severe weather. Since wind turbines have moving components that generate return signals with non-zero Doppler velocity, traditional ground clutter filters are ineffective at removing wind turbine clutter (WTC). In this study, a WTC mitigation algorithm using the range-Doppler spectrum is developed and tested with simulated weather and WTC signals. Once the general locations of the WTC contamination are known, the proposed range-Doppler regression (RDR) algorithm exploits the spatial continuity of weather signals in the range domain to mitigate the WTC contamination while retaining as much weather signal as possible. In contrast to other proposed mitigation algorithms, the RDR algorithm is suited for real-time implementation on typical operational weather radars. Simulated data are used to optimise the parameters of the algorithm and evaluate its performance for stratiform- and convective-precipitation cases with different degrees of WTC contamination. Finally, a real data case is processed to illustrate the RDR algorithm's effectiveness. The results show that the RDR algorithm has the potential to effectively reduce the bias in spectral-moment estimates caused by WTC contamination in an operational environment.
2012: Evolution of a Quasi-Linear Convective System Sampled by Phased Array Radar. Monthly Weather Review, 140, 3467–3486, doi:10.1175/MWR-D-12-00003.1., ,
On 2 April 2010, a quasi-linear convective system (QLCS) moved eastward through Oklahoma during the early morning hours. Wind damage in Rush Springs, Oklahoma, approached (enhanced Fujita) EF1-scale intensity and was likely associated with a mesovortex along the leading edge of the QLCS. The evolution of the QLCS as it produced its first bow echo was captured by the National Weather Radar Testbed Phased Array Radar (NWRT PAR) in Norman, Oklahoma. The NWRT PAR is an S-band radar with an electronically steered beam, allowing for rapid volumetric updates (~1 min) and user-defined scanning strategies. The rapid temporal updates and dense vertical sampling of the PAR created a detailed depiction of the damaging wind mechanisms associated with the QLCS. Key features sampled by the PAR include microbursts, an intensifying midlevel jet, and rotation associated with the mesovortex. In this work, PAR data are analyzed and compared to data from nearby operational radars, highlighting the advantages of using high-temporal-resolution data to monitor storm evolution.
The PAR sampled the events preceding the Rush Springs circulation in great detail. Based on PAR data, the midlevel jet in the QLCS strengthened as it approached Rush Springs, creating an area of strong midlevel convergence where it impinged on the system-relative front-to-rear flow. As this convergence extended to the lower levels of the storm, a preexisting azimuthal shear maximum increased in magnitude and vertical extent, and EF1-scale damage occurred in Rush Springs. The depiction of these events in the PAR data demonstrates the complex and rapidly changing nature of QLCSs.
2013: Range-Correcting Azimuthal Shear in Doppler Radar Data. Weather and Forecasting, 28, 194–211, doi:10.1175/WAF-D-11-00154.1., , , , ,
The current tornado detection algorithm (TDA) used by the National Weather Service produces a large number of false detections, primarily because it calculates azimuthal shear in a manner that is adversely impacted by noisy velocity data and range-degraded velocity signatures. Coincident with the advent of new radar-derived products and ongoing research involving new weather radar systems, the National Severe Storms Laboratory is developing an improved TDA. A primary component of this algorithm is the local, linear least squares derivatives (LLSD) azimuthal shear field. The LLSD method incorporates rotational derivatives of the velocity field and is affected less strongly by noisy velocity data in comparison with traditional “peak to peak” azimuthal shear calculations. LLSD shear is generally less range dependent than peak-to-peak shear, although some range dependency is unavoidable. The relationship between range and the LLSD shear values of simulated circulations was examined to develop a range correction for LLSD shear. A linear regression and artificial neural networks (ANNs) were investigated as range-correction models. Both methods were used to produce fits for the simulated shear data, although the ANN excelled as it could capture the nonlinear nature of the data. The range-correction methods were applied to real radar data from tornadic and nontornadic events to measure the capacity of the corrected shear to discriminate between tornadic and nontornadic circulations. The findings presented herein suggest that both methods increased shear values during tornadic periods by nearly an order of magnitude, facilitating differentiation between tornadic and nontornadic scans in tornadic events.
2013: Long-term Evaluation of Temperature Profiles Measured by an Operational Raman Lidar. Journal of Atmospheric and Oceanic Technology, 30, 1616–1634, doi:10.1175/JTECH-D-12-00138.1., , ,
This study investigates the accuracy and calibration stability of temperature profiles derived from an operational Raman lidar over a 2-yr period from 1 January 2009 to 31 December 2010. The lidar, which uses the rotational Raman technique for temperature measurement, is located at the U.S. Department of Energy’s Atmospheric Radiation Measurement site near Billings, Oklahoma. The lidar performance specifications, data processing algorithms, and the results of several test runs are described. Calibration and overlap correction of the lidar is achieved using simultaneous and collocated radiosonde measurements. Results show that the calibration coefficients exhibit no significant long-term or seasonal variation but do show a distinct diurnal variation. When the diurnal variation in the calibration is not resolved the lidar temperature bias exhibits a significant diurnal variation. Test runs in which only nighttime radiosonde measurements are used for calibration show that the lidar exhibits a daytime warm bias that is correlated with the strength of the solar background signal. This bias, which reaches a maximum of ~2.4 K near solar noon, is reduced through the application of a correction scheme in which the calibration coefficients are parameterized in terms of the solar background signal. Comparison between the corrected lidar temperatures and the noncalibration radiosonde temperatures show a negligibly small median bias of -0.013 K for altitudes below 10 km AGL. The corresponding root-mean-square difference profile is roughly constant at ~2 K below 6 km AGL and increases to about 4.5 K at 10 km AGL.
2013: Monitoring and Understanding Changes in Heat Waves, Cold Waves, Floods and Droughts in the United States: State of Knowledge. Bulletin of the American Meteorological Society, 94, 821–834, doi:10.1175/BAMS-D-12-00066.1., , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Weather and climate extremes have been varying and changing on many different time scales. In recent decades, heat waves have generally become more frequent across the United States, while cold waves have been decreasing. While this is in keeping with expectations in a warming climate, it turns out that decadal variations in the number of U.S. heat and cold waves do not correlate well with the observed U.S. warming during the last century. Annual peak flow data reveal that river flooding trends on the century scale do not show uniform changes across the country. While flood magnitudes in the Southwest have been decreasing, flood magnitudes in the Northeast and north-central United States have been increasing. Confounding the analysis of trends in river flooding is multiyear and even multidecadal variability likely caused by both large-scale atmospheric circulation changes and basin-scale “memory” in the form of soil moisture. Droughts also have long-term trends as well as multiyear and decadal variability. Instrumental data indicate that the Dust Bowl of the 1930s and the drought in the 1950s were the most significant twentieth-century droughts in the United States, while tree ring data indicate that the megadroughts over the twelfth century exceeded anything in the twentieth century in both spatial extent and duration. The state of knowledge of the factors that cause heat waves, cold waves, floods, and drought to change is fairly good with heat waves being the best understood.
2012: 3DVAR versus Traditional Dual-Doppler Wind Retrievals of a Simulated Supercell Thunderstorm. Monthly Weather Review, 140, 3487–3494, doi:10.1175/MWR-D-12-00063.1., , , , ,
Use of the three-dimensional variational data assimilation (3DVAR) framework in dual-Doppler wind analysis (DDA) offers several advantages over traditional techniques. Perhaps the most important is that the errors that result from explicit integration of the mass continuity equation in traditional methods are avoided. In this study, observing system simulation experiments (OSSEs) are used to compare supercell thunderstorm wind retrievals from a 3DVAR DDA technique and three traditional DDA methods. The 3DVAR technique produces better wind retrievals near the top of the storm than the traditional methods in the experiments. This is largely attributed to the occurrence of severe errors aloft in the traditional retrievals whether the continuity equation integration proceeds upward (due to vertically accumulating errors), downward (due to severe boundary condition errors arising from uncertainty in the horizontal divergence field aloft), or in both directions. Smaller, but statistically significant, improvement occurs near the ground using the 3DVAR method. When lack of upper-level observations prevents application of a top boundary condition in the traditional DDA framework, the 3DVAR approach produces better analyses at all levels. These results strongly suggest the 3DVAR DDA framework is generally preferable to traditional formulations.
2012: Comparison between Dual-Doppler and EnKF Storm-Scale Wind Analyses: Observing System Simulation Experiments with a Supercell Thunderstorm. Monthly Weather Review, 140, 3972–3991, doi:10.1175/MWR-D-12-00044.1., ,
Kinematical analyses of mobile radar observations are critical to advancing the understanding of supercell thunderstorms. Maximizing the accuracy of these and subsequent dynamical analyses, and appropriately characterizing the uncertainty in ensuing conclusions about storm structure and processes, requires thorough knowledge of the typical errors obtained using different retrieval techniques. This study adopts an observing system simulation experiment (OSSE) framework to explore the errors obtained from ensemble Kalman filter (EnKF) assimilation versus dual-Doppler analysis (DDA) of storm-scale mobile radar data. The radar characteristics and EnKF model errors are varied to explore a range of plausible scenarios.
When dual-radar data are assimilated, the EnKF produces substantially better wind retrievals at higher altitudes, where DDAs are more sensitive to unaccounted flow evolution, and in data-sparse regions such as the storm inflow sector. Near the ground, however, the EnKF analyses are comparable to the DDAs when the radar cross-beam angles (CBAs) are poor, and slightly worse than the DDAs when the CBAs are optimal. In the single-radar case, the wind analyses benefit substantially from using finer grid spacing than in the dual-radar case for the objective analysis of radar observations. The analyses generally degrade when only single-radar data are assimilated, particularly when microphysical parameterization or low-level environmental wind errors are introduced. In some instances, this leads to large errors in low-level vorticity stretching and Lagrangian circulation calculations. Nevertheless, the results show that while multiradar observations of supercells are always preferable, judicious use of single-radar EnKF assimilation can yield useful analyses.
2013: Comparison between Dual-Doppler and EnKF Storm-Scale Wind Analyses: The 29–30 May 2004 Geary, Oklahoma, Supercell Thunderstorm. Monthly Weather Review, 141, 1612–1628, doi:10.1175/MWR-D-12-00308.1., , , , ,
Kinematical analyses of storm-scale mobile radar observations are critical to advancing our understanding of supercell thunderstorms. Maximizing the accuracy of these analyses, and characterizing the uncertainty in ensuing conclusions about storm structure and processes, requires knowledge of the error characteristics of different retrieval techniques under different observational scenarios. Using storm-scale mobile radar observations of a tornadic supercell, this study examines the impacts on ensemble Kalman filter (EnKF) wind analyses of the number of available radars (one versus two), uncertainty in the model-initialization sounding, the sophistication of the microphysical parameterization scheme (double versus single moment), and assimilating reflectivity observations. The relative accuracy of three-dimensional variational data assimilation (3DVAR) dual-Doppler wind retrievals and single- and dual-radar EnKF wind analyses of the supercell is also explored. The results generally reinforce the findings of a previous study that used observing system simulation experiments to explore the same issues. Both studies suggest that single-radar EnKF wind analyses can be very useful once enough data have been assimilated, but that subsequent analyses that operate on the retrieved wind field gradients should be interpreted with caution. In the present study, severe errors appear to occur in computed Lagrangian circulation time series, imperiling interpretation of the underlying dynamics. This result strongly suggests that dual- and multiple-Doppler radar deployment strategies continue to be used in mobile field campaigns.
2013: Assessing Ensemble Forecasts of Low-Level Supercell Rotation within an OSSE Framework. Weather and Forecasting, 28, 940–960, doi:10.1175/WAF-D-12-00122.1., ,
Under the envisioned warn-on-forecast (WoF) paradigm, ensemble model guidance will play an increasingly critical role in the tornado warning process. While computational constraints will likely preclude explicit tornado prediction in initial WoF systems, real-time forecasts of low-level mesocyclone-scale rotation appear achievable within the next decade. Given that low-level mesocyclones are significantly more likely than higher-based mesocyclones to be tornadic, intensity and trajectory forecasts of low-level supercell rotation could provide valuable guidance to tornado warning and nowcasting operations. The efficacy of such forecasts is explored using three simulated supercells having weak, moderate, or strong low-level rotation. The results suggest early WoF systems may provide useful probabilistic 30–60-min forecasts of low-level supercell rotation, even in cases of large radar–storm distances and/or narrow cross-beam angles. Given the idealized nature of the experiments, however, they are best viewed as providing an upper-limit estimate of the accuracy of early WoF systems.
2013: A variational method for detecting and characterizing convective vortices in Cartesian wind fields. Monthly Weather Review, 141, 3102–3115, doi:10.1175/MWR-D-13-00015.1.,
Vortex detection algorithms are required for both research and operational applications in which data volume precludes timely subjective examination of model or analysis fields. Unfortunately, objective detection of convective vortices is often hindered by the strength and complexity of the flow in which they are embedded. To address this problem, a variational vortex-fitting algorithm previously developed to detect and characterize vortices observed by Doppler radar has been modified to operate on gridded horizontal wind data. The latter are fit to a simple analytical model of a vortex and its proximate environment, allowing the retrieval of important vortex characteristics. This permits the development of detection criteria tied directly to vortex properties (e.g., maximum tangential wind), rather than to more general kinematical properties that may poorly represent the vortex itself (e.g., vertical vorticity) when the background flow is strongly sheared. Thus, the vortex characteristic estimates provided by the technique may permit more effective detection criteria while providing useful information about vortex size, intensity, and trends therein. In tests with two simulated supercells, the technique proficiently detects and characterizes vortices, even in the presence of complex flow. Sensitivity tests suggest the algorithm would work well for a variety of vortex sizes without additional tuning. Possible applications of the technique include investigating relationships between mesocyclone and tornado characteristics, and detecting tornadoes, mesocyclones, and mesovortices in real-time ensemble forecasts.
2013: Evaluation of the Earth Systems Research Laboratory’s global Observing System Simulation Experiment system. Tellus A, 65, n/a–n/a, doi:10.3402/tellusa.v65i0.19011., , , , , ,
An Observing System Simulation Experiment (OSSE) system has been implemented at the National Oceanographic and Atmospheric Administration Earth Systems Research Laboratory in the US as part of an international Joint OSSE effort. The setup of the OSSE consists of a Nature Run from a 13-month free run of the European Center for Medium-Range Weather Forecasts operational model, synthetic observations developed at the National Centers for Environmental Prediction (NCEP) and the National Aeronautics and Space Administration Global Modelling and Assimilation Office, and an operational version of the NCEP Gridpoint Statistical Interpolation data assimilation and Global Forecast System numerical weather prediction model. Synthetic observations included both conventional observations and the following radiance observations: AIRS, AMSU-A, AMSU-B, HIRS2, HIRS3, MSU, GOES radiance and OSBUV. Calibration was performed by modifying the error added to the conventional synthetic observations to achieve a match between data denial impacts on the analysis state in the OSSE system and in the real data system. Following calibration, the performance of the OSSE system was evaluated in terms of forecast skill scores and impact of observations on forecast fields.
2013: VPR correction of bright band effects in radar QPEs using polarimetric radar observations. Journal of Geophysical Research, 118, 3627–3633, doi:10.1002/jgrd.50364., , , ,
Vertical profile of reflectivity (VPR) correction of bright band (BB) effects has been a challenge for single-polarization radar quantitative precipitation estimations (QPEs) for mesoscale convective systems and for cool season stratiform precipitation when the freezing level is low. BB is often found in the radar observations of stratiform precipitation, and the inflated reflectivity intensities in the BB often cause positive biases in radar QPEs. A VPR correction is desirable to mitigate the BB contamination and reduce the bias. However, a well-defined BB bottom, while critical for an effective correction of the bias, is often not found in the VPRs. Fortunately, polarimetric radar variables, especially the copolar correlation coefficient (rHV), can provide a much better depiction of vertical BB structure than does reflectivity. In the current study, an apparent vertical profile of rHV (AVPrHV) correction scheme is developed. For each tilt of the radar volume scan data, the precipitation echoes are segregated into convective and stratiform regions. An apparent VPR (AVPR) and AVPrHV are computed for the stratiform region in the given tilt. Then the bright band top, peak, and bottom are identified from the AVPR and AVPrHV, and a linear VPR correction model is fit to the AVPR in the BB layer. VPR corrections are applied to the reflectivity field in the given tilt based on the linear correction model, and radar QPEs are derived from the corrected reflectivity field. The new AVPR and AVPrHV combined scheme was tested on two mesoscale convective system events and one cold season event in the United States and was shown to be more effective in reducing the radar QPE bias associated with the BB than did the AVPR correction alone.
2013: The Emergence of Weather-Related Test Beds Linking Research and Forecasting Operations. Bulletin of the American Meteorological Society, 94, 1187–1211, doi:10.1175/BAMS-D-12-00080.1., , , ,
Test beds have emerged as a critical mechanism linking weather research with forecasting operations. The U.S. Weather Research Program (USWRP) was formed in the 1990s to help identify key gaps in research related to major weather prediction problems and the role of observations and numerical models. This planning effort ultimately revealed the need for greater capacity and new approaches to improve the connectivity between the research and forecasting enterprise.
Out of this developed the seeds for what is now termed “test beds.” While many individual projects, and even more broadly the NOAA/National Weather Service (NWS) Modernization, were successful in advancing weather prediction services, it was recognized that specific forecast problems warranted a more focused and elevated level of effort. The USWRP helped develop these concepts with science teams and provided seed funding for several of the test beds described.
Based on the varying NOAA mission requirements for forecasting, differences in the organizational structure and methods used to provide those services, and differences in the state of the science related to those forecast challenges, test beds have taken on differing characteristics, strategies, and priorities. Current test bed efforts described have all emerged between 2000 and 2011 and focus on hurricanes (Joint Hurricane Testbed), precipitation (Hydrometeorology Testbed), satellite data assimilation (Joint Center for Satellite Data Assimilation), severe weather (Hazardous Weather Testbed), satellite data support for severe weather prediction (Short-Term Prediction Research and Transition Center), mesoscale modeling (Developmental Testbed Center), climate forecast products (Climate Testbed), testing and evaluation of satellite capabilities [Geostationary Operational Environmental Satellite-R Series (GOES-R) Proving Ground], aviation applications (Aviation Weather Testbed), and observing system experiments (OSSE Testbed).
2013: The Dependence of QPF on the Choice of Microphysical Parameterization for Lake-Effect Snowstorms. Journal of Applied Meteorology and Climatology, 52, 363–377, doi:10.1175/JAMC-D-12-019.1., ,
Several lake-effect-snow forecasts are compared to assess how the choice of microphysical parameterization affects quantitative precipitation forecasting (QPF). Eight different schemes, with different numbers of moments and categories of hydrometeors, are considered. Half of the schemes are in the steady regime (so named because the precipitation rates are nearly constant with time), and the remaining experiments are in the unsteady regime, which has a high temporal variation in precipitation. The steady-regime members have broader precipitation shields and 24-h accumulations that range from 43 to 50 mm. In the unsteady regime, the precipitation shields are narrower, leading to higher accumulations (ranging from 55 to 94 mm). These differences are the result of lower terminal velocities (Vt) in the steady regime, which allows for relofting or suspension of hydrometeors (assuming the vertical velocity is sufficiently large) and, hence, a longer in-cloud residence time and stronger downstream transport. In the six-category experiments, low Vt values in the steady regime occur in conjunction with a lower production of graupel, which is primarily due to less accretion of rain by snow. In the five-category experiments, differences are due to the way Vt is functionally dependent on environmental temperature and the degree of riming, with the steady regime having a more conservative relation. The steady regime compares better to available observations, although both have notable forecast errors.
2012: Towards a space-time framework for integrated water and society studies. Bulletin of the American Meteorological Society, 93, ES89–ES91, doi:10.1175/BAMS-D-11-00226.1., , , , , , , , , , ,
WATER AND SOCIETY: A SPACE –TIME FRAMEWORK FOR INTEGRATED STUDIES (WAS*IS WORKSHOP)
What: Seventeen early-career scientists, doctoral candidates, and operational partners from 10 nations met to develop a new integrated approach based on “scaling” to better understand the physical and social processes governing water resources.
When: 8–13 May 2011
Where: Les Houches, France
2013: Liquid Water Cloud Measurements Using the Raman Lidar Technique: Current Understanding and Future Research Needs. Journal of Atmospheric and Oceanic Technology, 30, 1337–1353, doi:10.1175/JTECH-D-12-00099.1., , , , , , , ,
This paper describes recent work in the Raman lidar liquid water cloud measurement technique. The range-resolved spectral measurements at the National Aeronautics and Space Administration Goddard Space Flight Center indicate that the Raman backscattering spectra measured in and below low clouds agree well with theoretical spectra for vapor and liquid water. The calibration coefficients of the liquid water measurement for the Raman lidar at the Atmospheric Radiation Measurement Program Southern Great Plains site of the U.S. Department of Energy were determined by comparison with the liquid water path (LWP) obtained with Atmospheric Emitted Radiance Interferometer (AERI) and the liquid water content (LWC) obtained with the millimeter wavelength cloud radar and water vapor radiometer (MMCR– WVR) together. These comparisons were used to estimate the Raman liquid water cross-sectional value. The results indicate a bias consistent with an effective liquid water Raman cross-sectional value that is 28%–46% lower than published, which may be explained by the fact that the difference in the detectors’ sensitivity has not been accounted for. The LWP of a thin altostratus cloud showed good qualitative agreement between lidar retrievals and AERI. However, the overall ensemble of comparisons of LWP showed considerable scatter, possibly because of the different fields of view of the instruments, the 350-m distance between the instruments, and the horizontal inhomogeneity of the clouds. The LWC profiles for a thick stratus cloud showed agreement between lidar retrievals and MMCR–WVR between the cloud base and 150 m above that where the optical depth was less than 3. Areas requiring further research in this technique are discussed.
2013: Factors influencing the development and maintenance of nocturnal heavy-rain-producing convective systems in a storm-scale ensemble. Monthly Weather Review, 141, 2778–2801., , , ,
From 9 to 11 June 2010, a mesoscale convective vortex (MCV) was associated with several periods of heavy rainfall that led to flash flooding. During the overnight hours, mesoscale convective systems (MCSs) developed that moved slowly and produced heavy rainfall over small areas in south-central Texas on 9 June, north Texas on 10 June, and western Arkansas on 11 June. In this study, forecasts of this event from the Center for the Analysis and Prediction of Storms' Storm-Scale Ensemble Forecast system are examined. This ensemble, with 26 members at 4-km horizontal grid spacing, included a few members that very accurately predicted the development, maintenance, and evolution of the heavy-rain-producing MCSs, along with a majority of members that had substantial errors in their precipitation forecasts. The processes favorable for the initiation, organization, and maintenance of these heavy-rain-producing MCSs are diagnosed by comparing ensemble members with accurate and inaccurate forecasts. Even within a synoptic environment known to be conducive to extreme local rainfall, there was considerable spread in the ensemble's rainfall predictions. Because all ensemble members included an anomalously moist environment, the precipitation predictions were insensitive to the atmospheric moisture. However, the development of heavy precipitation overnight was very sensitive to the intensity and evolution of convection the previous day. Convective influences on the strength of the MCV and its associated dome of cold air at low levels determined whether subsequent deep convection was initiated and maintained. In all, this ensemble provides quantitative and qualitative information about the mesoscale processes that are most favorable (or unfavorable) for localized extreme rainfall.
2013: Two-Dimensional Variational Analysis of Near-Surface Moisture from Simulated Radar Refractivity-Related Phase Change Observations. Advances in Atmospheric Sciences, 30, 291–305, doi:10.1007/s00376-012-2087-7., , , , , ,
Because they are most sensitive to atmospheric moisture content, radar refractivity observations can provide high resolution information about the highly variable low-level moisture field. In this study, simulated radar refractivity related phase-change data were created using a radar simulator from realistic high-resolution model simulation data for a dryline case. These data were analyzed using the 2DVAR system developed specifically for the phase-change data. Two sets of experiments with the simulated observations were performed, one assuming a uniform target spacing of 250 m and one assuming nonuniform spacing between 250 m to 4 km. Several sources of observation error were considered, and their impacts were examined. They included errors due to ground target position uncertainty, typical random errors associated with radar measurements, and gross error due to phase wrapping. Without any additional information, the 2DVAR system was incapable of dealing with phase-wrapped data directly. When there was no phase wrapping in the data, the 2DVAR produced excellent analyses, even in the presence of both position uncertainty and random radar measurement errors. When a separate pre-processing step was applied to unwrap the phase-wrapped data, quality moisture analyses were again obtained, although the analyses were smoother due to the reduced efective resolution of the observations by interpolation and smoothing involved in the unwrapping procedure. The unwrapping procedure was efective even when signifcant diferences existed between the analyzed state and the state at a reference time. The results affirm the promise of using radar refractivity phase-change measurements for near-surface moisture analysis.
2013: High and Dry: New Observations of Tropospheric and Cloud Properties above the Greenland Ice Sheet. Bulletin of the American Meteorological Society, 94, 169–186, doi:10.1175/BAMS-D-11-00249.1., , , , , , , , , , , , ,
Cloud and atmospheric properties impart strong influences on the mass and energy budgets of the Greenland Ice Sheet (GIS). To address critical gaps in our understanding of these systems, a new suite of cloud- and atmosphere-observing instruments has been installed on the central GIS as part of the Integrated Characterization of Energy, Clouds, Atmospheric state, and Precipitation at Summit (ICECAPS) project. During the first 20 months in operation, this complementary suite of active and passive, ground-based sensors and radiosondes has provided new and unique perspectives on important cloud-atmosphere properties.
High atop the GIS, the atmosphere is extremely dry and cold with strong near-surface static stability predominating throughout the year, particularly in winter. This low-level thermodynamic structure, coupled with frequent moisture inversions, conveys the importance of advection for local cloud and precipitation formation. Cloud liquid water is observed in all months of the year, even the particularly cold and dry winter, while annual cycle observations indicate the largest atmospheric moisture amounts, cloud water contents, and snowfall occur in summer and under southwesterly flow. Many of the basic structural properties of clouds observed at Summit, particularly for low-level stratiform clouds, are similar to their counterparts in other Arctic regions.
The ICECAPS observations and accompanying analyses will be used to improve our understanding of key cloud-atmosphere processes and the manner in which they interact with the GIS. Furthermore, they will facilitate model evaluation and development in this data sparse, yet environmentally unique, region.
2013: A Satellite-Based Convective Cloud Object Tracking and Multipurpose Data Fusion Tool with Application to Developing Convection. Journal of Atmospheric and Oceanic Technology, 30, 510–525, doi:10.1175/JTECH-D-12-00114.1., , , , ,
Studying deep convective clouds requires the use of available observation platforms with high temporal and spatial resolution, as well as other non–remote sensing meteorological data (i.e., numerical weather prediction model output, conventional observations, etc.). Such data are often at different temporal and spatial resolutions, and consequently, there exists the need to fuse these different meteorological datasets into a single framework. This paper introduces a methodology to identify and track convective cloud objects from convective cloud infancy [as few as three Geostationary Operational Environmental Satellite (GOES) infrared (IR) pixels] into the mature phase (hundreds of GOES IR pixels) using only geostationary imager IR window observations for the purpose of monitoring the initial growth of convective clouds.
The object tracking system described within builds upon the Warning Decision Support System-Integrated Information (WDSS-II) object tracking capabilities. The system uses an IR-window-based field as input to WDSS-II for cloud object identification and tracking and a Cooperative Institute for Meteorological Satellite Studies at the University of Wisconsin (UW-CIMSS)-developed postprocessing algorithm to combine WDSS-II cloud object output. The final output of the system is used to fuse multiple meteorological datasets into a single cloud object framework. The object tracking system performance analysis shows improved object tracking performance with both increased temporal resolution of the geostationary data and increased cloud object size. The system output is demonstrated as an effective means for fusing a variety of meteorological data including raw satellite observations, satellite algorithm output, radar observations, and derived output, numerical weather prediction model output, and lightning detection data for studying the initial growth of deep convective clouds and temporal trends of such data
2012: Convective modes for significant severe thunderstorms in the contiguous United States. Part I: Storm classification and climatology. Weather and Forecasting, 27, 1114–1135, doi:10.1175/WAF-D-11-00115.1., , , , ,
Radar-based convective modes were assigned to a sample of tornadoes and significant severe thunderstorms reported in the contiguous United States (CONUS) during 2003–11. The significant hail (≥2-in. diameter), significant wind (≥65-kt thunderstorm gusts), and tornadoes were filtered by the maximum event magnitude per hour on a 40-km Rapid Update Cycle model horizontal grid. The filtering process produced 22 901 tornado and significant severe thunderstorm events, representing 78.5% of all such reports in the CONUS during the sample period. The convective mode scheme presented herein begins with three radar-based storm categories: 1) discrete cells, 2) clusters of cells, and 3) quasi-linear convective systems (QLCSs). Volumetric radar data were examined for right-moving supercell (RM) and left-moving supercell characteristics within the three radar reflectivity designations. Additional categories included storms with marginal supercell characteristics and linear hybrids with a mix of supercell and QLCS structures. Smoothed kernel density estimates of events per decade revealed clear geographic and seasonal patterns of convective modes with tornadoes. Discrete and cluster RMs are the favored convective mode with southern Great Plains tornadoes during the spring, while the Deep South displayed the greatest variability in tornadic convective modes in the fall, winter, and spring. The Ohio Valley favored a higher frequency of QLCS tornadoes and a lower frequency of RM compared to the Deep South and the Great Plains. Tornadoes with nonsupercellular/non-QLCS storms were more common across Florida and the high plains in the summer. Significant hail events were dominated by Great Plains supercells, while variations in convective modes were largest for significant wind events.
2013: Progress and challenges with Warn-on-Forecast. Atmospheric Research, 123, 2–16, doi:10.1016/j.atmosres.2012.04.004., , , , , , , , , , , , , , , , , , , ,
The current status and challenges associated with two aspects of Warn-on-Forecast - a National Oceanic and Atmospheric Administration research project exploring the use of a convective-scale ensemble analysis and forecast system to support hazardous weather warning operations - are outlined. These two project aspects are the production of a rapidly-updating assimilation system to incorporate data from multiple radars into a single analysis, and the ability of short-range ensemble forecasts of hazardous convective weather events to provide guidance that could be used to extend warning lead times for tornadoes, hailstorms, damaging windstorms and flash floods. Results indicate that a three-dimensional variational assimilation system, that blends observations from multiple radars into a single analysis, shows utility when evaluated by forecasters in the Hazardous Weather Testbed and may help increase confidence in a warning decision. The ability of short-range convective-scale ensemble forecasts to provide guidance that could be used in warning operations is explored for five events: two tornadic supercell thunderstorms, a macroburst, a damaging windstorm and a flash flood. Results show that the ensemble forecasts of the three individual severe thunderstorm events are very good, while the forecasts from the damaging windstorm and flash flood events, associated with mesoscale convective systems, are mixed. Important interactions between mesoscale and convective-scale features occur for the mesoscale convective system events that strongly influence the quality of the convective-scale forecasts. The development of a successful Warn-on-Forecast system will take many years and require the collaborative efforts of researchers and operational forecasters to succeed.
2013: Upscale Effects of Deep Convection during the North American Monsoon. Journal of the Atmospheric Sciences, 70, 2681–2695, doi:10.1175/JAS-D-13-063.1.,
The ability of deep monsoon convection to influence the larger-scale circulation over North America is investigated for a 6-day-long case study during the 2006 North American monsoon. Results from Rossby wave ray tracing and numerical simulations using the Advanced Research Weather Research and Forecasting model indicate that North American monsoon convection provides a source region for stationary Rossby waves. Two wave trains are seen in the numerical model simulations, with behaviors that agree well with expectations from theory and ray tracing. The shorter and faster-moving wave train moves eastward from the source region in Mexico and reaches the western Atlantic within 4 days. The longer and slower-moving wave train travels northeastward and reaches the coastal New England region within 6 days. An upstream tail of anticyclonic vorticity extends westward from the source region into the central Pacific Ocean.
The monsoon convection appears to help cut off the low-level anticyclonic flow by developing low-level southerly flow in the Gulf of Mexico and northerly flow in the eastern Pacific, as suggested in earlier global model studies. However, the stationary Rossby wave trains further alter the location and intensity of deep convection in locations remote from the monsoon. These results suggest that unless a numerical model can correctly predict monsoon convection, the ability of the model to produce accurate forecasts of the large-scale pattern and associated convective activity beyond a few days is in question. This result may be important for global climate modeling, since an inaccurate prediction of monsoon convection would lead to an inaccurate Rossby wave response.
2013: Use of Multiple Verification Methods to Evaluate Forecasts of Convection from Hot- and Cold-Start Convection-Allowing Models. Weather and Forecasting, 28, 119–138, doi:10.1175/WAF-D-12-00022.1., , , ,
This study uses both traditional and newer verification methods to evaluate two 4-km grid-spacing Weather Research and Forecasting Model (WRF) forecasts: a “cold start” forecast that uses the 12-km North American Mesoscale Model (NAM) analysis and forecast cycle to derive the initial and boundary conditions (C0) and a “hot start” forecast that adds radar data into the initial conditions using a three-dimensional variational data assimilation (3DVAR)/cloud analysis technique (CN). These forecasts were evaluated as part of 2009 and 2010 NOAA Hazardous Weather Test Bed (HWT) Spring Forecasting Experiments. The Spring Forecasting Experiment participants noted that the skill of CN’s explicit forecasts of convection estimated by some traditional objective metrics often seemed large compared to the subjectively determined skill. The Gilbert skill score (GSS) reveals CN scores higher than C0 at lower thresholds likely due to CN having higher-frequency biases than C0, but the difference is negligible at higher thresholds, where CN’s and C0’s frequency biases are similar. This suggests that if traditional skill scores are used to quantify convective forecasts, then higher (>35 dBZ) reflectivity thresholds should be used to be consistent with expert’s subjective assessments of the lack of forecast skill for individual convective cells. The spatial verification methods show that both CN and C0 generally have little to no skill at scales <8–12Δx starting at forecast hour 1, but CN has more skill at larger spatial scales (40–320 km) than C0 for the majority of the forecasting period. This indicates that the hot start provides little to no benefit for forecasts of convective cells, but that it has some benefit for larger mesoscale precipitation systems.
2013: Assimilation of high-resolution, mobile Doppler radar data into EnKF analyses of the 4 May 2007 Greensburg, Kansas supercell storm. Monthly Weather Review, 141, 625–648, doi:10.1175/MWR-D-12-00099.1., , , ,
Mobile Doppler radar data, along with observations from a nearby Weather Surveillance Radar-1988 Doppler (WSR-88D), are assimilated with an ensemble Kalman filter (EnKF) technique into a nonhydrostatic, compressible numerical weather prediction model to analyze the evolution of the 4 May 2007 Greensburg, Kansas, tornadic supercell. The storm is simulated via assimilation of reflectivity and velocity data in an initially horizontally homogeneous environment whose parameters are believed to be a close approximation to those of the Greensburg supercell inflow sector. Experiments are conducted to test analysis sensitivity to mobile radar data availability and to the mean environmental near-surface wind profile, which was changing rapidly during the simulation period. In all experiments, a supercell with similar location and evolution to the observed storm is analyzed, but the simulated storm’s characteristics differ markedly. The assimilation of mobile Doppler radar data has a much greater impact on the resulting analyses, particularly at low altitudes (≤2 km), than modifications to the near-surface environmental wind profile. Differences in the analyzed updrafts, vortices, cold pool structure, rear-flank gust front structure, and observation-space diagnostics are documented. An analyzed vortex corresponding to the enhanced Fujita scale 5 (EF-5) Greensburg tornado is stronger and deeper in experiments in which mobile (higher resolution) Doppler radar data are included in the assimilation. This difference is linked to stronger analyzed horizontal convergence, which in turn is associated with increased stretching of vertical vorticity. Changing the near-surface wind profile appears to impact primarily the updraft strength, availability of streamwise vorticity for tilting into the vertical, and low-level vortex strength and longevity.
2013: Macrophysical properties of tropical cirrus clouds from the CALIPSO satellite and from ground-based micropulse and Raman lidars. Journal of Geophysical Research, 118, 1–12, doi:10.1002/jgrd.50691., , , , , , ,
Lidar observations of cirrus cloud macrophysical properties over the U.S. Department of Energy Atmospheric Radiation Measurement (ARM) program Darwin, Australia, site are compared from the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite, the ground-based ARM micropulse lidar (MPL), and the ARM Raman lidar (RL). Comparisons are made using the subset of profiles where the lidar beam is not fully attenuated. Daytime measurements using the RL are shown to be relatively unaffected by the solar background and are therefore suited for checking the validity of diurnal cycles. RL and CALIPSO cloud fraction profiles show good agreement while the MPL detects significantly less cirrus, particularly during the daytime. Both MPL and CALIPSO observations show that cirrus clouds occur less frequently during the day than at night at all altitudes. In contrast, the RL diurnal cycle is significantly different from zero only below about 11 km; where it is of opposite sign (i.e., more clouds during the daytime). For cirrus geometrical thickness, the MPL and CALIPSO observations agree well and both data sets have significantly thinner clouds during the daytime than the RL. From the examination of hourly MPL and RL cirrus cloud thickness and through the application of daytime detection limits to all CALIPSO data, we find that the decreased MPL and CALIPSO cloud thickness during the daytime is very likely a result of increased daytime noise. This study highlights the significant improvement the RL provides (compared to the MPL) in the ARM program’s ability to observe tropical cirrus clouds and will help improve our understanding of these clouds. The RL also provides a valuable ground-based lidar data set for the evaluation of CALIPSO observations.
2013: The DTC ensembles task: A new testing and evaluation facility for mesoscale ensembles. Bulletin of the American Meteorological Society, 94, 321–327, doi:10.1175/BAMS-D-11-00209.1., , , , , , , , , , , , , ,
none (IN BOX Insights and Innovations) Article
2013: The importance of accurately measuring the range correlation for range oversampling processing. Journal of Atmospheric and Oceanic Technology, 30, 261–273, doi:10.1175/JTECH-D-12-00085.1., ,
A fundamental assumption for the application of range-oversampling techniques is that the correlation of oversampled signals in range is accurately known. In this paper, we derive a theoretical framework to quantify the effects of inaccurate range-correlation measurements on the performance of such techniques, which include digital matched filtering and those based on decorrelation (whitening) transformations. It is demonstrated that significant reflectivity biases and increased variance of estimates can occur if the range correlation is not accurately measured. Simulations and real data are used to validate the theoretical results and to illustrate the detrimental effects of mismeasurements. Results from this work underline the need for reliable calibration in the context of range-oversampling processing, and can be used to establish appropriate accuracy requirements for the measurement of the range correlation on modern weather radars.
2013: Regional Characterization of Tornado Activity. Journal of Applied Meteorology and Climatology, 52, 654–659, doi:10.1175/JAMC-D-12-0173.1., ,
In the United States, tornado activity of a given year is usually assessed in terms of the total number of human-reported tornadoes. Such assessments fail to account for the seldom-acknowledged fact that an active (or inactive) tornado year for the United States does not necessarily equate with activity (or inactivity) everywhere in the country. The authors illustrate this by comparing the geospatial tornado distributions from 1987, 2004, and 2011. Quantified in terms of the frequency of daily tornado occurrence (or “tornado days”), the high activity in the South Atlantic and upper Midwest regions was a major contributor to the record-setting number of tornadoes in 2004. The high activity in 2011 arose from significant tornado occurrences in the Southeast and lower Midwest. The authors also show that the uniqueness of the activity during these years can be determined by modeling the local statistical behavior of tornado days by a gamma distribution.
2012: The far-infrared: Focusing on a relatively underexplored portion of the electromagnetic spectrum. Bulletin of the American Meteorological Society, 93, ES103–ES104, doi:10.1175/BAMS-D-11-00007.1., , ,
The 2011 Workshop on Far-Infrared Remote Sensing convened thirty scientists from four nations to survey the state of the science in measuring and modeling the Earth’s spectral radiance in the far infrared (15–100 μm).
2013: Ground-based Remote Retrievals of Cumulus Entrainment Rates. Journal of Atmospheric and Oceanic Technology, 30, 1460–1471, doi:10.1175/JTECH-D-12-00187.1., , , ,
While fractional entrainment rates for cumulus clouds have typically been derived from airborne observations, this limits the size and scope of available datasets. To increase the number of continental cumulus entrainment rate observations available for study, an algorithm for retrieving them from ground-based remote sensing observations has been developed. This algorithm, called the Entrainment Rate In Cumulus Algorithm (ERICA), uses the suite of instruments at the Southern Great Plains (SGP) site of the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Climate Research Facility as inputs into a Gauss-Newton optimal estimation scheme, in which an assumed guess of the entrainment rate is iteratively adjusted through intercomparison of modeled cloud attributes to their observed counterparts. The forward model in this algorithm is the explicit mixing parcel model (EMPM), a cloud parcel model that treats entrainment as a series of discrete entrainment events. A quantified value for the uncertainty in the retrieved entrainment rate is also returned as part of the retrieval. Sensitivity testing and information content analysis demonstrate the robust nature of this method for retrieving accurate observations of the entrainment rate without the drawbacks of airborne sampling. Results from a test of ERICA on 3 months of shallow cumulus cloud events show significant variability of the entrainment rate of clouds in a single day and from one day to the next. The mean value of 1.06 km^(-1) for the entrainment rate in this dataset corresponds well with prior observations and simulations of the entrainment rate in cumulus clouds.
2013: C-Band Polarimetric Radar QPE Based on Specific Differential Propagation Phase for Extreme Typhoon Rainfall. Journal of Atmospheric and Oceanic Technology, 30, 1354–1370, doi:10.1175/JTECH-D-12-00083.1., , , ,
To obtain accurate radar quantitative precipitation estimation (QPE) for extreme rainfall events such as land-falling typhoon systems in complex terrain, a new method was developed for C-band polarimetric radars. The new methodology includes a correction method based on vertical profiles of the specific differential propagation phase (VPSDP) for low-level blockage and an optimal relation between rainfall rate (R) and the specific differential phase (KDP). In the VPSDP-based correction approach, a screening process is applied to KDP fields, where missing or unreliable data from lower tilts caused by severe beam blockage are replaced with those from upper and unblocked tilts. The KDP data from upper tilts are adjusted to account for variations in the vertical profile of KDP. The corrected KDP field is then used for rain-rate estimations. To acquire an accurate QPE result, a new R(KDP) relation for C-band polarimetric radars was derived through simulations using drop size distribution (DSD) and drop shape relation (DSR) observations from typhoon systems in Taiwan. The VPSDP-based correction method with the new R(KDP) relation was evaluated using the typhoon cases of Morakot (2009) and Fanapi (2010).
2013: Thermodynamic and liquid profiling during the 2010 Winter Olympics. Atmospheric Research, 132-133, 278–290., , , , , , , , ,
Tropospheric observations by a microwave profiling radiometer and six-hour radiosondes were obtained during the Alpine Venue of the 2010 Winter Olympic Games at Whistler, British Columbia, by Environment Canada. The radiometer provided continuous temperature, humidity and liquid (water) profiles during all weather conditions including rain, sleet and snow. Gridded analysis was provided by the U.S. National Oceanic and Atmospheric Administration. We compare more than two weeks of radiometer neural network and radiosonde temperature and humidity soundings including clear and precipitating conditions. Corresponding radiometer liquid and radiosonde wind soundings are shown. Close correlation is evident between radiometer and radiosonde temperature and humidity profiles up to 10 km height and among southwest winds, liquid water and upper level thermodynamics, consistent with up-valley advection and condensation of moist maritime air. We compare brightness temperatures observed by the radiometer and forward-modeled from radiosonde and gridded analysis. Radiosonde-equivalent observation accuracy is demonstrated for radiometer neural network temperature and humidity retrievals up to 800 m height and for variational retrievals that combine radiometer and gridded analysis up to 10 km height
2013: Incorporating NASA spaceborne radar data into NOAA National Mosaic QPE system for improved precipitation measurement: A physically based VPR identification and enhancement method. Journal of Hydrometeorology, 14, 1293–1307, doi:10.1175/JHM-D-12-0106.1., , , , , , , ,
This study proposes an approach that identifies and corrects for the vertical profile of reflectivity (VPR) by using Tropical Rainfall Measuring Mission (TRMM) precipitation radar (PR) measurements in the region of Arizona and southern California, where the ground-based Next Generation Weather Radar (NEXRAD) finds difficulties in making reliable estimations of surface precipitation amounts because of complex terrain and limited radar coverage. A VPR identification and enhancement (VPR-IE) method based on the modeling of the vertical variations of the equivalent reflectivity factor using a physically based parameterization is employed to obtain a representative VPR at S band from the TRMM PR measurement at Ku band. Then the representative VPR is convolved with ground radar beam sampling properties to compute apparent VPRs for enhancing NEXRAD quantitative precipitation estimation (QPE). The VPR-IE methodology is evaluated with several stratiform precipitation events during the cold season and is compared to two other statistically based correction methods, that is, the TRMM PR–based rainfall calibration and a range ring–based adjustment scheme. The results show that the VPR-IE has the best overall performance and provides much more accurate surface rainfall estimates than the original ground-based radar QPE. The potential of the VPR-IE method could be further exploited and better utilized when the Global Precipitation Measurement Mission's dual-frequency PR is launched in 2014, with anticipated accuracy improvements and expanded latitude coverage.
Keywords: Precipitation, Rainfall, Radars/Radar observations, Remote sensing, Satellite observations
2013: A new parametric tropical cyclone tangential wind profile model. Monthly Weather Review, 141, 1884–1909, doi:10.1175/MWR-D-12-00115.1., , , ,
A new parametric tropical cyclone (TC) wind profile model is presented for depicting representative surface pressure profiles corresponding to multiple-maxima wind profiles that exhibit single-, dual- and triple-maximum concentric-eyewall wind peaks associated with the primary (inner), secondary (first outer) and tertiary (second outer) complete rings of enhanced radar reflectivity. One profile employs five key parameters: tangential velocity maximum, radius of the maximum, and three different shape velocity parameters related to the shape of the profile. After tailoring the model for TC applications, a gradient wind is computed from a cyclostrophic wind formulated in terms of the cyclostrophic Rossby number. A pressure, via cyclostrophic balance, was partitioned into separate pressure components that corresponded to multiple-maxima cyclostrophic wind profiles in order to quantitatively evaluate the significant fluctuations in central pressure deficits. The model TC intensity in terms of varying growth, size and decay velocity profiles was analyzed in relation to changing each of five key parameters. Analytical results show that the first shape velocity parameter, changing a sharply to broadly peaked wind profile, increases the TC intensity and size by producing the corresponding central pressure fall. An increase (decrease) in the second (third) shape velocity parameter yields the pressure rise (fall) by decreasing (increasing) the inner (outer) wind profile inside (outside) the radius of the maximum. When a single-maximum tangential wind profile evolves to multiple-maxima tangential wind profiles during an eye replacement cycle, the pressure falls and rises are sensitively fluctuated.
2013: A parametric wind-pressure relationship for Rankine versus non-Rankine cyclostrophic vortices. Monthly Weather Review, 30, 2850–2867, doi:10.1175/JTECH-D-13-00041.1., ,
A parametric tangential wind profile model is presented for depicting representative pressure deficit profiles corresponding to varying tangential wind profiles of a cyclostrophic, axisymmetric vortex. The model employs five key parameters per wind profile: tangential velocity maximum, radius of the maximum, and three shape parameters that control different portions of the profile. The model coupled with the cyclostrophic balance assumption offers a diagnostic tool for estimating and examining a radial profile of pressure deficit deduced from a theoretical superimposing tangential wind profile in the vortex. Analytical results show that the shape parameters for a given tangential wind maximum of a non-Rankine vortex have an important modulating influence on the behavior of realistic tangential wind and corresponding pressure deficit profiles. The first parameter designed for changing the wind profile from sharply to broadly peaked produces the corresponding central pressure fall. An increase in the second (third) parameter yields the pressure rise by lowering the inner (outer) wind profile inside (outside) the radius of the maximum. Compared to the Rankine vortex, the parametrically constructed non-Rankine vortices have a larger central pressure deficit. It is suggested that the parametric model of non-Rankine vortex tangential winds has good potential for diagnosing the pressure features arising in dust devils, waterspouts, tornadoes, tornado cyclones and mesocyclones. Finally, presented are two examples in which the parametric model is fitted to a tangential velocity profile, one derived from an idealized numerical simulation and the other derived from high-resolution Doppler radar data collected in a real tornado.
2012: An adaptive dealiasing method based on variational analysis for radar radial velocities scanned with small Nyquist velocities. Journal of Atmospheric and Oceanic Technology, 29, 1723–1729, doi:10.1175/JTECH-D-12-00145.1., ,
Previous velocity–azimuth display (VAD)-based methods of dealiasing folded radial velocities have relied heavily on the VAD uniform-wind assumption and, thus, can fail when the uniform-wind assumption becomes poor around azimuthal circles in a vertical layer and the Nyquist velocity is small (≤12 m/s). By using the two-step, alias-robust variational (AR-Var) analysis in place of the alias-robust VAD (AR-VAD) analysis for the reference check, the previous AR-VAD-based dealiasing method is improved to an AR-Var-based dealiasing method adaptively for radar radial velocities scanned with small Nyquist velocities. The method has been tested with severely aliased velocity data scanned by the Oklahoma KTLX radar. The robustness and satisfactory performance of the AR-Var-based dealiasing are exemplified by the results obtained for a severe winter ice storm scanned with the Nyquist velocity reduced to 11.51 m/s.
2013: Improved Doppler velocity dealiasing for radar data assimilation and storm-scale vortex detection. Advances in Meteorology, 2013, 517–526, doi:10.1155/2013/562386., , , , , ,
The Doppler velocity dealiasing technique based on alias-robust VAD and variational (AR-Var) analyses developed at the National Severe Storms Laboratory for radar data quality control and assimilation is further improved in its two-step procedures: the reference check in the first step and the continuity check in the second step. In the first step, the alias-robust variational analysis is modified adaptively and used in place of the alias-robust velocity-azimuth display (VAD) analysis for all scan modes (rather than solely the WSR-88D volume coverage pattern 31 with the Nyquist velocity vN reduced below 12 m/s and the TDWR Mod80 with vN reduced below 15 m/s), so more raw data can pass the stringent threshold conditions used by the reference check in the first step. This improves the dealiased data coverage without false dealiasing to better satisfy the high data quality standard required by radar data assimilation. In the second step, new procedures are designed and added to the continuity check to increase the dealiased data coverage over storm-scale areas threatened by intense mesocyclones and their generated tornados. The performances of the improved dealiasing technique versus the existing techniques are exemplified by the results obtained for tornadic storms scanned by the operational KTLX .
2013: Statistical and hydrological evaluation of TRMM-based Multi-satellite Precipitation Analysis over the Wangchu Basin of Bhutan: Are the latest satellite precipitation products 3B42V7 ready for use in ungauged basins?. Journal of Hydrology, 499, 91–99, doi:10.1016/j.jhydrol.2013.06.042., , , , , , , ,
The objective of this study is to quantitatively evaluate the successive Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) products and further to explore the improvements and error propagation of the latest 3B42V7 algorithm relative to its predecessor 3B42V6 using the Coupled Routing and Excess Storage (CREST) hydrologic model in the mountainous Wangchu Basin of Bhutan. First, the comparison to a decade-long (2001–2010) daily rain gauge dataset reveals that: (1) 3B42V7 generally improves upon 3B42V6’s underestimation both for the whole basin (bias from −41.15% to −8.38%) and for a 0.25° × 0.25° grid cell with high-density gauges (bias from −40.25% to 0.04%), though with modest enhancement of correlation coefficients (CC) (from 0.36 to 0.40 for basin-wide and from 0.37 to 0.41 for grid); and (2) 3B42V7 also improves its occurrence frequency across the rain intensity spectrum. Using the CREST model that has been calibrated with rain gauge inputs, the 3B42V6-based simulation shows limited hydrologic prediction NSCE skill (0.23 in daily scale and 0.25 in monthly scale) while 3B42V7 performs fairly well (0.66 in daily scale and 0.77 in monthly scale), a comparable skill score with the gauge rainfall simulations. After recalibrating the model with the respective TMPA data, significant improvements are observed for 3B42V6 across all categories, but not as much enhancement for the already-well-performing 3B42V7 except for a reduction in bias (from −26.98% to −4.81%). In summary, the latest 3B42V7 algorithm reveals a significant upgrade from 3B42V6 both in precipitation accuracy (i.e., correcting the underestimation) thus improving its potential hydrological utility. Forcing the model with 3B42V7 rainfall yields comparable skill scores with in situ gauges even without recalibration of the hydrological model by the satellite precipitation, a compensating approach often used but not favored by the hydrology community, particularly in ungauged basins.
Keywords CREST model; A-priori parameter estimation; Hydrologic modeling evaluation; Precipitation estimation
2013: First evaluation of the climatological calibration algorithm in the real-time TMPA precipitation estimates over two basins at high and low latitudes. Water Resources Research, 49, 2461–2472, doi:10.1002/wrcr.20246., , , , , , , , ,
The TRMM Multi-satellite Precipitation Analysis (TMPA) system underwent a crucial upgrade in early 2009 to include a climatological calibration algorithm (CCA) to its real-time product 3B42RT, and this algorithm will continue to be applied in the future Global Precipitation Measurement era constellation precipitation products. In this study, efforts are focused on the comparison and validation of the Version 6 3B42RT estimates before and after the climatological calibration is applied. The evaluation is accomplished using independent rain gauge networks located within the high-latitude Laohahe basin and the low-latitude Mishui basin, both in China. The analyses indicate the CCA can effectively reduce the systematic errors over the low-latitude Mishui basin but misrepresent the intensity distribution pattern of medium-high rain rates. This behavior could adversely affect TMPA's hydrological applications, especially for extreme events (e.g., floods and landslides). Results also show that the CCA tends to perform slightly worse, in particular, during summer and winter, over the high-latitude Laohahe basin. This is possibly due to the simplified calibration-processing scheme in the CCA that directly applies the climatological calibrators developed within 40° latitude to the latitude belts of 40°N–50°N. Caution should therefore be exercised when using the calibrated 3B42RT for heavy rainfall-related flood forecasting (or landslide warning) over high-latitude regions, as the employment of the smooth-fill scheme in the CCA bias correction could homogenize the varying rainstorm characteristics. Finally, this study highlights that accurate detection and estimation of snow at high latitudes is still a challenging task for the future development of satellite precipitation retrievals.
2013: Spatial-temporal changes of water resources in a typical semiarid basin of North China over the past 50 years and assessment of possible natural and socioeconomic causes. Journal of Hydrometeorology, 14, 1009–1034, doi:10.1175/JHM-D-12-0116.1., , , , , , , , ,
Hydrological processes in most semiarid regions on Earth have been changing under the impacts of climate change, human activities, or combinations of the two. This paper first presents a trend analysis of the spatiotemporal changes in water resources and then diagnoses their underlying atmospheric and socioeconomic causes over 10 catchments in the Laoha basin, a typical semiarid zone of northeast China. The impacts of climate variability and human activities on streamflow change were quantitatively evaluated by the VIC (Variable Infiltration Capacity) model. First, results indicate that six out of the 10 studied catchments have statistically significant downward trends in annual streamflow; however, there is no significant change of annual precipitation for all catchments. Two abrupt changes of annual streamflow at 1979 and 1998 are identified for the four largest catchments. Second, the Laoha basin generally experienced three evident dry–wet pattern switches during the past 50 years. Furthermore, this basin is currently suffering from unprecedented water shortages. Large-scale climate variability has affected the local natural hydrologic system. Third, quantitative evaluation shows human activities were the main driving factors for the streamflow reduction with contributions of approximately 90% for the whole basin. A significant increase in irrigated area, which inevitably resulted in tremendous agricultural water consumption, is the foremost culprit contributing to the dramatic runoff reduction, especially at midstream and downstream of the Laoha basin. This study is expected to enable policymakers and stakeholders to make well-informed, short-term practice decisions and better plan long-term water resource and ecoenvironment management strategies.
Keywords: Hydrologic cycle, Hydrology, Hydrometeorology, Annual variations, Climate variability
2013: Commentary on “Why do tornados and hailstorms rest on weekends?” by D. Rosenfeld and T. Bell. Journal of Geophysical Research, 118, 1–7, doi:10.1002/jgrd.50526., , , , , , ,
2013: Short-term quantitative precipitation forecasting using an object-based approach. Journal of Hydrology, 483, 1–15, doi:10.1016/j.jhydrol.2012.09.052., , , , , ,
Short-term Quantitative Precipitation Forecasting (SQPF) is critical for flash-flood warning, navigation safety, and many other applications. The current study proposes a new object-based method, named PERCAST (PERsiann-ForeCAST), to identify, track, and nowcast storms. PERCAST predicts the location and rate of rainfall up to 4 h using the most recent storm images to extract storm features, such as advection field and changes in storm intensity and size. PERCAST is coupled with a previously developed precipitation retrieval algorithm called PERSIANN-CCS (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Cloud Classification System) to forecast rainfall rates. Four case studies have been presented to evaluate the performance of the models. While the first two case studies justify the model capabilities in nowcasting single storms, the third and fourth case studies evaluate the proposed model over the contiguous US during the summer of 2010. The results show that, by considering storm Growth and Decay (GD) trends for the prediction, the PERCAST-GD further improves the predictability of convection in terms of verification parameters such as Probability of Detection (POD) and False Alarm Ratio (FAR) up to 15–20%, compared to the comparison algorithms such as PERCAST.
2012: Radar-based quantitative precipitation estimation for the cool season in complex terrain: case studies from the noaa hydrometeorology testbed. Journal of Hydrometeorology, 13, 1836–1854., , , ,
This study explores error sources of the National Weather Service operational radar-based quantitative precipitation estimation (QPE) during the cool season over the complex terrain of the western United States. A new, operationally geared radar QPE was developed and tested using data from the National Oceanic and Atmospheric Administration Hydrometeorology Testbed executed during the 2005/06 cool season in Northern California. The new radar QPE scheme includes multiple steps for removing nonprecipitation echoes, constructing a seamless hybrid scan reflectivity field, applying vertical profile of reflectivity (VPR) corrections to the reflectivity, and converting the reflectivity into precipitation rates using adaptive Z–R relationships. Specific issues in radar rainfall accumulations were addressed, which include wind farm contaminations, blockage artifacts, and discontinuities due to radar overshooting. The new radar QPE was tested in a 6-month period of the 2005/06 cool season and showed significant improvements over the current operational radar QPE (43% reduction in bias and 30% reduction in root-mean-square error) when compared with gauges. In addition, the new technique minimizes various radar artifacts and produces a spatially continuous rainfall product. Such continuity is important for accurate hydrological model predictions. The new technique is computationally efficient and can be easily transitioned into operations. One of the largest remaining challenges is obtaining accurate radar QPE over the windward slopes of significant mountain ranges, where low-level orographic enhancement of precipitation is not resolved by the operational radars leading to underestimation. Additional high-resolution and near-surface radar observations are necessary for more accurate radar QPE over these regions.
2013: Evaluating and constraining ice cloud parameterizations in CAM5 using aircraft measurements from the SPARTICUS campaign. Atmos. Chem. Phys, 13, 4963–4982, doi:10.5194/acp-13-4963-2013., , , , , , ,
This study uses aircraft measurements of relative humidity and ice crystal size distribution collected during the SPARTICUS (Small PARTicles In CirrUS) field campaign to evaluate and constrain ice cloud parameterizations in the Community Atmosphere Model version 5. About 200 h of data were collected during the campaign between January and June 2010, providing the longest aircraft measurements available so far for cirrus clouds in the midlatitudes. The probability density function (PDF) of ice crystal number concentration (Ni) derived from the high-frequency (1 Hz) measurements features a strong dependence on ambient temperature. As temperature decreases from −35 °C to −62 °C, the peak in the PDF shifts from 10–20 L−1 to 200–1000 L−1, while Ni shows a factor of 6–7 increase.
2013: Partial Beam Blockage Correction Using Polarimetric Radar Measurements. Journal of Atmospheric and Oceanic Technology, 30, 861–872, doi:10.1175/JTECH-D-12-00075.1., , ,
A new method for mitigation of partial beam blockage that uses the consistency between reflectivity factor Z and specific differential phase KDP and their radial integrals in rain is presented. The immunity of differential phase ΦDP to partial beam blockage is utilized to estimate the bias of reflectivity factor caused by beam blockage. The algorithm is tested on dual-polarization radar data collected by the NCAR S-band polarimetric Dopper radar system (S-Pol) during the Southwest Monsoon Experiment/Terrain-Influenced Monsoon Rainfall Experiment (SoWMEX/TiMREX) in June 2008 in Taiwan. Corrected reflectivity factors in the blocked sectors are compared with corresponding values deduced from a digital elevation model (DEM) to show the advantage of the suggested method in areas where obstacles such as high-rise buildings cause additional blockage that is not accounted for by DEM. The accuracy and robustness of the method is quantitatively evaluated using a series of radar volume scans obtained in three rainfall events.
FY 2012 — 94 publications
2012: Multiplatform Comparisons of Rain Intensity for Extreme Precipitation Events. IEEE Transactions on Geoscience and Remote Sensing, 50, 675–686, doi:: 10.1109/TGRS.2011.2162737., , ,
Rainfall intensities during heavy rain events over the continental U.S. are compared for several advanced radar products. These products include the following: 1) Tropical Rainfall Measuring Mission (TRMM) precipitation radar (PR) near-surface estimates; 2) NOAA Quantitative Precipitation Estimation very high resolution (1 km, instantaneous) radar-only national mosaics (Q2); 3) very high resolution gauge-adjusted radar national mosaics, which we have developed by applying a gauge correction on the Q2 products; and 4) several independent C-band dual-polarimetric radar-estimated rainfall samples collected with the Advanced C-band Radar for Meteorological and Operational Research (ARMOR) radar in Alabama. These instantaneous rainfall rate fields [i.e., 1)-3)] can be considered as radar products with the largest coverage currently available from space- and ground-based radar observations. Although accumulated rainfall amounts are often similar, we find the PR and Q2 rain-rate histograms quite different. PR rain-rate histograms are shifted toward lower rain rates, implying a much larger stratiform/convective rain ratio than do products such as Q2. The shift is more evident during strong continental convective storms and not as pronounced in tropical rain. A “continental/maritime regime” behavior is also observed upon adjusting the Q2 products to rain gauges, yet the rain amount more closely agrees with that of PR. The independent PR/ARMOR comparisons confirm this systematic regime behavior. In addition, comparisons are performed over central Florida where PR, Q2, and the NASA TRMM ground validation products are available. These comparisons show large discrepancies among all three products. Resolving the large discrepancies between the products presents an important set of challenges related to improving remote-sensing estimates of precipitation in general and during extreme events in particular.
2012: Synthetic Satellite Imagery for Real-Time High-Resolution Model Evaluation. Weather and Forecasting, 27, 784–795, doi:10.1175/WAF-D-11-00130.1., , , , , , , , , , ,
Output from a real-time high-resolution numerical model is used to generate synthetic infrared satellite imagery. It is shown that this imagery helps to characterize model-simulated large-scale precursors to the formation of deep-convective storms as well as the subsequent development of storm systems. A strategy for using this imagery in the forecasting of severe convective weather is presented. This strategy involves comparing model-simulated precursors to their observed counterparts to help anticipate model errors in the timing and location of storm formation, while using the simulated storm evolution as guidance.
2011: Understanding radar refractivity: Sources of uncertainty. Journal of Applied Meteorology and Climatology, 50, 2543–2560, doi:10.1175/2011JAMC2648.1., , , , , , , , ,
This study presents a 2-yr-long comparison of Weather Surveillance Radar-1988 Doppler (WSR-88D) refractivity retrievals with Oklahoma Mesonetwork (“Mesonet”) and sounding measurements and discusses some challenges to implementing radar refractivity operationally. Temporal and spatial analyses of radar refractivity exhibit high correlation with Mesonet data; however, periods of large refractivity differences between the radar and Mesonet are observed. Several sources of refractivity differences are examined to determine the cause of large refractivity differences. One source for nonklystron radars includes magnetron frequency drift, which can introduce errors up to 10 N-units if the frequency drift is not corrected. Different reference maps made at different times can “shift” refractivity values. A semiautomated method for producing reference maps is presented, including trade-offs for making reference maps under different conditions. Refractivity from six Mesonet stations within the clutter domain of the Oklahoma City, Oklahoma, WSR-88D (KTLX) is compared with radar refractivity retrievals. The analysis revealed that the six Mesonet stations exhibited a prominent diurnal trend in differences between radar and Mesonet refractivity measurements. The diurnal range of the refractivity differences sometimes exceeded 20 or 30 N-units in the warm season, which translated to a potential dewpoint temperature difference of several degrees Celsius. A seasonal analysis revealed that large refractivity differences primarily occurred during the warm season when refractivity is most sensitive to moisture. Ultimately, the main factor in determining the magnitude of the differences between the two refractivity platforms is the vertical gradient of refractivity because of the difference in observation height between the radar and a surface station.
2011: Polarimetric Estimates of a 1-Month Accumulation of Light Rain with a 3-cm Wavelength Radar. JOURNAL OF HYDROMETEOROLOGY, 12, 1024–1039, doi:10.1175/2011JHM1339.1., , , ,
The authors evaluate rainfall estimates from the new polarimetric X-band radar at Bonn, Germany, for a period between mid-November and the end of December 2009 by comparison with rain gauges. The emphasis is on slightly more than 1-month accumulations over areas minimally affected by beam blockage. The rain regime was characterized by reflectivities mainly below 45 dBZ, maximum observed rain rates of 47 mm h21, a mean rain rate of 0.1 mm h21, and brightband altitudes between 0.6 and 2.4 km above the ground. Both the reflectivity factor and the specific differential phase are used to obtain the rain rates. The accuracy of rain total estimates is evaluated from the statistics of the differences between radar and rain gauge measurements. Polarimetry provides improvement in the statistics of reflectivity-based measurements by reducing the bias and RMS errors from 225% to 7% and from 33% to 17%, respectively. Essential to this improvement is separation of the data into those attributed to pure rain, those from the bright band, and those due to nonmeteorological scatterers. A type-specific (rain or wet snow) relation is applied to obtain the rain rate by matching on the average the contribution by wet snow to the radar-measured rainfall below the bright band. The measurement of rain using specific differential phase is the most robust and can be applied to the very low rain rates and still produce credible accumulation estimates characterized with a standard deviation of 11% but a bias of 225%. A composite estimator is also tested and discussed.
2012: Use of Ground Clutter to Monitor Polarimetric Radar Calibration. Journal of Atmospheric and Oceanic Technology, 29, 159–176, doi:10.1175/JTECH-D-11-00036.1., ,
It is suggested that urban ground clutter can have a role in monitoring calibration of reflectivity factor ZH and differential reflectivity ZDR on polarimetric radars. The median and average values of these variables are considered. Analysis of data from 1 month of cold season in Germany (X-band radar) and 3.5 hot days in Oklahoma (S-band radar) is presented. In the presence of up to moderate rain or snow a reflectivity threshold suffices for separating significant clutter from precipitation observed with an X-band radar. The same threshold was suitable on observations with an S-band radar in Oklahoma because heavy precipitation was not present. The tests suggest the scheme is worthy considering for operational monitoring of ZH as its median values at both locations were within the quantization interval of 0.5 dB. Environmental factors that can influence reflectivities from clutter are examined. The effects on ZDR can be significant. These are quantified in the data and possible uses for calibration and monitoring radar status are indicated.
2012: The tornadic vortex signature: An update. Weather and Forecasting, 27, 525–530., ,
A tornadic vortex signature (TVS) is a degraded Doppler velocity signature of a tornado that occurs when the core region of a tornado is smaller than the half-power beamwidth of the sampling Doppler radar. Soon after the TVS was discovered in the mid-1970s, simulations were conducted to verify that the signature did indeed represent a tornado. The simulations, which used a uniform reflectivity distribution across a Rankine vortex model, indicated that the extreme positive and negative Doppler velocity values of the signature should be separated by about one half-power beamwidth regardless of tornado size or strength. For a Weather Surveillance Radar-1988 Doppler (WSR-88D) with an effective half-power beamwidth of approximately 1.4 deg and data collected at 1.0 deg azimuthal intervals, the two extreme Doppler velocity values should be separated by 1.0 deg. However, with the recent advent of 0.5 deg azimuthal sampling ("superresolution") by WSR-88Ds at lower elevation angles, some of the extreme Doppler velocity values unexpectedly were found to be separated by 0.5 deg instead of 1.0 deg azimuthal intervals. To understand this dilemma, the choice of vortex model and reflectivity profile is investigated. It is found that the choice of vortex model does not have a significant effect on the simulation results. However, using a reflectivity profile with a minimum at the vortex center does make a difference. The revised simulations indicate that it is possible for the distance between the peak Doppler velocity values of a TVS to be separated by 0.5 deg with superresolution data collection.
2011: Computing streamfunction and velocity potential in a limited domain. Part II: Numerical methods and test experiments. Adv. Atmos. Sci, 28, 1445–1458, doi:10.1007/s00376-011-0186-5., ,
Built on the integral formulas in Part I, numerical methods are developed for computing velocity potential and streamfunction in a limited domain. When there is no inner boundary (around a data hole) inside the domain, the total solution is the sum of the internally and externally induced parts. For the internally induced part, three numerical schemes (grid-staggering, local-nesting and piecewise continuous integration) are designed to deal with the singularity of the Green's function encountered in numerical calculations. For the externally induced part, by setting the velocity potential (or streamfunction) component to zero, the other component of the solution can be computed in two ways: (1) Solve for the density function from its boundary integral equation and then construct the solution from the boundary integral of the density function. (2) Use the Cauchy integral to construct the solution directly. The boundary integral can be discretized on a uniform grid along the boundary. By using local-nesting (or piecewise continuous integration), the scheme is refined to enhance the discretization accuracy of the boundary integral around each corner point (or along the entire boundary). When the domain is not free of data holes, the total solution contains a data-hole--induced part, and the Cauchy integral method is extended to construct the externally induced solution with irregular external and internal boundaries. An automated algorithm is designed to facilitate the integrations along the irregular external and internal boundaries. Numerical experiments are performed to evaluate the accuracy and efficiency of each scheme relative to others.
2012: Partly Cloudy with a Chance of Migration: Weather, Radars, and Aeroecology. Bulletin of the American Meteorological Society, 93, 669–686, doi:10.1175/BAMS-D-11-00099.1., , , , , , , , ,
Aeroecology is an emerging scientific discipline that integrates atmospheric science, Earth science, geography, ecology, computer science, computational biology, and engineering to further the understanding of biological patterns and processes. The unifying concept underlying this new transdisciplinary field of study is a focus on the planetary boundary layer and lower free atmosphere (i.e., the aerosphere), and the diversity of airborne organisms that inhabit and depend on the aerosphere for their existence. Here, we focus on the role of radars and radar networks in aeroecological studies. Radar systems scanning the atmosphere are primarily used to monitor weather conditions and track the location and movements of aircraft. However, radar echoes regularly contain signals from other sources, such as airborne birds, bats, and arthropods. We briefly discuss how radar observations can be and have been used to study a variety of airborne organisms and examine some of the many potential benefits likely to arise from radar aeroecology for meteorological and biological research over a wide range of spatial and temporal scales. Radar systems are becoming increasingly sophisticated with the advent of innovative signal processing and dual-polarimetric capabilities. These capabilities should be better harnessed to promote both meteorological and aeroecological research and to explore the interface between these two broad disciplines. We strongly encourage close collaboration among meteorologists, radar scientists, biologists, and others toward developing radar products that will contribute to a better understanding of airborne fauna.
2012: An Overview of the 2010 Hazardous Weather Testbed Experimental Forecast Program Spring Experiment. Bulletin of the American Meteorological Society, 139, 55–74., , , , , , , , , , , , , , , , , , , , , , , , ,
The NOAA Hazardous Weather Testbed (HWT) conducts annual spring forecasting experiments organized by the Storm Prediction Center and National Severe Storms Laboratory to test and evaluate emerging scientific concepts and technologies for improved analysis and prediction of hazardous mesoscale weather. A primary goal is to accelerate the transfer of promising new scientific concepts and tools from research to operations through the use of intensive real-time experimental forecasting and evaluation activities conducted during the spring and early summer convective storm period. The 2010 NOAA/HWT Spring Forecasting Experiment (SE2010), conducted 17 May through 18 June, had a broad focus, with emphases on heavy rainfall and aviation weather, through collaboration with the Hydrometeorological Prediction Center (HPC) and the Aviation Weather Center (AWC), respectively. In addition, using the computing resources of the National Institute for Computational Sciences at the University of Tennessee, the Center for Analysis and Prediction of Storms at the University of Oklahoma provided unprecedented real-time conterminous United States (CONUS) forecasts from a multimodel Storm-Scale Ensemble Forecast (SSEF) system with 4-km grid spacing and 26 members and from a 1-km grid spacing configuration of the Weather Research and Forecasting model. Several other organizations provided additional experimental high-resolution model output. This article summarizes the activities, insights, and preliminary findings from SE2010, emphasizing the use of the SSEF system and the successful collaboration with the HPC and AWC.
2012: Views on Applying RKW Theory: An Illustration Using the 8 May 2009 Derecho-Producing Convective System. Monthly Weather Review, 140, 1023–1043, doi:10.1175/MWR-D-11-00026.1., , ,
This work presents an analysis of the vertical wind shear during the early stages of the remarkable 8 May 2009 central U.S. derecho-producing convective system. Comments on applying Rotunno–Klemp–Weisman (RKW) theory to mesoscale convective systems (MCSs) of this type also are provided. During the formative stages of the MCS, the near-surface-based shear vectors ahead of the leading convective line varied with time, location, and depth, but the line-normal component of the shear in any layer below 3 km ahead of where the strong bow echo developed was relatively small (6–9 m s−1). Concurrently, the midlevel (3–6 km) line-normal shear component had magnitudes mostly >10 m s−1 throughout.
In a previous companion paper, it was hypothesized that an unusually strong and expansive low-level jet led to dramatic changes in instability, shear, and forced ascent over mesoscale areas. These mesoscale effects may have overwhelmed the interactions between the cold pool and low-level shear that modulate system structure in less complex environments. If cold pool–shear interactions were critical to producing such a strong system, then the extension of the line-normal shear above 3 km also appeared to be critical. It is suggested that RKW theory be applied with much caution, and that examining the shear above 3 km is important, if one wishes to explain the formation and maintenance of intense long-lived convective systems, particularly complex nocturnal systems like the one that occurred on 8 May 2009.
2012: Verification of RUC 0-1-hour forecasts and SPC Mesoscale Analyses using VORTEX2 Soundings. Weather and Forecasting, 27, 667–683, doi:10.1175/WAF-D-11-00096.1.,
This study uses radiosonde observations obtained during VORTEX2 to verify base-state variables and severe-weather-related parameters calculated from Rapid Update Cycle (RUC) analyses and 1-h forecasts, as well as those calculated from the operational surface objective analysis system used at the Storm Prediction Center (the SFCOA). The rapid growth in temperature, humidity, and wind errors from 0 to 1 h seen at all levels in a past RUC verification study (Benjamin et al. 2004) is not seen in the present study. This could be because the verification observations are also assimilated into the RUC in the Benjamin et al. study, whereas the verification observations in the present study are not. In the upper troposphere, the present study shows large errors in relative humidity, mostly related to a large moist bias. The planetary boundary layer tends to be too shallow in the RUC analyses and 1-h forecasts. Wind speeds tend to be too fast in the lowest 1 km and too slow in the 2 - 4 km layer. RUC and SFCOA 1-h forecast errors for many important severe weather parameters are large relative to their potential impact on convective evolution. However, the SFCOA significantly improves upon the biases seen in most of the 1-h RUC forecasts for the base state surface variables and most of the other severe-weather-related parameters, indicating that the SFCOA has a more significant impact in reducing the biases in the 1-h RUC forecasts than on the root-mean-squared errors.
2012: Use of dual-polarization signatures in diagnosing tornadic potential. Electronic Journal of Operational Meteorology, 13(5), 57-–78., , , , ,
Recent research has suggested that the combination of differential reflectivity (ZDR) and specific differential phase (KDP) can be useful in the assessment of low-level wind shear within a thunderstorm, a critical factor in tornado formation. The two main polarimetric signatures indicated for this diagnosis include an arc of ZDR along the right inflow edge of the thunderstorm near or collocated with a large gradient in horizontal reflectivity, ZH, (indicative of regions of preferentially large raindrops) and a region of enhanced KDP located deeper into the forward flank precipitation shield than the ZDR arc (indicating that the smaller drops are preferentially advected farther from the updraft core by the low-level winds).
Three severe weather events in North Alabama were examined to assess the utility of these ZDR and KDP signatures in determining the potential for tornadic activity. The cases were: 26 October 2010, when many storms indicated tornadic potential from a standard reflectivity and velocity analysis, but very few storms actually produced tornadoes; 28 February 2011, a broken line event that transitioned from a tornadic to high wind threat; and 27 April 2011, when multiple rounds of tornadic storms, associated with quasi-linear convective systems (QLCS) and supercells, thrashed the Tennessee Valley. All three cases displayed strong evidence of ZDR arcs and horizontal separation of KDP and ZDR during tornadic periods. In addition, non-tornadic storms showed consistent signatures of overlapping dual-pol fields. While some variations remain between supercell, broken line, and QLCS tornadoes, common signatures among all storm types indicate a potentially broad application of this type of signature recognition.
2011: Adaptive Range Oversampling to Achieve Faster Scanning on the National Weather Radar Testbed Phased-Array Radar. Journal of Atmospheric and Oceanic Technology, 28, 1581–1597, doi:10.1175/JTECH-D-10-05042.1., ,
This paper describes a real-time implementation of adaptive range oversampling processing on the National Weather Radar Testbed phased-array radar. It is demonstrated that, compared to conventional matched-filter processing, range oversampling can be used to reduce scan update times by a factor of 2 while producing meteorological data with similar quality. Adaptive range oversampling uses moment-specific transformations to minimize the variance of meteorological variable estimates. An efficient algorithm is introduced that allows for seamless integration with other signal processing functions and reduces the computational burden. Through signal processing, a new dimension is added to the traditional trade-off triangle that includes the variance of estimates, spatial coverage, and update time. That is, by trading an increase in computational complexity, data with higher temporal resolution can be collected and the variance of estimates can be improved without affecting the spatial coverage.
2012: Uncertainties in trajectory analyses within near-surface mesocyclones of simulated supercells. Monthly Weather Review, 140, 2959–2966, doi:10.1175/MWR-D-12-00131.1., , ,
This study addresses the sensitivity of backward trajectories within simulated near-surface mesocyclones to the spatiotemporal resolution of the velocity field. These backward trajectories are compared to forward trajectories computed during run time within the numerical model. It is found that the population of backward trajectories becomes increasingly contaminated with “inflow trajectories” that owe their existence to spatiotemporal interpolation errors in time-varying and strongly curved, confluent flow. These erroneous inflow parcels may mistakenly be interpreted as a possible source of air for the near-surface vortex. It is hypothesized that, unlike forward trajectories, backward trajectories are especially susceptible to errors near the strongly confluent intensifying vortex. Although the results are based on model output, dual-Doppler analysis fields may be equally affected by such errors.
2012: Impact from the environmental wind profile on ensemble forecasts of the 4 May 2007 Greensburg tornado and its associated mesoscyclones. Monthly Weather Review, 140, 696–712, doi:10.1175/MWR-D-11-00008.1., , , ,
The early tornadic phase of the Greensburg, Kansas supercell on the evening of 4 May 2007 is simulated using a set of storm-scale (1 km horizontal grid spacing) 30-member EnKF data assimilation and forecast experiments. NEXRAD Level-II radar data from the Dodge City, Kansas WSR-88D (KDDC) are assimilated into the National Severe Storms Laboratory (NSSL) Collaborative Model for Multiscale Atmospheric Simulation (COMMAS). The initially horizontally homogeneous environments are initialized from one of three reconstructed soundings representative of the early tornadic phase of the storm, when a low-level jet (LLJ) was intensifying. To isolate the impact of the low-level wind profile, 0 – 3.5 km AGL wind profiles from Vance Air Force Base, Oklahoma WSR-88D (KVNX) velocity-azimuth display (VAD) analyses at 0130, 0200, and 0230 UTC are used. A sophisticated, double-moment bulk ice microphysics scheme is employed.
For each of the three soundings, ensemble forecast experiments are initiated from EnKF analyses at various times prior to and shortly after the genesis of the Greensburg tornado (0200 UTC). Probabilistic forecasts of the mesocyclone-scale circulation(s) are generated and compared to the observed Greensburg tornado track. Probabilistic measures of significant rotation and observation-space diagnostic statistics are also calculated. It is shown that in general, the track of the Greensburg tornado is well predicted, and forecasts improve as forecast lead-time decreases. Significant variability is also seen across the experiments using different VAD wind profiles. Implications of these results in regards to the choice of initial mesoscale environment, as well as for the “Warn-on-Forecast” paradigm for probabilistic numerical prediction of severe thunderstorms and tornadoes, are discussed.
2012: The tornadoes of spring 2011 in the USA: an historical perspective. Weather, 67, 88–94, doi:10.1002/wea.1902., , ,
2012: Using WSR-88D Data and Insolation Estimates to Determine Convective Boundary Layer Depth. Journal of Atmospheric and Oceanic Technology, 29, 581–588, doi:10.1175/JTECH-D-11-00043.1., , ,
Prior work shows that Weather Surveillance Radar-1988 Doppler (WSR-88D) clear-air reflectivity can be used to determine convective boundary layer (CBL) depth. Based on that work, two simple linear regressions are developed that provide CBL depth. One requires only clear-air radar reflectivity from a single 4.5° elevation scan, whereas the other additionally requires the total, clear-sky insolation at the radar site, derived from the radar location and local time. Because only the most recent radar scan is used, the CBL depth can, in principle, be computed for every scan. The “true” CBL depth used to develop the models is based on human interpretation of the 915-MHz profiler data. The regressions presented in this work are developed using 17 summer days near Norman, Oklahoma, that have been previously investigated. The resulting equations and algorithms are applied to a testing dataset consisting of 7 days not previously analyzed. Though the regression using insolation estimates performs best, errors from both models are on the order of the expected error of the profiler-estimated CBL depth values. Of the two regressions, the one that uses insolation yields CBL depth estimates with an RMSE of 208 m, while the regression with only clear-air radar reflectivity yields CBL depth estimates with an RMSE of 330 m.
2012: Analytical Expressionss for Doppler Spectra from a Vertically Directed Radar Beam. Journal of Atmospheric and Oceanic Technology, 29, 500–509, doi:10.1175JTECH-D-11-00005.1., , ,
A generalized expression is derived for the correlation function of signals backscattered from hydrometeors observed with a vertically pointed radar beam in which particle size distribution, turbulence and mean wind are not homogeneous. This study extends the work of Fang and Doviak 2008 by including the effects of particle size distribution on the measured Doppler spectrum. It shows the measured Doppler spectrum to be the volumetric mean of the weighted convolution of the normalized Doppler spectra associated with turbulence, mean wind, particle oscillation/wobble, and terminal velocity. Without particle oscillation/wobble, mean wind, and turbulence, the Doppler spectrum is the mirror image of the terminal velocity spectrum under the condition that the second order effect of finite beam width can be ignored. This generalized Doppler spectrum reduces further to a previously derived expression, if the particle size distribution, or equivalently reflectivity, is uniform. Provided there is a unique relationship between the particle’s terminal velocity and its effective diameter, the derived equations can be applied to scatterers consisting of ice particles as well as water droplets. This study derives the analytical expression for the Doppler spectrum of mean wind and also shows that, if stationary homogeneous turbulence is the only contributor to spectral broadening, the average of a large number of radar measured Doppler spectra will be equal to the velocity probability density function of turbulence independent of the angular, range, and reflectivity weighting functions.
2012: Application of a Lightning Data Assimilation Technique in the WRF-ARW Model at Cloud-Resolving Scales for the Tornado Outbreak of 24 May 2011. Monthly Weather Review, 140, 2609–2627, doi:10.1175/MWR-D-11-00299.1., , , ,
This study presents the assimilation of total lightning data to help initiate convection at cloud-resolving scales within a numerical weather prediction model. The test case is the 24 May 2011 Oklahoma tornado outbreak, which was characterized by an exceptional synoptic/mesoscale setup for the development of long-lived supercells with large destructive tornadoes. In an attempt to reproduce the observed storms at a predetermined analysis time, total lightning data were assimilated into the Weather Research and Forecasting Model (WRF) and analyzed via a suite of simple numerical experiments. Lightning data assimilation forced deep, moist precipitating convection to occur in the model at roughly the locations and intensities of the observed storms as depicted by observations from the National Severe Storms Laboratory’s three-dimensional National Mosaic and Multisensor Quantitative Precipitation Estimation (QPE)—i.e., NMQ—radar reflectivity mosaic product. The nudging function for the total lightning data locally increases the water vapor mixing ratio (and hence relative humidity) via a simple smooth continuous function using gridded pseudo-Geostationary Lightning Mapper (GLM) resolution (9 km) flash rate and simulated graupel mixing ratio as input variables. The assimilation of the total lightning data for only a few hours prior to the analysis time significantly improved the representation of the convection at analysis time and at the 1-h forecast within the convective permitting and convective resolving grids (i.e., 3 and 1 km, respectively). The results also highlighted possible forecast errors resulting from errors in the initial mesoscale thermodynamic variable fields. Although this case was primarily an analysis rather than a forecast, this simple and computationally inexpensive assimilation technique showed promising results and could be useful when applied to events characterized by moderate to intense lightning activity.
2012: Tropical Oceanic Hot Towers: Need They Be Undilute to Transport Energy from the Boundary Layer to the Upper Troposphere Effectively? An Answer Based on Trajectory Analysis of a Simulation of a TOGA COARE Convective System. Journal of the Atmospheric Sciences, 69, 195–213, doi:10.1175/JAS-D-11-0147.1., , , , ,
This paper addresses questions resulting from the authors’ earlier simulation of the 9 February 1993 Tropical Ocean Global Atmosphere Coupled Ocean–Atmosphere Research Experiment (TOGA COARE) squall line, which used updraft trajectories to illustrate how updrafts deposit significant moist static energy (in terms of equivalent potential temperature theta-e) in the upper troposphere, despite dilution and a theta-e minimum in the midtroposphere. The major conclusion drawn from this earlier work was that the “hot towers” that Riehl and Malkus showed as necessary to maintain the Hadley circulation need not be undilute. It was not possible, however, to document how the energy (or theta-e) increased above the midtroposphere. To address this relevant scientific question, a high-resolution (300 m) simulation was carried out using a standard 3-ICE microphysics scheme (Lin–Farley–Orville).
Detailed along-trajectory information also allows more accurate examination of the forces affecting each parcel’s vertical velocity W, their displacement, and the processes impacting theta-e, with focus on parcels reaching the upper troposphere. Below 1 km, pressure gradient acceleration forces parcels upward against negative buoyancy acceleration associated with the sum of (positive) virtual temperature excess and (negative) condensate loading. Above 1 km, the situation reverses, with the buoyancy (and thermal buoyancy) acceleration becoming positive and nearly balancing a negative pressure gradient acceleration, slightly larger in magnitude, leading to a W minimum at midlevels. The W maximum above 8 km and concomitant theta-e increase between 6 and 8 km are both due to release of latent heat resulting from the enthalpy of freezing of raindrops and riming onto graupel from 5 to 6.5 km and water vapor deposition onto small ice crystals and graupel pellets above that, between 7 and 10 km.
2012: Climate and Weather Impact Timing of Emergence of Bats. PLoS ONE, 7(8): e42737, 1–8, doi:10.1371/journal.pone.0042737., , , , , , ,
Interest in forecasting impacts of climate change have heightened attention in recent decades to how animals respond to variation in climate and weather patterns. One difficulty in determining animal response to climate variation is lack of longterm datasets that record animal behaviors over decadal scales. We used radar observations from the national NEXRAD network of Doppler weather radars to measure how group behavior in a colonially-roosting bat species responded to annual variation in climate and daily variation in weather over the past 11 years. Brazilian free-tailed bats (Tadarida brasiliensis) form dense aggregations in cave roosts in Texas. These bats emerge from caves daily to forage at high altitudes, which makes them detectable with Doppler weather radars. Timing of emergence in bats is often viewed as an adaptive trade-off between emerging early and risking predation or increased competition and emerging late which restricts foraging opportunities. We used timing of emergence from five maternity colonies of Brazilian free-tailed bats in southcentral Texas during the peak lactation period (15 June–15 July) to determine whether emergence behavior was associated with summer drought conditions and daily temperatures. Bats emerged significantly earlier during years with extreme drought conditions than during moist years. Bats emerged later on days with high surface temperatures in both dry and moist years, but there was no relationship between surface temperatures and timing of emergence in summers with normal moisture levels. We conclude that emergence behavior is a flexible animal response to climate and weather conditions and may be a useful indicator for monitoring animal response to long-term shifts in climate.
2012: Degree of Polarization at Horizontal Transmit: Theory and Applications for Weather Radar. IEEE Transactions on Geoscience and Remote Sensing, 50, 1291–1301, doi:10.1109/TGRS.2011.2167516., , , ,
This paper considers weather radar measurements at linear depolarization ratio (LDR) mode, consisting of transmission of horizontal polarization and simultaneous reception of the copolar (horizontal) and cross-polar (vertical) components of the returned wave. Such a system yields the coherency matrix, with four degrees of freedom. After a theoretical analysis of its structure and symmetries, we focus on three cross-polarization variables: LDR, cross-polar correlation coefficient at horizontal transmit (ρxh), and degree of polarization at horizontal transmit (pH). The different properties of these variables with respect to backscattering and propagation are analyzed, together with the bias induced by antenna cross-channel coupling. It is demonstrated that the degree of polarization at horizontal transmit possesses attractive properties in terms of robustness to propagation effects and antenna cross-channel coupling.
2012: Hail Swaths Observed from Satellite Data and Their Relation to Radar and Surface-Based Observations: A Case Study from Iowa in 2009. Weather and Forecasting, 27, 796–802, doi:10.1175/WAF-D-11-00118.1., , , ,
Several storms produced extensive hail damage over Iowa on 9 August 2009. The hail associated with these supercells was observed with radar data, reported by surface observers, and the resulting hail swaths were identified within satellite data. This study includes an initial assessment of cross validation of several radar-derived products and surface observations with satellite data for this storm event. Satellite-derived vegetation index data appear to be a useful product for cross validation of surface-based reports and radar-derived products associated with severe hail damage events. Satellite imagery acquired after the storm event indicated that decreased vegetation index values corresponded to locations of surface reported damage. The areal extent of decreased vegetation index values also corresponded to the spatial extent of the storms as characterized by analysis of radar data. While additional analyses are required and encouraged, these initial results suggest that satellite data of vegetated land surfaces are useful for cross validation of surface and radar-based observations of hail swaths and associated severe weather.
2012: Assimilation of Reflectivity Data in a Convective-Scale, Cycled 3DVAR Framework with Hydrometeor Classification. Journal of the Atmospheric Sciences, 69, 1054–1065., ,
The impact of assimilating radar reflectivity and radial velocity data with an intermittent, cycled threedimensional variational assimilation (3DVAR) system is explored using an idealized thunderstorm case and a real data case on 8 May 2003. A new forward operator for radar reflectivity is developed that uses a background temperature field provided by a numerical weather prediction model for automatic hydrometeor classification. Three types of experiments are performed on both the idealized and real data cases. The first experiment uses radial velocity data only, the second experiment uses both radial velocity and reflectivity data without hydrometeor classification, and the final experiment uses both radial velocity and reflectivity data with hydrometeor classification. All experiments advance the analysis state to the next observation time using a numerical model prediction, which is then used as the background for the next analysis. Results from both the idealized and real data cases show that, assimilating only radial velocity data, the model can reconstruct the supercell thunderstorm after several cycles, but the development of precipitation is delayed because of the well-known spinup problem. The spinup problem is reduced dramatically when assimilating reflectivity without hydrometeor classification. The analyses are further improved using the new reflectivity formulation with hydrometeor classification. This study represents a successful first effort in variational convective-scale data assimilation to partition hydrometeors using a background temperature field from a numerical weather prediction model.
2012: Diagnostic Pressure Equation as a Weak Constraint in a Storm-Scale Three-Dimensional Variational Radar Data Assimilation System. Journal of Atmospheric and Oceanic Technology, 29, 1075–1092, doi:10.1175/JTECH-D-11-00201.1., , ,
A diagnostic pressure equation is incorporated into a storm-scale three-dimensional variational data assimilation (3DVAR) system in the form of a weak constraint in addition to a mass continuity equation constraint (MCEC). The goal of this diagnostic pressure equation constraint (DPEC) is to couple different model variables to help build a more dynamic consistent analysis, and therefore improve the data assimilation results and subsequent forecasts. Observational System Simulation Experiments (OSSEs) are first performed to examine the impact of the pressure equation constraint on storm-scale radar data assimilation using an idealized tornadic thunderstorm simulation. The impact of MCEC is also investigated relative to that of DPEC. It is shown that DPEC can improve the data assimilation results slightly after a given period of data assimilation. Including both DPEC and MCEC yields the best data assimilation results. Sensitivity tests show that MCEC is not very sensitive to the choice of its weighting coefficients in the cost function, while DPEC is more sensitive and its weight should be carefully chosen. The updated 3DVAR system with DPEC is further applied to the 5 May 2007 Greensburg, Kansas, tornadic supercell storm case assimilating real radar data. It is shown that the use of DPEC can speed up the spinup of precipitation during the intermittent data assimilation process and also improve the follow-on forecast in terms of the general evolution of storm cells and mesocyclone rotation near the time of observed tornado.
2012: Determining Key Model Parameters of Rapidly Intensifying Hurricane Guillermo (1997) Using the Ensemble Kalman Filter. Journal of the Atmospheric Sciences, 69, 3147–3171, doi:10.1175/JAS-D-12-022.1., , , , ,
In this work the authors determine key model parameters for rapidly intensifying Hurricane Guillermo (1997) using the ensemble Kalman filter (EnKF). The approach is to utilize the EnKF as a tool only to estimate the parameter values of the model for a particular dataset. The assimilation is performed using dual-Doppler radar observations obtained during the period of rapid intensification of Hurricane Guillermo. A unique aspect of Guillermo was that during the period of radar observations strong convective bursts, attributable to wind shear, formed primarily within the eastern semicircle of the eyewall. To reproduce this observed structure within a hurricane model, background wind shear of some magnitude must be specified and turbulence and surface parameters appropriately specified so that the impact of the shear on the simulated hurricane vortex can be realized. To identify the complex nonlinear interactions induced by changes in these parameters, an ensemble of model simulations have been conducted in which individual members were formulated by sampling the parameters within a certain range via a Latin hypercube approach. The ensemble and the data, derived latent heat and horizontal winds from the dual-Doppler radar observations, are utilized in the EnKF to obtain varying estimates of the model parameters. The parameters are estimated at each time instance, and a final parameter value is obtained by computing the average over time. Individual simulations were conducted using the estimates, with the simulation using latent heat parameter estimates producing the lowest overall model forecast error.
2011: Hydrologic Evaluation of Rainfall Estimates from Radar, Satellite, Gauge, and Combinations on Ft. Cobb Basin, Oklahoma. Journal of Hydrometeorology, 12, 973–988, doi:10.1175/2011JHM1287.1., , , , , ,
This study evaluates rainfall estimates from the Next Generation Weather Radar (NEXRAD), operational rain gauges, Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA), and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks Cloud Classification System (PERSIANN-CCS) in the context as inputs to a calibrated, distributed hydrologic model. A high-density Micronet of rain gauges on the 342-km2 Ft. Cobb basin in Oklahoma was used as reference rainfall to calibrate the National Weather Service’s (NWS) Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM) at 4-km/l-h and 0.25°/3-h resolutions. The unadjusted radar product was the overall worst product, while the stage IV radar product with hourly rain gauge adjustment had the best hydrologic skill with a Micronet relative efficiency score of −0.5, only slightly worse than the reference simulation forced by Micronet rainfall. Simulations from TRMM-3B42RT were better than PERSIANN-CCS-RT (a real-time version of PERSIANN-CSS) and equivalent to those from the operational rain gauge network. The high degree of hydrologic skill with TRMM-3B42RT forcing was only achievable when the model was calibrated at TRMM’s 0.25°/3-h resolution, thus highlighting the importance of considering rainfall product resolution during model calibration.
2012: Evaluation of tools used for monitoring and forecasting flash floods in the United States. Weather and Forecasting, 27, 158–173, doi:10.1175/WAF-D-10-05043.1., , , ,
This paper evaluates, for the first time, flash-flood guidance (FFG) values and recently developed gridded FFG (GFFG) used by the National Weather Service (NWS) to monitor and predict imminent flash flooding, which is the leading storm-related cause of death in the United States. It is envisioned that results from this study will be used 1) to establish benchmark performance of existing operational flash-flood prediction tools and 2) to provide information to NWS forecasters that reveals how the existing tools can be readily optimized. Sources used to evaluate the products include official reports of flash floods from the NWS Storm Data database, discharge measurements on small basins available from the U.S. Geological Survey, and witness reports of flash flooding collected during the Severe Hazards Analysis and Verification Experiment. Results indicated that the operational guidance values, with no calibration, were marginally skillful, with the highest critical success index of 0.20 occurring with 3-h GFFG. The false-alarm rates fell and the skill improved to 0.34 when the rainfall was first spatially averaged within basins and then reached 50% of FFG for 1-h accumulation and exceeded 3-h FFG. Although the skill of the GFFG values was generally lower than that of their FFG counterparts, GFFG was capable of detecting the spatial variability of reported flash flooding better than FFG was for a case study in an urban setting.
Keywords: Algorithms, Radars/Radar observations, Operational forecasting, Model evaluation/performance
2012: Exploring Impacts of Rapid-Scan Radar Data on NWS Warning Decisions. Weather and Forecasting, 27, 1031–1044, doi:10.1175/WAF-D-11-00145.1., , ,
Rapid-scan weather radars, such as the S-band phased array radar at the National Weather Radar Testbed in Norman, Oklahoma, improve precision in the depiction of severe storm processes. To explore potential impacts of such data on forecaster warning decision making, 12 National Weather Service forecasters participated in a preliminary study with two control conditions: 1) when radar scan time was similar to volume coverage pattern 12 (4.5 min) and 2) when radar scan time was faster (43 s). Under these control conditions, forecasters were paired and worked a tropical tornadic supercell case. Their decision processes were observed and audio was recorded, interactions with data displays were video recorded, and the products were archived. A debriefing was conducted with each of the six teams independently and jointly, to ascertain the forecaster decision-making process. Analysis of these data revealed that teams examining the same data sometimes came to different conclusions about whether and when to warn. Six factors contributing toward these differences were identified: 1) experience, 2) conceptual models, 3) confidence, 4) tolerance of possibly missing a tornado occurrence, 5) perceived threats, and 6) software issues. The three 43-s teams issued six warnings: three verified, two did not verify, and one event was missed. Warning lead times were the following: tornado, 18.6 and 11.5 min, and severe, 6 min. The three tornado warnings issued by the three 4.5-min teams verified, though warning lead times were shorter: 4.6 and 0 min (two teams). In this case, use of rapid-scan data showed the potential to extend warning lead time and improve forecasters’ confidence, compared to standard operations.
2012: An automated technique to categorize storm type from radar and near-storm environment data. Atmospheric Research, 111, 104–113, doi:10.1016/j.atmosres.2012.03.004., , , ,
An automated approach to storm classification that relies on identifying storms from observed radar data and classifying them based on their shape, radar, and near-storm environmental parameters is described in this paper. Storms are identified and clustered within CONUS radar and environmental data using a combined watershed segmentation and k-means clustering technique. Storms were manually classified into short-lived convective cells, supercells, ordinary cells, or convective cells at two scales, using data from selected severe weather events between May 2008 and July 2009. Objects of composite reflectivity were identified and tracked using a clustering technique at two spatial scales, and attributes for every storm cluster were extracted based on radar and near-storm environment data from model analysis fields. Quinlan decision trees were trained on these individual attributes and implemented to nowcast storm types for both scales. It is shown in this paper that storms can be automatically identified and classified using a decision tree, and that these automatic classifications have different climatological properties, which are potentially useful for short-term forecasting.
2012: Science of nowcasting olympic weather for Vancouver 2010 (SNOW-V10): A world weather research programme project. Pure and Applied Geophysics, 1, 1–24, doi:10.1007/s00024-012-0579-0., , , , , , , , , , , , , , , , , , , , , , , , ,
A World Weather Research Programme (WWRP) project entitled the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) was developed to be associated with the Vancouver 2010 Olympic and Paralympic Winter Games conducted between 12 February and 21 March 2010. The SNOWV10 international team augmented the instrumentation associated with the Winter Games and several new numerical weather forecasting and nowcasting models were added. Both the additional observational and model data were available to the forecasters in real time. This was an excellent opportunity to demonstrate existing capability in nowcasting and to develop better techniques for short term (0–6 h) nowcasts of winter weather in complex terrain. Better techniques to forecast visibility, low cloud, wind gusts, precipitation rate and type were evaluated. The weather during the games was exceptionally variable with many periods of low visibility, low ceilings and precipitation in the form of both snow and rain. The data collected should improve our understanding of many physical phenomena such as the diabatic effects due to melting snow, wind ﬂow around and over terrain, diurnal ﬂow reversal in valleys associated with daytime heating, and precipitation reductions and increases due to local terrain. Many studies related to these phenomena are described in the Special Issue on SNOW-V10 for which this paper was written. Numerical weather prediction and nowcast models have been evaluated against the unique observational data set now available. It is anticipated that the data set and the knowledge learned as a result of SNOW-V10 will become a resource for other World Meteorological Organization member states who are interested in improving forecasts of winter weather.
2012: Threshold Calculation for Coherent Detection in Dual-Polarization Weather Radars. IEEE Transactions on Aerospace and Electronic Systems, 48, 2198–2215, doi:10.1109/TAES.2012.6237588., , ,
It is customary to censor signals in conventional weather radar using estimates of signal-to-noise ratio (SNR) and/or magnitude of autocorrelation coefficient at lag one. Dual-polarized weather radar provides a pair of highly correlated signals from the two orthogonally polarized returns. A novel censoring technique, previously proposed, sums powers, autocorrelations, and correlation between signals in the two channels and compares the sum to a threshold. In this paper an efficient procedure for calculating such a threshold is proposed.
2012: Assimilating AIRS Temperature and Mixing Ratio Profiles Using an Ensemble Kalman Filter Approach for Convective-Scale Forecasts. Weather and Forecasting, 27, 541–564, doi:10.1175/WAF-D-11-00090.1., ,
One satellite data product that has received great interest in the numerical weather prediction community is the temperature and mixing ratio profiles derived from the Atmospheric Infrared Sounder (AIRS) instrument on board the Aqua satellite. This research assesses the impact of assimilating AIRS profiles on high-resolution ensemble forecasts of southern plains severe weather events occurring on 26 May 2009 and 10 May 2010 by comparing two ensemble forecasts. In one ensemble, the 1830 and 2000 UTC level 2 AIRS temperature and dewpoint profiles are assimilated with all other routine observations into a 36-member, 15-km Weather and Research Forecast Model (WRF) ensemble using a Kalman filter approach. The other ensemble is identical, except that only routine observations are assimilated. In addition, 3-km one-way nested-grid ensemble forecasts are produced during the periods of convection. Results indicate that over the contiguous United States, the AIRS profiles do not measurably improve the ensemble mean forecasts of midtropospheric temperature and dewpoint. However, the ensemble mean dewpoint profiles in the region of severe convective development are improved by the AIRS assimilation. Comparisons of the forecast ensemble radar reflectivity probabilities between the 1- and 4-h forecast times with nearby Weather Surveillance Radar-1988 Doppler (WSR-88D) observations show that AIRS-enhanced ensembles consistently generate more skillful forecasts of the convective features at these times.
2011: Value of a Dual-Polarized Gap-Filling Radar in Support of Southern California Post-Fire Debris-Flow Warnings. J. Hydrometeor, 12, 1581–1595, doi:10.1175/JHM-D-11-05.1., , , , , , ,
A portable truck-mounted C-band Doppler weather radar was deployed to observe rainfall over the Station Fire burn area near Los Angeles, California, during the winter of 2009/10 to assist with debris-flow warning decisions. The deployments were a component of a joint NOAA–U.S. Geological Survey (USGS) research effort to improve definition of the rainfall conditions that trigger debris flows from steep topography within recent wildfire burn areas. A procedure was implemented to blend various dual-polarized estimators of precipitation (for radar observations taken below the freezing level) using threshold values for differential reflectivity and specific differential phase shift that improves the accuracy of the rainfall estimates over a specific burn area sited with terrestrial tipping-bucket rain gauges. The portable radar outperformed local Weather Surveillance Radar-1988 Doppler (WSR-88D) National Weather Service network radars in detecting rainfall capable of initiating post-fire runoff-generated debris flows. The network radars under- estimated hourly precipitation totals by about 50%. Consistent with intensity–duration threshold curves determined from past debris-flow events in burned areas in Southern California, the portable radar-derived rainfall rates exceeded the empirical thresholds over a wider range of storm durations with a higher spatial resolution than local National Weather Service operational radars. Moreover, the truck-mounted C-band radar dual-polarimetric-derived estimates of rainfall intensity provided a better guide to the expected severity of debris-flow events, based on criteria derived from previous events using rain gauge data, than traditional radar-derived rainfall approaches using reflectivity–rainfall relationships for either the portable or opera- tional network WSR-88D radars. Part of the reason for the improvement was due to siting the radar closer to the burn zone than the WSR-88Ds, but use of the dual-polarimetric variables improved the rainfall estimation by ~12% over the use of traditional Z–R relationships.
2011: The role of unbalanced mesoscale circulations in dust storms. Journal of Geophysical Research - D: Atmospheres, 116, 218–247, doi:10.1029/2011JD016218,2011., , , ,
In this study, two dust storms in northwestern Nevada (February 2002 and April 2004) are investigated through the use of Weather Research and Forecasting (WRF) model simulations. The focus of the study is twofold: (1) Examination of dynamic processes on the meso‐b scale for both cases, and (2) analysis of extreme upper‐air cooling prior to storm formation and the development of a nearly discontinuous gust front in the 2002 case that could not be validated in an earlier synoptic‐scale study. Results of the simulations suggest that the driving mechanism for dust storm dynamics derives from the breakdown and subsequent balance between the advection of geostrophic wind and total wind in the exit region of the polar jet. In this process, the deviation from quasi‐geostrophic (Q‐G) balance creates a plume of ascent along and to the right of the jet’s exit region. The cold pool generation in the mid‐lower troposphere in consequence of this adjustment sets up the kinetic energy in the planetary boundary layer and creates a forward leaning (slope from north to south) cold front under the jet exit region. Surface heating is coupled with this frontal structure, and rapid surface pressure falls (rises) occur initially (later) in response to diabatic (adiabatic) processes. The adjustments occur at fast time scales, scales that are radically different from those in studies that followed the Q‐G tenets of the Danielsen paradigm. The results of this study indicate that meso‐b scale features associated with subgeostrophy in the exit region of the curved jet aloft and associated thermal wind imbalance (700–500 hPa) lead to significant velocity divergence aloft. Mass/momentum adjustments and the associated cooling strengthen the baroclinic zone aloft. The restoration to thermal wind balance accompanying this cooling resulted in a narrow zone of surface pressure rise and strong low‐level isallobaric winds. The turbulent momentum for dust ablation comes
2012: Upstream midtropospheric circulation enabling leeside (spillover) precipitation over the Sierra Nevada. Journal of Hydrometeorology, 13, 1372–1394., , , ,
Spillover precipitation over the northern Sierra Nevada Range has been investigated using the Omega model. Results indicate that deep convection over the eastern Pacific Ocean transport low-level water vapor to the upper-levels of the troposphere and is subsequently transported northeast by the subtropical jet. This stream of moisture interacts the polar jet as it moves into the NW United States and sets the stage for heavy precipitation on the west slopes of the Sierra but also on the lee side due to moisture transport in an atmospheric river at levels higher than at the mountain range.
2012: Quantifying animal phenology in the aerosphere at a continental scale using NEXRAD weather radars. Ecosphere, 3(2), 1–9, doi:10.1890/ES11-00257.1., , , , , ,
One of the primary ecological manifestations of climate change is a shift in the timing of events in a species’ annual cycle. Such phenological shifts have been documented in numerous taxa, but data for animals have been derived primarily from human observers rather than networks of instruments used for remote sensing. The potential to use the network of weather radars in the United States (NEXRAD) to remotely sense animal phenologies could advance our understanding of the spatiotemporal scaling of phenologies in relation to shifts in local and regional climate. We tested the utility of NEXRAD radar products for quantifying the phenology of the purple martin (Progne subis) at summer roost sites in the United States. We found that the maximum radar reflectivity value in the hour before local sunrise above purple martin roost sites contained a strong phenological signal of significantly increased radar reflectivity during June, July, and August 2010. The seasonal pattern in this radar signal matched our expectation of the timing of formation and dissipation of these seasonal roosts. Radar reflectivity was greater and less variable when considering roosts close to NEXRAD stations (,25 km) than when including all 358 documented roosts; there was a negative relationship between maximum reflectivity and the distance between a roost and the nearest NEXRAD. Our results suggest that: (1) mosaicked NEXRAD radar products are a valuable source of information on the phenology of bioscatter in the aerosphere; (2) citizen scientists who document the locations of roosts on the ground are providing critical information for advancing our understanding of animal phenology and aeroecology; and (3) ongoing research that examines spatiotemporal relationships among radar-derived phenologies in airborne organisms, climate, and land cover change are likely to provide further insights.
2012: Microwave satellite data for hydrologic modeling in ungauged basins. IEEE Geoscience and Remote Sensing Letters, 9, 663–667, doi:10.1109/LGRS.2011.2177807., , , , , , , , ,
Abstract—An innovative flood-prediction framework is devel- oped using Tropical Rainfall Measuring Mission precipitation forcing and a proxy for river discharge from the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E) onboard the National Aeronautics and Space Admin- istration’s Aqua satellite. The AMSR-E-detected water surface signal was correlated with in situ measurements of streamflow in the Okavango Basin in Southern Africa as indicated by a Pearson correlation coefficient of 0.90. A distributed hydrologic model, with structural data sets derived from remote-sensing data, was calibrated to yield simulations matching the flood fre- quencies from the AMSR-E-detected water surface signal. Model performance during a validation period yielded a Nash–Sutcliffe efficiency of 0.84. We concluded that remote-sensing data from microwave sensors could be used to supplement stream gauges in large sparsely gauged or ungauged basins to calibrate hydrologic models. Given the global availability of all required data sets, this approach can be potentially expanded to improve flood monitoring and prediction in sparsely gauged basins throughout the world. Index Terms—Digital elevation models (DEMS), distributed hydrologic modeling, floods, passive microwave sensors, satellite remote sensing.
2012: Toward a framework for systematic error modeling of spaceborne precipitation radar with NOAA/NSSL ground radar-based National Mosaic QPE. Journal of Hydrometeorology, 13, 1285–1300, doi:10.1175/JHM-D-11-0139.1., , , , , , , , ,
Characterization of the error associated with satellite rainfall estimates is a necessary component of deterministic and probabilistic frameworks involving spaceborne passive and active microwave measurements for applications ranging from water budget studies to forecasting natural hazards related to extreme rainfall events. The authors focus here on the error structure of NASA’s Tropical Rainfall Measurement Mission (TRMM) Precipitation Radar (PR) quantitative precipitation estimation (QPE) at ground. The problem is addressed by comparison of PR QPEs with reference values derived from ground-based measurements using NOAA/NSSL ground radar–based National Mosaic and QPE system (NMQ/Q2). A preliminary investigation of this subject has been carried out at the PR estimation scale (instantaneous and 5 km) using a 3-month data sample in the southern part of the United States. The primary contribution of this study is the presentation of the detailed steps required to derive a trustworthy reference rainfall dataset from Q2 at the PR pixel resolution. It relies on a bias correction and a radar quality index, both of which provide a basis to filter out the less trustworthy Q2 values. Several aspects of PR errors are revealed and quantified including sensitivity to the processing steps with the reference rainfall, comparisons of rainfall detectability and rainfall-rate distributions, spatial representativeness of error, and separation of systematic biases and random errors. The methodology and framework developed herein applies more generally to rainfall-rate estimates from other sensors on board low-earth-orbiting satellites such as microwave imagers and dual-wavelength radars such as with the Global Precipitation Measurement (GPM) mission.
2011: Evolving multisensor precipitation estimation methods: Their impacts on flow prediction using a distributed hydrologic model. Journal of Hydrometeorology, 12, 1414–1431, doi:10.1175/JHM-D-10-05038.1., , , , , , , , , , ,
This study investigates evolving methodologies for radar and merged gauge–radar quantitative precipitation estimation (QPE) to determine their influence on the flow predictions of a distributed hydrologic model. These methods include the National Mosaic and QPE algorithm package (NMQ), under development at the National Severe Storms Laboratory (NSSL), and the Multisensor Precipitation Estimator (MPE) and High-Resolution Precipitation Estimator (HPE) suites currently operational at National Weather Service (NWS) field offices. The goal of the study is to determine which combination of algorithm features offers the greatest benefit toward operational hydrologic forecasting. These features include automated radar quality control, automated Z–R selection, brightband identification, bias correction, multiple radar data compositing, and gauge–radar merging, which all differ between NMQ and MPE–HPE. To examine the spatial and temporal characteristics of the precipitation fields produced by each of the QPE methodologies, high-resolution (4 km and hourly) gridded precipitation estimates were derived by each algorithm suite for three major precipitation events between 2003 and 2006 over subcatchments within the Tar–Pamlico River basin of North Carolina. The results indicate that the NMQ radar-only algorithm suite consistently yielded closer agreement with reference rain gauge reports than the corresponding HPE radar-only estimates did. Similarly, the NMQ radar-only QPE input generally yielded hydrologic simulations that were closer to observations at multiple stream gauging points. These findings indicate that the combination of Z–R selection and freezing-level identification algorithms within NMQ, but not incorporated within MPE and HPE, would have an appreciable positive impact on hydrologic simulations. There were relatively small differences between NMQ and HPE gauge–radar estimates in terms of accuracy and impacts on hydrologic simulations, most likely due to the large influence of the input rain gauge information.
2011: Precipitation properties of supercell hook echoes. Electronic Journal of Severe Storms Meteorology, 6(5), 1–21.,
Recent studies have suggested that thermodynamic properties of supercell rear-flank downdrafts (RFDs) can affect whether or not tornadogenesis occurs. The thermodynamic characteristics of RFDs are determined in part by microphysical processes such as evaporation of raindrops and melting of hailstones. Whereas in situ measurements of hook-echo particle size distributions (PSDs) are exceedingly rare, polarimetric radars can be used to determine the bulk characteristics of these PSDs remotely. A preliminary analysis of polarimetric radar data from a small sample of supercell hook echoes reveals unusual drop size distributions compared to typical rainfall in Oklahoma, as well as spatially inhomogeneous structures. The inner edge of the hook echo is often characterized by low or moderate reflectivity factor at horizontal polarization (ZH), with very high differential reflectivity factor (ZDR), indicating a sparse population of very large drops. The southern and/or western (back) portion of the hook is characterized by moderate to high ZH and rather low ZDR, indicating a surplus of small drops and/or a lack of larger drops. Hypotheses explaining the unusual drop size distributions are presented. Additionally, the time evolution of these characteristics is explored using data collected with a special rapid scanning strategy.
2012: The impact of size sorting on the polarimetric radar variables. Journal of the Atmospheric Sciences, 69, 2042–2060, doi:10.1175/JAS-D-11-0125.1., ,
Differential sedimentation of precipitation occurs because heavier hydrometeors fall faster than lighter ones. Updrafts and vertical wind shear can maintain this otherwise transient size sorting, resulting in prolonged regions of ongoing particle sorting in storms. This study quantifies the impact of size sorting on the S-band polarimetric radar variables (radar reflectivity factor at horizontal polarization ZH, differential reflectivity ZDR, specific differential phase KDP, and the copolar cross-correlation coefficient ρhv). These variables are calculated from output of two idealized bin models: a one-dimensional model of pure raindrop fallout and a two-dimensional rain shaft encountering vertical wind shear. Additionally, errors in the radar variables as simulated by single-, double-, and triple-moment bulk microphysics parameterizations are quantified for the same size sorting scenarios.
Size sorting produces regions of sparsely concentrated large drops with a lack of smaller drops, causing ZDR enhancements as large as 1 dB in areas of decreased ZH, often along a ZH gradient. Such areas of enhanced ZDR are offset from those of high ZH and KDP. Illustrative examples of polarimetric radar observations in a variety of precipitation regimes demonstrate the widespread occurrence of size sorting and are consistent with the bin model simulations. Single-moment schemes are incapable of size sorting, leading to large underestimations in ZDR (>2 dB) compared to the bin model solution. Double-moment schemes with a fixed spectral shape parameter produce excessive size sorting by incorrectly increasing the number of large raindrops, overestimating ZDR by 2–3 dB. Three-moment schemes with variable shape parameters better capture the narrowing drop size distribution resulting from size sorting but can underestimate ZDR and overestimate KDP by as much as 20%. Implications for polarimetric radar data assimilation into storm-scale numerical weather prediction models are discussed.
2012: Image processing of weather radar reflectivity data: Should it be done in Z or dBZ?. EJSSM, 7, 1–4.,
It appears to be a common belief that processing, such as interpolation, clustering or smoothing, of weather radar reflectivity fields ought to be carried out on the reflectivity factor ($Z$) and not on its logarithm (dB$Z$). It is demonstrated here by means of a statistical study on a large dataset that, contrary to common belief, processing in dB$Z$ is better for such applications.
2012: Visualizing Model Data Using A Fast Approximation of a Radiative Transfer Model. Journal of Atmospheric and Oceanic Technology, 29, 745–754., , , , ,
Visualizing model forecasts using simulated satellite imagery has proven very useful because the depiction of forecasts using cloud imagery can provide inferences about meteorological scenarios and physical processes that are not characterized well by depictions of those forecasts using radar reflectivity. A forward radiative transfer model is capable of providing such a visible-channel depiction of numerical weather prediction model output, but present-day forward models are too slow to run routinely on operational model forecasts.
It is demonstrated that it is possible to approximate the radiative transfer model using an universal approximator whose parameters can be determined by fitting the output of the forward model to inputs derived from the raw output from the prediction model. The resulting approximation is very close to the result derived from the complex radiative transfer model and has the advantage that it can be computed in a small fraction of the time required by the forward model. This approximation is carried out on model forecasts to demonstrate its utility as a visualization and forecasting tool.
2012: A Statistical Approach to Mitigating Persistent Clutter in Radar Reflectivity Data. IEEE J. Selected Topics in Applied Earth Observations and Remote Sensing, 5, 652–662., , , ,
A statistical approach to creating a clutter map from "found data" i.e. data not specifically collected in clear air is described in this paper. Different methods of mitigating ground clutter are then compared using an information theory statistical approach and the best mitigation approach chosen.
The technique described in this paper allows for the mitigation of persistent ground clutter returns in archived data where signal processing techniques have not been applied or have been conservatively applied. It is also helpful for correcting mobile radar data where the creation of a clear-air clutter map is impractical. Accordingly, the technique is demonstrated in each of the above situations.
2012: A Physical Model of Branching in Upward Leaders. AerospaceLab, 5, 07-1–07-7., ,
Abstract: The physical processes leading to branching and physical factors affecting branching features are poorly understood. We are applying the tested physical model of axisymmetrical leader development following the streamer-leader transition to a 3-dimensional propagation of the leader with branching. The propagation of the leader is driven by the potential drop at the leader tip. The branching occurs when the drop potential at the leader tip reaches a threshold. The space charge around the leaders self regulates the total number of active branches by reducing the available potential for the propagation. The model has been applied to simulate the time evolution of an upward leader started from a tall ground structure and developing in an electric field produced by a mature thunderstorm. We are satisfied with the fact that the results of computer simulation of branching leader closely resemble branching of upward positive leaders triggered by tall structures depicted in high-speed video images.
2011: Transient luminous events above two mesoscale convective systems: Charge moment change analysis. Journal of Geophysical Research: Space Science, 116, A10306, 1–11, doi:10.1029/2011JA016758., , , , , ,
Charge moment change (ΔMQ) data were examined for 41 positive cloud‐to‐ground (+CG) lightning discharges that were parents of transient luminous events (TLEs; mainly sprites) over two different storms: 9 May (20 parents) and 20 June 2007 (21). Data were broken down by contributions from the impulse ΔMQ (iΔMQ), within the first 2 ms of the return stroke, and the ΔMQ from the continuing current (CC), which can last tens of ms afterward. Three‐dimensional lightning mapping data provided positions for the in‐cloud components of the parent +CGs. Charge and charge density neutralized by the strokes were estimated. The 20 June parents were more impulsive than 9 May, with increased iΔMQ and CC amplitude but reduced CC duration. Total ΔMQ values between the two storms were very similar, averaging approximately 1800 C km. Estimated charge density on 20 June was nearly twice that on 9 May, consistent with the 20 June storm being more intense with a stronger electrical generator. Lightning metrics were analyzed for 9 high‐ iΔMQ (>300 C km) +CGs that did not produce an observable TLE on 20 June, and compared to that day's TLE parents. Non‐TLE +CGs had reduced CC magnitudes and duration, with less total ΔMQ. Photogrammetric estimates of TLE azimuthal swaths were positively correlated with similar metrics of the in‐cloud portions of the parent +CGs, as well with total ΔMQ. The implications of all these results for the ΔMQ theory of sprite initiation, and for the relationship between sprite development and in‐cloud discharging, are discussed.
2012: Multilag Correlation Estimators for Polarimetric Radar Measurements in the Presence of Noise. Journal of Atmospheric and Oceanic Technology, 29, 772–795, doi:10.1175/JTECH-D-11-00010.1., , , , , , , ,
The quality of polarimetric radar data degrades as the signal-to-noise ratio (SNR) decreases. This substantially limits the usage of collected polarimetric radar data to high SNR regions. To improve data quality at low SNRs, multilag correlation estimators are introduced. The performance of the multilag estimators for spectral moments and polarimetric parameters is examined through a theoretical analysis and by the use of simulated data. The biases and standard deviations of the estimates are calculated and compared with those estimates obtained using the conventional method.
2012: Herbert Riehl: Intrepid and Enigmatic Scholar. Bulletin of the American Meteorological Society, 93, 963–985, doi:10.1175/BAMS-D-11-00224.1., , ,
Herbert Riehl, known as the “father of tropical meteorology”, certainly made outstanding contributions to this field of study. Yet, when his oeuvre is examined retrospectively, there is strong evidence that his view was global and encompassed processes that cut across the latitudinal bands of the tropics, subtropics, and mid-latitudes. His pathway into meteorology was unique as a Jewish man who immigrated to the USA from Germany in 1933 — that point in time when the fascist regime in Germany gained significant power. Meteorology was not his first choice as a career, but circumstances related to imminent world war led him to the study of meteorology. He was inspired by his teaching and research experiences at the Institute of Tropical Meteorology in Puerto Rico during WWII. Further, he found his scientific calling in the milieu of “Rossby’s School” at the University of Chicago (U of C) following the war. We pay particular attention to his early work from the mid-1940s through the late-1950s while professor at the U of C – a period when he ventured into the relatively unknown field of tropical meteorology. The strength of his early research contributions along with his mastery of language and adeptness in scientific debate drew many first-rate students into the field. Yet, his unorthodox brand of mentorship and his hard-edged nature created challenges that are further examined through first-person verbal portraits or vignettes. In some detail, we explore the interaction between Riehl and one of his students, Joanne Simpson. We end with a discussion of his scientific legacy.
2012: Analyzing projected changes and trends of temperature and precipitation in the southern USA from 16 downscaled global climate models. Theoretical and Applied Climatology, 0177-798X, 1–16, doi:10.1007/s00704-011-0567-9., , , , , , , , ,
This study aims to examine how future climate, temperature and precipitation specifically, are expected to change under the A2, A1B, and B1 emission scenarios over the six states that make up the Southern Climate Impacts Planning Program (SCIPP): Oklahoma, Texas, Arkansas, Louisiana, Tennessee, and Mississippi. SCIPP is a member of the National Oceanic and Atmospheric Administration- funded Regional Integrated Sciences and Assessments net- work, a program which aims to better connect climate-related scientific research with in-the-field decision-making processes. The results of the study found that the average temperature over the study area is anticipated to increase by 1.7°C to 2.4°C in the twenty-first century based on the different emission scenarios with a rate of change that is more pronounced during the second half of the century. Summer and fall seasons are projected to have more significant temperature increases, while the northwestern portions of the region are projected to experience more significant increases than the Gulf coast region. Precipitation projections, conversely, do not exhibit a discernible upward or downward trend. Late twenty-first century exhibits slightly more precipitation than the early century, based on the A1B and B1 scenario, and fall and winter are projected to become wetter than the late twentieth century as a whole. Climate changes on the city level show that greater warming will happened in inland cities such as Oklahoma City and El Paso, and heavier precipitation in Nashville. These changes have profound implications for local water resources man- agement as well as broader regional decision making. These results represent an initial phase of a broader study that is being undertaken to assist SCIPP regional and local water planning efforts in an effort to more closely link climate modeling to longer-term water resources management and to continue assessing climate change impacts on regional hazards management in the South.
2011: Advancing research and applications with lightning detection and mapping systems. EOS, Trans. Amer. Geophys. Un, 92, 400–400, doi:10.1029/2011EO450007., ,
The Southern Thunder 2011 (ST11) Workshop was the fourth in a series intended to accelerate research and operational applications made possible by the expanding availability of ground-based and satellite systems that detect and map all types of lightning (in-cloud and cloud-to-ground). This community workshop, first held in 2004, brings together lightning data providers, algorithm developers, and operational users in government, academia, and industry.
ST11 presentations described the ongoing expansion of regional ground-based networks that map the location of all types of lightning and updated plans for the Geostationary Lightning Mapper (GLM) planned for launch on the next generation Geostationary Operational Environmental Satellite-R series (GOES-R) in late 2015 (http://www.goes-r.gov). Presentations also described new techniques for tracking trends in lightning flash rates, displaying and using those trends in National Weather Service (NWS) forecast offices, and improving lightning safety.
2011: The timing of cloud-to-ground lightning relative to total lightning activity. Monthly Weather Review, 139, 3871–3886, doi:10.1175/MWR-D-11-00047.1., , , , , ,
The first flash produced by a storm usually does not strike ground, but little has been published concerning the time after the first flash before a cloud-to-ground flash occurs, particularly for a variety of climatological regions. To begin addressing this issue, this study analyzed data from very-high-frequency (VHF) lightning mapping systems, which detect flashes of all types, and from the U.S. National Lightning Detection Network (NLDN), which identifies flash type and detects roughly 90% of cloud-to-ground flashes overall. VHF mapping data were analyzed from three regions: north Texas, Oklahoma, and the high plains of Colorado, Kansas, and Nebraska. The percentage of storms in which a cloud-to-ground flash was detected in the first minute of lightning activity varied from 0% in the high plains to 10%–20% in Oklahoma and north Texas. The distribution of delays to the first cloud-to-ground flash varied similarly. In Oklahoma and north Texas, 50% of storms produced a cloud-to-ground flash within 5–10 min, and roughly 10% failed to produce a cloud-to-ground flash within 1 h. In the high plains, however, it required 30 min for 50% of storms to have produced a cloud-to-ground flash, and 20% produced no ground flash within 1 h. The authors suggest that the reason high plains storms take longer to produce cloud-to-ground lightning is because the formation of the lower charge needed to produce most cloud-to-ground flashes is inhibited either by delaying the formation of precipitation in the midand lower levels of storms or by many of the storms having an inverted-polarity electrical structure.
2012: Infrared measurements in the Arctic using two Atmospheric Emitted Radiance Interferometers. Atmos. Meas. Tech, 5, 329–344, doi:10.5194/amt-5-329-2012., , , , , , , , , , , , , , , ,
The Extended-range Atmospheric Emitted Radiance Interferometer (E-AERI) is a moderate resolution (1 cm−1 ) Fourier transform infrared spectrometer for measuring the absolute downwelling infrared spectral radiance from the atmosphere between 400 and 3000 cm−1. The extended spectral range of the instrument permits monitoring of the 400–550 cm−1 (20–25 μm) region, where most of the infrared surface cooling currently occurs in the dry air of the Arctic. Spectra from the E-AERI have the potential to provide information about radiative balance, trace gases, and cloud properties in the Canadian high Arctic. Calibration, performance evaluation, and certification of the E-AERI were performed at the University of Wisconsin Space Science and Engineering Centre from September to October 2008. The instrument was then installed at the Polar Environment Atmospheric Research Laboratory (PEARL) Ridge Lab (610 m altitude) at Eureka, Nunavut, in October 2008, where it acquired one year of data. Measurements are taken every seven minutes year-round, including polar night when the solar-viewing spectrometers at PEARL are not operated. A similar instrument, the University of Idaho’s Polar AERI (P-AERI), was installed at the Zero-altitude PEARL Auxiliary Laboratory (0PAL), 15 km away from the PEARL Ridge Lab, from March 2006 to June 2009. During the period of overlap, these two instruments provided calibrated radiance measurements from two altitudes. A fast line-by-line radiative transfer model is used to simulate the downwelling radiance at both altitudes; the largest differences (simulation- measurement) occur in spectral regions strongly influenced by atmospheric temperature and/or water vapour. The two AERI instruments at close proximity but located at two different altitudes are well-suited for investigating cloud forcing. As an example, it is shown that a thin, low ice cloud resulted in a 6% increase in irradiance. The presence of clouds creates a large surface radiative forcing in the Arctic, particularly in the 750–1200 cm−1 region where the downwelling radiance is several times greater than clear-sky radiances, which is significantly larger than in other more humid regions.
2012: The Pretornadic Phase of the Goshen County, Wyoming, Supercell of 5 June 2009 Intercepted by VORTEX2. Part I: Evolution of Kinematic and Surface Thermodynamic Fields. Monthly Weather Review, 140, 2887–2915, doi:10.1175/MWR-D-11-00336.1., , , , , , , ,
The authors analyze the pretornadic phase (2100–2148 UTC; tornadogenesis began at 2152 UTC) of the Goshen County, Wyoming, supercell of 5 June 2009 intercepted by the second Verification of the Origins of Rotation in Tornadoes Experiment (VORTEX2). The analysis relies on radar data from the Weather Surveillance Radar-1988 Doppler (WSR-88D) in Cheyenne, Wyoming (KCYS), and a pair of Doppler-on-Wheels (DOW) radars, mobile mesonet observations, and mobile sounding observations.
The storm resembles supercells that have been observed in the past. For example, it develops a couplet of counter-rotating vortices that straddle the hook echo within the rear-flank outflow and are joined by arching vortex lines, with the cyclonic vortex becoming increasingly dominant in the time leading up to tornadogenesis. The outflow in the hook echo region, where sampled, has relatively small virtual potential temperature θυ deficits during this stage of evolution. A few kilometers upstream (north) of the location of maximum vertical vorticity, θυ is no more than 3 K colder than the warmest θυ readings in the inflow of the storm. Forward trajectories originating in the outflow within and around the low-level mesocyclone rise rapidly, implying that the upward-directed perturbation pressure gradient force exceeds the negative buoyancy.
Low-level rotation intensifies in the 2142–2148 UTC period. The intensification is preceded by the formation of a descending reflectivity core (DRC), similar to others that have been documented in some supercells recently. The DRC is associated with a rapid increase in the vertical vorticity and circulation of the low-level mesocyclone.
2012: The Pretornadic Phase of the Goshen County, Wyoming, Supercell of 5 June 2009 Intercepted by VORTEX2. Part II: Intensification of Low-Level Rotation. Monthly Weather Review, 140, 2916–2938, doi:10.1175/MWR-D-11-00337.1., , , , , , , , ,
The dynamical processes responsible for the intensification of low-level rotation prior to tornadogenesis are investigated in the Goshen County, Wyoming, supercell of 5 June 2009 intercepted by the second Verification of the Origins of Rotation in Tornadoes Experiment (VORTEX2). The circulation of material circuits that converge upon the low-level mesocyclone is principally acquired along the southern periphery of the forward-flank precipitation region, which is a corridor characterized by a horizontal buoyancy gradient; thus, much of the circulation appears to have been baroclinically generated. The descending reflectivity core (DRC) documented in Part I of this paper has an important modulating influence on the circulation of the material circuits. A circuit that converges upon the low-level mesocyclone center prior to the DRC’s arrival at low levels (approximately the arrival of the 55-dBZ reflectivity isosurface in this case) loses some of its previously acquired circulation during the final few minutes of its approach. In contrast, a circuit that approaches the low-level mesocyclone center after the DRC arrives at low levels does not experience the same adversity.
An analysis of the evolution of angular momentum within a circular control disk centered on the low-level mesocyclone reveals that the area-averaged angular momentum in the nearby surroundings of the low-level mesocyclone increases while the mesocyclone is occluding and warm-sector air is being displaced from the near surroundings. The occlusion process reduces the overall negative vertical flux of angular momentum into the control disk and enables the area-averaged angular momentum to continue increasing even though the positive radial influx of angular momentum is decreasing in time.
2012: A Method for Calibrating Deterministic Forecasts of Rare Events. Weather and Forecasting, 27, 531–538, doi:10.1175/WAF-D-11-00074.1., , , , , ,
Convection-allowing models offer forecasters unique insight into convective hazards relative to numerical models using parameterized convection. However, methods to best characterize the uncertainty of guidance derived from convection-allowing models are still unrefined. This paper proposes a method of deriving calibrated probabilistic forecasts of rare events from deterministic forecasts by fitting a parametric kernel density function to the model’s historical spatial error characteristics. This kernel density function is then applied to individual forecast fields to produce probabilistic forecasts.
2012: Comments on “Tornado Risk Analysis: Is Dixie Alley an Extension of Tornado Alley?”. Bulletin of the American Meteorological Society, 93, 405–407, doi:10.1175/BAMS-D-11-00200.1., ,
2011: Airborne Instrumentation Needs for Climate and Atmospheric Research (2011). Bulletin of the American Meteorological Society, 92, 1193–1196, doi:10.1175/2011BAMS3180.1., , , , , , , ,
The Atmospheric Radiation Measurement (ARM) program hosted a workshop to bring together graduate students, postdoctoral fellows, and senior researchers working in atmospheric sciences at U.S. and foreign universities and government laboratories to discuss state-of-the-art techniques and necessary advances to realize effective airborne measurements of atmospheric parameters for climate and weather research.
2011: Mapping Bragg Scatter with a Polarimetric WSR-88D. Journal of Atmospheric and Oceanic Technology, 28, 1273–1285, doi:10.1175/JTECH-D-10-05048.1., , , ,
Using a polarimetric Weather Surveillance Radar-1988 Doppler (WSR-88D) radar to distinguish Bragg scatterers from insects and birds in an optically clear atmosphere has the potential to provide information on convective boundary layer depth. Measured median differential reflectivities ZDR of Bragg scatterers lie between −0.08 and 0.06 dB, which supports the hypothesis that the intrinsic ZDR of Bragg scatters is 0 dB. Thus, the intrinsic 0 dB of Bragg scatter can be used for verifying of ZDR radar calibration. Measured copolar correlation coefficients phv have distributions peaked at about 0.998-1.0. If insects and birds are spatially separated from Bragg scatterers, the dual-polarization capability of the WSR-88D allows distinguishing echoes from these two types of scatterers since ZDR from biota is significantly larger than 0 dB. In mixtures of Bragg and biota scatter, polarimetric spectral analysis shows differences in portions of the H and V spectra where birds and insects could be contaminating echoes from Bragg scatterers. Enhancements to data collection and signal processing allow power measurement, with a standard deviation of about 1 dB, of weak echoes from Bragg scatterers having equivalent reflectivity factors of about −28 dBZ at distance of 10 km from the radar. This level of reflectivity corresponds to a refractive index structure parameter of about 4 × 10^(−15) m^(−2/3), a typical magnitude found in maritime air.
2012: Resonance Effects within S Band in Echoes from Birds. IEEE Geoscience and Remote Sensing Letters, 9, 3, 413–416, doi:10.1109/LGRS.2011.2169933., , ,
It is shown that the scattering resonance effects in echoes from migrating birds are so strong that a 10% frequency deviation within S band can result in more than 10 dB changes in reflectivity values. Differential reflectivity values from adjacent polarimetric WSR-88D weather radars operating at offset frequencies can differ by several dB in “clear air” echoes.
2012: Comparing Information Content of Upwelling Far-Infrared and Midinfrared Radiance Spectra for Clear Atmosphere Profiling. Journal of Atmospheric and Oceanic Technology, 29, 510–526, doi:10.1175/JTECH-D-11-00113.1., ,
The information content of high-spectral-resolution midinfrared (MIR; 650–2300 cm^(-1)) and far-infrared (FIR; 200–685 cm^(-1)) upwelling radiance spectra is calculated for clear-sky temperature and water vapor profiles. The wavenumber ranges of the two spectral bands overlap at the central absorption line in the CO2 n2 absorption band, and each contains one side of the full absorption band. Each spectral band also includes a water vapor absorption band; the MIR contains the first vibrational–rotational absorption band, while the FIR contains the rotational absorption band. The upwelling spectral radiances are simulated with the line-by- line radiative transfer model (LBLRTM), and the retrievals and information content analysis are computed using standard optimal estimation techniques. Perturbations in the surface temperature and in the trace gases methane, ozone, and nitrous oxide (CH4, O3, and N2O) are introduced to represent forward-model errors. Each spectrum is observed by a simulated infrared spectrometer, with a spectral resolution of 0.5 cm^(-1), with realistic spectrally varying sensor noise levels. The modeling and analysis framework is applied identically to each spectral range, allowing a quantitative comparison. The results show that for similar sensor noise levels, the FIR shows an advantage in water vapor profile information content and less sensitivity to forward-model errors. With a higher noise level in the FIR, which is a closer match to current FIR detector technology, the FIR information content drops and shows a disadvantage relative to the MIR.
2011: A climatology of nocturnal warming events associated with cold-frontal passages in Oklahoma. Journal of Applied Meteorology and Climatology, 50, 2042–2061, doi:10.1175/JAMC-D-11-020.1., , ,
A sudden increase in temperature during the nighttime hours accompanies the passages of some cold fronts. In some cold front–associated warming events, the temperature can rise by as much as 10°C and can last from a few minutes to several hours. Previous studies suggest that these events are due to the downward transport of warmer air by the strong and gusty winds associated with the cold-frontal passages. In this study, a climatology of nocturnal warming events associated with cold fronts was created using 6 yr of Oklahoma Mesonetwork (Mesonet) data from 2003 to 2008. Nocturnal warming events associated with cold-frontal passages occurred surprisingly frequently across Oklahoma. Of the cold fronts observed in this study, 91.5% produced at least one warming event at an Oklahoma Mesonet station. The winter months accounted for the most events (37.9%), and the summer months accounted for the fewest (3.8%). When normalized by the monthly number of cold-frontal passages, the winter months still had the most number of warming events. The number of warming events increased rapidly from 2300 to 0200 UTC; thereafter, the number of events gradually decreased. A spatial analysis revealed that the frequency of warming events decreased markedly from west to east across the state. In contrast, the average magnitude of the warming increased from west to east. In contrast to control periods (associated with cold-frontal passages without nocturnal warming events), warming events were associated with weaker initial winds and stronger initial temperature inversions. Moreover, the nocturnal temperature inversion weakened more during warming events than during control periods and the surface wind speeds increased more during warming events than during control periods. These results are consistent with previous studies that suggest the warming events are due to the “mixing out” of the nocturnal temperature inversion.
2012: Impact of a Vertical Vorticity Constraint in Variational Dual-Doppler Wind Analysis: Tests with Real and Simulated Supercell Data. Journal of Atmospheric and Oceanic Technology, 29, 32–49., , ,
One of the greatest challenges to dual-Doppler retrieval of the vertical wind is the lack of low-level divergence information available to the mass conservation constraint. This study examines the impact of a vertical vorticity equation constraint on vertical velocity retrievals when radar observations are lacking near the ground. The analysis proceeds in a three-dimensional variational data assimilation (3DVAR) framework with the anelastic form of the vertical vorticity equation imposed along with traditional data, mass conservation, and smoothness constraints. The technique is tested using emulated radial wind observations of a supercell storm simulated by the Advanced Regional Prediction System (ARPS), as well as real dual-Doppler observations of a supercell storm that occurred in Oklahoma on 8 May 2003. Special attention is given to procedures to evaluate the vorticity tendency term, including spatially variable advection correction and estimation of the intrinsic evolution. Volume scan times ranging from 5 min, typical of operational radar networks, down to 30 s, achievable by rapid-scan mobile radars, are considered. The vorticity constraint substantially improves the vertical velocity retrievals in our experiments, particularly for volume scan times smaller than 2 min.
2012: Assessing Errors in Variational Dual-Doppler Wind Syntheses of Supercell Thunderstorms Observed by Storm-Scale Mobile Radars. Journal of Atmospheric and Oceanic Technology, 29, 1009–1025, doi:10.1175/JTECH-D-11-00177.1., , ,
Dual-Doppler wind retrieval is an invaluable tool in the study of convective storms. However, the nature of the errors in the retrieved three-dimensional wind estimates and subsequent dynamical analyses is not precisely known, making it difficult to assign confidence to inferred storm behavior. Using an Observing System Simulation Experiment (OSSE) framework, this study characterizes these errors for a supercell thunderstorm observed at close range by two Doppler radars. Synthetic radar observations generated from a high-resolution numerical supercell simulation are input to a three-dimensional variational data assimilation (3DVAR) dual-Doppler wind retrieval technique. The sensitivity of the analyzed kinematics and dynamics to the dual-Doppler retrieval settings, hydrometeor fall speed parameterization errors, and radar cross-beam angle and scanning strategy is examined.
Imposing the commonly adopted assumptions of spatially constant storm motion and intrinsically steady flow produces large errors at higher altitudes. On the other hand, reasonably accurate analyses are obtained at lower and middle levels, even when the majority of the storm lies outside the 30° dual-Doppler lobe. Low-level parcel trajectories initiated around the main updraft and rear-flank downdraft are generally qualitatively accurate, as are time series of circulation computed around material circuits. Omitting upper-level radar observations to reduce volume scan times does not substantially degrade the lower- and middle-level analyses, which implies that shallower scanning strategies should enable an improved retrieval of supercell dynamics. The results suggest that inferences about supercell behavior based on qualitative features in 3DVAR dual-Doppler and subsequent dynamical retrievals may generally be reliable.
2012: Tornado Climatology of Finland. Monthly Weather Review, 140, 1446–1456., , ,
A tornado climatology for Finland is constructed from 1796 to 2007. The climatology consists of two datasets. A historical dataset (1796-1996) is largely constructed from newspaper archives and other historical archives and datasets, and a recent dataset (1997-2007) is largely constructed from eyewitness accounts sent to the Finnish Meteorological Institute and news reports. This article describes the process of collecting and evaluating possible tornado reports. Altogether, 298 Finnish tornado cases comprise the climatology: 129 from the historical dataset and 169 from the recent dataset. An annual average of 14 tornado cases occur in Finland (1997-2007). A case with a significant tornado (F2 or stronger) occurs in our database on average every other year, comprising 14% of all tornado cases. All documented tornadoes in Finland have occurred between April and November. As in the neighbouring countries in northern Europe, July and August are the months with the maximum frequency of tornado cases, coincident with the highest lightning occurrence both over land and sea. Waterspouts tend to be favored later in the summer, peaking in August. The peak month for significant tornadoes is August. The diurnal peak for tornado cases is 1700-1859 local time.
2011: Impact of CASA Radar and Oklahoma Mesonet Data Assimilation on the Analysis and Prediction of Tornadic Mesovortices in an MCS. Monthly Weather Review, 139, 3422–3445., , , , ,
The impact of radar and Oklahoma Mesonet data assimilation on the prediction of mesovortices in a tornadic mesoscale convective system (MCS) is examined. The radar data come from the operational Weather Surveillance Radar-1988 Doppler (WSR-88D) and the Engineering Research Center for Collaborative Adaptive Sensing of the Atmosphere’s (CASA) IP-1 radar network. The Advanced Regional Prediction System (ARPS) model is employed to perform high-resolution predictions of an MCS and the associated cyclonic line-end vortex that spawned several tornadoes in central Oklahoman 8–9 May 2007, while the ARPS three-dimensional variational data assimilation (3DVAR) system in combination with a complex cloud analysis package is used for the data analysis. A set of data assimilation and prediction experiments are performed on a 400-m resolution grid nested inside a 2-km grid, to examine the impact of radar data on the prediction of meso-g-scale vortices (mesovortices). An 80-min assimilation window is used in radar data assimilation experiments. An additional set of experiments examines the impact of assimilating 5-min data from the Oklahoma Mesonet in addition to the radar data. Qualitative comparison with observations shows highly accurate forecasts of mesovortices up to 80 min in advance of their genesis are obtained when the low-level shear in advance of the gust front is effectively analyzed. Accurate analysis of the low-level shear profile relies on assimilating high resolution low-level wind information. The most accurate analysis (and resulting prediction) is obtained in experiments that assimilate low-level radial velocity data from the CASA radars. Assimilation of 5-min observations from the Oklahoma Mesonet has a substantial positive impact on the analysis and forecast when high-resolution low-level wind observations from CASA are absent; when the low-level CASA wind data are assimilated, the impact of Mesonet data is smaller. Experiments that do not assimilate low-level wind data from CASA radars are unable to accurately resolve the low-level shear profile and gust front structure, precluding accurate prediction of mesovortex development.
2012: Classification of precipitation types during transitional winter weather using the RUC model and polarimetric radar retrievals. Journal of Applied Meteorology and Climatology, 51, 763–779, doi:10.1175/JAMC-D-11-091.1., , , ,
A new hydrometeor classification algorithm that combines thermodynamic output from the Rapid Update Cycle (RUC) model with polarimetric radar observations is introduced. The algorithm improves upon existing classification techniques that rely solely on polarimetric radar observations by using thermodynamic information to help to diagnose microphysical processes (such as melting or refreezing) that might occur aloft. This added information is especially important for transitional weather events for which past studies have shown radar-only techniques to be deficient. The algorithm first uses vertical profiles of wet-bulb temperature derived from the RUC model output to provide a background precipitation classification type. According to a set of empirical rules, polarimetric radar data are then used to refine precipitation-type categories when the observations are found to be inconsistent with the background classification. Using data from the polarimetric KOUN Weather Surveillance Radar-1988 Doppler (WSR-88D) located in Norman, Oklahoma, the algorithm is tested on a transitional winter-storm event that produced a combination of rain, freezing rain, ice pellets, and snow as it passed over central Oklahoma on 30 November 2006. Examples are presented in which the presence of a radar bright band (suggesting an elevated warm layer) is observed immediately above a background classification of dry snow (suggesting the absence of an elevated warm layer in the model output). Overall, the results demonstrate the potential benefits of combining polarimetric radar data with thermodynamic information from numerical models, with model output providing widespread coverage and polarimetric radar data providing an observation-based modification of the derived precipitation type at closer ranges.
2012: Precipitation observations with NSSL’s X-band polarimetric radar during the SNOW-V10 campaign. Pure and Applied Geophysics, , doi:10.1007/s00024-012-0569-2., , , , ,
In support of SNOW-V10, the National Oceanic Administration/National Severe Storms Laboratory (NOAA/NSSL) mobile dual-polarized X-band (NO-XP) radar was deployed to Birch Bay State Park in Birch Bay, Washington from 3 January 2010 to 17 March 2010. In addition to being made available in real time for Science and NOWcasting of the Olympic Weather for Vancouver 2010 (SNOW-V10) operations, NO-XP data are used here to demonstrate the capabilities of easily deployable, polarimetric X-band radar systems, especially for regions where mountainous terrain results in partial beam blockage. A rainfall estimator based on specific attenuation is shown to mitigate the effects of partial beam blockage and provide potential improvement in rainfall estimation. The ability of polarimetric X-band radar to accurately detect melting layer (ML) height is also shown. A 16 hour comparison of radar reflectivity (Z), differential reflectivity (ZDR), and correlation coefficient (RhoHV) measurements from NO-XP with vertically pointing Micro Rain Radar observations indicates that the two instruments provide ML height evolution that exhibit consistent temporal trends. Since even slight changes in the ML height in regions of mountainous terrain might result in a change in precipitation type measured at the surface, this shows that horizontally extensive information on ML height fluctuations, such as provided by the NO-XP, is useful in determining short term changes in expected precipitation type. Finally, range-height indicator (RHI) scans of NO-XP Z, ZDR, and RhoHV fields from SNOW-V10 are used to demonstrate the ability of polarimetric radar to diagnose microphysical processes (both above and below the ML) that otherwise remain unseen by conventional radar. Near-surface enhancements in ZDR are attributed to either differential sedimentation or the preferential evaporation of smaller drops. Immediately above the ML, regions of high Z, low ZDR, and high RhoHV are believed to be associated with convective turrets containing heavily aggregated or rimed snow that supply water/ice mass that later result in enhanced regions of precipitation near the surface. Higher up, horizontally extensive regions of enhanced ZDR are attributed to rapid dendritic growth and the onset of snow aggregation, a feature that has been widely observed with both S band and C band radars.
Online publication; this paper does not have page numbers. Please follow the DOI link for the full text of the paper.
2011: Observations of the Surface Boundary Structure within the 23 May 2007 Perryton, Texas, Supercell. Monthly Weather Review, 139, 3730–3749, doi:10.1175/MWR-D-10-05078.1., , , , ,
In situ data collected within a weakly tornadic, high-precipitation supercell occurring on 23 May 2007 near Perryton, Texas, are presented. Data were collected using a recently developed fleet of 22 durable, rapidly deployable probes dubbed “StickNet” as well as four mobile mesonet probes. Kinematic and thermodynamic observations of boundaries within the supercell are described in tandem with an analysis of data from the Shared Mobile Atmospheric Research and Teaching Radar.
Observations within the rear-flank downdraft of the storm exhibit large deficits of both virtual potential temperature and equivalent potential temperature, with a secondary rear-flank downdraft gust front trailing the mesocyclone. A primarily thermodynamic boundary resided across the forward-flank reflectivity gradient of the supercell. This boundary is characterized by small deficits in virtual potential temperature coupled with positive perturbations of equivalent potential temperature. The opposing thermodynamic perturbations appear to be representative of modified storm inflow, with a flux of water vapor responsible for the positive perturbations of the equivalent potential temperature. Air parcels exhibiting negative perturbations of virtual potential temperature and positive perturbations of equivalent potential temperature have the ability to be a source of both baroclinically generated streamwise horizontal vorticity and greater potential buoyancy if ingested by the low-level mesocyclone.
2011: Probabilistic Forecast Guidance for Severe Thunderstorms Based on the Identification of Extreme Phenomena in Convection-Allowing Model Forecasts. Weather and Forecasting, 26, 714–728, doi:10.1175/WAF-D-10-05046.1., , , , , ,
With the advent of convection-allowing NWP models (CAMs) comes the potential for new forms of forecast guidance. While CAMs lack the required resolution to simulate many severe phenomena associated with convection (e.g., large hail, downburst winds, and tornadoes), they can still provide unique guidance for the occurrence of these phenomena if “extreme” patterns of behavior in simulated storms are strongly correlated with observed severe phenomena. This concept is explored using output from a series of CAM forecasts generated on a daily basis during the spring of 2008. This output is mined for the presence of extreme values of updraft helicity (UH), a diagnostic field used to identify supercellular storms. Extreme values of the UH field are flagged as simulated “surrogate” severe weather reports and the spatial correspondence between these surrogate reports and actual observed severe reports is determined. In addition, probabilistic forecasts [surrogate severe probabilistic forecasts (SSPFs)] are created from each field’s simulated surrogate severe reports using a Gaussian smoother. The simulated surrogate reports are capable of reproducing the seasonal climatology observed within the field of actual reports. The SSPFs created from the surrogates are verified using ROC curves and reliability diagrams and the sensitivity of the verification metrics to the smoothing parameter in the Gaussian distribution is tested. The SSPFs produce reliable forecast probabilities with minimal calibration. These results demonstrate that a relatively straightforward postprocessing procedure, which focuses on the characteristics of explicitly predicted convective entities, can provide reliable severe weather forecast guidance. It is anticipated that this technique will be even more valuable when implemented within a convection-allowing ensemble forecasting system.
2012: Comparison of Ensemble Kalman Filter–Based Forecasts to Traditional Ensemble and Deterministic Forecasts for a Case Study of Banded Snow. Weather and Forecasting, 27, 85–105, doi:10.1175/WAF-D-11-00030.1., , , ,
The ensemble Kalman ﬁlter (EnKF) technique is compared to other modeling approaches for a case study of banded snow. The forecasts include a 12- and 3-km grid-spaced deterministic forecast (D12 and D3), a 12-km 30-member ensemble (E12), and a 12-km 30-member ensemble with EnKF-based four- dimensional data assimilation (EKF12). In D12 and D3, ﬂow patterns are not ideal for banded snow, but they have similar precipitation accumulations in the correct location. The increased resolution did not improve the quantitative precipitation forecast. The E12 ensemble mean has a ﬂow pattern favorable for banding and precipitation in the approximate correct location, although the magnitudes and probabilities of relevant features are quite low. Six members produced good forecasts of the ﬂow patterns and the precipitation structure. The EKF12 ensemble mean has an ideal ﬂow pattern for banded snow and the mean produces banded precipitation, but relevant features are about 100 km too far north. The EKF12 has a much lower spread than does E12, a consequence of their different initial conditions. Comparison of the initial ensemble means shows that EKF12 has a closed surface low and a region of high low- to midlevel humidity that are not present in E12. These features act in concert to produce a stronger ensemble-mean cyclonic system with heavier precipitation at the time of banding.
2012: The Ewiem Nimdie Summer School Series in Ghana. Bulletin of the American Meteorological Society, 93, 595–601, doi:10.1175/BAMS-D-11-00098.1., , , , , , , , , , , , , ,
Capacity Building in Meteorological Education and Research - Lessons Learned and Future prospects.
2012: The Impact of Signal Processing on the Range-Weighting Function for Weather Radars. Journal of Atmospheric and Oceanic Technology, 29, 796–806, doi:10.1175/JTECH-D-11-00135.1., ,
The range-weighting function (RWF) determines how individual scatterer contributions are weighted as a function of range to produce the meteorological data associated with a single resolution volume. The RWF is commonly defined in terms of the transmitter pulse envelope and the receiver filter impulse response, and it determines the radar range resolution. However, the effective RWF also depends on the range-time processing involved in producing estimates of meteorological variables. This is a third contributor to the RWF that has become more significant in recent years as advanced range-time processing techniques have become feasible for real-time implementation on modern radar systems. In this work, a new formulation of the RWF for weather radars that incorporates the impact of signal processing is proposed. Following the derivation based on a general signal processing model, typical scenarios are used to illustrate the variety of RWFs that can result from different range-time signal processing techniques. Finally, the RWF is used to measure range resolution and the range correlation of meteorological data.
2012: Impact of modifying the longwave water vapor continuum absorption model on community Earth system model simulations. Journal of Geophysical Research, 117, D04106, 1–11, doi:10.1029/2011JD016440., , , ,
The far-infrared (wavelengths longer than 17 µm) has been shown to be extremely important for radiative processes in the earth’s atmosphere. The strength of the water vapor continuum absorption in this spectral region has largely been predicted using observations at other wavelengths that have been extrapolated using semiempirical approaches such as the Clough-Kneizys-Davies (CKD) family of models. Recent field experiments using new far-infrared instrumentation have supported a factor of 2 decrease in the modeled strength of the foreign continuum at 50 µm and a factor of 1.5 increase in the self-continuum at 24 µm in the Clough-Kneizys-Davies continuum model (CKD v2.4); these changes are incorporated in the Mlawer-Tobin-CKD continuum model (MT_CKD v2.4). The water vapor continuum in the Community Earth System Model (CESM v1.0) was modified to use the newer model, and the impacts of this change were investigated by comparing output from the original and modified CESM for 20 year integrations with prescribed sea surface temperatures. The change results in an increase in the net upward longwave flux of order 0.5 W m^2 between 300 and 400 mb, and a decrease in this flux of about the same magnitude for altitudes below 600 mb. The radiative impact results in a small but statistically significant change in the mean temperature and humidity fields, and also a slight decrease (order 0.5%) of high-cloud amount. The change in the cloud amount modified the longwave cloud radiative forcing, which partially offset the radiative heating caused by the change in the water vapor continuum absorption model.
2012: Ground-based high spectral resolution observations of the entire terrestrial spectrum under extremely dry conditions. Geophysical Research Letters, 39, L10801, 1–5, doi:10.1029/2012GL051542., , , , , , , , , , , ,
A field experiment was conducted in northern Chile at an altitude of 5.3 km to evaluate the accuracy of line-by-line radiative transfer models in regions of the spectrum that are typically opaque at sea level due to strong water vapor absorption. A suite of spectrally resolved radiance instruments collected simultaneous observations that, for the first time ever, spanned the entire terrestrial thermal spectrum (i.e., from 10 to 3000 cm^(-1), or 1000 to 3.3 microns). These radiance observations, together with collocated water vapor and temperature profiles, are used to provide an initial evaluation of the accuracy of water vapor absorption in the far-infrared of two line-by-line radiative transfer models. These initial results suggest that the more recent of the two models is more accurate in the strongly absorbing water vapor pure rotation band. This result supports the validity of the Turner et al. (2012) study that demonstrated that the use of the more recent water vapor absorption model in climate simulations resulted in significant radiative and dynamical changes in the simulation relative to the older water vapor model.
2011: The CI-FLOW Project: A System for Total Water Level Prediction from the Summit to the Sea. Bulletin of the American Meteorological Society, 92, 1427–1442, doi:10.1175/2011BAMS3150.1., , , , , , , , , , ,
The objective of the Coastal and Inland Flooding Observation and Warning (CI-FLOW) project is to prototype new hydrometeorologic techniques to address a critical NOAA service gap: routine total water level predictions for tidally influenced watersheds. Since February 2000, the project has focused on developing a coupled modeling system to accurately account for water at all locations in a coastal watershed by exchanging data between atmospheric, hydrologic, and hydrodynamic models. These simulations account for the quantity of water associated with waves, tides, storm surge, rivers, and rainfall, including interactions at the tidal/surge interface.
Within this project, CI-FLOW addresses the following goals: i) apply advanced weather and oceanographic monitoring and prediction techniques to the coastal environment; ii) prototype an automated hydrometeorologic data collection and prediction system; iii) facilitate interdisciplinary and multiorganizational collaborations; and iv) enhance techniques and technologies that improve actionable hydrologic/hydrodynamic information to reduce the impacts of coastal flooding. Results are presented for Hurricane Isabel (2003), Hurricane Earl (2010), and Tropical Storm Nicole (2010) for the Tar–Pamlico and Neuse River basins of North Carolina. This area was chosen, in part, because of the tremendous damage inflicted by Hurricanes Dennis and Floyd (1999). The vision is to transition CI-FLOW research findings and technologies to other U.S. coastal watersheds.
2011: Ensemble prediction of Mediterranean high-impact events using potential vorticity perturbations. Part I: Comparison against the multiphysics approach. Atmospheric Research, 102, 227–241, doi:10.1016/j.atmosres.2011.07.01., , ,
The western Mediterranean is a very cyclogenetic area and many of the cyclones developed over this region are associated with high-impact weather phenomena that affect the society of the coastal countries. Two ensemble prediction systems (EPSs) based on multiphysics and perturbed initial and boundary conditions (IBC) are designed in order to improve the forecast of these heavy rain episodes. The MM5 mesoscale model nested in the ECMWF forecast fields provides the simulations, run at 22.5 km resolution for a two-day period.
The multiphysics ensemble combines different model physical parameterization schemes while the other ensemble perturbs the initial state and boundary forcing of the model with the aid of a PV inversion scheme. A PV error climatology derived from the large-scale fields allows to perturb the ECMWF PV fields using the appropriate error range.
The verification procedure indicates that even though both EPSs are skillful, the perturbed IBC ensemble is more proficient than the multiphysics EPS for the 19 Mediterranean cyclonic events with heavy rain considered in the study. Therefore the results show a more dominant role of the uncertainties in the initial and boundary conditions than the model error, although both of them contribute significantly to improve the predictability of Western Mediterranean high impact weather situations.
2012: RACORO Extended-term Aircraft Observations of Boundary Layer Clouds. Bulletin of the American Meteorological Society, 93, 861–878, doi:10.1175/BAMS-D-11-00189.1., , , , , , , , , , , , , , , , , , , , , , , , , ,
A first-of-its-kind, extended-term cloud aircraft campaign was conducted to obtain an in situ statistical characterization of continental boundary layer clouds needed to investigate cloud processes and refine retrieval algorithms. Coordinated by the Atmospheric Radiation Measurement (ARM) Aerial Facility (AAF), the Routine AAF Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign operated over the ARM Southern Great Plains (SGP) site from 22 January to 30 June 2009, collecting 260 h of data during 59 research flights. A comprehensive payload aboard the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter aircraft measured cloud microphysics, solar and thermal radiation, physical aerosol properties, and atmospheric state parameters. Proximity to the SGP's extensive complement of surface measurements provides ancillary data that support modeling studies and facilitates evaluation of a variety of surface retrieval algorithms. The five-month duration enabled sampling a range of conditions associated with the seasonal transition from winter to summer. Although about two-thirds of the flights during which clouds were sampled occurred in May and June, boundary layer cloud fields were sampled under a variety of environmental and aerosol conditions, with about 77% of the cloud flights occurring in cumulus and stratocumulus. Preliminary analyses illustrate use of these data to analyze aerosol-cloud relationships, characterize the horizontal variability of cloud radiative impacts, and evaluate surface-based retrievals. We discuss how an extended-term campaign requires a simplified operating paradigm that is different from that used for typical, short-term, intensive aircraft field programs.
2012: Lightning in the Anvils of Supercell Thunderstorms. Monthly Weather Review, 140, 2064–2079, doi:10.1175/MWR-D-11-00312.1., , ,
This study uses data from the Oklahoma Lightning Mapping Array (OK-LMA), the National Lightning Detection Network, and the Norman, Oklahoma (KOUN), prototype Weather Surveillance Radar-1988 Doppler (WSR-88D) radar to examine the evolution and structure of lightning in the anvils of supercell storms as they relate to storm dynamics and microphysics. Several supercell storms within the domain of the OK-LMA were examined to determine whether they had lightning in the anvil region, and if so, the time and location of the initiation of the anvil flashes were determined. Every warm-season supercell storm had some flashes that were initiated in or near the stronger reflectivities of the parent storm and propagated 40–70 km downstream to penetrate well into the anvil. Some supercell storms also had flashes that were initiated within the anvil itself, 40–100 km beyond the closest 30-dBZ contour of the storm. These flashes were typically initiated in one of three locations: 1) coincident with a local reflectivity maximum, 2) between the uppermost storm charge and a screening-layer charge of opposite polarity near the cloud boundary, or 3) in a region in which the anvils from two adjoining storms intersected. In some storms, anvil flashes struck ground beneath a reflectivity maximum in which reflectivity ≥20 dBZ had extended below the 0°C isotherm, possibly leading to the formation of embedded convection. This relationship may be useful for identifying regions in which there is a heightened risk for cloud-to-ground strikes beneath anvil clouds. In one storm, however, anvil lightning struck ground even though this reflectivity signature was absent.
2012: Application of a WRF Mesoscale Data Assimilation System to Springtime Severe Weather Events 2007-09. Monthly Weather Review, 140, 1539–1557, doi:10.1175/MWR-D-11-00106.1., , , ,
An ensemble-based data assimilation system using the Weather Research and Forecasting (WRF) model has been used to initialize forecasts of prolific severe weather events from springs 2007-2009. These experiments build on previous work that has shown the ability of ensemble Kalman filter (EnKF) data assimilation to produce realistic mesoscale features, such as drylines and convectively driven cold pools, which often play an important role in future convective development. For each event in this study, severe weather parameters are calculated from an experimental ensemble forecast started from EnKF analyses, and then compared to a control ensemble forecast in which no ensemble-based data assimilation is performed. Root mean square errors for surface observations averaged across all events are generally smaller for the experimental ensemble over the 0-6 h forecast period. At model grid points nearest tornado reports, the ensemble-mean significant tornado parameter (STP) and the probability of STP > 1 are often greater in the experimental 0-6 h ensemble forecasts than in the control forecasts. Likewise, the probability of MCS Maintenance Probability (MMP) is often greater with the experimental ensemble at model grid points nearest wind reports. Severe weather forecasts can be sharpened by coupling the respective severe weather parameter with the probability of measurable rainfall at model grid points. The differences between the two ensembles are found to be significant at the 95% level, suggesting that even a short period of ensemble data assimilation can yield improved forecast guidance for severe weather events.
2011: Simulated Tornadic Vortex Signatures of Tornado-Like Vortices Having One- and Two-Celled Structures. Journal of Applied Meteorology and Climatology, 50, 2338–2342, doi:10.1175/JAMC-D-11-0118.1., ,
A tornadic vortex signature (TVS) is a degraded Doppler velocity signature that occurs when the tangential velocity core region of a tornado is smaller than the effective beamwidth of a sampling Doppler radar. Early Doppler radar simulations, which used a uniform reflectivity distribution across an idealized Rankine vortex, showed that the extreme Doppler velocity peaks of a TVS profile are separated by approximately one beamwidth. The simulations also indicated that neither the size nor the strength of the tornado is recoverable from a TVS. The current study was undertaken to investigate how the TVS might change if vortices having more realistic tangential velocity profiles were considered. The one-celled (axial updraft only) Burgers–Rott vortex model and the two-celled (annular updraft with axial downdraft) Sullivan vortex model were selected. Results of the simulations show that the TVS peaks still are separated by approximately one beamwidth—signifying that the TVS not only is unaffected by the size or strength of a tornado but also is unaffected by whether the tornado structure consists of one or two cells.
2012: Verification of the Origins of Rotation in Tornadoes Experiment 2: VORTEX2. Bulletin of the American Meteorological Society, 93, 1147–1170., , , , , , , ,
The second Verification of the Origins of Rotation in Tornadoes Experiment, VORTEX2, the field phases of which occurred in May and June 2009 and 2010, was designed to explore (a) the physical processes of tornadogenesis, maintenance, and demise, (b) the relationships among tornadoes, tornadic storms, and the larger-scale environment, (c) numerical weather prediction and forecasting of supercell thunderstorms and tornadoes, and (d) the wind field near the ground in tornadoes. VORTEX2 is by far the largest and most ambitious observational and modeling study of tornadoes and tornadic storms ever undertaken. It employed thirteen mobile mesonet instrumented vehicles, eleven ground-based mobile radars (several of which had dual-polarization capability and two of which were phased-array rapid-scan), a mobile Doppler lidar, four mobile balloon sounding systems, forty-two deployable in situ observational weather stations, an unmanned aerial system, video and photogrammetric teams, damage survey teams, deployable disdrometers, and other experimental instrumentation as well a extensive modeling studies of tornadic storms. Participants were drawn from more than 15 universities and laboratories, and at least five nations, with over 80 students participating in field activities. The VORTEX2 field phases spanned two years in order to increase the probability of intercepting significant tornadoes, which are rare events. The field phase of VORTEX2 collected data in over three dozen tornadic and nontornadic supercell thunderstorms with unprecedented detail and diversity of measurements. Some preliminary data and analyses from the ongoing analysis phase of VORTEX2 are shown.
2011: Completeness of Normal Modes for Symmetric Perturbations in Vertically Bounded Domain. J. Met. Soc. Japan, 89, 389–397.,
Perturbations generated by symmetric instability can be characterized, in terms of growing normal modes, by slantwise vertical motion bands similar to those observed in frontal rainbands. Nonmodal growths of symmetric perturbations, characterized also by slantwise vertical motion bands, can be produced by linear combinations of normal modes even before the basic state becomes symmetrically unstable to generate growing modes. In this paper, normal modes for nonhydrostatic symmetric perturbations in a vertically bounded domain are revisited and constructed by free modes obtained in unbounded domain. The constructed modes form a complete set in the full-solution space and thus can construct any admissible solutions to further explore nonmodal growths of symmetric perturbations in the vertically bounded domain beyond previous studies.
2011: Computing streamfunction and velocity potential in a limited domain. Part I: Theory and integral formulae. Adv. Atmos. Sci, 28, 1433–1444, doi:10.1007/s00376-011-0185-6., , ,
The non-uniqueness of solution and compatibility between the coupled boundary conditions in computing velocity potential and streamfunction from horizontal velocity in a limited domain of arbitrary shape are revisited theoretically with rigorous mathematic treatments. Classic integral formulas and their variants are used to formulate solutions for the coupled problems. In the absence of data holes, the total solution is the sum of two integral solutions. One is the internally induced solution produced purely and uniquely by the domain internal divergence and vorticity, and its two components (velocity potential and streamfunction) can be constructed by applying Green's function for Poisson equation in unbounded domain to the divergence and vorticity inside the domain. The other is the externally induced solution produced purely but non-uniquely by the domain external divergence and vorticity, and the non-uniqueness is caused by the harmonic nature of the solution and the unknown divergence and vorticity distributions outside the domain. By setting either the velocity potential (or streamfunction) component to zero, the other component of the externally induced solution can be expressed by the imaginary (or real) part of the Cauchy integral constructed using the coupled boundary conditions and solvability conditions that exclude the internally induced solution. The streamfunction (or velocity potential) for the externally induced solution can also be expressed by the boundary integral of a double-layer (or single-layer) density function. In the presence of data holes, the total solution includes a data-hole--induced solution in addition to the above internally and externally induced solutions.
2011: Measuring information content from observations for data assimilation: Utilities of spectral formulations for radar data compression. Tellus, 63A, 1014–1027, doi:10.1111/j.1600-0870.2011.00542.x., ,
Utilities of the spectral formulations for measuring information content from observations are explored and demonstrated with real radar data. It is shown that the spectral formulations can be used (i) to precisely compute the information contents from one-dimensional radar data uniformly distributed along the radar beam, (ii) to approximately estimate the information contents from two-dimensional radar observations non-uniformly distributed on the conical surface of radar scan, and thus (iii) to estimate the information losses caused by super-observations generated by local averaging with a series of successively coarsened resolutions to find an optimally coarsened resolution for radar data compression with zero or near-zero minimal loss of information. The results obtained from the spectral formulations are verified against the results computed accurately but costly from the singular-value formulations. As the background and observation error power spectra can be derived analytically for the above utilities, the spectral formulations are computationally much more efficient and affordable than the singular-value formulations, even and especially when the background space and observation space are both extremely large and too large to be computed by the singular-value formulations.
2012: Semibalance Model in Terrain-Following Coordinates. Journal of the Atmospheric Sciences, 69, 2201–2206, doi:10.1175/JAS-D-12-012.1., ,
By partitioning the hydrostatically balanced flow into a nonlinearly balanced primary-flow part and a remaining secondary-flow part and then truncating the secondary-flow vorticity advection and stretching–tilting terms in the vector vorticity equation, the previous semibalance model (SBM) in pseudoheight coordinates is rederived in terrain-following pressure coordinates, called η coordinates. The involved truncation is topologically the same as that in pseudoheight coordinates but the truncated terms in η coordinates are not equivalent to those in pseudoheight coordinates. Because its potential vorticity (PV) is conserved and invertible, the rederived SBM is suitable for studying balanced dynamics via “PV thinking” in real weather events, such as slowly varying vortices and curved fronts in which the primary-flow velocity and secondary-flow vorticity are nearly parallel in η coordinates.
2011: Phased Array Weather / Multipurpose Radar. IEEE Aerospace and Electronic Systems Magazine, 26, 12–15, doi:10.1109/MAES.2011.6065653., , , , , , , , , , ,
The first phased array radar dedicated to weather observation and analysis is now instrumented with eight simultaneous digital receivers. The addition of these additional channels will enable the use of advanced signal processing to improve signal/clutter in an adaptive mode. Elimination of strone point and ground clutter returns from the low-level, volumetrically-distributed weather cell returns is a new application of adaptive processing. The NSF funded 8-channel receiver has been added to the National Weather Radar Testbed (NWRT) system in Norman, Oklahoma, to enable operation as a multi-function and/or adaptive processing system. Herein, we describe the system concept, system installation and early results from fielded weather data returns.
2012: Multichannel Receiver Design, Instrumentation, and First Results at the National Weather Radar Testbed. IEEE Transactions on Instrumentation and Measurement, 61, 2022–2033, doi:10.1109/TIM.2011.2178671., , , , , , , , , ,
When the National Weather Radar Testbed (NWRT) was installed in 2004, a single-channel digital receiver was implemented so that the radar could mimic typical Weather Surveillance Radar (WSR) version 1988 Doppler (WSR-88D) capability. This, however, left unused eight other channels, built into the antenna. This paper describes the hardware instrumentation of a recently completed project that digitizes the radar signals produced by these channels. The NWRT is the nation’s first phased array devoted to weather observations, and this tested serves as an evaluation platform to test new hardware and signal processing concepts. The multichannel digital data will foster a new generation of adaptive/fast scanning techniques and space-antenna/interferometry measurements, which will then be used for improved weather forecasting via data assimilation. The multichannel receiver collects signals from the sum, azimuth-difference, elevation-difference, and five broad-beamed auxiliary channels. One of the major advantages of the NWRT is the capability to adaptively scan weather phenomena at a higher temporal resolution than is possible with the WSR-88D. Access to the auxiliary channels will enable clutter mitigation and advanced array processing for higher data quality with shorter dwell times. Potential benefits of higher quality and higher resolution data include: better understanding of storm dynamics and convective initiation; better detection of small-scale phenomena, including tornadoes and microbursts; and crossbeam wind, shear and turbulence estimates. These items have the distinct possibility to ultimately render increased lead time for warning s and improved weather prediction. Finally, samples of recently collected data are presented in the results section of this paper.
2011: A novel multiple flow direction algorithm for computing the topographic wetness index. Hydrology Research, 43, 135–145, doi:10.2166/nh.2011.115., , , , , , , , ,
The topographic wetness index (TWI), frequently used in approximately characterizing the spatial distribution of soil moisture and surface saturation within a watershed, has been widely applied in topography-related geographical processes and hydrological models. However, it is still questionable whether the current algorithms of TWI can adequately model the spatial distribution of topographic characteristics. Based upon the widely-used multiple flow direction approach (MFD), a novel MFD algorithm (NMFD) is proposed for improving the TWI derivation using a Digital Elevation Model (DEM) in this study. Compared with MFD, NMFD improves the mathematical equations of the contributing area and more precisely calculates the effective contour length. Additionally, a varying exponent strategy is adopted to dynamically determine the downslope flow-partition exponent. Finally, a flow-direction tracking method is employed to address grid cells in flat terrain. The NMFD algorithm is first applied to a catchment located upstream of the Hanjiang River in China to demonstrate its accuracy and improvements. Then NMFD is quantitatively evaluated by using four types of artificial mathematical surfaces. The results indicate that the error generated by NMFD is generally lower than that computed by MFD, and NMFD is able to more accurately represent the hydrological similarity of watersheds.
2012: Assessment of evolving TRMM-based multisatellite real-time precipitation estimation methods and their impacts on hydrologic prediction in a high latitude basin. Journal of Geophysical Research, 117, D09108, 1–21, doi:10.1029/2011JD017069., , , , , , , ,
The real-time availability of satellite-derived precipitation estimates provides hydrologists an opportunity to improve current hydrologic prediction capability for medium to large river basins. Due to the availability of new satellite data and upgrades to the precipitation algorithms, the Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis real-time estimates (TMPA-RT) have been undergoing several important revisions over the past ten years. In this study, the changes of the relative accuracy and hydrologic potential of TMPA-RT estimates over its three major evolving periods were evaluated and inter-compared at daily, monthly and seasonal scales in the high-latitude Laohahe basin in China. Assessment results show that the performance of TMPA-RT in terms of precipitation estimation and streamflow simulation was significantly improved after 3 February 2005. Overestimation during winter months was noteworthy and consistent, which is suggested to be a consequence from interference of snow cover to the passive microwave retrievals. Rainfall estimated by the new version 6 of TMPA-RT starting from 1 October 2008 to present has higher correlations with independent gauge observations and tends to perform better in detecting rain compared to the prior periods, although it suffers larger mean error and relative bias. After a simple bias correction, this latest data set of TMPA-RT exhibited the best capability in capturing hydrologic response among the three tested periods. In summary, this study demonstrated that there is an increasing potential in the use of TMPA-RT in hydrologic streamflow simulations over its three algorithm upgrade periods, but still with significant challenges during the winter snowing events.
2012: Comparison of Single-Parameter and Multiparameter Ensembles for Assimilation of Radar Observations Using the Ensemble Kalman Filter. Monthly Weather Review, 140, 562–586, doi:10.1175/MWR-D-10-05074.1., ,
Observational studies indicate that the densities and intercept parameters of hydrometeor distributions can vary widely among storms and even within a single storm. Therefore, assuming a fixed set of microphysical parameters within a given microphysics scheme can lead to significant errors in the forecasts of storms. To explore the impact of variations in microphysical parameters, Observing System Simulation Experiments are conducted based on both perfect- and imperfect-model assumptions. Two sets of ensembles are designed using either fixed or variable parameters within the same single-moment microphysics scheme. The synthetic radar observations of a splitting supercell thunderstorm are assimilated into the ensembles over a 30-min period using an ensemble Kalman filter data assimilation technique followed by 1-h ensemble forecasts. Results indicate that in the presence of a model error, a multiparameter ensemble with a combination of different hydrometeor density and intercept parameters leads to improved analyses and forecasts and better captures the truth within the forecast envelope compared to single-parameter ensemble experiments with a single, constant, inaccurate hydrometeor intercept and density parameters. This conclusion holds when examining the general storm structure, the intensity of midlevel rotation, surface cold pool strength, and the extreme values of the model fields that are most helpful in determining and identifying potential hazards. Under a perfect-model assumption, the single- and multiparameter ensembles perform similarly as model error does not play a role in these experiments. This study highlights the potential for using a variety of realistic microphysical parameters across the ensemble members in improving the analyses and very short-range forecasts of severe weather events.
2012: Quantitative precipitation nowcasting: A Lagrangian pixel-based approach. Atmospheric Research, 118, 418–434, doi:10.1016/j.atmosres.2012.07.001., , , , , , ,
Short-term high-resolution precipitation forecasting has important implications for navigation, flood forecasting, and other hydrological and meteorological concerns. This article introduces a pixel-based algorithm for Short-term Quantitative Precipitation Forecasting (SQPF) using radar-based rainfall data. The proposed algorithm called pixel- Based Nowcasting "PBN" tracks severe storms with a hierarchical mesh-tracking algorithm to capture storm advection in space and time at high resolution from radar imagers. The extracted advection field is then extended to nowcast the rainfall field in the next 3 hr based on a pixel-based Lagrangian dynamic model. The proposed algorithm is compared with two other nowcasting algorithms for ten thunderstorm events over the conterminous United States. Object-based verification metric and traditional statistics have been used to evaluate the performance of the proposed algorithm. It is shown that the proposed algorithm is superior over other algorithms and is effective in tracking and predicting severe storm events for the next few hours.
2011: National Mosaic and Multi-sensor QPE (NMQ) System: Description, Results, and Future Plans. Bulletin of the American Meteorological Society, 92, 1321–1338., , , , , , , , , , , , ,
The National Mosaic and Multi-sensor QPE (Quantitative Precipitation Estimation), or “NMQ”, system was initially developed from a joint initiative between the National Oceanic and Atmospheric Administration's National Severe Storms Laboratory, the Federal Aviation Administration's Aviation Weather Research Program, and the Salt River Project. Further development has continued with additional support from the National Weather Service (NWS) Office of Hydrologic Development, the NWS Office of Climate, Water, and Weather Services, and the Central Weather Bureau of Taiwan. The objectives of NMQ research and development (R&D) are 1) to develop a hydrometeorological platform for assimilating different observational networks toward creating high spatial and temporal resolution multisensor QPEs for f lood warnings and water resource management and 2) to develop a seamless high-resolution national 3D grid of radar reflectivity for severe weather detection, data assimilation, numerical weather prediction model verification, and aviation product development.
Through about ten years of R&D, a real-time NMQ system has been implemented (http://nmq.ou.edu). Since June 2006, the system has been generating high-resolution 3D reflectivity mosaic grids (31 vertical levels) and a suite of severe weather and QPE products in real-time for the conterminous United States at a 1-km horizontal resolution and 2.5 minute update cycle. The experimental products are provided in real-time to end users ranging from government agencies, universities, research institutes, and the private sector and have been utilized in various meteorological, aviation, and hydrological applications. Further, a number of operational QPE products generated from different sensors (radar, gauge, satellite) and by human experts are ingested in the NMQ system and the experimental products are evaluated against the operational products as well as independent gauge observations in real time.
The NMQ is a fully automated system. It facilitates systematic evaluations and advances of hydrometeorological sciences and technologies in a real-time environment and serves as a test bed for rapid science-to-operation infusions. This paper describes scientific components of the NMQ system and presents initial evaluation results and future development plans of the system.
2012: Toward understanding of differences in current cloud retrievals of ARM ground-based measurements. Journal of Geophysical Research, 117, D10206, 1–21, doi:10.1029/2011JD016792., , , , , , , , , , , , , , , , , ,
Accurate observations of cloud microphysical properties are needed for evaluating and improving the representation of cloud processes in climate models and better estimate of the Earth radiative budget. However, large differences are found in current cloud products retrieved from ground-based remote sensing measurements using various retrieval algorithms. Understanding the differences is an important step to address uncertainties in the cloud retrievals. In this study, an in-depth analysis of nine existing ground-based cloud retrievals using ARM remote sensing measurements is carried out. We place emphasis on boundary layer overcast clouds and high level ice clouds, which are the focus of many current retrieval development efforts due to their radiative importance and relatively simple structure. Large systematic discrepancies in cloud microphysical properties are found in these two types of clouds among the nine cloud retrieval products, particularly for the cloud liquid and ice particle effective radius. Note that the differences among some retrieval products are even larger than the prescribed uncertainties reported by the retrieval algorithm developers. It is shown that most of these large differences have their roots in the retrieval theoretical bases, assumptions, as well as input and constraint parameters. This study suggests the need to further validate current retrieval theories and assumptions and even the development of new retrieval algorithms with more observations under different cloud regimes.
FY 2011 — 73 publications
2010: A digitized global flood inventory (1998-2008): Compilation and preliminary results. J. Natural Hazards, 55, 405–422, doi:10.1007/s11069-010-9537-2., , , , , , ,
2011: Multi-platform comparisons of rain intensity for extreme precipitation events. IEEE Transactions on Geoscience and Remote Sensing, 99, 1–12., , , ,
Rainfall intensities during heavy rain events over the continental U.S. are compared for several advanced radar products. These products include the following: 1) Tropical Rainfall Measuring Mission (TRMM) precipitation radar (PR) near-surface estimates; 2) NOAA Quantitative Precipitation Estimation very high resolution (1 km, instantaneous) radar-only national mosaics (Q2); 3) very high resolution gauge-adjusted radar national mosaics, which we have developed by applying a gauge correction on the Q2 products; and 4) several independent C-band dual-polarimetric radar-estimated rainfall samples collected with the Advanced C-band Radar for Meteorological and Operational Research (ARMOR) radar in Alabama. These instantaneous rainfall rate fields [i.e., 1)–3)] can be considered as radar products with the largest coverage currently available from space- and ground-based radar observations. Although accumulated rainfall amounts are often similar, we find the PR and Q2 rain-rate histograms quite different. PR rain-rate histograms are shifted toward lower rain rates, implying a much larger stratiform/convective rain ratio than do products such as Q2. The shift is more evident during strong continental convective storms and not as pronounced in tropical rain. A “continental/maritime regime” behavior is also observed upon adjusting the Q2 products to rain gauges, yet the rain amount more closely agrees with that of PR. The independent PR/ARMOR comparisons confirm this systematic regime behavior. In addition, comparisons are performed over central Florida where PR, Q2, and the NASA TRMM ground validation products are available. These comparisons show large discrepancies among all three products. Resolving the large discrepancies between the products presents an important set of challenges related to improving remote-sensing estimates of precipitation in general and during extreme events in particular.
2011: Investigating cloud radar sensitivity to optically thin cirrus using collocated Raman lidar observations. Geophysical Research Letters, 38, L05807, doi:10.1029/2010GL046365., , ,
The sensitivity of the millimeter cloud radar (MMCR) to optically thin single‐layer cirrus at the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) site is investigated using collocated Raman lidar observations. The sensitivity is characterized in terms of cloud optical depth (OD) and infrared (IR) radiative flux using over three years of coincident Raman lidar and MMCR observations. For cases when the Raman lidar is not fully attenuated (OD < 2.0) the MMCR detects approximately 70% of the total cloud OD with the majority of missed cloud OD occurring near cloud top. If only MMCR observations are used for computing cloudy top‐ of‐the‐atmosphere (TOA) IR flux, the missed cloud OD results in TOA flux biases from 0 to over 100 W/m2; however, the most frequently occurring bias is approximately 16 W/m2. This result highlights the importance of combining Raman lidar, or other sensitive cloud lidars that are able to measure cloud extinction directly, with the MMCR in order to accurately characterize the cloud radiative forcing for thin cirrus cases.
2011: Attenuation and differential attenuation of 5-cm-wavelength radiation in melting hail. Journal of Applied Meteorology and Climatology, 50, 59–76., , , , ,
Presented are quantitative estimates of specific attenuation and specific differential attenuation of 5-cmwavelength radiation (C band) obtained by comparison with measurements at 10-cm wavelength (S band), which are much less affected by attenuation. The data originated from two almost-collocated radars in central Oklahoma. To avoid biases in estimates, the slopes with respect to range of differences in reflectivities and differential reflectivities are assumed to represent the specific attenuations. Observations on a day with no reports of hail on the ground and on a day with large hail are contrasted. A simple one-dimensional model of melting hail is used to qualify these observations. Examples of volumetric fields of the polarimetric variables obtained at the two wavelengths are presented to illustrate that much can be learned about size, orientation, and phase of hydrometeors over volumes that play a role in precipitation formation.
2011: Evaluation of European Storm Forecast Experiment (ESTOFEX) forecasts. Atmospheric Research, 100, 538–546, doi:10.1016/j.atmosres.2010.09.004., , , , , , , , , ,
Three years of forecasts of lightning and severe thunderstorms from the European Storm Forecast Experiment (ESTOFEX) have been evaluated. The forecasts exhibit higher quality in summer than in winter and there is some evidence that they have improved over the course of the evaluation. Five individual forecasters made the majority of the forecasts and differences in their forecasts are on the order of the overall variability of the forecast quality. As a result, the forecasts appear to come from a single unit, rather than from a group of individuals. The graphical description of the probability of detection and frequency of hits recently developed by Roebber is a valuable tool for displaying the time series of lightning forecast performance. It also appears that, even though they are not intended for that purpose, using the lightning forecasts as a low-end forecast of severe thunderstorms is potentially useful for decision makers.
2011: A five-year climatology of tornado false alarms. Weather and Forecasting, 26, 534–544., , ,
During 2008 approximately 75% of tornado warnings issued by the National Weather Service (NWS) were false alarms. This study investigates some of the climatological trends in the issuance of false alarms and highlights several factors that impact false alarm ratio (FAR) statistics. All tornadoes and tornado warnings issued across the continental U.S. between 2000 and 2004 were analyzed, and the data were sorted by hour of the day, month of the year, geographical region and weather forecast office (WFO), the number of tornadoes observed on a day in which a false alarm was issued, distance of the warned area from the nearest NWS radar, county population density and county area.
Analysis of the tornado false alarm data identified six specific trends. First, the FAR was highest during non-peak storm periods, such as during the night and during the winter and late summer. Second, the FAR was strongly tied to the number of tornadoes warned per day. Nearly one-third of all false alarms were issued on days when no tornadoes were confirmed within the WFO’s county warning area. Third, the FAR varied with distance from radar, with significantly low estimates found beyond 150 km from radar. Fourth, the FAR varied with population density. For warnings within 50 km of a NWS radar, FAR increased with population density; however, for warnings beyond 150 km from radar, FAR decreased regardless of population density. Fifth, the FAR also varied as a function of county size. The FAR was generally highest for the smallest counties; the FAR was ~80% for all counties < 1000 km2 regardless of distance from radar. Finally, the combined effects of distance from radar, population density, and county size led to significant variability across geographic regions.
2010: Formation of Charge Structures in a Supercell. Monthly Weather Review, 138, 3740–3761, doi:10.1175/2010MWR3160.1., , , , ,
Lightning mapping, electric field, and radar data from the 26 May 2004 supercell in central Oklahoma are used to examine the storm’s charge structure. An initial arc-shaped maximum in lightning activity on the right flank of the storm’s bounded weak echo region was composed of an elevated normal polarity tripole in the region of precipitation lofted above the storm’s weak echo region. Later in the storm, two charge structures were associated with precipitation that reached the ground. To the left of the weak echo region, six charge regions were inferred, with positive charge carried on hail at the bottom of the stack. Farther forward in the storm’s precipitation region, four charge regions were inferred, with negative charge at the bottom of the stack. There were different charge structures in adjacent regions of the storm at the same time, and regions of opposite polarity charge were horizontally adjacent at the same altitude. Flashes ccasionally lowered positive charge to ground from the forward charge region. A conceptual model is presented that ties charge structure in different regions of the storm to storm structure inferred from radar reflectivity.
2011: Evaluation of water permittivity models from ground-based observations of cold clouds at frequencies between 23 and 170 GHz. IEEE Transactions of Geoscience and Remote Sensing, 49, 2999–3008, doi:10.1109/TGRS.2011.2121074., ,
Accurate retrievals of liquid water path (LWP) from passive microwave radiometers rely on the use of radiative transfer models to describe the absorption of radiation by various atmospheric components. When clouds are present, atmospheric absorption is affected by the dielectric properties of liquid water. In this paper, we use measurements from four microwave radiometers to assess four models of the complex permittivity of water. The observations are collected at five frequencies between 23.8 and 170 GHz. The purpose of the study is to compare measurements of microwave absorption with model computations in supercooled liquid clouds that have temperatures between 0◦C and −30◦C. Models of liquid water permittivity in this temperature range suffer from a lack of laboratory measurements and are generally derived from the extrapolation of available data. An additional rationale for this work is to examine to what degree the use of different dielectric models affects the retrieval of LWP in supercooled liquid clouds. Inaccuracies in modeling the water permittivity at low temperatures are likely one of the largest sources of retrieval uncertainty in supercooled clouds, uncertainty that could offset the advantages offered by the enhanced sensitivity of channels at frequencies at and above 90 GHz relative to lower frequencies.
2011: Computing Rossby potential vorticity in terrain-following coordinates. Monthly Weather Review, 139, 2955–2961., ,
2011: Thermodynamic atmospheric profiling during the 2010 Winter Olympics using ground-based microwave radiometry. IEEE Trans. Geosci. Rem. Sens, 99, 1–11, doi:10.1109/TGRS.2011.2154337., , , , , , , , , ,
Ground-based microwave radiometer profilers in the 20–60-GHz range operate continuously at numerous sites in different climate regions. Recent work suggests that a 1-D variational (1-DVAR) technique, coupling radiometric observations with outputs from a numerical weather prediction model, may outperform traditional retrieval methods for temperature and humidity profiling. The 1-DVAR technique is applied here to observations from a commercially available microwave radiometer deployed at Whistler, British Columbia, which was operated by Environment Canada to support nowcasting and short-term weather forecasting during the Vancouver 2010 Winter Olympic and Paralympic Winter Games. The analysis period included rain, sleet, and snow events (∼235-mm total accumulation and rates up to 18 mm/h). The 1-DVAR method is applied “quasi-operationally,” i.e., as it could have been applied in real time, as no data were culled. The 1-DVAR-achieved accuracy has been evaluated by using simultaneous radiosonde and ceilometer observations as reference. For atmospheric profiling from the surface to 10 km, we obtain retrieval errors within 1.5 K for temperature and 0.5 g/m3 for water vapor density. The retrieval accuracy for column-integrated water vapor is 0.8 kg/m2, with small bias (−0.1 kg/m2) and excellent correlation (0.96). The retrieval of cloud properties shows a high probability of detection of cloud/no cloud (0.8/0.9, respectively), low false-alarm ratio (0.1), and cloud-base height estimate error within ∼0.60 km.
2010: Neighborhood-based verification of precipitation forecasts from convection-allowing NCAR WRF Model simulations and the operational NAM. Weather and Forecasting, 25, 1495–1509., , ,
2011: Probabilistic precipitation forecast skill as a function of ensemble size and spatial scale in a convection-allowing ensemble. Monthly Weather Review, 139, 1410–1418, doi:10.1175/2010MWR3624.1., , , , , , , , , , , , ,
2011: Environment and Early Evolution of the 8 May 2009 Derecho-Producing Convective System. Monthly Weather Review, 139, 1083–1102, doi:10.1175/2010MWR3413.1., , ,
This study documents the complex environment and early evolution of the remarkable derecho that traversed portions of the central United States on 8 May 2009. Central to this study is the comparison of the 8 May 2009 derecho environment to that of other mesoscale convective systems (MCSs) that occurred in the central United States during a similar time of year. Synoptic-scale forcing was weak and thermodynamic instability was limited during the development of the initial convection, but several mesoscale features of the environment appeared to contribute to initiation and upscale growth, including a mountain wave, a midlevel jet streak, a weak midlevel vorticity maximum, a “Denver cyclone,” and a region of upper-tropospheric inertial instability.
The subsequent MCS developed in an environment with an unusually strong and deep low-level jet (LLJ), which transported exceptionally high amounts of low-level moisture northward very rapidly, destabilized the lower troposphere, and enhanced frontogenetical circulations that appeared to aid convective development. The thermodynamic environment ahead of the developing MCS contained unusually high precipitable water (PW) and very large midtropospheric lapse rates, compared to other central plains MCSs. Values of downdraft convective available potential energy (DCAPE), mean winds, and 0–6-km vertical wind shear were not as anomalously large as the PW, lapse rates, and LLJ. In fact, the DCAPE values were lower than the mean values in the comparison dataset. These results suggest that the factors contributing to updraft strength over a relatively confined area played a significant role in generating the strong outflow winds at the surface, by providing a large volume of hydrometeors to drive the downdrafts.
2011: CORRIGENDUM. Monthly Weather Review, 139, 2686–2688., , ,
2011: Comparing Theory and Measurements of Cross-Polar Fields of a Phased Array Weather Radar. IEEE Geoscience and Remote Sensing Letters, 8, 1002–1006, doi:10.1109/LGRS.2011.2146753., , , , ,
Cross-polar measurements made with an agile-beam phased-array weather radar are compared with theory. The intensity of cross-polar fields places conditions on the accuracy of meteorological measurements. Results reported herein support the hypothesis that polarimetric phased-array radar for weather observations can be designed to allow use of the polarimetric data acquisition mode being implemented by the National Weather Service on upgraded WSR-88Ds which use parabolic reflector antennas.
2011: Ensemble Kalman filter assimilation of radar observations of the 8 May 2003 Oklahoma City supercell: Influences of reflectivity observations on storm-scale analyses. Monthly Weather Review, 139, doi:10.1175/2010MWR3438.1., , ,
Ensemble Kalman filter (EnKF) techniques have been proposed for obtaining atmospheric state estimates on the scale of individual convective storms from radar and other observations, but tests of these methods with observations of real convective storms are still very limited. In the current study, radar observations of the 8 May 2003 Oklahoma City tornadic supercell thunderstorm were assimilated into the National Severe Storms Laboratory (NSSL) Collaborative Model for Multiscale Atmospheric Simulation (NCOMMAS) with an EnKF method. The cloud model employed 1-km horizontal grid spacing, a single-moment bulk precipitation- microphysics scheme, and a base state initialized with sounding data. A 50-member ensemble was produced by randomly perturbing base-state wind profiles and by regularly adding random local perturbations to the horizontal wind, temperature, and water vapor fields in and near observed precipitation.
In a reference experiment, only Doppler-velocity observations were assimilated into the NCOMMAS ensemble. Then, radar-reflectivity observations were assimilated together with Doppler-velocity observations in subsequent experiments. Influences that reflectivity observations have on storm-scale analyses were revealed through parameter-space experiments by varying observation availability, observation errors, ensemble spread, and choices for what model variables were updated when a reflectivity observation was assimilated. All experiments produced realistic storm-scale analyses that compared favorably with independent radar ob- servations. Convective storms in the NCOMMAS ensemble developed more quickly when reflectivity ob- servations and velocity observations were both assimilated rather than only velocity, presumably because the EnKF utilized covariances between reflectivity and unobserved model fields such as cloud water and vertical velocity in efficiently developing realistic storm features.
Recurring spatial patterns in the differences between predicted and observed reflectivity were noted par- ticularly at low levels, downshear of the supercell’s updraft, in the anvil of moderate-to-light precipitation, where reflectivity in the model was typically lower than observed. Bias errors in the predicted rain mixing ratios and/or the size distributions that the bulk scheme associates with these mixing ratios are likely re- sponsible for this reflectivity underprediction. When a reflectivity observation is assimilated, bias errors in the model fields associated with reflectivity (rain, snow, and hail–graupel) can be projected into other model variables through the ensemble covariances. In the current study, temperature analyses in the downshear anvil at low levels, where reflectivity was underpredicted, were very sensitive both to details of the assimi- lation algorithm and to ensemble spread in temperature. This strong sensitivity suggests low confidence in analyses of low-level cold pools obtained through reflectivity-data assimilation.
2010: On characterizing the error in a remotely sensed liquid water content profile. Atmospheric Research, 98, 57–68, doi:10.1016/j.atmosres.2010.06.002., , , ,
The accuracy of a liquid water content profile retrieval using microwave radiometer brightness temperatures and/or cloud radar reflectivities is investigated for two realistic cloud profiles. The interplay of the errors of the a priori profile, measurements and forward model on the retrieved liquid water content error and on the information content of the measurements is analyzed in detail. It is shown that the inclusion of the microwave radiometer observations in the liquid water content retrieval increases the number of degrees of freedom (independent pieces of information) by about 1 compared to a retrieval using data from the cloud radar alone. Assuming realistic measurement and forward model errors, it is further demonstrated, that the error in the retrieved liquid water content is 60% or larger, if no a priori information is available, and that a priori information is essential for better accuracy. However, there are few observational datasets available to construct accurate a priori profiles of liquid water content, and thus more observational data are needed to improve the knowledge of the a priori profile and consequentially the corresponding error covariance matrix. Accurate liquid water content profiles are essential for cloud-radiation interaction studies. For the two cloud profiles of this study, the impact of a 30% liquid water content error on the shortwave and longwave surface fluxes and on the atmospheric heating rates is illustrated.
2011: Cloud statistics and cloud radiative effect for a low-mountain site. Quarterly Journal of the Royal Meteorological Society, 137, 306–324, doi:10.1002/qj.748., , , , ,
In 2007, the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) was operated for a nine-month period in the Murg Valley, Black Forest, Germany, in support of the Convective and Orographically-induced Precipitation Study (COPS). The synergy of AMF and COPS partner instrumentation was exploited to derive a set of high-quality thermodynamic and cloud property profiles with 30 s resolution. In total, clouds were present 72% of the time, with multi-layer mixed phase (28.4%) and single-layer water clouds (11.3%) occurring most frequently. A comparison with the Cloudnet sites Chilbolton and Lindenberg for the same time period revealed that the Murg Valley exhibits lower liquid water paths (LWPs; median = 37.5 g m−2) compared to the two sites located in flat terrain. In order to evaluate the derived thermodynamic and cloud property profiles, a radiative closure study was performed with independent surface radiation measurements. In clear sky, average differences between calculated and observed surface fluxes are less than 2% and 4% for the short wave and long wave part, respectively. In cloudy situations, differences between simulated and observed fluxes, particularly in the short wave part, are much larger, but most of these can be related to broken cloud situations. The daytime cloud radiative effect (CRE), i.e. the difference of cloudy and clear-sky net fluxes, has been analysed for the whole nine-month period. For overcast, single-layer water clouds, sensitivity studies revealed that the CRE uncertainty is likewise determined by uncertainties in liquid water content and effective radius. For low LWP clouds, CRE uncertainty is dominated by LWP uncertainty; therefore refined retrievals, such as using infrared and/or higher microwave frequencies, are needed.
2011: The NSSL hydrometeor classification algorithm in winter surface precipitation: evaluation and future development. Weather and Forecasting, 26, 756–765.,
The National Severe Storms Laboratory (NSSL) has developed a hydrometeor classification algorithm (HCA) for use with the polarimetric upgrade of the current WSR-88D radar network. The algorithm was developed specifically for warm season convection, but it will run regardless of season and so its performance on surface precipitation type during winter events is examined here. Using observations of precipitation type provided by the public, the HCA output is compared to collocated (in time and space) public observations of precipitation type. The Peirce skill score (PSS) shows that the NSSL HCA applied to winter surface precipitation displays little skill, with a PSS of only 0.115. Further analysis indicates that HCA failures are strongly linked to the inability of HCA to accommodate refreezing below the first freezing level and errors in the melting level detection algorithm. However, entrants in the 2009 American Meteorology Society Seventh Annual Artificial Intelligence competition developed classification methods that yield a PSS of 0.35 using a subset of available radar data merged with limited environmental data. Thus, when polarimetric radar data and environmental data are appropriately combined, more information about winter surface precipitation type is available than from either data source alone.
2011: Lightning activity in a hail-producing storm observed with phased-array radar. Monthly Weather Review, 139, 1809–1825, doi:10.1175/2010MWR3574.1., , , ,
This study examined lightning activity relative to the rapidly evolving kinematics of a hail-producing storm on 15 August 2006. Data were provided by the National Weather Radar Testbed Phased-Array Radar, the Oklahoma Lightning Mapping Array, and the National Lightning Detection Network.
This analysis is the first to compare the electrical characteristics of a hail-producing storm with reflectivity and radial velocity structure at temporal resolutions of <1 min. Total flash rates increased to ~220 / min as the storm’s updraft first intensified, leveled off during its first mature stage, and then decreased for 2–3 min despite the simultaneous development of another updraft surge. This reduction in flash rate occurred as wet hail formed in the new updraft and was likely related to the wet growth; wet growth is not conducive to hydrometeor charging and probably contributed to the formation of a “lightning hole” without a mesocyclone. Total flash rates subsequently increased to ~450 / min as storm volume and inferred graupel volume increased, and then decreased as the storm dissipated. Vertical charge structure in the storm initially formed a positive tripole (midlevel negative charge between upper and lower positive charges). Charge structure in the second updraft surge consisted of negative charge above deep midlevel positive charge, a reversal consistent with the effect of large liquid water contents on hydrometeor charge polarity in laboratory experiments. Prior to the second updraft surge, the storm produced two cloud-to-ground flashes, both lowering the usual negative charge to ground. Shortly before hail likely reached ground, the storm produced four cloud-to-ground flashes, all lowering positive charge. Episodes of high singlet VHF sources were observed at ~13–15 km during the initial formation and later intensification of the storm’s updraft.
2011: Significance of the Coupled Term in the Doppler Weather Radar Spectrum Width Equation. Journal of Atmospheric and Oceanic Technology, 28, 539–547, doi:10.1175/2010JTECHA1442.1., ,
There is an additional zero mean random variable term that couples mean wind shear and turbulence in the Doppler radar spectrum width equation. This random variable, labeled the “coupled term”, has been neglected heretofore in the literature. Herein the variance of the squared spectrum width ascribed to this coupled term is determined from data collected with a WSR-88D in two snow storms; it can exceed 1 m^4 s^-4. Thus this coupled term can be a significant contributor to the variance of the spectrum width and must be considered when using spectrum width to deduce turbulence.
2011: Bias in copolar correlation coefficient caused by antenna radiation patterns. IEEE Transections on Geoscience and Remote Sensing, 49, 2274–2284., ,
We present a theoretical study of the bias in the copolar correlation coefficient caused by cross-polar radiation patterns and by unmatched horizontal and vertical copolar radiation patterns. The analysis of the bias induced by cross-polarization radiation is carried out for both modes of operation of polarimetric radars, designated as the simultaneous transmission and reception of horizontally and vertically polarized waves and the alternate transmission of horizontally and vertically polarized waves, respectively. The bias caused by unmatched horizontal and vertical copolar radiation patterns as a function of slight differences in pointing angles and beamwidths is also analyzed. In well-designed weather radars, for the purpose of hydrometeor classification, the overall acceptable bias in the copolar correlation coefficient should be less than about 0.01. The levels of cross-to-copolar gain ratios for acceptable performance are indicated. Ultimately, pointing angle and beamwidth tolerances are indicated for horizontal and vertical copolar antenna patterns.
2011: Long-term trends in downwelling spectral infrared radiance over the U.S. Southern Great Plains. Journal of Climate, 24, 4831–4843, doi:10.1175/2011JCLI4210.1., ,
A trend analysis was applied to a 14-yr time series of downwelling spectral infrared radiance observations from the Atmospheric Emitted Radiance Interferometer (AERI) located at the Atmospheric Radiation Measurement Program (ARM) site in the U.S. Southern Great Plains. The highly accurate calibration of the AERI instrument, performed every 10 min, ensures that any statistically significant trend in the observed data over this time can be attributed to changes in the atmospheric properties and composition, and not to changes in the sensitivity or responsivity of the instrument. The measured infrared spectra, numbering more than 800 000, were classified as clear-sky, thin cloud, and thick cloud scenes using a neural network method. The AERI data record demonstrates that the downwelling infrared radiance is decreasing over this 14-yr period in the winter, summer, and autumn seasons but it is increasing in the spring; these trends are statistically sig- nificant and are primarily due to long-term change in the cloudiness above the site. The AERI data also show many statistically significant trends on annual, seasonal, and diurnal time scales, with different trend signa- tures identified in the separate scene classifications. Given the decadal time span of the dataset, effects from natural variability should be considered in drawing broader conclusions. Nevertheless, this dataset has high value owing to the ability to infer possible mechanisms for any trends from the observations themselves and to test the performance of climate models
2010: Remote collection and analysis of high-resolution data on flash floods. Journal of Hydrology, 394, 53–62, doi:10.1016/j.jhydrol.2010.05.042., , , , ,
Typically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe hazards analysis and verification experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has also been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This paper describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies (i.e., US National Weather Service Storm Data reports) and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.
2011: Relationship between sounding derived parameters and the strength of tornadoes in Europe and the USA from reanalysis data. Atmospheric Research, 100, 479–488, doi:10.1016/j.atmosres.2010.11.011., ,
Proximity soundings from reanalysis data for tornado events in Europe for the years 1958 to 1999 and in the US for the years 1991 to 1999 have been used for generating distributions of parameter combinations important for severe convection. They include parcel updraft velocity (WMAX) and deep-layer shear (DLS), lifting condensation level (LCL) and deep-layer shear (DLS), and LCL and shallow-layer shear (LLS) for weak and significant tornadoes. We investigate how well they discriminate between weak and significant tornadoes. For Europe, these distributions have been generated for unrated, F0 and F1 tornadoes as well to discover if the unrated tornadoes can be associated with the weak tornadoes. The pattern of parameter combination distributions for unrated tornadoes in Europe strongly resembles the pattern of F0 tornadoes. Thus, the unrated tornadoes are likely to consist of mostly F0 tornadoes. Consequently, the unrated tornadoes have been included into the weak tornadoes and distributions of parameter combinations have been generated for these. In Europe, none of the three combinations can discriminate well between weak and significant tornadoes, but all can discriminate if the unrated tornadoes are included with the weak tornadoes (unrated/weak). In the US, the combinations of LCL and either of the shear parameters discriminate well between weak and significant tornadoes, with significant tornadoes occurring at lower LCL and higher shear values than the weak ones. In Europe, the shear shows the same behavior, but the LCL behaves differently, with significant tornadoes occurring at higher LCL than the unrated/weak ones. The combination of WMAX and DLS is a good discriminator between unrated/weak and significant tornadoes in Europe, but not in the US, with significant tornadoes occurring at a higher WMAX and DLS than the unrated/weak tornadoes.
2011: Polarimetric attenuation correction in heavy rain at C band. Journal of Applied Meteorology and Climatology, 50, 39–58., , , , , , ,
2011: Assimilation of surface-based boundary-layer profiler observations during a cool season observation system simulation experiment. Part 2: Forecast assessment. Monthly Weather Review, 139, 2327–2346, doi:10.1175/2011MWR3623.1., , , , ,
In this study, atmospheric analyses obtained through assimilation of temperature, water vapor, and wind profiles from a potential network of ground-based remote sensing boundary layer profiling instruments were used to generate short-range ensemble forecasts for each assimilation experiment performed in Part I. Re- mote sensing systems evaluated during this study include the Doppler wind lidar (DWL), Raman lidar (RAM), microwave radiometer (MWR), and the Atmospheric Emitted Radiance Interferometer (AERI). Overall, the results show that the most accurate forecasts were achieved when mass (temperature and hu- midity profiles from the RAM, MWR, and/or AERI) and momentum (wind profiles from the DWL) ob- servations were assimilated simultaneously, which is consistent with the main conclusion from Part I. For instance, the improved wind and moisture analyses obtained through assimilation of these observations contributed to more accurate forecasts of moisture flux convergence and the intensity and location of accu- mulated precipitation (ACPC) due to improved dynamical forcing and mesoscale boundary layer thermo- dynamic structure. An object-based verification tool was also used to assess the skill of the ACPC forecasts. Overall, total interest values for ACPC matched objects, along with traditional forecast skill statistics like the equitable threat score and critical success index, were most improved in the multisensor assimilation cases.
2011: High-temporal-resolution capabilities of the National Weather Radar Testbed Phased-Array Radar. Journal of Applied Meteorology and Climatology, 50, 579–593., ,
Since 2007 the advancement of the National Weather Radar Testbed Phased-Array Radar (NWRT PAR) hardware and software capabilities has been supporting the implementation of high-temporal-resolution (1 min) sampling. To achieve the increase in computational power and data archiving needs required for high-temporal-resolution sampling, the signal processor was upgraded to a scalable, Linux-based cluster with a distributed computing architecture. The development of electronic adaptive scanning, which can reduce update times by focusing data collection on significant weather, became possible through functionality added to the radar control interface and real-time controller. Signal processing techniques were implemented to address data quality issues, such as artifact removal and range-and-velocity ambiguity mitigation, absent from the NWRT PAR at its installation. The hardware and software advancements described above have made possible the development of conventional and electronic scanning capabilities that achieve high-temporal-resolution sampling. Those scanning capabilities are sector- and elevation-prioritized scanning, beam multiplexing, and electronic adaptive scanning. Each of these capabilities and related sampling trade-offs are explained and demonstrated through short case studies.
2011: A Preliminary Look at the Social Perspective of Warn-on-Forecast: Preferred Tornado Warning Lead Time and the General Public's Perceptions of Weather Risks. Weather, Climate, and Society, 3, 128–140., , , , ,
Tornado warnings are currently issued an average of 13 minutes in advance of a tornado (Golden and Adams 2000) and are based on a warn-on-detection paradigm (Erickson and Brooks 2006). However, computer model improvements may allow for a new warning paradigm, warn-on- forecast, to be established in the future (Stensrud et al. 2009). This would mean that tornado warnings could be issued one to two hours in advance, prior to storm initiation. In anticipation of the technological innovation, this study inquires whether the “warn-on-forecast” paradigm for tornado warnings may be preferred by the public (i.e., individuals and single families). Our sample is drawn from visitors to the National Weather Center in Norman, Oklahoma. During the summer and fall of 2009, surveys were distributed to 320 participants to assess their understanding and perception of weather risks and preferred tornado warning lead-time.
Responses were analyzed according to several different parameters including age, region of residency, educational level, number of children, and prior tornado experience. A majority of the respondents answered many of the weather risk questions correctly. They seemed to be familiar with tornado seasons; however, they were unaware of the relative number of fatalities caused by tornadoes and several additional weather phenomena each year in the United States. The preferred lead-time was 34.3 minutes according to average survey responses. This suggests that while the general public may currently prefer a longer average lead-time than the present system offers, the preference does not extend to the one to two hour time-frame theoretically offered by the warn-on-forecast system. When asked what they would do if given a one-hour lead-time, respondents reported that taking shelter was a lesser priority than when given a 15-minute leadtime, and fleeing the area became a slightly more popular alternative. A majority of respondents also reported the situation would feel less life threatening if given a one-hour lead-time. These results suggest that how the public responds to longer lead times may be complex and situationally-dependent, and further study must be conducted to ascertain the users for whom the longer lead-times would carry the most value. These results form the basis of an informative stated-preference approach to predicting public response to long (> 1 hour) warning lead times, using public understanding of the risks posed by severe weather events to contextualize leadtime demand.
2010: Automatic detection of wind turbine clutter for weather radars. Journal of Atmospheric and Oceanic Technology, 27, 1868–1880., , ,
2011: A reanalysis of MODIS fine mode fraction over ocean using OMI and daily GOCART simulations. Atmos. Chem. Phys, 11, 5805–5817, doi:10.5194/acp-11-5805-2011., ,
Using daily Goddard Chemistry Aerosol Radiation and Transport (GOCART) model simulations and columnar retrievals of 0.55 μm aerosol optical thickness (AOT) and fine mode fraction (FMF) from the Moderate Resolution Imaging Spectroradiometer (MODIS), we estimate the satellite-derived aerosol properties over the global oceans between June 2006 and May 2007 due to black carbon (BC), organic carbon (OC), dust (DU), sea-salt (SS), and sulfate (SU) components. Using Aqua-MODIS aerosol properties embedded in the CERES-SSF product, we find that the mean MODIS FMF values for each aerosol type are SS: 0.31±0.09, DU: 0.49±0.13, SU: 0.77±0.16, and (BC+OC):0.80±0.16. We further combine information from the ultraviolet spectrum using the Ozone Monitoring Instrument (OMI) onboard the Aura satellite to improve the classification process, since dust and carbonate aerosols have positive Aerosol Index (AI) values > 0.5 while other aerosol types have near zero values. By combining MODIS and OMI datasets, we were able to identify and remove data in the SU, OC, and BC regions that were not associated with those aerosol types. The same methods used to estimate aerosol size characteristics from MODIS data within the CERES-SSF product were applied to Level 2 (L2) MODIS aerosol data from both Terra and Aqua satellites for the same time period. As expected, FMF estimates from L2 Aqua data agreed well with the CERES-SSF dataset, also from Aqua. However, the FMF estimate for DU from Terra data was significantly lower (0.37 vs. 0.49) indicating that sensor calibration, sampling differences, and/or diurnal changes in DU aerosol size characteristics were occurring. Differences for other aerosol types were generally smaller. Sensitivity studies show that a difference of 0.1 in the estimate of the anthropogenic component of FMF produces a corresponding change of 0.2 in the anthropogenic component of AOT (assuming a unit value of AOT). This uncertainty would then be passed along to any satellite-derived estimates of anthropogenic aerosol radiative effects.
2010: Assessing advances in the assimilation of radar data within a collaborative forecasting-research environment. Weather and Forecasting, 25, 1510–1521, doi:10.1175/2010WAF2222405.1., , , , , , , , , , , , ,
The impacts of assimilating radar data and other mesoscale observations in real-time, convection-allowing model forecasts were evaluated during the spring seasons of 2008 and 2009 as part of the Hazardous Weather Test Bed Spring Experiment activities. In tests of a prototype continental U.S.-scale forecast system, focusing primarily on regions with active deep convection at the initial time, assimilation of these observations had a positive impact. Daily interrogation of output by teams of modelers, forecasters, and verification experts provided additional insights into the value-added characteristics of the unique assimilation forecasts. This evaluation revealed that the positive effects of the assimilation were greatest during the first 3–6 h of each forecast, appeared to be most pronounced with larger convective systems, and may have been related to a phase lag that sometimes developed when the convective-scale information was not assimilated. These preliminary results are currently being evaluated further using advanced objective verification techniques.
2010: Extracting unique information from high resolution forecast models: Monitoring selected fields and phenomena every time step. Weather and Forecasting, 25, 1536–1542, doi:10.1175/2010WAF2222430.1., , , , , ,
A new strategy for generating and presenting model diagnostic fields from convection-allowing forecast models is introduced. The fields are produced by computing temporal-maximum values for selected diagnostics at each horizontal grid point between scheduled output times. The two-dimensional arrays containing these maximum values are saved at the scheduled output times. The additional fields have minimal impacts on the size of the output files and the computation of most diagnostic quantities can be done very efficiently during integration of the Weather Research and Forecasting Model. Results show that these unique output fields facilitate the examination of features associated with convective storms, which can change dramatically within typical output intervals of 1–3 h.
2011: Effects of aerosols on precipitation and hail production in a midlatitude storm as seen from simulations using spectral (bin) microphysics model. Atmospheric Research, 99, 129–146., , , , ,
2011: Satellite remote sensing and hydrological modeling for flood inundation mapping in Lake Victoria Basin: Implications for hydrologic prediction in ungauged basins. IEEE Transactions on Geoscience and Remote Sensing, 49, 85–95, doi:10.1109/TGRS.2010.2057513., , , , , , , , , ,
2010: Rapid-scan super-resolution observations of a cyclic supercell with a dual-polarization WSR-88D. Monthly Weather Review, 138, 3762–3786, doi:10.1175/2010MWR3322.1., , , ,
In recent years, there has been widespread interest in collecting and analyzing rapid updates of radar data in severe convective storms. To this end, conventional single-polarization rapid-scan radars and phased array radar systems have been employed in numerous studies. However, rapid updates of dual-polarization radar data in storms are not widely available. For this study, a rapid scanning strategy is developed for the polarimetric prototype research Weather Surveillance Radar-1988 Doppler (WSR-88D) radar in Norman, Oklahoma (KOUN), which emulates the future capabilities of a polarimetric multifunction phased array radar (MPAR). With this strategy, data are collected over an 80° sector with 0.5° azimuthal spacing and 250-m radial resolution (“super resolution”), with 12 elevation angles. Thus, full volume scans over a limited area are collected every 71–73 s.
The scanning strategy was employed on a cyclic nontornadic supercell storm in western Oklahoma on 1 June 2008. The evolution of the polarimetric signatures in the supercell is analyzed. The repetitive pattern of evolution of these polarimetric features is found to be directly tied to the cyclic occlusion process of the low-level mesocyclone. The cycle for each of the polarimetric signatures is presented and described in detail, complete with a microphysical interpretation. In doing so, for the first time the bulk microphysical properties of the storm on small time scales (inferred from polarimetric data) are analyzed. The documented evolution of the polarimetric signatures could be used operationally to aid in the detection and determination of various stages of the low-level mesocyclone occlusion.
2011: Dust Storm over the Black Rock Desert: Larger-scale Dynamic Signatures. Journal of Geophysical Research - D: Atmospheres, 116, 1–23, doi:10.1029/2010JD014784., , , , , ,
A dust storm that originated over the Black Rock Desert (BRD) of northwestern Nevada is investigated. Our primary goal is to more clearly understand the sequence of dynamical processes that generate surface winds responsible for entraining dust from this desert. In addition to reliance on conventional surface and upper-air observations, we make use of reanalysis datasets (NCAR/NCEP and NARR) — blends of primitive equation model forecasts and observations. From these datasets, we obtain the evolution of vertical motion patterns and ageostrophic motions associated with the event. In contrast to earlier studies that have emphasized the importance of indirect transverse circulations about an upper-level jet streak, our results indicate that the transition from indirect to direct circulation across the exit region of upper-level jet streak is central to creation of low-level winds that ablate dust from the desert. It is further argued that the transition of vertical circulation patterns is in response to adjustments to geostrophic imbalance — where the adjustment time scale is the order of 6-9 h. Although unproven, we suggest that precedent rainfall over the alkali desert two weeks prior to the event was instrumental in lowering the bulk density of sediments and thereby improved the chances for dust ablation. We comprehensively compare/contrast our results with those of earlier investigators, and we present an alternative view of key dynamical signatures in atmospheric flow that portend the likelihood of dust storms over the western United States.
2011: Evaluation of TRIGRS (Transient Rainfall Infiltration and Grid-based Regional Slope-Stability Analysis)’s predictive skill for hurricane-triggered landslides: A case study in Macon County, North Carolina. Natural Hazards, 58, 325–339, doi:10.1007/s11069-010-9670-y., , , , , ,
2011: Lightning development associated with two gigantic jets. Geophysical Research Letters, 38, L12801, doi:10.1029/2011GL047662., , , , , , , , , , , , ,
We report observations of two negative polarity gigantic jets sufficiently near very high‐frequency (VHF) lightning mapping networks that the associated lightning characteristics and charge transfer could be investigated. In both cases the gigantic jet‐producing flash began as ordinary intracloud lightning with upper level channels attempting to exit the cloud, and then produced the upward gigantic jet. Neither flash had developed channels to ground, confirming that the major charge transfer during gigantic jets occurred between the cloud and ionosphere. The leader progression of one event was detected at altitudes above 20 km, demonstrating the possibility of detecting and tracking the propagation of negative jets above the cloud with VHF techniques.
2011: Time-expanded sampling for ensemble-based filters: assimilation experiments with real radar observations. Advances in Atmospheric Sciences, 28, 743–757, doi:10.1007/s00376-010-0021-4., , , ,
By sampling perturbed state vectors from each ensemble prediction run at properly selected time levels in the vicinity of the analysis time, the recently proposed time-expanded sampling approach can enlarge the ensemble size without increasing the number of prediction runs and, hence, can reduce the computational cost of an ensemble-based filter. In this study, this approach is tested for the first time with real radar data from a tornadic thunderstorm. In particular, four assimilation experiments were performed to test the time-expanded sampling method against the conventional ensemble sampling method used by ensemble-based filters. In these experiments, the ensemble square-root filter (EnSRF) was used with 45 ensemble members generated by the time-expanded sampling and conventional sampling from 15 and 45 prediction runs, respectively, and quality-controlled radar data were compressed into super-observations with properly reduced spatial resolutions to improve the EnSRF performances. The results show that the time-expanded sampling approach not only can reduce the computational cost but also can improve the accuracy of the analysis, especially when the ensemble size is severely limited due to computational constraints for real-radar data assimilation. These potential merits are consistent with those previously demonstrated by assimilation experiments with simulated data.
2011: Physical processes during development of upward leaders from tall structures. Journal of Electrostatics, 69, 97–110.,
2011: Indirect and semi-direct aerosol campaign (ISDAC): The impact of Arctic aerosols on clouds. Bulletin of the American Meteorological Society, 92, 183–201, doi:10.1175/2010BAMS2935.1., , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
A comprehensive dataset of microphysical and radiative properties of aerosols and clouds in the arctic boundary layer in the vicinity of Barrow, Alaska was collected in April 2008 during the Indirect and Semi-Direct Aerosol Campaign (ISDAC) sponsored by the Department of Energy Atmospheric Radiation Measurement (ARM) and Atmospheric Science Programs. The primary aim of ISDAC was to examine effects of aerosols on clouds that contain both liquid and ice water for clean and polluted environments. ISDAC utilized the ARM permanent observational facilities at Barrow. These include a cloud radar, a polarized micropulse lidar, and an atmospheric emitted radiance interferometer as well as instruments specially deployed for ISDAC measuring aerosol, ice fog, precipitation and spectral shortwave radiation. The National Research Council of Canada Convair-580 flew 27 sorties during ISDAC, collecting data using an unprecedented 42 state-of-the-art cloud and aerosol instruments for more than 100 hours on 12 different days. Data were obtained on a number of days, including above, below and within single- layer stratus on 8 April and 26 April 2008. These data enable a process-oriented understanding of how aerosols affect the microphysical and radiative properties of arctic clouds influenced by different surface conditions and aerosol loads. Observations acquired on a heavily polluted day, 19 April 2008, are enhancing this understanding. Data acquired in cirrus on transit flights between Fairbanks and Barrow are improving our understanding of the performance of cloud probes in ice. Ultimately the ISDAC data will be used to improve the representation of cloud and aerosol processes in models covering a variety of spatial and temporal scales and pollution regimes, and to determine the extent to which long-term surface-based measurements can provide retrievals of aerosols, clouds, precipitation and radiative heating in the Arctic.
2010: Understanding severe weather processes through spatiotemporal relational random forests. Proceedings of 2010 Conference on Intelligent Data Understanding, NASA, na, 213–227., , , , , , , ,
2011: Identifying predictive multi-dimensional time series motifs: An application to severe weather prediction. Data Mining and Knowledge Discovery, 22, 232–258., , , ,
2011: Using spatiotemporal relational random forests to improve our understanding of severe weather processes. Statistical Analysis and Data Mining, 4, 407–429., , , , , ,
Major severe weather events can cause a significant loss of life and property. We seek to revolutionize our understanding of and our ability to predict such events through the mining of severe weather data. Because weather is inherently a spatiotemporal phenomenon, mining such data requires a model capable of representing and reasoning about complex spatiotemporal dynamics, including temporally and spatially varying attributes and relationships. We introduce an augmented version of the Spatiotemporal Relational Random Forest, which is a random forest that learns with spatiotemporally varying relational data. Our algorithm maintains the strength and performance of random forests but extends their applicability, including the estimation of variable importance, to complex spatiotemporal relational domains. We apply the augmented Spatiotemporal Relational Random Forest to three severe weather data sets. These are: predicting atmospheric turbulence across the continental United States, examining the formation of tornadoes near strong frontal boundaries, and understanding the spatial evolution of drought across the southern plains of the United States. The results on such a wide variety of real-world domains demonstrate the extensive applicability of the Spatiotemporal Relational Random Forest. Our long-term goal is to significantly improve the ability to predict and warn about severe weather events. We expect that the tools and techniques we develop will be applicable to a wide range of complex spatiotemporal phenomena.
2011: Prospects of the WSR-88D Radar for Cloud Studies. Journal of Applied Meteorology and Climatology, 50, 859–872, doi:10.1175/2010JAMC2303.1., , , , , ,
Sounding of clouds with the 10-cm wavelength Weather Surveillance Radar-1988 Doppler (WSR-88D) is discussed. Enhancements to signal processing and volume coverage patterns of the WSR-88D allow observations of a variety of clouds with reflectivities as low as −25 dBZ (at a range of 10 km). The high sensitivity of the WSR-88D, its wide velocity and unambiguous range intervals, and the absence of attenuation allow accurate measurements of the reflectivity factor, Doppler velocity, and spectrum width fields in clouds to ranges of about 50 km. Fields of polarimetric variables in clouds, observed with a research polarimetric WSR-88D, demonstrate an abundance of information and help to resolve Bragg and particulate scatter. The scanning, Doppler, and polarimetric capabilities of the WSR-88D allow real-time, three-dimensional mapping of cloud processes, such as transformations of hydrometeors between liquid and ice phases. Pockets of high differential reflectivities are frequently observed in clouds; maximal values of differential reflectivity exceed 8 dB, far above the level observed in rain. The establishment of the WSR-88D network consisting of 157 polarimetric radars can be used to collect cloud data at any radar site, making the network a potentially powerful tool for climatic studies.
2011: Assimilation of surface-based boundary-layer profiler observations during a cool season observation system simulation experiment. Part 1: Analysis impact. Monthly Weather Review, 139, 2309–2326, doi:10.1175/2011MWR3622.1., , , , , ,
In this study, an Observing System Simulation Experiment was used to examine how the assimilation of temperature, water vapor, and wind profiles from a potential array of ground-based remote sensing boundary layer profiling instruments impacts the accuracy of atmospheric analyses when using an ensemble Kalman filter data assimilation system. Remote sensing systems evaluated during this study include the Doppler wind lidar (DWL), Raman lidar (RAM), microwave radiometer (MWR), and the Atmospheric Emitted Radiance Interferometer (AERI). The case study tracked the evolution of several extratropical weather systems that occurred across the contiguous United States during 7–8 January 2008. Overall, the results demonstrate that using networks of high-quality temperature, wind, and moisture profile observations of the lower troposphere has the potential to improve the accuracy of wintertime atmospheric analyses over land. The impact of each profiling system was greatest in the lower and middle troposphere on the variables observed or retrieved by that instrument; however, some minor improvements also occurred in the unobserved variables and in the upper troposphere, particularly when RAM observations were assimilated. The best analysis overall was achieved when DWL wind profiles and temperature and moisture observations from the RAM, AERI, or MWR were assimilated simultaneously, which illustrates that both mass and momentum observations are necessary to improve the analysis accuracy.
2011: Observations of the 10 May 2010 tornado outbreak using OU-PRIME: Potential for new science with high-resolution polarimetric radar. Bulletin of the American Meteorological Society, 92, 871–891, doi:10.1175/2011BAMS3125.1., , , , , , , , , ,
A tornado outbreak occurred in central Oklahoma on 10 May 2010, including two tornadoes with enhanced Fujita scale ratings of 4 (EF-4). Tragically, three deaths were reported along with significant property damage. Several strong and violent tornadoes occurred near Norman, Oklahoma, which is a major hub for severe storms research and is arguably one of the best observed regions of the country with multiple Doppler radars operated by both the federal government and the University of Oklahoma (OU). One of the most recent additions to the radars in Norman is the high-resolution OU Polarimetric Radar for Innovations in Meteorology and Engineering (OU-PRIME). As the name implies, the radar is used as a platform for research and education in both science and engineering studies using polarimetric radar. To facilitate usage of the system by students and faculty, OU-PRIME was constructed adjacent to the National Weather Center building on the OU research campus. On 10 May 2010, several tornadoes formed near the campus while OU researchers were operating OU-PRIME in a sector scanning mode, providing polarimetric radar data with unprecedented resolution and quality. In this article, the environmental conditions leading to the 10 May 2010 outbreak will be described, an overview of OU-PRIME will be provided, and several examples of the data and possible applications will be summarized. These examples will highlight supercell polarimetric signatures during and after tornadogenesis, and they will describe how the polarimetric signatures are related to observations of reflectivity and velocity.
2011: Comparison between low-flash and non-lightning-producing convective areas within a mature mesoscale convective system. Weather and Forecasting, 26, 468–486, doi:10.1175/WAF-D-10-05012.1., , , ,
Two small multicellular convective areas within a larger mesoscale convective system that occurred on 20 June 2004 were examined to assess vertical motion, radar reflectivity, and dual-polarimetric signatures between flash and non-flash-producing convection. Both of the convective areas had similar life cycles and general structures. Yet, one case produced two flashes, one of which may have been a cloud-to-ground flash, while the other convective area produced no flashes. The non-lightning-producing case had a higher peak reflectivity up to 6 km. Hence, if a reflectivity-based threshold were used as a precursor to lightning, it would have yielded misleading results. The peak upward motion in the mixed-phase region for both cases was 8 m/s or less. However, the lightning-producing storm contained a wider region where the updraft exceeded 5 m/s. Consistent with the broader updraft region, the lightning-producing case exhibited a distinct graupel signature over a broader region than the non-lightning-producing convection. Slight differences in vertical velocity affected the quantity of graupel present in the mixed-phase region, thereby providing the subtle differences in polarimetric signatures that were associated with lightning activity. If the results here are generally applicable, then graupel volume may be a better precursor to a lightning flash than radar reflectivity. With the dual-polarimetric upgrade to the national observing radar network, it should be possible to better distinguish between lightning- and non-lightning-producing areas in weak convective systems that pose a potential safety hazard to the public
2012: A dual-wavelength polarimetric analysis of the 16 May 2010 Oklahoma City extreme hailstorm. Monthly Weather Review, 140, 1385–1403., ,
A comparative analysis of a supercell hailstorm using simultaneous observations with S-band and C-band polarimetric radars supported by abundant ground truth reports is presented in this study. The storm occurred on 16 May 2010 and produced a swath of extremely damaging hail across a large portion of the Oklahoma City metro area. Hail sizes over 10 cm in diameter and hail drifts upwards of 1.5 m in height were reported. Both S-band (KOUN) and C-band (OU-PRIME) polarimetric radars in Norman, OK sampled the storm at ranges less than 60 km, so that high-resolution dual-wavelength polarimetric data were obtained. Among the issues investigated in the study are the relation of hail size measured at the surface to the polarimetric signatures at both wavelengths, the difference between polarimetric signatures at the two wavelengths of hail aloft and near the surface (where melting hail is mixed with rain), and the three-body scattering signature (TBSS) associated with large hail.
2011: Assessment of forecasts during persistent valley cold pools in the Bonneville Basin by the North American Mesoscale model. Weather and Forecasting, 26, 447–467., , , ,
The North American Mesoscale (NAM) model forecasts of low-level temperature and dewpoint during persistent valley cold pools in the Bonneville Basin of Utah are assessed. Stations near the east sidewall have a daytime cold and nighttime warm bias. This is due to a poor representation of the steep slopes on this side of the basin. Basin stations where the terrain is better represented by the model have a distinct warm, moist bias at night. Stations in snow-covered areas have a cold bias for both day and night. Biases are not dependent on forecast lead or validation time. Several potential causes for the various errors are considered in a series of sensitivity experiments. An experiment with 4-km grid spacing, which better resolves the gradient of the slopes on the east side of the basin, yields smaller errors along the east corridor of the basin. The NAM model assumes all soil water freezes at a temperature of 273 K. This is likely not representative of the freezing temperature in the salt ﬂats in the western part of the basin, since salt reduces the freezing point of water. An experiment testing this hypothesis shows that reducing the freezing point of soil water in the salt ﬂats leads to an average error reduction between 1.5 and 4 K, depending on the station and time of day. Using a planetary boundary layer scheme that has greater mixing alleviates the cold bias over snow somewhat, but the exact source of this bias could not be determined.
2011: Assessment of forecasts during persistent valley cold pools in the Bonneville basin by the North American Mesoscale Model. Weather and Forecasting, 26, 447–467., , , ,
North American Mesoscale Model (NAM) forecasts of low-level temperature and dewpoint during persistent valley cold pools in the Bonneville Basin of Utah are assessed. Stations near the east sidewall have a daytime cold and nighttime warm bias. This is due to a poor representation of the steep slopes on this side of the basin. Basin stations where the terrain is better represented by the model have a distinct warm, moist bias at night. Stations in snow-covered areas have a cold bias for both day and night. Biases are not dependent on forecast lead or validation time. Several potential causes for the various errors are considered in a series of sensitivity experiments. An experiment with 4-km grid spacing, which better resolves the gradient of the slopes on the east side of the basin, yields smaller errors along the east corridor of the basin. The NAM assumes all soil water freezes at a temperature of 273 K. This is likely not representative of the freezing temperature in the salt flats in the western part of the basin, since salt reduces the freezing point of water. An experiment testing this hypothesis shows that reducing the freezing point of soil water in the salt flats leads to an average error reduction between 1.5 and 4 K, depending on the station and time of day. Using a planetary boundary layer scheme that has greater mixing alleviates the cold bias over snow somewhat, but the exact source of this bias could not be determined.
2010: Multifunction phased‐array radar: time balance scheduler for adaptive weather sensing. Journal of Atmospheric and Oceanic Technology, 27, 1854–1867., , ,
2011: Polarimetric radar observation operator for a cloud model with spectral microphysics. Journal of Applied Meteorology and Climatology, 50, 873–894., , , ,
2011: The Analysis and Prediction of the 8–9 May 2007 Oklahoma Tornadic Mesoscale Convective System by Assimilating WSR-88D and CASA Radar Data Using 3DVAR. Monthly Weather Review, 139, 224–246, doi:10.1175/2010MWR3336.1., , , , ,
The Advanced Regional Prediction System (ARPS) model is employed to perform high-resolution numerical simulations of a mesoscale convective system and associated cyclonic line-end vortex (LEV) that spawned several tornadoes in central Oklahoma on 8–9 May 2007. The simulation uses a 1000 km × 1000 km domain with 2-km horizontal grid spacing. The ARPS three-dimensional variational data assimilation (3DVAR) is used to assimilate a variety of data types. All experiments assimilate routine surface and upper-air observations as well as wind profiler and Oklahoma Mesonet data over a 1-h assimilation window. A subset of experiments assimilates radar data. Cloud and hydrometeor fields as well as in-cloud temperature are adjusted based on radar reflectivity data through the ARPS complex cloud analysis procedure. Radar data are assimilated from the Weather Surveillance Radar-1988 Doppler (WSR-88D) network as well as from the Engineering Research Center for Collaborative and Adaptive Sensing of the Atmosphere (CASA) network of four X-band Doppler radars. Three-hour forecasts are launched at the end of the assimilation window. The structure and evolution of the forecast MCS and LEV are markedly better throughout the forecast period in experiments in which radar data are assimilated. The assimilation of CASA radar data in addition to WSR-88D data increases the structural detail of the modeled squall line and MCS at the end of the assimilation window, which appears to yield a slightly better forecast track of the LEV.
2011: Real-time, rapidly updating severe weather products for virtual globes. Computers and Geosciences, 37, 3–12, doi:10.1016/j.cageo.2010.03.023., ,
It is critical that weather forecasters are able to put severe weather information from a variety of observational and modeling platforms into a geographic context so that warning information can be effectively conveyed to the public, emergency managers, and disaster response teams. The availability of standards for the specification and transport of virtual globe data products has made it possible to generate spatially precise, geo-referenced images and to distribute these centrally created products via a web server to a wide audience. In this paper, we describe the data and methods for enabling severe weather threat analysis information inside a KML framework. The method of creating severe weather diagnosis products that are generated and translating them to KML and image files is described. We illustrate some of the practical applications of these data when they are integrated into a virtual globe display. The availability of standards for interoperable virtual globe clients has not completely alleviated the need for custom solutions. We conclude by pointing out several of the limitations of the general-purpose virtual globe clients currently available.
2011: Downwelling infrared radiance temperature climatology for the Atmospheric Radiation Measurement Southern Great Plains site. Journal of Geophysical Research - D: Atmospheres, 116, D08212, doi:10.1029/2010JD015135., ,
Fourteen years of data from the Atmospheric Emitted Radiance Interferometer were used to determine the distribution of downwelling infrared radiance at 10 microns at the Department of Energy’s Atmospheric Radiation Measurement (ARM) site in north central Oklahoma. A neural network classification algorithm was applied to each infrared radiance observation to separate clear‐sky from cloudy conditions, with the latter being separated into two broad categories using a simple threshold on the downwelling radiance temperature to separate “thick” from “thin” clouds. This distribution shows a prominent trimodal character. The mode associated with the highest downwelling radiance values is associated with thick low‐altitude opaque clouds, whereas the other two modes each contain both clear‐sky and thin cloud samples. The distribution of the downwelling radiance in each classification is qualitatively similar for each year in the larger data set. A strong seasonal dependence is seen in the distribution of the radiance in the three classifications, with the clear‐sky classification being well correlated with the seasonal distribution of precipitable water vapor.
2010: Quantitative assessment of climate change and human impacts on long-term hydrologic response: a case study in a sub-basin of the Yellow River, China. International Journal of Climatology, 30, 2130–2137, doi:10.1002/joc.2023., , , , , ,
2011: The coupled routing and excess storage (CREST) distributed hydrological model. Hydrological Sciences Journal, 56, 84–98, doi:10.1080/02626667.2010.543087., , , , , , , , , , , , ,
2010: Quasi-decadal spectral peaks of tropical western Pacific SSTs as a percursor for tropical cyclone threat. Geophysical Research Letters, 37, doi:10.1029/2010GL044709., ,
2011: Cross validation of spaceborne radar and ground polarimetric radar aided by polarimetric echo classification of hydrometeor types. Journal of Applied Meteorology and Climatology, 50, 1389–1402, doi:10.1175/2011JAMC2622.1., , , , , , , ,
2011: Characteristics of Sonoran Desert Microbursts. Weather and Forecasting, 26, 94–108, doi:10.1175/2010WAF2222388.1., , , ,
During the 2008 North American monsoon season, 140 microburst events were identified in Phoenix, Arizona, and the surrounding Sonoran Desert. The Sonoran microbursts were studied and examined for their frequency and characteristics, as observed from data collected from three Doppler radars and electrical power infrastructure damage reports. Sonoran microburst events were wet microbursts, and occurred most frequently in the evening hours (19-21 MST). Stronger maximum differential velocities (20-25 m s-1) were observed more frequently in Sonoran microbursts than in many previously documented microbursts. Alignment of Doppler radar data to reports of wind-related damage to electrical power infrastructure in Phoenix allowed a comparison of microburst wind damage versus gust-front wind damage. For these damage reports, microburst winds caused more significant damage than gust-front winds.
2011: A new parametric model of vortex tangential-wind profiles: Development, testing, and verification. Journal of the Atmospheric Sciences, 68, 5, 990–1006, doi:10.1175/2011JAS3588.1., ,
A new parametric model of vortex tangential-wind profiles is presented that is primarily designed to depict realistic-looking tangential wind profiles such as those in intense atmospheric vortices arising in dust devils, waterspouts, tornadoes, mesocyclones, and tropical cyclones. The profile employs five key parameters: maximum tangential wind, radius of maximum tangential wind, and three power-law exponents that shape different portions of the velocity profile. In particular, a new parameter is included controlling the broadly or sharply peaked profile in the annular zone of tangential velocity maximum. Different combinations of varying the model parameters are considered to investigate and understand their effects on the physical behaviors of tangential wind and corresponding vertical vorticity profiles. Additionally, the parametric tangential velocity and vorticity profiles are favorably compared to those of an idealized Rankine model and also those of a theoretical stagnant core vortex model in which no tangential velocity exists within a core boundary and a potential flow occurs outside the core. Furthermore, the parametric profiles are evaluated against and compared to those of two other idealized vortex models (Burgers-Rott and Sullivan). The comparative profiles indicate very good agreements with low root-mean-square errors of a few tenths of a meter per second and high correlation coefficients of nearly one. Thus, the veracity of the parametric model is demonstrated.
2011: A VAD-based dealiasing method for radar velocity data quality control. Journal of Atmospheric and Oceanic Technology, 28, 50–62., , , , , ,
This paper describes a new VAD-based dealiasing method developed for automated radar radial-velocity data quality control to satisfy the high quality standard and efficiency required by operational radar data assimilation. The method is built on an alias-robust velocity azimuth display (AR-VAD) analysis. It upgrades and simplifies the previous three-step dealiasing method in three major aspects. First, the AR-VAD is used with sufficiently stringent threshold conditions in place of the original modified VAD for the preliminary reference check to produce alias-free seed data in the first step. Second, the AR-VAD is more accurate than the traditional VAD for the refined reference check in the original second step, so the original second step becomes unnecessary and is removed. Third, a block-to-point continuity check procedure is developed, in place of the point-to-point continuity check in the original third step, to enhance the use of available seed data in a properly enlarged block area around each flagged data point that is being checked with multiple threshold conditions to avoid false dealiasing. The new method has been tested extensively with aliased radial-velocity data collected under various weather conditions, including hurricane high-wind conditions. The robustness of the new method is exemplified by the result tested with a hurricane case. The limitations of the new method and possible improvements are discussed.
2011: Measuring information content from observations for data assimilation: Spectral formulations and their implications to observational data compression. Tellus, 63A, 793–804.,
The previous singular-value formulations for measuring information content from observations are transformed into spectral forms in the wavenumber space for univariate analyses of uniformly distributed observations. The transformed spectral formulations exhibit the following advantages over their counterpart singular-value formulations: (i) The information contents from densely distributed observations can be calculated very efficiently even if the background and observation space dimensions become both too large to compute by using the singular-value formulations. (ii) The information contents and their asymptotic properties can be analyzed explicitly for each wavenumber. (iii) Super-observations can be not only constructed by a truncated spectral expansion of the original observations with zero or minimum loss of information but also explicitly related to the original observations in the physical space. The spectral formulations reveal that (i) uniformly thinning densely distributed observations will always cause a loss of information and (ii) compressing densely distributed observations into properly coarsened super-observations by local averaging may cause no loss of information under certain circumstances.
2011: Retrievals of cloud optical depth and effective radius from thin-cloud rotating shadowband radiometer measurements. Journal of Geophysical Research - D: Atmospheres, 116, D23208, 1–9, doi:10.1029/2011JD016192., , , , , ,
A Thin-Cloud Rotating Shadowband Radiometer (TCRSR) was developed and deployed in a field test at the Atmospheric Radiation Measurement Climate Research Facility’s Southern Great Plains site. The TCRSR measures the forward-scattering lobe of the direct solar beam (i.e., the solar aureole) through an optically thin cloud (optical depth < 8). We applied the retrieval algorithm of Min and Duan (2005) to the TCRSR measurements of the solar aureole to derive simultaneously the cloud optical depth (COD) and cloud drop effective radius (DER), subsequently inferring the cloud liquid-water path (LWP). After careful calibration and preprocessing, our results indicate that the TCRSR is able to retrieve simultaneously these three properties for optically thin water clouds. Colocated instruments, such as the MultiFilter Rotating Shadowband Radiometer (MFRSR), atmospheric emitted radiance interferometer (AERI), and Microwave Radiometer (MWR), are used to evaluate our retrieval results. The relative difference between retrieved CODs from the TCRSR and those from the MFRSR is less than 5%. The distribution of retrieved LWPs from the TCRSR is similar to those from the MWR and AERI. The differences between the TCRSR-based retrieved DERs and those from the AERI are apparent in some time periods, and the uncertainties of the DER retrievals are discussed in detail in this article.
2011: Polarimetric Phased Array Radar for Weather Measurement: A Planar or Cylindrical Configuration?. Journal of Atmospheric and Oceanic Technology, 28, 63–73., , , , ,
This paper suggests a cylindrical configuration for agile beam polarimetric phased-array radar (PPAR) for weather surveillance. The most often used array configuration for PAR is a planar array antenna. The planar configuration, however, has significant deficiencies for polarimetric measurements, as well as other limitations, such as increases in beamwidth, decreases of sensitivity, and changes in the polarization basis when the beam scans off its broadside. The cylindrical polarimetric phased-array radar (CPPAR) is proposed to avoid these deficiencies. The CPPAR principle and potential performance are demonstrated through theoretical analysis and simulation. It is shown that the CPPAR has the advantage of a scan-invariant polarization basis, and thus avoids the inherent limitations of the planar PPAR (i.e., PPPAR).
2011: Multipatterns of the National Weather Radar Testbed Mitigate Clutter Received via Sidelobes. Journal of Atmospheric and Oceanic Technology, 28, 401–409, doi:10.1175/2010JTECHA1453.1., , , , ,
The Phased Array Radar (PAR) of the National Weather Radar Testbed (NWRT) has a unique hybrid (mechanical and electrical) azimuth scan capability, allowing weather observations with different antenna patterns. Observations show the standard deviation of the sample mean power of weather echoes received through the main lobe of a set of squinted beams is less than clutter received via sidelobes. This then allows use of a multi-pattern technique to cancel sidelobe echoes from moving scatterers, echoes that cannot be filtered with a ground-clutter canceller. Although the multi-pattern technique was developed to cancel clutter received through sidelobes, results show clutter from objects moving within the beam can also be canceled.
2011: Winter precipitation microphysics characterized by polarimetric radar and video disdrometer observations in central Oklahoma. Journal of Applied Meteorology and Climatology, 50, 1558–1570., , , , , ,
The study of precipitation in different phases is important to understanding the physical processes that occur in storms, as well as to improving their representation in numerical weather prediction models. A 2D video disdrometer was deployed about 30 km from a polarimetric weather radar in Norman, Oklahoma, (KOUN) to observe winter precipitation events during the 2006/07 winter season. These events contained periods of rain, snow, and mixed-phase precipitation. Five-minute particle size distributions were generated from the disdrometer data and fitted to a gamma distribution; polarimetric radar variables were also calculated for comparison with KOUN data. It is found that snow density adjustment improves the comparison substantially, indicating the importance of accounting for the density variability in representing model microphysics.
2010: A real-time algorithm for the correction of bright band effects in radar-derived precipitation estimation. J. Hydrometeorology, 11, 1157–1171., ,
The bright band (BB) is a layer of enhanced reflectivity due to melting of aggregated snow and ice crystals. The locally high reflectivity causes significant overestimation in radar precipitation estimates if an appropriate correction is not applied. The main objective of the current study is to develop a method that automatically corrects for large errors due to BB effects in a real-time national radar quantitative precipitation estimation (QPE) product. An approach that combines the mean apparent vertical profile of reflectivity (VPR) computed from a volume scan of radar reflectivity observations and an idealized linear VPR model was used for computational efficiency. The methodology was tested for eight events from different regions and seasons in the United States. The VPR correction was found to be effective and robust in reducing overestimation errors in radar-derived QPE, and the corrected radar precipitation fields showed physically continuous distributions. The correction worked consistently well for radars in flat land regions because of the relatively uniform spatial distributions of the BB in those areas. For radars in mountainous regions, the performance of the correction is mixed because of limited radar visibility in addition to large spatial variations of the vertical precipitation structure due to underlying topography.
2011: Spatial verification using a true metric. Atmospheric Research, 102, 408–419., , , , , ,
Verifying high-resolution forecasts is challenging because forecasts can be considered good by their end-users even when there is no pixel-to-pixel correspondence between the forecast and the verification field. Many of the verification methods that have been proposed to address the verification of high-resolution forecasts are based on filtering, warping or searching within a neighborhood of pixels in the forecast and/or the verification fields in order to retain the capability to use a simple metric. This is because it is necessary for a verification score to be a metric to allow comparisons of forecasts. In this paper, we devise a computationally simple scalar spatial verification metric that is capable of ordering forecasts without preprocessing the fields. The metric is based on the insight that in the verification problem, the observation field can be considered a reference field that forecast fields are ordered against. This new metric is demonstrated on synthetic and real model forecasts of precipitation.
2010: Bias in differential reflectivity due to cross-coupling through the radiation patterns of polarimetric weather radars. Journal of Atmospheric and Oceanic Technology, 49, 1624–1637., , , ,
Examined is bias in differential reflectivity and its effect on estimates of rain rate due to coupling of the vertically and horizontally polarized fields through the radiation patterns. To that end, a brief review of the effects of the bias on quantitative rainfall measurements is given. Suggestions for tolerable values of this bias are made. Of utmost interest is the bias produced by radars simultaneously transmitting horizontally and vertically polarized fields, as this configuration has been chosen for pending upgrades to the U.S. national network of radars (Weather Surveillance Radar-1988 Doppler; WSR-88D). The bias strongly depends on the cross-polar radiation pattern. Two patterns, documented in the literature, are considered.
2011: Bias correction and Doppler measurement for polarimetric phased-array radar. IEEE Transactions on Geoscience and Remote Sensing, 49, 843–853., , ,
This paper discusses ways to avoid and/or mitigate biases in polarimetric variables inherent to agile-beam planar phased-array radars. Two bias-avoiding schemes produce unbiased estimates of the polarimetric backscattering covariance matrix which are then combined into bias-free polarimetric variables. One concerns full polarimetric measurements and calls for adjusting the amplitudes and phases of the array elements so that the transmitted field equals that generated by a mechanically steered polarimetric weather radar antenna; this is followed by an additional adjustment of the received fields. The second scheme is also applicable to full polarimetric measurements but involves adjustments only of the received fields. Crucial to both schemes is decoupling of the Doppler effects from the terms of the covariance matrix. It is a significant part of the bias issue that had not been previously addressed. A scheme to reduce bias applicable to nondepolarizing media (i.e., diagonal backscattering matrix) is also addressed; it calls for multiplication of the fields received by each dipole as opposed to a combination of multiplication and addition required for full correction. The schemes are applied to the alternate transmission and simultaneous reception polarimetric mode and the simultaneous transmission and simultaneous reception mode.
FY 2010 — 66 publications
2010: Multiple-scattering in radar systems: a review. J. Quantitative Spectroscopy and Radiative Transfer, 111, 917–947., , , , ,
Although extensively studied within the lidar community, the multiple scattering phenomenon has always been considered a rare curiosity by radar meteorologists. Up to few years ago its appearance has only been associated with two or three-body-scattering features (e.g. hail flares and mirror images) involving highly reflective surfaces.
Recent atmospheric research aimed at better understanding of the water cycle and the role played by clouds and precipitation in affecting the Earth’s climate has driven the deployment in space of high frequency radars, like the TRMM 13.5 GHz, the CloudSat 94 GHz, the upcoming EarthCARE 94 GHz,and the GPM dual 13-35 GHz radars. These systems have the potential of detecting the vertical distribution of hydrometeors and thus provide crucial feedbacks for radiation and climate studies. The shift towards higher frequencies increases the sensitivity to hydrometeors, improves the spatial esolution and reduces the size and weight of the radar systems. On the other hand, higher frequency radars are affected by stronger extinction, especially in presence of large precipitating particles (e.g. raindrops or hail particles), which may eventually drive the signal below the minimum detection threshold. In such circumstances the interpretation of the radar equation via the single scattering approximation may be erroneous if the radiation emitted from the radar after interacting more than once with the medium still contributes substantially to the received power. This is the case if the transport mean-free-path becomes comparable with the instrument footprint (determined by the antenna beam-width and the platform altitude).
This situation resembles to what has already been experienced in lidar observations, but with a predominance of wide- versus small-angle scattering events. At millimeter wavelengths, radiation is diffused by hydrometeors relatively isotropically compared to the visible or near infrared region where scattering is predominantly in the forward direction. A complete understanding of radiation transport modeling and data analysis methods under wide-angle multiple scattering conditions is mandatory for a correct interpretation of echoes observed by space-borne millimeter radars.
This paper reviews the status of research in this field. Different numerical techniques currently implemented to account for higher order scattering are reviewed and their weaknesses and strengths highlighted. Examples of simulated
radar backscattering profiles are provided with particular emphasis given to situations in which the multiple scattering contributions become comparable or overwhelm the single scattering signal. We show evidences of multiple scattering effects from air-borne and from the CloudSat observations, i.e. unique signatures which cannot be explained within the frame of the single scattering theory. Ideas how to identify and tackle the multiple scattering effects are discussed. Finally perspectives and suggestions for future work are outlined.
This work represents a reference-guide for studies focused at modeling the radiation transport and at interpreting data from high frequency space-borne radar systems that probe highly opaque scattering media such as thick ice clouds or precipitating clouds.
2010: An assessment of the absolute accuracy of the Atmospheric Infrared Sounder v5 precipitable water vapor product at tropical, mid-latitude, and Arctic ground-truth sites: September 2002 through August 2008. Journal of Geophysical Research - D: Atmospheres, 115, D17310, doi:10.1029/2009JD013139., , , , ,
The Atmospheric Infrared Sounder (AIRS) is the first of a series of satellite sensors that exploit high spectral resolution and broad spectral coverage of the midinfrared to improve the retrieval accuracy of passive infrared sounding. The AIRS atmospheric retrieval goals are to obtain 1 K accuracy for 1 km layers below 100 mb for temperature and 10% for 2 km layers for water vapor in clear and most cloud conditions. The AIRS total column precipitable water vapor (PWV) is obtained by integrating the vertical profile of water vapor mixing ratio derived from cloud‐cleared radiances. The accuracy goal of the AIRS PWV product is 5%. This paper provides a validation of the AIRS PWV product at three distinct climate sites over the nearly full range of total water amounts observed on Earth (between 0.1 and 6.5 cm). Six years (September 2002 to August 2008) of AIRS v5 retrievals of PWV are evaluated against ground‐based microwave radiometer (MWR) data at three Department of Energy Atmospheric Radiation Measurement program (ARM) sites. The accuracy of the MWR PWV retrieval is estimated to be between 1% and 3%. This study shows that the agreement between the MWR and AIRS retrievals of PWV is within 5% at all three ARM climate sites for most conditions.
The notable exceptions are (1) very dry cases (PWV < 1 cm) over the Southern Great Plains (SGP) land site during both daytime and nighttime, where AIRS is too moist by 15%–30% and (2) nighttime observations over the SGP land site for PWV > 1 cm, where AIRS is too dry by about 10%. The moist bias for low water amounts (usually observed during the winter) over land could be a surface emissivity–related error since very little bias is seen at the ARM Arctic site for similar water amounts. The cause of the dry bias at nighttime over land for moderate water amounts is not determined by this study. However, a spatial map of the diurnal bias in monthly AIRS water amount suggests
that this effect is related to meteorological conditions in the U. S. Great Plains, which in the summertime is characterized by a moist boundary layer. The diurnal error in AIRS PWV at the SGP site seen with respect to the MWR data are confirmed by PWV amounts derived from a coincident ground‐based GPS receiver.
2010: A case study on the impact of moisture variability on convection initiation using radar refractivity retrievals. Journal of Applied Meteorology and Climatology, 49, 1766–1778., , , , ,
A case study illustrating the impact of moisture variability on convection initiation in a synoptically active environment without strong moisture gradients is presented. The preconvective environment on 30 April 2007 nearly satisfied the three conditions for convection initiation: moisture, instability, and a low-level lifting mechanism. However, a sounding analysis showed that a low-level inversion layer and high LFC would prevent convection initiation because the convective updraft velocities required to overcome the convective inhibition (CIN) were much higher than updraft velocities typically observed in convergence zones. Radar refractivity retrievals from the Twin Lakes, Oklahoma (KTLX), Weather Surveillance Radar-1988 Doppler (WSR-88D) showed a moisture pool contributing up to a 2°C increase in dewpoint temperature where the initial storm-scale convergence was observed. The analysis of the storm-relative wind field revealed that the developing storm ingested the higher moisture associated with the moisture pool. Sounding analyses showed that the moisture pool reduced or nearly eliminated CIN, lowered the LFC by about 500 m, and increased CAPE by 2.5 times. Thus, these small-scale moisture changes increased the likelihood of convection initiation within the moisture pool by creating a more favorable thermodynamic environment. The results suggest that refractivity data could improve convection initiation forecasts by assessing moisture variability at finer scales than the current observation network.
2010: Evaluation of Distributed Collaborative Adaptive Sensing for Detection of Low-Level Circulations and Implications for Severe Weather Warning Operations. Weather and Forecasting, 25, 173–189, doi:10.1175/2009WAF2222233.1., , , , , , ,
The Center for Collaborative Adaptive Sensing of the Atmosphere (CASA) is a multiyear engineering research center established by the National Science Foundation for the development of small, inexpensive, low-power radars designed to improve the scanning of the lowest levels (<3 km AGL) of the atmosphere. Instead of sensing autonomously, CASA radars are designed to operate as a network, collectively adapting to the changing needs of end users and the environment; this network approach to scanning is known as distributed collaborative adaptive sensing (DCAS). DCAS optimizes the low-level volume coverage scanning and maximizes the utility of each scanning cycle. A test bed of four prototype CASA radars was deployed in southwestern Oklahoma in 2006 and operated continuously while in DCAS mode from March through June of 2007.
This paper analyzes three convective events observed during April–May 2007, during CASA’s intense operation period (IOP), with a special focus on evaluating the benefits and weaknesses of CASA radar system deployment and DCAS scanning strategy of detecting and tracking low-level circulations. Data collected from nearby Weather Surveillance Radar-1988 Doppler (WSR-88D) and CASA radars are compared for mesoscyclones, misocyclones, and low-level vortices. Initial results indicate that the dense, overlapping coverage at low levels provided by the CASA radars and the high temporal (60 s) resolution provided by DCAS give forecasters more detailed feature continuity and tracking. Moreover, the CASA system is able to resolve a whole class of circulations—misocyclones—far better than the WSR-88Ds. In fact, many of these are probably missed completely by the WSR-88D. The impacts of this increased detail on severe weather warnings are under investigation. Ongoing efforts include enhancing the DCAS data quality and scanning strategy, improving the DCAS data visualization, and developing a robust infrastructure to better support forecast and warning operations.
2010: Polarimetric radar rain estimation through retrieval of drop size distribution using Bayesian approach. Journal of Applied Meteorology and Climatology, 49, 973–990, doi:10:1175/2009JAMc2227.1., , , ,
This study proposes a Bayesian approach to retrieve raindrop size distributions (DSDs) and to estimate rainfall rates from radar reflectivity in horizontal polarization ZH and differential reflectivity ZDR. With this approach, the authors apply a constrained-gamma model with an updated constraining relation to retrieve DSD parameters. Long-term DSD measurements made in central Oklahoma by the two-dimensional video disdrometer (2DVD) are first used to construct a prior probability density function (PDF) of DSD parameters, which are estimated using truncated gamma fits to the second, fourth, and sixth moments of the distributions. The forward models of ZH and ZDR are then developed based on a T-matrix calculation of raindrop backscattering amplitude with the assumption of drop shape. The conditional PDF of ZH and ZDR is assumed to be a bivariate normal function with appropriate standard deviations. The Bayesian algorithm has a good performance according to the evaluation with simulated ZH and ZDR. The algorithm is also tested on S-band radar data for a mesoscale convective system that passed over central Oklahoma on 13 May 2005. Retrievals of rainfall rates and 1-h rain accumulations are compared with in situ measurements from one 2DVD and six Oklahoma Mesonet rain gauges, located at distances of 28–54 km from Norman, Oklahoma. Results show that the rain estimates from the retrieval agree well with the in situ measurements, demonstrating the validity of the Bayesian retrieval algorithm.
2010: Convection-allowing and Convection-parameterizing ensemble forecasts of a mesoscale convective vortex and associated severe weather environment. Weather and Forecasting, 25, 1052–1081., , , ,
2010: Evaluation of WRF model output for severe weather forecasting from the 2008 NOAA Hazardous Weather Testbed Spring Experiment. Weather and Forecasting, 25, 408–427, doi:10.1175/2009WAF2222258.1., , , , , ,
This study assesses forecasts of the preconvective and near-storm environments from the convectionallowing
models run for the 2008 National Oceanic and Atmospheric Administration (NOAA) Hazardous
Weather Testbed (HWT) spring experiment. Evaluating the performance of convection-allowing models
(CAMs) is important for encouraging their appropriate use and development for both research and operations.
Systematic errors in theCAMforecasts included a cold bias in mean 2-m and 850-hPa temperatures over most
of the United States and smaller than observed vertical wind shear and 850-hPa moisture over the high plains.
The placement of airmass boundaries was similar in forecasts from the CAMs and the operational North
American Mesoscale (NAM) model that provided the initial and boundary conditions. This correspondence
contributed to similar characteristics for spatial and temporalmean error patterns. However, substantial errors
were found in the CAM forecasts away from airmass boundaries. The result is that the deterministic CAMs
do not predict the environment as well as the NAM. It is suggested that parameterized processes used at
convection-allowing grid lengths, particularly in the boundary layer, may be contributing to these errors.
It is also shown that mean forecasts from an ensemble of CAMs were substantially more accurate than
forecasts from deterministic CAMs. If the improvement seen in the CAM forecasts when going from a deterministic
framework to an ensemble framework is comparable to improvements in mesoscale model forecasts
when going from a deterministic to an ensemble framework, then an ensemble of mesoscale model
forecasts could predict the environment even better than an ensemble of CAMs. Therefore, it is suggested that
the combination of mesoscale (convection parameterizing) andCAMconfigurations is an appropriate avenue
to explore for optimizing the use of limited computer resources for severe weather forecasting applications.
2010: Environmental Factors in the Upscale Growth and Longevity of MCSs Derived from Rapid Update Cycle Analyses. Monthly Weather Review, 138, 3514–3539., , ,
Composite environments of mesoscale convective systems (MCSs) are produced from Rapid Update Cycle (RUC) analyses to explore the differences between rapidly- and slowly-developing MCSs as well as the differences ahead of long-lived and short-lived MCSs. The composite analyses capture the synoptic-scale features known to be associated with MCSs and depict the inertial oscillation of the nocturnal low-level jet (LLJ), which remains strong but tends to veer away from decaying MCSs. The composite first-storms environment for the rapidly-developing MCSs contains a stronger LLJ located closer to the first storms region, much more conditional instability, potential instability, and energy available for downdrafts, smaller 3 – 10 km vertical wind shear, and smaller geostrophic potential vorticity in the upper troposphere, when compared to the environment for the slowly-developing MCSs. The weaker shear above 3 km for the rapidly-developing MCSs is consistent with supercell or discrete cell modes being less likely in weaker deep layer shear and the greater potential for a cold pool to trigger convection when the shear is confined to lower levels. Furthermore, these results suggest that low values of upper-level potential vorticity may signal a rapid transition to an MCS. The composite environment ahead of the genesis of long-lived MCSs contains a broader LLJ, a better-defined frontal zone, stronger low-level frontogenesis, deeper moisture and stronger wind shear above 2 km, when compared to short-lived MCSs. The larger shear above 2 km for the long-lived MCSs is consistent with the importance of shear elevated above the ground to help organize and maintain convection that feeds on the elevated unstable parcels after dark and is indicative of the enhanced baroclinicity ahead of the MCSs.
2010: Revisiting the 3-4 April 1974 super outbreak of tornadoes. Weather and Forecasting, 25, doi:10.1175/2009WAF2222297.1., , , , , ,
The Super Outbreak of tornadoes over the central and eastern United States on 3–4 April 1974 remains the most outstanding severe convective weather episode on record in the continental United States. The outbreak far surpassed previous and succeeding events in severity, longevity, and extent. In this paper, surface, upper-air, radar, and satellite data are used to provide an updated synoptic and subsynoptic overview of the event. Emphasis is placed on identifying the major factors that contributed to the development of the three main convective bands associated with the outbreak, and on identifying the conditions that may have contributed to the outstanding number of intense and long-lasting tornadoes. Selected output from a 29-km, 50-layer version of the Eta forecast model, a version similar to that available operationally in the mid-1990s, also is presented to help depict the evolution of thermodynamic stability during the event.
2010: Comparison of Evaporation and Cold Pool Development between Single-Moment and Multimoment Bulk Microphysics Schemes in Idealized Simulations of Tornadic Thunderstorms. Monthly Weather Review, 138, doi:10.1175/2009MWR2956.1., , , ,
Idealized simulations of the 3 May 1999 Oklahoma tornadic supercell storms are conducted at various horizontal grid spacings ranging from 1 km to 250 m, using a sounding extracted from a prior 3-km grid spacing real-data simulation. A sophisticated multimoment bulk microphysics parameterization scheme capable of predicting up to three moments of the particle or drop size distribution (DSD) for several liquid and ice hydrometeor species is evaluated and compared with traditional single-moment schemes. The emphasis is placed on the impact of microphysics, specifically rain evaporation and size sorting, on cold pool strength and structure, and on the overall reflectivity structure of the simulated storms. It is shown through microphysics budget analyses and examination of specific processes within the low-level downdraft regions that the multimoment scheme has important advantages, which lead to a weaker and smaller cold pool and better reflectivity structure, particularly in the forward-flank region of the simulated supercells. Specifically, the improved treatment of evaporation and size sorting, and their effects on the predicted rain DSDs by the multimoment scheme helps to control the cold bias often found in the simulations using typical single-moment schemes. The multimoment results are more consistent with observed (from both fixed and mobile mesonet platforms) thermodynamic conditions within the cold pools of the discrete supercells of the 3 May 1999 outbreak.
2010: A far-infrared radiative closure study in the Arctic: Application to water vapor. Journal of Geophysical Research - D: Atmospheres, 115, D17106, doi:10.1029/2009JD012968., , , , ,
Far‐infrared (lambda > 15.0 mm) (far‐IR) radiative processes provide a large fraction of Earth’s outgoing longwave radiation and influence upper tropospheric vertical motion. Water vapor, because of its abundance and strong absorption properties over an extended spectral range, is the primary source of these radiative processes. Historically, the lack of spectrally resolved radiometric instruments and the opacity of the lower atmosphere have precluded extensive studies of far‐IR water vapor absorption properties. The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) program has organized a series of field experiments, the Radiative Heating in Underexplored Bands Campaigns (RHUBC), to address this deficiency. The first phase of RHUBC took place in 2007 at the ARM North Slope of Alaska Climate Research Facility. Measurements taken before and during this campaign have provided the basis for a clear‐sky radiative closure study aimed at reducing key uncertainties associated with far‐IR radiative transfer models. Extended‐range Atmospheric Emitted Radiance Interferometer infrared radiance observations taken in clear sky conditions were compared against calculations from the Line‐By‐Line Radiative Transfer Model. The water vapor column amounts used in these calculations were retrieved from 183 GHz radiometer measurements. The uncertainty in these integrated water vapor retrievals is approximately 2%, a notable improvement over past studies. This far‐IR radiative closure study resulted in an improvement to the Mlawer‐Tobin Clough‐Kneiyzs‐Davies (MT_CKD) water vapor foreign continuum model and updates to numerous, far‐IR water vapor line parameters from their values in the circa 2006 version of the HITRAN molecular line parameter database.
2010: Use of novel lightning data and advanced modeling approaches to predict maritime cyclogenesis. Bulletin of the American Meteorological Society, 91, 1091–1093, doi:10.1175/2010BAMS2926.1., , , , , ,
2010: An Empirical Latent Heat Flux Parameterization for the Noah Land Surface Model. Journal of Applied Meteorology and Climatology, 49, 1696–1713, doi:10.1175/2010JAMC2180.1., ,
Proper partitioning of the surface energy ﬂuxes that drive the evolution of the planetary boundary layer in numerical weather prediction models requires an accurate representation of initial land surface conditions. Unfortunately, soil temperature and moisture observations are unavailable in most areas and routine daily estimates of vegetation coverage and biomass are not easily available. This gap in observational capabilities seriously hampers the evaluation and improvement of land surface parameterizations, since model errors likely relate to improper initial conditions as much as to inaccuracies in the parameterizations. Two unique datasets help to overcome these difﬁculties. First, 1-km fractional vegetation coverage and leaf area index values can be derived from biweekly maximum normalized difference vegetation index composites obtained from daily observations by the Advanced Very High Resolution Radiometer onboard NOAA satellites. Second, the Oklahoma Mesonet supplies multiple soil temperature and moisture measurements at various soil depths each hour. Combined, these two datasets provide signiﬁcantly improved initial conditions for a land surface model and allow an evaluation of the accuracy of the land surface model with much greater conﬁdence than previously. Forecasts that both include and neglect these unique land surface observations are used to evaluate the value of these two data sources to land surface initializations. The dense network of surface observations afforded by the Oklahoma Mesonet, including surface ﬂux data derived from special sensors, provides veriﬁcation of the model results, which indicate that predicted latent heat ﬂuxes still differ from observations by as much as 150 W m-2. This result provides a springboard for assessing parameterization errors within the model. A new empirical parameterization developed using principal-component regression reveals simple relationships between latent heat ﬂux and other surface observations. Periods of very dry conditions observed across Oklahoma are used advantageously to derive a parameterization for evaporation from bare soil. Combining this parameterization with an empirical canopy transpiration scheme yields improved sensible and latent heat ﬂux forecasts and better partitioning of the surface energy budget. Surface temperature and mixing ratio forecasts show improvement when compared with observations.
2009: Evaluation of incremental improvements to quantitative precipitation estimates in complex terrain. J. Hydrometeor, 10, 1507–1520, doi:10.1175/2009JHM1125.1., , , ,
2010: Intercomparison of rainfall estimates from radar, satellite, gauge, and combinations for a season of record rainfall. Journal of Applied Meteorology and Climatology, 49, 437–452., , , , ,
2010: Impacts of polarimetric radar observations on hydrologic simulation. J. Hydrometeor, 11, 781–796., , , , , ,
2009: Radar Refractivity Retrievals in Oklahoma: Insights into Operational Benefits and Limitations. Weather and Forecasting, 24, 1345–1361, doi:10.1175/2009WAF2222256.1., , , , ,
The 2007 and 2008 spring refractivity experiments at KTLX investigated the potential utility of high-resolution, near-surface refractivity measurements to operational forecasting. During these experiments, forecasters at the Norman, Oklahoma, National Weather Service Forecast Office (NWSFO) assessed refractivity and scan-to-scan refractivity change fields retrieved from the Weather Surveillance Radar-1988 Doppler weather radar near Oklahoma City—Twin Lakes, Oklahoma (KTLX). Both quantitative and qualitative analysis methods were used to analyze the 41 responses from seven forecasters to a questionnaire designed to measure the impact of refractivity fields on forecast operations. The analysis revealed that forecasts benefited from the refractivity fields on 25% of the days included in the evaluation. In each of these cases, the refractivity fields provided complementary information that somewhat enhanced the forecasters’ capability to analyze the near-surface environment and boosted their confidence in moisture trends. A case in point was the ability to track a retreating dryline after its location was obscured by a weak reflectivity bloom caused by biological scatterers. Forecasters unanimously agreed, however, that the impact of this complementary information on their forecasts was too insignificant to justify its addition as an operational dataset. The applicability of these findings to other NWSFOs may be limited to locations with similar weather situations and access to surface data networks like the Oklahoma Mesonet.
2009: The Use of Coherency to Improve Signal Detection in Dual-Polarization Weather Radars. Journal of Atmospheric and Oceanic Technology, 26, 2474–2487, doi:10.1175/2009JTECHA1154.1., , ,
Currently, signal detection and censoring in operational weather radars is performed using thresholds of estimated signal-to-noise ratio (SNR) and/or magnitude of autocorrelation coefficient at the first temporal lag. Growing popularity of polarimetric radars prompts the quest for improved detection schemes that take advantage of the signals from the two orthogonally polarized electric fields. A hybrid approach is developed based on the sum of the cross-correlation estimates as well as the powers and autocorrelations from each of the dual-polarization returns. The hypothesis that “signal is present” is accepted if the sum exceeds a predetermined threshold. Otherwise, data are considered to represent noise and are censored. The threshold is determined by the acceptable rate of false detections that is less than or equal to a preset value. The scheme is evaluated both in simulations and through implementation on time-series data collected by the research weather surveillance radar (KOUN) in Norman, Oklahoma.
2009: The role of upstream mid-tropospheric circulations in Sierra Nevada leeside (spillover precipitation): Part II: Secondary atmospheric river accompanying a mid-level jet. J. Hydrometeor, 10, 1327–1354., , , , , ,
2010: The impact of evaporation on polarimetric characteristics of rain: Theoretical model and practical implications. Journal of Applied Meteorology and Climatology, 49, 1247–1267., ,
Soon the National Weather Service WSR-88D radar network will be upgraded to allow dual-polarization capabilities. Therefore, it is imperative to understand and identify
microphysical processes using the polarimetric variables. Though melting and size sorting of hydrometeors has been investigated, there has been relatively little focus devoted to the impact of evaporation on the polarimetric characteristics of rainfall. In this study, a simple explicit bin microphysics one-dimensional rainshaft model is constructed to quantify the impact of evaporation (neglecting the collisional processes) on vertical profiles of polarimetric radar variables in rain. Results of this model are applicable for light to moderate rain (< 10 mm hr-1).
The modeling results indicate that the amount of evaporation that occurs in the subcloud layer is strongly dependent on the initial shape of the drop size distribution (DSD) aloft which can be assessed with polarimetric measurements. Understanding how radar-estimated rainfall rates may change in height due to
evaporation is important for quantitative precipitation estimates, especially in regions far from the radar or in regions of complex terrain where low levels may not be adequately sampled. In addition to quantifying the effects of evaporation, we offer a simple method of estimating the
amount of evaporation that occurs in a given environment based on polarimetric radar measurements of reflectivity factor ZH and differential reflectivity ZDR aloft. Such a technique may be useful to operational meteorologists and hydrologists in estimating the amount of precipitation reaching the surface, especially in regions of poor low-level radar coverage such as mountainous regions or locations at large distances from the radar.
2010: Strengths and limitations of current radar systems for two stakeholder groups in the Southern Plains. Bulletin of the American Meteorological Society, 91, 899–910., , ,
Advancements in radar technology since the deployment of the Weather Surveillance Radar-1988 Doppler (WSR-88D) network have prompted consideration of radar replacement technologies. In order for the outcomes of advanced radar research and development to be the most beneficial to users, an understanding of user needs must be established early in the process and considered throughout. As an important early step in addressing this need, this study explored the strengths and limitations of current radar systems for nine participants from two key stakeholder groups: NOAA's NWS and broadcast meteorologists. Critical incident interviews revealed the role of each stakeholder group and attained stories that exemplified radar strengths and limitations in their respective roles.
NWS forecasters emphasized using radar as an essential tool to assess the current weather situation and communicate hazards to key stakeholder groups. TV broadcasters emphasized adding meaning and value to NWS information and using radar to effectively communicate weather information to viewers. The stories told by our participants vividly illustrated the advancing nature of weather detection with radar, and why there are still issues with weather radar and radar-derived information. Analysis of the stories, which ranged from accounts of severe weather to winter weather, revealed four underlying radar needs: 1) clean, accurate data without intervention, 2) higher spatial- and temporal-resolution data than that provided by the WSR-88D, 3) consistent and low-altitude information, and 4) more accurate information on precipitation type, size, intensity, and distribution.
A supplement to this article is available online:
2009: Data Mining Storm Attributes from Spatial Grids. Journal of Atmospheric and Oceanic Technology, 26, 2353–2365., ,
A technique to identify storms and capture scalar features within the geographic and temporal extent of the identified storms is described. The identification technique relies on clustering grid points in an observation field to find self-similar and spatially coherent clusters that meet the traditional understanding of what storms are. From these storms, geometric, spatial and temporal features can be extracted. These scalar features can then be data mined to answer many types of research questions in an objective, data-driven manner. This is illustrated by using the technique to answer questions of forecaster skill and lightning predictability.
2010: A Technique to Censor Biological Echoes in Radar Reflectivity Data. Journal of Atmospheric and Oceanic Technology, 49, 435–462., , ,
Weather radar data is susceptible to several artifacts due to anamalous
propagation, ground clutter, electronic interference, sun angle,
second-trip echoes and biological contaminants such as insects, bats and
birds. Several methods of censoring radar reflectivity data have been devised
and described in the literature. However, they all rely on analyzing the
local texture and vertical profile of reflectivity fields.
The local texture of reflectivity fields suffices to remove most artifacts,
except for biological echoes. Biological echoes have proved difficult to remove
because they can have the same returned power and vertical profile
as stratiform rain or snow.
In this paper, we describe a soft-computing technique based on
clustering, segmentation and a two-stage neural network to censor
all non-precipitating artifacts in weather radar reflectivity data.
We demonstrate that the technique is capable of discrimination between
light snow, stratiform rain and deep biological "bloom".
2010: A Gaussian Mixture Model Approach to Forecast Verification. Weather and Forecasting, 25, 908–920., ,
Verification methods for high-resolution forecasts have been based either on filtering or on objects created by thresholding the images. The filtering methods do not easily permit the use of deformation while identifying objects based on thresholds can be problematic. In this paper, we introduce a new approach in which the observed and forecast fields are broken down into a mixture of Gaussians, and the parameters of the Gaussian Mixture Model fit are examined to identify translation, rotation and scaling errors. We discuss the advantages of this method in terms of the traditional filtering or object-based methods and interpret resulting scores on a standard verification dataset.
2010: An Objective Method of Evaluating and Devising Storm Tracking Algorithms. Weather and Forecasting, 25, 721–729., ,
introduce a set of
easily computable bulk statistics that
can be used to directly evaluate the performance of tracking algorithms
on specific characteristics.
We apply the evaluation method to a diverse set of radar reflectivity data
cases and note the characteristic behavior of five different storm
tracking algorithms proposed in the literature and now employed in widely
used nowcasting systems. Based on this objective evaluation, we devise
a storm tracking algorithm that performs consistently and better than any
of the previously suggested techniques.
2010: Reaching scientific consensus through a competition. Bulletin of the American Meteorological Society, 91, 1423–1427., , ,
2010: Forward Sensitivity Approach to Dynamic Data Assimilation. Advances in Meteorology, 2010, 1–13, doi:10.1155/2010/375615., ,
The least squares fit of observations with known error variance to a strong-constraint dynamical model has been developed through use of the time evolution of sensitivity functions – the derivatives of model output with respect to the elements of control (initial conditions, boundary conditions, and physical/empirical parameters). Model error is assumed to stem from incorrect specification of the control elements. The optimal corrections to control are found through solution to an inverse problem. Duality between this method and the standard 4D-Var assimilation using adjoint equations has been proved. The paper ends with an illustrative example based on a simplified version of turbulent heat transfer at the sea/air interface.
2010: Transient luminous events above two mesoscale convective systems: Storm structure and evolution. Journal Of Geophysical Research - Space, 115, A00E22, doi:10.1029/2009JA014500., , , , , ,
Two warm‐season mesoscale convective systems (MCSs) were analyzed with respect to their production of transient luminous events (TLEs), mainly sprites. The 20 June 2007 symmetric MCS produced 282 observed TLEs over a 4 h period, during which the storm’s intense convection weakened and its stratiform region strengthened. TLE production corresponded well to convective intensity. The convective elements of the MCS contained normal‐polarity tripole charge structures with upper‐level positive charge (<−40°C), midlevel negative charge (−20°C), and low‐level positive charge near the melting level. In contrast to previous sprite studies, the stratiform charge layer involved in TLE production by parent positive cloud‐to‐ground (+CG) lightning resided at upper levels. This layer was physically connected to upper‐level convective positive charge via a downward sloping pathway. The average altitude discharged by TLE‐parent flashes during TLE activity was 8.2 km above mean sea level (MSL; −25°C). The 9 May 2007 asymmetric MCS produced 25 observed TLEs over a 2 h period, during which the storm’s convection rapidly weakened before recovering later. Unlike 20 June, TLE production was approximately anticorrelated with convective intensity. The 9 May storm, which also had a normal tripole in its convection, best fit the conventional model of low‐altitude positive charge playing the dominant role in sprite production; however, the average altitude discharged during the TLE phase of flashes still was higher than the melting level: 6.1 km MSL (−15°C). Based on these results, it is inferred that sprite production and sprite‐parent positive charge altitude depend on MCS morphology.
2010: Reducing the effects of noise on atmospheric imaging radars using multilag correlation. Radio Science, 45, doi:10.1029/2008RS003989., , , , , ,
2010: Suomi: Pragmatic Visionary. Bulletin of the American Meteorological Society, 91, 561–577, doi:10.1175/2009BAMS2897.1., , , ,
The steps on Verner Suomi's path to becoming a research scientist are examined. We argue that his research style – his natural interests in science and engineering, and his methodology in pursuing answers to scientific questions – was developed in his youth on the Iron Range of northeastern Minnesota, as an instructor in the cadet program at the University of Chicago (U of C) during World War II and as a fledgling academician at University of Wisconsin - Madison. We examine several of his early experiments that serve to identify his style. The principal results of the study are: 1) despite austere living conditions on the Iron Range during the Great Depression, Suomi benefitted from excellent industrial arts courses at Eveleth High School; 2) with his gift for designing instruments, his more practical approach to scientific investigation flourished in the company of world-class scientific thinkers at U of C; 3) his dissertation on the heat budget over a cornfield in the mid-1950s served as a springboard for studying the Earth-atmosphere energy balances in the space-age environment of the late 1950s; and 4) his design of radiometers – the so-called ping-pong radiometer and its sequel, the hemispheric bolometer – flew aboard Explorer VI and VII in the late 1950s, and analysis of the radiances from these instruments led to the first accurate estimate of the Earth's mean albedo.
2009: Relationships between Lightning Location and Polarimetric Radar Signatures in a Small Mesoscale Convective System. Monthly Weather Review, 137, 4151–4170, doi:10.1175/2009MWR2860.1., , , , ,
On 19 June 2004, the Thunderstorm Electrification and Lightning Experiment observed electrical, microphysical, and kinematic properties of a small mesoscale convective system (MCS). The primary observing systems were the Oklahoma Lightning Mapping Array, the KOUN S-band polarimetric radar, two mobile C-band Doppler radars, and balloon-borne electric field meters. During its mature phase, this MCS had a normal tripolar charge structure (lightning involved a midlevel negative charge between an upper and a lower positive charge), and flash rates fluctuated between 80 and 100 flashes per min. Most lightning was initiated within one of two altitude ranges (3-6 km MSL or 7-10 km MSL) and within the 35 dBZ contours of convective cells embedded within the convective line. The properties of two such cells were investigated in detail, the first lasting approximately 40 min and producing only 12 flashes and the second lasting over an hour and producing 105 flashes. In both, lightning was initiated in or near regions containing graupel. The upper lightning initiation region (7-10 km MSL) was near 35-47.5 dBZ contours, with graupel inferred below and ice crystals inferred above. The lower lightning initiation region (3-6 km MSL) was in the upper part of melting or freezing layers, often near differential reflectivity columns extending above the 0 deg C isotherm, which is suggestive of graupel formation. Both lightning initiation regions are consistent with what is expected from the noninductive graupel-ice thunderstorm electrification mechanism, though inductive processes may also have contributed to initiations in the lower region.
2010: Simulated electrification of a small thunderstorm with two-moment bulk microphysics. Journal of the Atmospheric Sciences, 67, 171–194, doi:10.1175/2009JAS2965.1., , ,
Electrification and lightning are simulated for a small continental multicell storm. The results are consistent with observations and thus provide additional understanding of the charging processes and evolution of this storm. The first six observed lightning flashes were all negative cloud-to ground (CG) flashes, after which intracloud (IC) flashes also occurred between middle and upper levels of the storm. The model simulation reproduces the basic evolution of lightning from low and middle levels to upper levels. The observed lightning indicated an initial charge structure of at least an inverted dipole (negative charge above positive). The simulations show that noninductive charge separation higher in the storm can enhance the main negative charge sufficiently to produce negative CG flashes before upper level IC flashes commence. The result is a ‘‘bottom-heavy’’ tripole charge structure with midlevel negative charge and a lower positive charge region that is more significant than the upper positive region, in contrast to the traditional tripole structure that has a less significant lower positive charge region. Additionally, the occurrence of cloud-to-ground lightning is not necessarily a result of excess net charge carried by the storm, but it is primarily caused by the local potential imbalance between the lowest charge regions.
The two-moment microphysics scheme used for this study predicted mass mixing ratio and number concentration of cloud droplets, rain, ice crystals, snow, and graupel. Bulk particle density of graupel was also predicted, which allows a single category to represent a greater range of particle characteristics. (An additional hail category is available but was not needed for the present study.) The prediction of hydrometeor number concentration is particularly critical for charge separation at higher temperatures (-5 < T < -20 deg C) in the mixed phase region, where ice crystals are produced by rime fracturing (Hallett–Mossop process) and by splintering of freezing drops. Cloud droplet concentration prediction also affected the rates of inductive charge separation between graupel and droplets.
2010: On Sedimentation and Advection in multi-moment bulk microphysics. Journal of the Atmospheric Sciences, 67, 3084–3094.,
In two-moment bulk microphysics schemes, the practice of using different weighted fall velocities for the various moments is known to lead to artificial growth in reflectivity values for fast-falling particles, particularly at the downward leading edge of a precipitation column. Two simple correction schemes that prevent these artifacts while still allowing some effects of size-sorting are presented. The corrections are be obtained by comparing particle number concentrations that result from two or three different sedimentation calculations. The corrections do not conserve particle number concentrations, but prevent spurious reflectivity growth automatically without the need to place ad hoc limits on mean particle size.
Multi-moment bulk microphysics schemes often have used inconsistent variables in terms of the appropriate advection equation (for example, mass mixing ratio and particle number concentration). A brief review of consistent advection and turbulent mixing for such variables is presented to provide clarification.
2010: Approaches for Compression of Super-Resolution WSR-88D Data. IEEE Tran. on Geosc. and Remote Sensing Letters, 8, 191–195, doi:10.1109/LGRS.2010.2058089., , , , ,
2009: Short-Wavelength Technology and the Potential For Distributed Networks of Small Radar Systems. Bulletin of the American Meteorological Society, 90, 1797–1817, doi:10.1175/2009BAMS2507.1., , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Dense networks of short-range radars capable of mapping storms and detecting atmospheric hazards are described. Composed of small X-band (9.4 GHz) radars spaced tens of kilometers apart, these networks defeat the Earth curvature blockage that limits today's long-range weather radars and enables observing capabilities fundamentally beyond the operational state-of-the-art radars. These capabilities include multiple Doppler observations for mapping horizontal wind vectors, subkilometer spatial resolution, and rapid-update (tens of seconds) observations extending from the boundary layer up to the tops of storms. The small physical size and low-power design of these radars permits the consideration of commercial electronic manufacturing approaches and radar installation on rooftops, communications towers, and other infrastructure elements, leading to cost-effective network deployments. The networks can be architected in such a way that the sampling strategy dynamically responds to changing weather to simultaneously accommodate the data needs of multiple types of end users. Such networks have the potential to supplement, or replace, the physically large long-range civil infrastructure radars in use today.
2009: Polarimetric radar properties of smoke plumes: A model. Journal of Geophysical Research - D: Atmospheres, 114, doi:10.1029/2009JD012647., , ,
Smoke plumes can be recognized with the polarimetric WSR-88D weather radar using low correlation coefficient between signals in the horizontal and vertical channels. A model that describes radar measurements at 10-cm wavelength is developed. Using the model it is concluded that smoke scatterers have a needle-like form. The scatterers flutter with the mean angle of 23-27 deg which is the angle between the major particle's axis and horizontal plane. It is inferred that the scatterers have the major-to-minor axis ratio larger than 6.
2009: The severe hazards analysis and verification experiment. Bulletin of the American Meteorological Society, 90, 1519–1530, doi:10.1175/2009BAMS2815.1., , , , , , ,
During the springs and summers of 2006 through 2008, scientists from the National Severe Storms Laboratory and students from the University of Oklahoma have conducted an enhanced severe-storm verification effort. The primary goal for the Severe Hazards Analysis and Verification Experiment (SHAVE) was the remote collection of high spatial and temporal resolution hail, wind (or wind damage), and flash-flooding reports from severe thunderstorms. This dataset has a much higher temporal and spatial resolution than the traditional storm reports collected by the National Weather Service and published in Storm Data (tens of square kilometers and 1–5 min versus thousands of square kilometers and 30–60 min) and also includes reports of nonsevere storms that are not included in Storm Data. The high resolution of the dataset makes it useful for validating high-resolution, gridded warning guidance applications.
SHAVE is unique not only for the type of data collected and the resolution of that data but also for how the data are collected. The daily operations of the project are largely student led and run. To complete the remote, high-resolution verification, the students use Google Earth to display experimental weather data and geographic information databases, such as digital phonebooks. Using these data, the students then make verification phone calls to residences and businesses, throughout the United States, thought to have been affected by a severe thunderstorm. The present article summarizes the data collection facilities and techniques, discusses applications of these data, and shows comparisons of SHAVE reports to reports currently available from Storm Data.
2010: Polarimetric and electrical characteristics of a lightning ring in a supercell storm. Monthly Weather Review, 138, 2405–2425, doi:10.1175/2009MWR3210.1., , , , , ,
On 30 May 2004, a supercell storm was sampled by a suite of instrumentation that had been deployed as part of the Thunderstorm Electrification and Lightning Experiment (TELEX). The instrumentation included the Oklahoma Lightning Mapping Array (OK-LMA), the National Severe Storms Laboratory S-band Weather Surveillance Radar-1988 Doppler (WSR-88D) polarimetric radar at Norman, Oklahoma, and two mobile C-band, Shared Mobile Atmospheric Research and Teaching Radars (SMART-R). Combined, datasets collected by these instruments provided a unique opportunity to investigate the possible relationships among the supercell’s kinematic, microphysical, and electrical characteristics. This study focuses on the evolution of a ring of lightning activity that formed near the main updraft at approximately 0012 UTC, matured near 0039 UTC, and collapsed near 0050 UTC. During this time period, an F2-intensity tornado occurred near the lightning-ring region. Lightning density contours computed over 1-km layers are overlaid on polarimetric and dual-Doppler data to assess the low- and midlevel kinematic and microphysical characteristics within the lightning-ring region. Results indicate that the lightning ring begins in the middle and upper levels of the precipitation-cascade region, which is characterized by inferred graupel. The second time period shows that the lightning source densities take on a horizontal u-shaped pattern that is collocated with midlevel differential reflectivity and correlation coefficient rings and with the strong cyclonic vertical vorticity noted in the dual-Doppler data. The final time period shows dissipation of the u-shaped pattern and the polarimetric signatures as well as an increase in the lightning activity at the lower levels associated with the development of the rear-flank downdraft (RFD) and the envelopment of the vertical vorticity maximum by the RFD.
2009: Synoptic-scale flow and valley cold pool evolution in the Western United States. Weather and Forecasting, 24, 1625–1643, doi:10.1175/2009WAF2222234.1., ,
Valley cold pools (VCPs), which are trapped, cold layers of air at the bottoms of basins or valleys, pose a significant problem for forecasters because they can lead to several forms of difficult-to-forecast and hazardous weather such as fog, freezing rain, or poor air quality. Numerical models have historically failed to routinely provide accurate guidance on the formation and demise of VCPs, making the forecast problem more challenging. In some case studies of persistent wintertime VCPs, there is a connection between the movement of upper-level waves and the timing of VCP formation and decay. Herein, a 3-yr climatology of persistent wintertime VCPs for five valleys and basins in the western United States is performed to see how often VCP formation and decay coincides with synoptic-scale (~200-2000 km) wave motions. Valley cold pools are found to form most frequently as an upper-level ridge approaches the western United States and in response to strong midlevel warming. The VCPs usually last as long as the ridge is over the area and usually only end when a trough, and its associated midlevel cooling, move over the western United States. In fact, VCP strength appears to be almost entirely dictated by midlevel temperature changes, which suggests large-scale forcing is dominant for this type of VCP most of the time.
2009: Next-day convection-allowing WRF model guidance: A second look at 2 vs. 4 km grid spacing. Monthly Weather Review, 137, 3351–3372, doi:10.1175/2009MWR2924.1., , , , , , , , ,
During the 2007 NOAA Hazardous Weather Testbed (HWT) Spring Experiment, the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma produced convection-allowing forecasts from a single deterministic 2 km model and a 10-member 4 km resolution ensemble. In this study, the 2 km deterministic output was compared with forecasts from the 4 km ensemble control member. Other than the difference in horizontal resolution, the two sets of forecasts featured identical WRFARW configurations, including vertical resolution, forecast domain, initial and lateral boundary conditions, and physical parameterizations. Therefore, forecast disparities were attributed solely to differences in horizontal grid spacing.
This study is a follow-up to similar work that was based on results from the 2005 Spring Experiment. Unlike the 2005 Experiment, however, model configurations were more rigorously controlled in the present study, providing a more robust dataset and a cleaner isolation of the dependence on horizontal resolution. Additionally, in this study, the 2 and 4 km output were compared to 12 km forecasts from the North American Mesoscale (NAM) model.
Model forecasts were analyzed using objective verification of mean hourly precipitation and visual comparison of individual events, primarily during the 21- to 33-hour forecast period to examine the utility of the models as next-day guidance. On average, both the 2 and 4 km model forecasts showed substantial improvement over the 12 km NAM. However, although the 2 km forecasts produced more detailed structures on the smallest resolvable scales, the patterns of convective initiation, evolution, and organization were remarkably similar to the 4 km output. Moreover, on average, metrics such as equitable threat score, frequency bias, and fractions skill score revealed no statistical improvement of the 2 km forecasts compared to the 4 km forecasts. These results, based on the 2007 dataset, corroborate previous findings, suggesting that decreasing horizontal grid spacing from 4 to 2 km provides little added value as next-day guidance for severe convective storm and heavy rain forecasters in the United States.
2010: Toward Improved Convection-Allowing Ensembles: Model Physics Sensitivities and Optimizing Probabilistic Guidance with Small Ensemble Membership. Weather and Forecasting, 25, 263–280, doi:10.1175/2009WAF2222267.1., , , , , , , , ,
During the 2007 NOAA Hazardous Weather Testbed Spring Experiment, the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma produced a daily 10-member 4-km horizontal resolution ensemble forecast covering approximately three-fourths of the continental United States. Each member used the Advanced Research version of the Weather Research and Forecasting (WRF-ARW) model core, which was initialized at 2100 UTC, ran for 33 h, and resolved convection explicitly. Different initial condition (IC), lateral boundary condition (LBC), and physics perturbations were introduced in 4 of the 10 ensemble members, while the remaining 6 members used identical ICs and LBCs, differing only in terms of microphysics (MP) and planetary boundary layer (PBL) parameterizations. This study focuses on precipitation forecasts from the ensemble.
The ensemble forecasts reveal WRF-ARW sensitivity to MP and PBL schemes. For example, over the 7-week experiment, the Mellor–Yamada–Janjić PBL and Ferrier MP parameterizations were associated with relatively high precipitation totals, while members configured with the Thompson MP or Yonsei University PBL scheme produced comparatively less precipitation. Additionally, different approaches for generating probabilistic ensemble guidance are explored. Specifically, a “neighborhood” approach is described and shown to considerably enhance the skill of probabilistic forecasts for precipitation when combined with a traditional technique of producing ensemble probability fields.
2010: Doppler weather radar based nowcasting of cyclone ogni. J. Earth Syst. Sci, 119, 183–199., , , ,
2010: `Doppler weather radar based nowcasting of cyclone ogni. J. Earth Syst. Sci, 119, 183–199., , , ,
2010: Satellite-observed cold-ring-shaped features atop deep convective clouds. Atmospheric Research, 97, 80–96, doi:10.1016/j.atmosres.2010.03.009., , , , , , , , , , ,
This paper focuses on deep convective storms which exhibit a distinct long-lived cold ring at their cloud top, as observed in enhanced infrared (IR) window satellite imagery. The feature seems to be closely linked to a similar phenomenon, cold-U/V (enhanced-V) shape, or in general to storms which exhibit an enclosed warm spot or larger warm area downwind of the overshooting tops, surrounded by colder parts of the storm anvil. While storms exhibiting some form of warm spots seem to be quite common, storms exhibiting distinct cold rings or cold-U/Vs are significantly less frequent. The cold-ring feature is described here for storms which occurred above the Czech Republic and Austria on 25 June 2006. Compared to other cold-ring-shaped storms, this case was extraordinary not only by the magnitude and duration of the cold ring and its central warm spot, but also by storm cloud-top heights, reaching 16-17 km, as determined from ground-based C-band radar observations. The paper also addresses a possible link between cold-ring-shaped storms with those exhibiting a cold-U/V (enhanced-V) feature, indicating (based on model results) that the stratification and wind shear just above the tropopause are key conditions for the cold-ring to exist. The case from 25 June 2006 also shows that the cloud top height, derived from satellite radiances, has significant error when applied to this particular type of storm. Finally, we discuss the potential of the satellite-observed cold-ring feature as an indicator of storm severity.
2010: Spatially Variable Advection Correction of Radar Data. Part I: Theoretical Considerations. Journal of the Atmospheric Sciences, 67, 3445–3456, doi:10.1175/2010JAS3465.1., , ,
Radar data–based analysis products, such as accumulated rainfall maps, dual-Doppler wind syntheses, and thermodynamic retrievals, are prone to substantial error if the temporal sampling interval is too coarse. Techniques to mitigate these errors typically make use of advection-correction procedures (space-to-time conversions) in which the analyzed radial velocity or reflectivity field is idealized as a pattern of unchanging form that translates horizontally at constant speed. The present study is concerned with an advection-correction procedure for the reflectivity field in which the pattern-advection components vary spatially. The analysis is phrased as a variational problem in which errors in the frozen-turbulence constraint are minimized subject to smoothness constraints. The Euler–Lagrange equations for this problem are derived and a solution is proposed in which the trajectories, pattern-advection fields, and reflectivity field are analyzed simultaneously using a combined analytical and numerical procedure. The potential for solution nonuniqueness is explored.
2010: Spatially Variable Advection Correction of Radar Data. Part II: Test Results. Journal of the Atmospheric Sciences, 67, 3457–3470, doi:10.1175/2010JAS3466.1., , ,
The spatially variable advection-correction/analysis procedure introduced in Part I is tested using analytical reflectivity blobs embedded in a solid-body vortex, and Terminal Doppler Weather Radar (TDWR) and Weather Surveillance Radar-1988 Doppler (WSR-88D) data of a tornadic supercell thunderstorm that passed over central Oklahoma on 8 May 2003. In the TDWR tests, plan position indicator (PPI) data at two volume scan times are input to the advection-correction procedure, with PPI data from a third scan time, intermediate between the two input times, that is used to validate the results. The procedure yields analyzed reflectivity fields with lower root-mean-square errors and higher correlation coefficients than those reflectivity fields that were advection corrected with any constant advection speed.
2009: Convective-scale warn on forecast: A vision for 2020. Bulletin of the American Meteorological Society, 90, 1487–1499, doi:10.1175/2009BAMS2795.1., , , , , , , , , , ,
The National Oceanic and Atmospheric Administration’s (NOAA’s) National Weather Service (NWS) issues warnings for severe thunderstorms, tornadoes, and flash floods since these phenomena are a threat to life and property. These warnings are presently based upon either visual confirmation of the phenomena or the observational detection of proxy signatures that are largely based upon radar observations. Convective-scale weather warnings are unique in the NWS by having little reliance on direct numerical forecast guidance. Since increasing severe thunderstorm, tornado, and flash flood warning lead times is a key NOAA strategic mission goal designed to reduce the loss of life, injury, and economic costs of these high impact weather phenomena, a new warning paradigm is needed in which numerical model forecasts play a larger role in convective-scale warnings. This new paradigm shifts the warning process from warn-on-detection to warn-on-forecast and has the potential to dramatically increase warning lead times.
A warn-on-forecast system is envisioned as a probabilistic convective-scale ensemble analysis and forecast system that assimilates in-storm observations into a high-resolution convection-resolving model ensemble. The building blocks needed for such a system are presently available and initial research results clearly illustrate the value of radar observations to the production of accurate analyses of convective weather systems and improved forecasts. While a number of scientific and cultural challenges still need to be overcome, the potential benefits are significant. A probabilistic convective-scale warn-on-forecast system is a vision worth pursuing.
2010: Importance of Horizontally Inhomogeneous Environmental Initial Conditions to Ensemble Storm-Scale Radar Data Assimilation and Very Short-Range Forecasts. Monthly Weather Review, 138, 1250–1272, doi:10.1175/2009MWR3027.1., ,
The assimilation of operational Doppler radar observations into convection-resolving numerical weather prediction models for very short-range forecasting represents a signiﬁcant scientiﬁc and technological challenge. Numerical experiments over the past few years indicate that convective-scale forecasts are sensitive to the details of the data assimilation methodology, the quality of the radar data, the parameterized microphysics, and the storm environment. In this study, the importance of horizontal environmental variability to very short-range (0–1 h) convective-scale ensemble forecasts initialized using Doppler radar observations is investigated for the 4–5 May 2007 Greensburg, Kansas, tornadic thunderstorm event. Radar observations of reﬂectivity and radial velocity from the operational Doppler radar network at 0230 UTC 5 May 2007, during the time of the ﬁrst large tornado, are assimilated into each ensemble member using a three-dimensional variational data assimilation system (3DVAR) developed at the Center for Analysis and Prediction of Storms (CAPS). Very short-range forecasts are made using the nonhydrostatic Advanced Regional Prediction System (ARPS) model from each ensemble member and the results are compared with the observations. Explicit three-dimensional environmental variability information is provided to the convective-scale ensemble using analyses from a 30-km mesoscale ensemble data assimilation system. Comparisons between convective-scale ensembles with initial conditions produced by 3DVAR using 1) background ﬁelds that are horizontally homogeneous but vertically inhomogeneous (i.e., have different vertical environmental proﬁles) and 2) background ﬁelds that are horizontally and vertically inhomogeneous are undertaken. Results show that the ensemble with horizontally and vertically inhomogeneous background ﬁelds provides improved predictions of thunderstorm structure, mesocyclone track, and low-level circulation track than the ensemble with horizontally homogeneous background ﬁelds. This suggests that knowledge of horizontal environmental variability is important to successful convective-scale ensemble predictions and needs to be included in real-data experiments.
2009: Unusually high differential attenuation at C-band: Results from a two-year analysis of the French Trappes polarimetric radar data. Journal of Applied Meteorology and Climatology, 48, 2037–2053., , , , ,
2010: Alternating dual‐pulse, dual‐frequency techniques for range and velocity ambiguity mitigation on weather radars. Journal of Atmospheric and Oceanic Technology, 27, 1461–1475., , , ,
2009: A Statistical Methodology to Discover Precipitation Microclimates in Southeast Louisiana: Implications for Coastal Watersheds. J. Hydrometeorol, 10, 1184–1202, doi:10.1175/2009JHM1076.1., , , ,
This study quantifies the spatial distribution of precipitation patterns on an annual basis for southeast Louisiana. To compile a long-term record of 24-h rainfall, rainfall reports collected by National Weather Service (NWS) cooperative observers were gathered from National Climatic Data Center (NCDC) archives, private collections of observational data held at regional and local libraries, NWS offices, and local utility providers. The reports were placed into a digital database in which each station’s record was subjected to an extensive quality control process. This process produced a database of daily rainfall reports for 59 south Louisiana stations for the period 1836–2002, with extensive documentation for each site outlining the differences between the study’s data and the data available from the NCDC Web page. A statistical methodology was developed to determine if the four NCDC climate divisions for southeast Louisiana accurately depict average monthly rainfall for the area. This method employs cluster analysis, using Euclidean distance as the measure of dissimilarity for the clustering technique. To resolve missing rainfall observations, an imputation scheme was developed that uses the two most similar stations (based on Euclidean distance) to determine appropriate values for missing rainfall observations. Results from this testing structure show statistical evidence of precipitation microclimates across south Louisiana at higher spatial scales than those of the NCDC climate zones. Quantifying the spatial extent of daily precipitation and documenting historical trends of precipitation provides critical design information for regional infrastructure within this highly vulnerable area of the central Gulf Coast region.
2009: Difficulties with Correcting Radar Rainfall Estimates Based on Rain Gauge Data: A Case Study of Severe Weather in Montana on 16–17 June 2007. Weather and Forecasting, 24, 1334–1344., , ,
2010: On the predictability of mesoscale convective systems: Three-dimensional simulations. Monthly Weather Review, 138, 863–885, doi:10.1175/2009MWR2961.1., , , ,
Mesoscale convective systems (MCSs) are a dominant climatological feature of the central United States and are responsible for a substantial fraction of warm-season rainfall. Yet very little is known about the predictability of MCSs. To help address this situation, a previous paper by the authors examined a series of ensemble MCS simulations using a two-dimensional version of a storm-scale (dx = 1 km) model. Ensemble member perturbations in the preconvective environment, namely, wind speed, relative humidity, and convective instability, are based on current 24-h forecast errors from the North American Model (NAM). That work is now extended using a full three-dimensional model.
Results from the three-dimensional simulations of the present study resemble those found in two dimensions. The model successfully produces an MCS within 100 km of the location of the control run in around 70% of the ensemble runs using perturbations to the preconvective environment consistent with 24-h forecast errors, while reducing the preconvective environment uncertainty to the level of current analysis errors improves the success rate to nearly 85%. This magnitude of improvement in forecasts of environmental conditions would represent a radical advance in numerical weather prediction. The maximum updraft and surface wind forecast uncertainties are of similar magnitude to their two-dimensional counterparts. However, unlike the two-dimensional simulations, in three dimensions, the improvement in the forecast uncertainty of storm features requires the reduction of preconvective environmental uncertainty for all perturbed variables. The MCSs in many of the runs resemble bow echoes, but surface winds associated with these solutions, and the perturbation proﬁles that produce them, are nearly indistinguishable from the nonbowing solutions, making any conclusions about the bowlike systems difﬁcult.
2010: Ship wave signature at the cloud top of deep convective storms. Atmospheric Research, 97, 294–302, doi:10.1016/j.atmosres.2010.03.015., , , , ,
We identify certain features atop some thunderstorms observed by meteorological satellites as ship wave-like. A few examples of satellite visible images are shown and the ship wave signature patterns in them are identiﬁed and discussed. The presence of ship wave signatures implies the existence of a dynamical mechanism in the storm that behaves like an obstacle to the ambient ﬂow. We use a numerical storm model simulation to show that this mechanism is due to the strong updraft and divergence in the upper part of the storm.
2010: NAM Model forecasts of warm-season quasi-stationary frontal environments in the central United States. Weather and Forecasting, 25, 1281–1292., ,
2010: The Impact of Assimilating Surface Pressure Observations on Severe Weather Events in a WRF Mesoscale Ensemble System. Monthly Weather Review, 138, 1673–1694, doi:10.1175/2009MWR3042.1., ,
Surface pressure observations are assimilated into a Weather Research and Forecast ensemble using an ensemble Kalman ﬁlter (EnKF) approach and the results are compared with observations for two severe weather events. Several EnKF experiments are performed to evaluate the relative impacts of two very different pressure observations: altimeter setting (a total pressure ﬁeld) and 1-h surface pressure tendency. The primary objective of this study is to determine the surface pressure observation that is most successful in producing realistic mesoscale features, such as convectively driven cold pools, which often play an important role in future convective development. Results show that ensemble-mean pressure analyses produced from the assimilation of surface temperature, moisture, and winds possess signiﬁcant errors in regard to mesohigh strength and location. The addition of surface pressure tendency observations within the assimilation yields limited ability to constrain such errors, while the assimilation of altimeter setting yields accurate depictions of the mesoscale pressure patterns associated with mesoscale convective systems. The mesoscale temperature patterns produced by all the ensembles are quite similar and tend to reproduce the observed features. Results suggest that even though surface pressure observations can have large cross covariances with temperature and the wind components, the resulting analyses fail to improve upon the EnKF temperature and wind analyses that exclude the surface pressure observations. Ensemble forecasts following the assimilation period show the potential to improve short-range forecasting of surface pressure.
2009: An unconventional approach for assimilating aliased radar radial velocities. Tellus, 61A, 621–630., , , ,
An aliasing operator is introduced to mimic the effect of aliasing that causes discontinuities in radial-velocity observations, and to modify the observation term in the costfunction for direct assimilations of aliased radar radial-velocity observations into numerical models. It is found that if the aliasing operator is treated as a part of the observation operator and applied to the analysed radial velocity in a conventional way, then the analysis is not ensured to be aliased (or not aliased) in consistency with the aliased (or not aliased) observation at every observation point. Thus, the analysis-minus-observation term contains a large alias error whenever an inconsistency occurs at an observation point. This causes fine-structure discontinuities in the costfunction. An unconventional approach is thus introduced to apply the aliasing operator to the entire analysis-minus-observation term at each observation point in the observation term of the costfunction. With this approach, the costfunction becomes smooth and concave upwards in the vicinity of the global minimum. The usefulness of this approach for directly assimilating aliased radar radial-velocity observations under certain conditions is demonstrated by illustrative examples.
2009: Bayesian perspective of the unconventional approach for assimilating aliased radar radial velocities. Tellus, 61A, 631–634.,
The global minimization problem for directly assimilating aliased radial velocities is derived in terms of Bayesian estimation by folding the domain of the original Gaussian non-aliased observation probability density function (pdf) into the Nyquist interval. By truncating the folded tails of the observation pdf, the observation term in the costfunction recovers the aliased observation term formulated previously by an unconventional approach. This establishes the theoretical basis for the unconventional approach and quantifies the involved approximation. The alias-robust radar wind analysis developed based on the unconventional approach is also revisited from the Bayesian perspective.
2010: Fitting VAD wind to aliased Doppler radial-velocity observations – A minimization problem with multiple minima. Quart. J. Roy. Meteor. Soc, 136, 451–461, doi:10.1002/qj.589., , ,
When the horizontal vector wind is estimated by the traditional velocity azimuth display (VAD) analysis from radar radial-velocity observations on a selected range circle, the observations should be thoroughly de-aliased first. When the effect of aliasing is formulated into the cost function, the VAD analysis can be applied to raw aliased radial-velocity observations, but the minimization problem for the VAD fitting is complicated by the multiple local minima caused by the zigzag-discontinuities of the aliasing operator. An efficient two-step VAD algorithm is thus developed in this paper to find the global minimum in properly transformed subspaces of the VAD wind parameters. The algorithm is then extended into a three-step volume velocity processing (VVP) method to estimate the vertical profile of horizontal winds from each volume of radar radial-velocity scans. Examples are presented to illustrate the capability and robustness of the method.
2010: A 3.5-Dimensional Variational Method for Doppler Radar Data Assimilation and Its Application to Phased-Array Radar Observations. Advances in Meteorology, 2010, 61–74, doi:10.1155/2010/797265., , , , ,
A 3.5-dimensional variational method is developed for Doppler radar data assimilation. In this method, incremental analyses are performed in three steps to update the model state upon the background state provided by the model prediction. First, radar radial-velocity observations from three consecutive volume scans are analyzed on the model grid. The analyzed radial-velocity fields are then used in step 2 to produce incremental analyses for the vector velocity fields at two time levels between the three volume scans. The analyzed vector velocity fields are used in step 3 to produce incremental analyses for the thermodynamic fields at the central time level accompanied by the adjustments in water vapor and hydrometeor mixing ratios based on radar reflectivity observations. The finite element B-spline representations and recursive filter are used to reduce the dimension of the analysis space and enhance the computational efficiency. The method is applied to a squall line case observed by the phased-array radar with rapid volume scans at the National Weather Radar Testbed and is shown to be effective in assimilating the phased-array radar observations and improve the prediction of the subsequent evolution of the squall line.
2010: Modal and Nonmodal Growths of Symmetric Perturbations in Unbounded Domain. Journal of the Atmospheric Sciences, 67, 1996–2017.,
2010: Hydrologic evaluation of Multisatellite Precipitation Analysis standard precipitation products in basins beyond its inclined latitude band: A case study in Laohahe Basin, China. Water Resources Research, 46, doi:10.1029/2009WR008965., , , , , , , ,
2010: Impact of Phased-Array Radar Observations over a Short Assimilation Period: Observing System Simulation Experiments Using an Ensemble Kalman Filter. Monthly Weather Review, 138, 517–538, doi:10.1175/2009MWR2925.1., ,
The conventional Weather Surveillance Radar-1988 Doppler (WSR-88D) scans a given weather phenomenon in approximately 5 min, and past results suggest that it takes 30–60 min to establish a storm into a model assimilating these data using an ensemble Kalman ﬁlter (EnKF) data assimilation technique. Severe weather events, however, can develop and evolve very rapidly. Therefore, assimilating observations for a 30–60-min period prior to the availability of accurate analyses may not be feasible in an operational setting. A shorter assimilation period also is desired if forecasts are produced to increase the warning lead time. With the advent of the emerging phased-array radar (PAR) technology, it is now possible to scan the same weather phenomenon in less than 1 min. Therefore, it is of interest to see if the faster scanning rate of PAR can yield improvements in storm-scale analyses and forecasts from assimilating over a shorter period of time. Observing system simulation experiments are conducted to evaluate the ability to quickly initialize a storm into a numerical model using PAR data in place of WSR-88D data. Synthetic PAR and WSR-88D observations of a splitting supercell storm are created from a storm-scale model run using a realistic volume-averaging technique in native radar coordinates. These synthetic reﬂectivity and radial velocity observations are assimilated into the same storm-scale model over a 15-min period using an EnKF data assimilation technique followed by a 50-min ensemble forecast. Results indicate that assimilating PAR observations at 1-min intervals over a short 15-min period yields signiﬁcantly better analyses and ensemble forecasts than those produced using WSR-88D observations. Additional experiments are conducted in which the adaptive scanning capability of PAR is utilized for thunderstorms that are either very close to or far away from the radar location. Results show that the adaptive scanning capability improves the analyses and forecasts when compared with the nonadaptive PAR data. These results highlight the potential for ﬂexible rapid-scanning PAR observations to help to quickly and accurately initialize storms into numerical models yielding improved storm-scale analyses and very short range forecasts.
2009: Phased Array Radar Polarimetry for Weather Sensing: A Theoretical Formulation for Bias Corrections. IEEE Transactions on Geoscience and Remote Sensing, 47, 3679–3689., , , , , ,
It is becoming widely accepted that radar polarimetry provides accurate and informative weather measurements, while phased array technology can shorten data updating time. In this paper, a theory of phased array radar polarimetry is developed to establish the relation between electric fields at the antenna of phased array radar and the fields in a resolution volume filled with hydrometeors. It is shown that polarimetric measurements with an electronically steered beam can cause measurement biases that are comparable to or even larger than the intrinsic polarimetric characteristics of hydrometeors. However, these biases are correctable if the transmitted electric fields are known. A correction to the measured scattering matrix is derived that removes biases in meteorological variables. The challenges and opportunities for weather sensing with polarimetric phased array radar are discussed.
2010: The impact of spatial variations of low-level stability on the life cycle of a simulated supercell storm. Monthly Weather Review, 138, 1738–1766, doi:10.1175/2009MWR3010.1., , , , ,
This study reports on the dynamical evolution of simulated, long-lived right-moving supercell storms in a high-CAPE, strongly sheared mesoscale environment, which initiate in a weakly capped region and subsequently move into a cold boundary layer (BL) and inversion region before dissipating. The storm simulations realistically approximate the main morphological features and evolution of the 22 May 1981 Binger, Oklahoma, supercell storm by employing time-varying inﬂow lateral boundary conditions for the storm relative moving grid, which in turn are prescribed from a parent, ﬁxed steady-state mesoscale analysis to approximate the observed inversion region to the east of the dryline on that day. A series of full life cycle storm simulations have been performed in which the magnitude of boundary layer coldness and the convective inhibition are varied to examine the ability of the storm to regenerate and sustain its main updraft as it moves into environments with increasing convective stability. The analysis of the simulations employs an empirical expression for the theoretical speed of the right-forward-ﬂank outﬂow boundary relative to the ambient, low-level storm inﬂow that is consistent with simulated cold-pool boundary movement. The theoretical outﬂow boundary speed in the direction opposite to the ambient ﬂow increases with an increasing cold-pool temperature deﬁcit relative to the ambient BL temperature, and it decreases as ambient wind speed increases. The right-moving, classic (CL) phase of the simulated supercells is supported by increasing precipitation content and a stronger cold pool, which increases the right-moving cold-pool boundary speed against the constant ambient BL winds. The subsequent decrease of the ambient BL temperature with eastward storm movement decreases the cold-pool temperature deﬁcit and reduces the outﬂow boundary speed against the ambient winds, progressing through a state of stagnation to an ultimate retrogression of the outﬂow boundary in the direction of the ambient ﬂow. Onset of a transient, left-moving low-precipitation (LP) phase is initiated as the storm redevelops on the retrograding outﬂow boundary. The left-moving LP storm induces compensating downward motions in the inversion layer that desiccates the inﬂow, elevates the cloudy updraft parcel level of free convection (LFC), and leads to the ﬁnal storm decay. The results demonstrate that inversion-region simulations support isolated, long-lived supercells. Both the degree of stratiﬁcation and the coldness of the ambient BL regulate the cold-pool intensity and the strength and capacity of the outﬂow boundary to lift BL air through the LFC and thus regenerate convection, resulting in variation of supercell duration in the inversion region of approximately 1–2 h. In contrast, horizontally homogeneous conditions lacking an inversion region result in the development of secondary convection from the initial isolated supercell, followed by rapid upscale growth after 3 h to form a long-lived mesoscale convective system.
2010: Three-body scattering and hail size. Journal of Applied Meteorology and Climatology, 49, 687–700., , , ,
Three-body scattering signature is an appendage seen on weather radar displays of reflectivity behind strong storm cells. It is caused by multiple scattering between hydrometeors and ground. The radar equation for this phenomenon is reexamined and corrected to include the coherent wave component producing 3 dB more power than previously reported. Furthermore, the possibility to gauge hail size causing this phenomena is explored. A model of forward scattering by spherical hail and accepted values of ground backscattering cross sections is used in an attempt to reconcile the reflectivity in this signature with observations. This work demonstrates that the signature can be caused by small (<10 mm) to moderate (20 mm) size hail. A try to gauge hail size by comparing the direct return from hail with the three-body scattered return is made. The theory indicates fundamental ambiguities in size retrieval due to resonant effects. Although theory eliminates the number of hailstones per unit volume, the shape of hail size distribution and the cross section of ground contribute additional uncertainty to the retrieval.
Books, FY 2010–2014
2009: Detection Thresholds for Spectral Moments and Polarimetric Variables. VDM Verlag Dr. Muller, 199 pp.,
The most significant advancement in weather radars over the last two decades is polarization diversity, so much so, that the US National Weather Service is introducing this capability to its national network of Doppler radars. Introduction of dual-polarization brings new information that can be used for separating signals from noise. Classical approaches apply thresholds on estimated signal-to-noise ratio (SNR) and/or the magnitude of autocorrelation coefficient at lag one. Because the weather signals from the two orthogonally polarized electric fields are highly correlated this feature can be used to enhance the detection. This book provides a comprehensive analysis of a novel approach that combines estimates of powers, autocorrelations, and cross-correlation to effectively enhance the signal detection on the polarimetric weather radars. The book gives detailed description of a computationally efficient method suited for real-time implementation. Principles and approaches laid out are general and can be applied to other cases where sensing of partially coherent signals is of interest. Moreover methods for evaluating probabilities at the tails of density functions are exposed.
2012: Automating the Analysis of Spatial Grids: A Practical Guide to Data Mining Geospatial Images for Human and Environmental Applications. Springer, 323 pp.,
The ability to create automated algorithms to process gridded spatial data is increasingly important as remotely sensed datasets increase in volume and frequency. Whether in business, social science, ecology, meteorology or urban planning, the ability to create automated applications to analyze and detect patterns in geospatial data is increasingly important. This book provides students with a foundation in topics of digital image processing and data mining as applied to geospatial datasets. The aim is for readers to be able to devise and implement automated techniques to extract information from spatial grids such as radar, satellite or high-resolution survey imagery.
2012: Heinz-Wolfram Kasemir: His Collected Works. AGU Geopress-Wiley, 718 pp., ,
PREFACE Historically, the science of atmospheric electricity has evolved, largely, based on field observations and measurements taken during fair weather and thunderstorms. In its early stages, experimentalists were the primary developers of this field, reporting and interpreting, to the best of their abilities, the different manifestations of the electrical processes found in the atmosphere. Meanwhile, as a branch of physics, this new field required a sound knowledge of theoretical physics and mathematics, in order to correctly interpret observations that were frequently obtained with sensors of limited capabilities. In the late 1940s, the first critical studies of the relations of observed variables to the laws of physics were undertaken by Heinz-Wolfram Kasemir, and he continued to advance such studies throughout the rest of his scientific career. Heinz-Wolfram Kasemir (Heinz, as we, the Editors, called him) was a physicist by education, and also a talented and tireless experimentalist and innovative designer of scientific instruments. Many of the physical concepts presented by Kasemir in his manuscripts on atmospheric electricity and lightning physics contradicted the prevailing contemporary interpretations of the physics of atmospheric electrical processes. Not surprisingly, therefore, many of the papers Kasemir submitted to peer-reviewed publications in the United States had serious difficulties with reviewers, and the majority of the papers were rejected. This could have been because either the reviewers’ abilities were simply not equal to understanding the new physical concepts developed by Kasemir, or the reviewers were biased toward the prevailing interpretations of that time. In the face of such constantly frustrating, exhausting fights for acceptance of his papers in scientific journals, Kasemir finally stopped submitting them to peer-reviewed journals. As his close associates and friends, we knew of his feelings on this issue. Most of Kasemir’s publications, therefore, are either in technical reports, or in the proceedings of scientific conferences, making them difficult or impossible for interested researchers to access. Eduard Bazelyan, a well-known Russian physicist in the field of spark discharges, recently shared with us an interesting story related to Kasemir: “None of us thought about the possibility of starting and developing a lightning flash without any contact with a high-voltage electrode in the laboratory, until we started working on our book in the late 1990s. The possibility of the simultaneous development of positive and negative leaders in the volume between high-voltage electrodes appeared to us to be a brilliant idea, and was supported by our laboratory experiment. We were so proud of ourselves for coming up with it. Our euphoria, however, lasted only for a couple of weeks until, by chance, we came across a paper written by Kasemir a half-century earlier, where he had described a similar idea. After this understandably-great disappointment, we found satisfaction in proceeding with numerical calculations to develop this idea, which Kasemir could not perform because of the lack of powerful computers. So, Kasemir’s idea provided a quantitative foundation, and his name received the well-deserved recognition and respect of our research community. It is simply a pity that it took so long.” Heinz’s early, fundamental papers were published in German, and were not translated into English until now, for this book. Most of the work he conducted in the U.S. was presented in conference proceedings and technical reports, with only a few papers published in scientific journals. The main reasons for publishing this collection of Kasemir’s papers and presentations are that Kasemir's ideas were far ahead of their time, that many of his publications are not readily accessible, and that it is important to make them available to researchers currently pursuing a better understanding of atmospheric electricity and lightning physics, a field making rapid advances at the moment. It would be a huge loss to the research community if most of Kasemir’s scientific legacy were to remain unavailable. Since most of the papers were not reviewed by his peers, this book is a rare opportunity to experience the real, “uncensored” thinking of a prominent scientist in the field of atmospheric electricity. Kasemir’s papers are not easy reading, but a persistent reader will find great pleasure in discovering in them clear ideas expressed in very precise language. In the late 1990s, we approached Heinz with the suggestion of publishing a collection of his work. He was very receptive to this idea, and we started, with his assistance, the preliminary assembly of his publications. The project stalled, however, because he insisted on revisiting his old papers, to critically evaluate their contents for possible additional commentaries. We told him that we would wait until he finished this process. Sadly, however, it became obvious to us very soon that Heinz would not be able to proceed with these revisions: the unmistakable signs of dementia had already become very noticeable by this time. In 2010, we submitted a proposal to the National Science Foundation, asking for funding of the publication of a collection of the scientific papers by Heinz-Wolfram Kasemir. This was not a typical NSF proposal, so we solicited the co-participation of the Book Department of the American Geophysical Union (AGU) as our partner in publishing it. We are most grateful for the consideration, support and promotion of our idea for this book by Dr. Bradley F. Smull, Program Director of Physical and Dynamic Meteorology in the Division of Atmospheric and Geospace Sciences at the NSF. Also, we could not have had a better assistance in our project than that provided by Ms. Colleen Matan from the Book Department of the AGU, whose suggestions were always on-target and highly appreciated, and who has tirelessly advanced this project as an example for future projects of this type by the AGU. The excellent German-to-English translation of Kasemir’s early papers was done by Apex Translation Inc. Our friend, Tom Warner, generously contributed a photograph from his collection of exceptionally beautiful images of thunderstorms for the cover of this book. The lightning literature includes several books that are essentially reference books, which review the published results of the lightning research community (e.g., two volumes of Lightning edited by R. H. Golde, 1977; Lightning Discharge by M. Uman, 1987; and Lightning: Physics and Effects by V. Rakov and M. Uman, 2003). The first book to directly address the various issues of the physical concepts of lightning was “Lightning Physics and Lightning Protection” by E. M. Bazelyan and Yu. P. Raizer, 2000. There is still, however, a void in the literature on issues that address the physical interpretations of many lightning observations and measurements. The publication of this collection of Kasemir’s papers is intended, to some degree, to fill this void. The reproduction of some of the earlier papers was a challenging task, even with the use of modern technology; so, the clarity of some images and formulas in those papers is not as good as we would like it to be, and reading these papers may require some effort and patience. This collection of the work of Heinz-Wolfram Kasemir consists of 55 manuscripts, which comprise journal articles, conference presentations and technical reports, and is organized into five topics: Fair Weather Electricity, Global Circuit, Thunderstorm Electricity, Lightning Physics, and Measurement Techniques. Twelve early papers by Kasemir, published in German (and now translated into English), are included in the collection. In the Table of Contents, for each paper listed, the comments by the Editors serve to emphasize the important points of each manuscript, and to connect it to current issues in the fields of atmospheric electricity and lightning research.
2013: Radar for Meteorological and Atmospheric Observations. Springer, 537 pp., , ,
This book is written for Scientists, engineers, students, and other interested meteorological and atmospheric researchers. This book bridges the gap in our understanding of weather and atmospheric radar. The book consists of two parts the first half, Chaps. 1-7, mainly discussed the theoretical basis of weather and atmospheric radar and the last half, Chapters 8-12, describes actual systems and observations with these radars.