Winds of Change

How computer models are playing an increasingly important role in catastrophe losses

January 09, 2020 Photo

From Jan. 1 to Oct. 4, 2019, there were 41,074 California wildfires, with 4.4 million acres burned, according to the National Interagency Fire Center. California also sustained its largest fire in state history last year—the Mendocino Complex Fire, which burned 459,123 acres.

Not to be outdone, storm-related losses, including flooding and severe weather in the U.S., caused an estimated $11.8 billion in economic losses, according to Aon Benfield’s report “Global Catastrophe Recap: First Half of 2019.”

Measuring the impact of wind on wildfire, hurricane, and severe-weather losses involves, among other tools, sophisticated software programs and models. Let’s review wind/computer modeling, explain how it is employed in claims litigation, and provide recommendations on its use.

The Court’s View

Due to the potential exposures involved in catastrophe losses today, claims-related wind/computer modeling has dramatically increased in recent years. For example, in Pacific Gas & Electric’s (PG&E) bankruptcy filing, losses being assessed in the Northern California PG&E wildfire subrogation matters are estimated at $20 billion, not including individual and business claims, which are asserted to be much higher.

With losses of that size now common in large wildfire subrogation matters, wind/computer modeling is being used to help assess factors such as fire origin, cause, and spread issues. Plaintiffs and defendants can evaluate a variety of factors, including the wind’s impact on power lines and how that relates to the origin of fires, how far embers may travel, and a multitude of other disputed issues involved in catastrophic losses.

One reported case involving wind/computer modeling is U.S. v. Black Hills Power Inc. The U.S. and the state of South Dakota filed an action alleging that the defendant failed to clear aspen trees from its transmission line right-of-way. One of the trees contacted the defendant’s transmission line, causing an arc that set off the Grizzly Gulch fire.

The district court reviewed experts’ use of computer mathematical formulas. One expert was assessing line sag that increases when line temperature rises, meaning that the line moves closer to the ground. Increasing electrical load on a line increases the line’s temperature. But wind can, under certain circumstances, decrease the temperature of transmission lines when it blows perpendicular to the lines. Further, wind can cause lines to sway and move horizontally, potentially hitting trees or other structures nearby.

The defendant challenged the expert’s qualifications, calculations, and conclusions, leading the court to review the expert’s work, the material it relied upon, and the input for his calculations regarding the height and location of trees. Ultimately, it upheld the expert’s use of sag and wind displacement calculations and permitted the expert to testify at trial.

The court reviewed another expert who employed a computer model, called BEHAVE, to predict the ignition time of the fire at issue in the case. That program models the rate of fire growth based on variables. The defendant again challenged the expert’s qualifications and the scientific basis/validity of the BEHAVE model. The court found that the expert witness was well qualified. It next reviewed the modeling program itself, noting that the model was renowned and peer reviewed, and that the defense expert actually used the program himself.

The defense challenge was that the program was not proper to estimate the ignition time of the fire. The plaintiff’s expert had sought input data on fuel, topography, wind speed, and ignitability of the fuel. That data was based on the expert’s observations as well as weather information provided by witnesses and other experts. A first responder indicated he saw the fire was about 30 yards across and 30 yards wide, totaling about 10,000 square feet, at 2:30 p.m. Based on the model, the expert estimated how long it would take for a fire ignited at the origin location to spread to 10,000 square feet. The model indicated that it would take between 44 and 68 minutes. Based on that, the expert estimated the fire started between 1:20 p.m. and 1:45 p.m.

The program presumes the variables remain the same the entire time. To adjust to that assumption, the expert created three different models based on different variable inputs. He determined the fastest the fire would grow and the slowest it would grow, permitting him to provide a range of time for the spread of the fire previously noted. Those bracketed times likely include the actual time of ignition. The expert defended the modeling by noting that the time from ignition to sustainable fire was short. The court concluded that the range-of-time estimate was appropriate, in part because it did not pinpoint a time certain for ignition due to variances in the computer model.

Wind/Computer Models Put to Use

Fire departments are modeling weather data during emergencies to help determine how best to fight fires. WindNinja is a computer program that computes wind fields for wildland fire events. Wind is a significant factor in wildland fire events. Complex terrain causes local changes in wind speed that impact wildfires and those attempting to suppress them. WindNinja was developed to help fire managers predict these winds and then make decisions on how best to fight the fire.

Designed for fast-simulation times, it can be run in different modes. One mode uses coarser weather data from the U.S. National Weather Service for future forecasting of winds. The second mode uses more specific wind measurements to develop a wind field for the impacted areas. The last mode uses user-specified average-surface direction and wind speed. Other inputs can include elevation data, time and date, and vegetation type.

Power companies are also modeling weather events. Diablo winds—a name that has been used to describe hot, dry wind from the northeast in Northern California during the spring and fall—recently forced a historic preemptive power shutdown in Northern California, leaving 150,000 Bay Area homes and businesses without power. PG&E determined that those winds posed a dangerous hazard for its estimated 2,500 miles of transmission lines and 25,000 distribution lines. It feared that the Diablo winds and California’s dry climate, coupled with PG&E’s equipment, could spark another round of wildfires.

With input from PG&E’s emergency operations center, which contains dozens of computers and meteorologists from its wildfire safety operations center that can model scenarios based on data from weather stations (low humidity, sustained wind and gusts, dry fuel percentage); high-definition cameras; National Weather Service warnings; MesoWest; and other proprietary sources, the decision was made to cut off power.

California Gov. Gavin Newsom has budgeted $1 billion in new funding for response and fire preparedness. A warning center will be operated by the state firefighting agency CalFire, along with the California Public Utilities Commission and the California Department of Energy. Data collected will be shared with federal, state, and local authorities. Data will be obtained from planes with high-definition cameras, radar, and infrared equipment, and relayed to a UC San Diego research team running a lab called WIFIRE, which uses a supercomputer to model how the fire might spread.

Another initiative involves data collected by 300 cameras in high-fire areas that can see 100 miles with infrared at night and 70 miles in daylight. That information is inputted to a program that uses artificial intelligence to learn what constitutes normal conditions for a certain area. When the program detects an anomaly, it notifies local and state agencies.

Best Practices

Employing sophisticated wind/computer modeling related to wildfires, hurricanes, and other severe storm events like flooding requires attention to detail. Once a need is identified, qualified experts should be retained to work within their specialty areas. When hiring an expert, focusing on the specific area of needed expert assistance is critical to ensure that modeling work performed later has the potential to be introduced at trial.

Moreover, if the claims matter ends up in litigation, ensure that counsel is hired to protect the expert’s work under the attorney-work-product doctrine. Physical evidence that the expert will be relying on for his modeling work needs to be preserved, as well as any applicable documentation. If that evidence is not under your control, then you must demand access and preserve it. The expert will likely need testing of those materials to obtain data for the modeling program. Failure to preserve that evidence opens a line of attack that the data used from exemplar evidence is inconsistent or not substantially similar to the actual evidence in the claim.

Preserving witness recollections of what happened must also be done, as their observations may be relied upon by your expert, as seen with the first-responder data about the area of fire observed in the Black Hills Power case discussed previously. Claims professionals, counsel, and experts should agree on how to proceed to avoid misunderstandings and unintended actions and costs.

photo
About The Authors
Peter A. Lynch

Peter A. Lynch is a member of the subrogation and recovery department at Cozen O’Connor. He can be reached at plynch@cozen.com or follow him on Twitter @firesandrain.  

Sponsored Content
photo
Daily Claims News
  Powered by Claims Pages
photo
About The Community
  Property

CLM’s Property Committee provides education relevant topics, practical skills, and innovative strategies for handling property claims and litigation related to coverage and insurance claims for CLM’s members and fellows.

photo
Community Events
  Property
No community events