Colorado, US: Esri and Google Maps presented maps of wildfires in Colorado, that the two companies continuously update, demonstrating an increasingly popular method for disseminating emergency information.
“The primary use is for residents wanting very current information on where the fire is and whether they’re going to need to evacuate or not,” said Michael Goodchild, a geographer at the University of California, Santa Barbara, who has studied how residents of his own state use updating online maps during wildfires. People may use them to find the nearest shelter and to plan a driving route to their evacuation points, Goodchild explained.
In California, people often rely on maps like Esri’s and Google’s even more than they rely upon official government websites. Local government servers often cannot handle the sudden surge in traffic during emergencies, making their sites slow or even non-functional, Goodchild told InnovationNewsDaily.
Unofficial maps also may update more frequently than official ones because their fact-checking standards aren’t as stringent, Goodchild said. Unofficial maps can post information as soon as it is received, whereas government maps may need to wait for a verifying source. The Esri and Google maps of the Colorado fires combine governmental and nongovernmental sources, suggesting they are balancing verified facts and fresh information.
Google’s map shows the locations of fires, how well contained different fires are, the locations of shelters for evacuees, and photos taken within the last hour. Google draws its data from governmental groups including the state of Colorado, the US Geological Service, NASA satellite imagery and the National Oceanic and Atmospheric Administration.
Esri’s map shows locations of fires, burned areas, wind direction, precipitation and areas in danger, as well as tweets and YouTube videos that people have made about different locations. Besides Twitter and YouTube, the map draws its data from the same governmental sources Google uses.
“So-called crisis maps show it’s possible to automatically scrape the Web for relevant information, then summarise the results in a way that’s easy to use and understand. Many institutions, including the FBI, are interested in harvesting and summarising online data,” noted Goodchild.
Source: Live Science