Wildfire smoke can travel quickly and pose significant health hazards to communities. The University of ÁùºÏ±¦µä, Reno is currently leading an effort to explore big data techniques for tracking wildfire smoke and predicting air quality.
The collaborative effort with the College of William and Mary has recently been awarded a National Science Foundation Big Data award, for a total of $1.34M where $983,012 is awarded through NSF and $358,571 is awarded from Amazon. The University's share is $1,043,260.
The objective of this research is to develop big data techniques to enable real-time, fine-grained wildfire smoke tracking and air quality forecasting.
"Such a prestigious award demonstrates the excellence and national competitiveness of our program," Manos Maragakis, dean of the College of Engineering, said.
The main team members at the University include Feng Yan, principal investigator, and Lei Yang, a co-PI from the Department of Computer Science and Engineering, and Heather Holmes, also a co-PI from the Department of Physics. Their efforts have also been supported by the ALERTWildfire team led by Graham Kent and Kenneth Smith, the Pronghorn HPC cluster managed by the High-Performance Computing Team at the Office of Information Technology, including Steve Smith, Mike Nicks, and Sebastian Smith.
Zach Newell, engineering computing manager at the College, along with technical solutions engineer, Jake Wheeler, are also part of the team.
"In today's era of data-driven science and engineering, it's important to collaborate with researchers to create innovative solutions associated with health hazards using advancements in technologies," said Sanjay Padhi, head of Amazon Web Service Research Initiatives. "Our work with the NSF BIGDATA program supports this project to help communities affected by wildfires."
Holmes explained that exposure to wildfire smoke can cause serious health problems, especially to vulnerable people with respiratory problems like asthma and heart disease, which makes it an important topic in both science and public health.
Challenges for battling wildfires impacts firefighters and researchers alike
Yan said that tracking wildfire smoke, along with its impact, can be challenging as it moves quickly, can cover hundreds of miles and can cause a sudden drop in air quality.
"It is a very dynamic thing that can be affected by many things like the fire, terrain and wind," Yang said.
The traditional way of tracking wildfire smoke is through satellite image data and meteorology data, which updates slowly and has a low resolution. To enable real-time, fire-grained wildfire smoke tracking and air quality predictions, researchers need to look for new data sources.
"The recent advances in machine learning and edge computing has shined a light on this as fine-grained image and video data can obtained by cameras, which can potentially provide fine-grained information in real time," Yan said. "Fortunately, we have a unique camera network infrastructure right here to facilitate this idea: the system, built by Dr. Graham Kent, Dr. Kenneth Smith, and their team."
Computer science and engineering Ph.D. student Heyang Qin, co-advised by Yan and Yang, has built deep learning models using the data collected from ALERTWildfire system. The preliminary experimental results verified the idea of using machine learning to detect smoke density from camera data.
"The traditional data source and the new camera data source are actually complementary as a traditional data source offers a global view while the camera data source gives fine-grained real-time local details, and gives us an opportunity to develop a hybrid approach by using both," Yang said.
This project is also expected to bring significant benefits to local communities in ÁùºÏ±¦µä, as the wildfire smoke situation has been getting worse over the past a few years. In 2018, more than 660,240 acres of U.S. Forest Service, Bureau of Land Management, and private land burned in 138 wildfires, according to an article from the Elko Daily Free Press.
"To implement this in practice imposes lots of challenges. Actually, even for the course-grained traditional data, it is already at hundreds of terabyte level, where tens of thousands of central processing unit hours are needed to perform the corresponding analysis," Yan said. "Collecting large amount of high-resolution camera data in real time is almost infeasible due to the networking constraints while processing the collected data and combining with the results with traditional data makes it even more challenging."
To address these challenges, the team has proposed basic preprocessing techniques at the camera devices and new cloud computing techniques for fast processing of the collected data. They will work together with their industry collaborators at Amazon, HP Labs, and Microsoft Research to develop a scalable and efficient system to prototype the proposed research.
Finally, the project will also provide opportunities to engage more students from underrepresented groups and impact the education of students via the College's K-12 outreach program, along with mentoring for undergrad and graduate students. It also provides an opportunity for students in the big data minor program to receive hands-on experience in the rapidly expanding field.
"As one of the department's three main research focus areas, big data has definitely demonstrated its potential having a significant real-world impact," said Eelke Folmer, chair of Department of Computer Science and Engineering.