The Memorial Bridge between Kittery, ME and Portsmouth, NH has several sensors on and below that collect data and record that data in an on-bridge database in a raw form. This project is extracting data from the on-bridge database into a new researcher-facing database that will offer publicly-available dataset extractions and on-server calculations for those researchers who wish to understand the work being done on the Living Bridge. The work in this project involves the understanding of the sensor data on the bridge, the existing schema of the database, developing a new schema for the researcher-facing database, and populating the database with data from the bridge. Future scope work will involve presenting and visualizing the bridge data as parameters are entered.
Funding is requested from the Northeast CyberTeam to support an undergraduate intern who will help advance remote sensing deep learning workflows supporting Vermont’s high-resolution land cover initiative. The internship will be based out of the University of Vermont Spatial Analysis Laboratory (SAL) and supervised by SAL Director and faculty member Jarlath O’Neil-Dunne. This internship will make extensive use of the Vermont Advanced Computing Core (VACC), particularly the DeepGreen GPU cluster.
The State of Vermont is under both regulatory and public pressure to improve the water quality of Lake Champlain. State agencies must have access to high-resolution land cover information that is detailed enough to provide parcel-level quantification of land cover features. The University of Vermont, with funding from the State of Vermont, led the development of the 2016 statewide, high-resolution land cover dataset. This 2016 land cover dataset is the most accurate, detailed, and comprehensive land cover map ever made of Vermont. The existing workflows employed to develop this land cover dataset are slow and expensive, running on individual desktop computer workstations. Moreover, the land cover dataset was already out of date the moment it was produced.
In February 2020, a meeting was held consisting of the state agency representatives, the Vermont Advanced Computing Core, and the Spatial Analysis Laboratory. State agencies voiced their desire to have an approach to land cover mapping that would allow for more rapid updates of high-resolution land cover products, and that would capture fine-scale changes that could influence water quality, such as the construction of a new building.
This project will focus on integrating deep learning approaches into the SAL’s feature extraction workflows. Deep learning has shown tremendous potential for mapping land cover from high-resolution remotely sensed datasets. Deep learning techniques by themselves may not always be optimal for updating existing land cover datasets as false change can result in differences stemming from the source data or errors in the mapping itself. We propose to leverage deep learning to more efficiently update the Sate’s high-resolution land cover maps through a hybrid approach. Our desire is to take advantage of the potential that deep learning offers while still employing the methodologies that ensure quality specifications are met. The goal of this hybrid approach is to have a faster, more efficient, and more accurate approach to updating existing high-resolution land cover products. High-performance computing will be employed to tackle the most computationally intensive aspects of deep learning, the model training process. These models will then be integrated into the existing workflows to produce areas showing areas of change, and the existing high-resolution land cover to enable rapid updating of the statewide landcover data set. This project will leverage the University of Vermont’s recent investments in high-performance computing architecture. Deep Green, an NSF-funded supercomputer, will be employed.
The phases for this project are: 1) deep learning system design, 2) deep learning system development, 3) deep learning system implementation, 4) integration of deep learning into object-based feature extraction workflow, 5) production of an updated statewide land cover map. The software technologies employed will include TensorFlow and eCognition for feature extraction and ArcGIS for visualization.
This project is incredibly valuable to the state of Vermont as the State is struggling to meet regulatory requirements to reduce non-point source pollution to Lake Champlain, the state’s largest lake that extends into New York and Quebec. Access to current, accurate high-resolution land cover is imperative if the State is going to make decisions on how to reduce non-point source pollution best and fund these activities. Furthermore, the State has no dedicated remote sensing scientists on staff and lacks the computing and technical resources to carry out land cover mapping on this scale. The intern funded as part of this project will work with a talented team that consists of individuals who are internationally recognized for their expertise in automated feature extraction.
|hpc.social||High Performance Computing and related fields, including Big Data, Research Computing, and related hardware and software optimized for these fields, including research software engineering and…||administering-hpc, cluster-management, cluster-support, distributed-computing, file-systems, hardware, hpc-arch-and-perf, hpc-operations, hpc-storage|