Neural Networks and Photogrammetry for Analysis of Marine Remote Sensing Data

Project Summary

KC and Patrick led two hands-on data workshops for ENVIRON 335: Drones in Marine Biology, Ecology, and Conservation. These labs were intended to introduce students to examples of how drones are currently being used as a remote sensing tool to monitor marine megafauna and their environments, and how machine learning can be used to efficiently analyze remote sensing datasets. The first lab specifically focused on how drones are being used to collect aerial images of whales to measure changes in body condition to help monitor populations. Students were introduced to the methods for making accurate measurements and then received an opportunity to measure whales themselves. The second lab then introduced analysis methods using computer vision and deep neural networks to detect, count, and measure objects of interest in remote sensing data. This work provided students in the environmental sciences an introduction to new techniques in machine learning and remote sensing that can be powerful multipliers of effort when analyzing large environmental datasets.

Themes and Categories

Graduate Students: KC Bierlich and Patrick Gray

Faculty: Dr. David Johnston

Undergraduate Course: "Drones in Marine Biology, Ecology, and Conservation" (ENVIRON 335)

Course Summary

Learning Objectives

Lab 1:

  • Understand how to accurately measure objects from aerial images depending on different camera/sensors and the altitude of the photo.
  • Understand why it is important to measure body condition and health of top predators, such as whales, in an ecosystem.
  • Hands-on activity to measure the total length of blue, humpback, and minke whales from California and Antarctica using drone imagery.
  • Compare results to classmates/class and discuss reasons for variation in measurements.

Lab 2:

  • Understand basic remote sensing technology, temporal, spectral, spatial, and radiometric characteristics of different sensors.
  • Learn some intuitive ideas about machine learning and applications in marine science
  • Build up a library of examples of how remote sensing and machine learning can be used for marine science.
  • Learn more specifically how deep learning functions and gain an understanding of the theory behind convolutional neural networks. Use this knowledge and the general remote sensing knowledge to develop a neural network model to classify animals and everyday objects.

The first lab consisted a short lecture on the theory of photogrammetry and its conservation importance and then students worked directly with the drone imagery using ImageJ for hands-on photogrammetry. The second lab session began with a short crash course in machine learning and a summary of applications in marine remote sensing. Students then had time to familiarize themselves with the datasets and the python tools we used in a guided format using Jupyter notebooks stored on Github and run on the Google Colaboratory environment. We then trained our neural network models as a group but with each student running it independently on Colab.

Artificial intelligence-based techniques are bringing incredible advances to difficult research problems, but these methods are often inaccessible to non-technical ecologists. Developing an intuitive understanding of neural networks, without the complicated setup and coding challenges, was an essential goal of this, with the hope that students will be empowered enough to bring these new techniques into their own research.


We used three datasets during this project. Lab 1 used images of blue, humpback, and Antarctic minke whales collected by our research group in Antarctica and California. Lab 2’s initial Neural Network Development was done on the publicly available MINST and CIFAR datasets. Students used these to experiment with model design and then as a final component of the project we demonstrated how a neural network could be used to identify and measure cetaceans using the same dataset as Lab 1.

Course Materials

Drone images of whales that students measured can be found here.

Students also downloaded ImageJ, a photogrammetry software, that they used for measuring. All code from Lab 2 can be found here. An overview of both deep learning and remote sensing intended to be an hour long lecture can be found here.


Equation for measuring whales
Figure 1) Equation and example of measuring whales using aerial photogrammetry.
Drone imagery examples
Figure 2) Example of drone imagery in raw form and after being analyzed by a convolutional neural network to identify the species and measure each animal.
CIFAR dataset images
Figure 3) CIFAR Dataset for neural network model development


Related Projects

This two-week teaching module in an introductory-level undergraduate course invites students to explore the power of Twitter in shaping public discourse. The project supplements the close-reading methods that are central to the humanities with large-scale social media analysis. This exercise challenges students to consider how applying visualization techniques to a dataset too vast for manual apprehension might enable them to identify for granular inspection smaller subsets of data and individual tweets—as well as to determine what factors do not lend themselves to close-reading at all. Employing an original dataset of almost one million tweets focused on the contested 2018 Florida midterm elections, students develop skills in using visualization software, generating research questions, and creating novel visualizations to answer those questions. They then evaluate and compare the affordances of large-scale data analytics with investigation of individual tweets, and draw on their findings to debate the role of social media in shaping public conversations surrounding major national events. This project was developed as a collaboration among the English Department (Emma Davenport and Astrid Giugni), Math Department (Hubert Bray), Duke University Library (Eric Monson), and Trinity Technology Services (Brian Norberg).

Understanding how to generate, analyze, and work with datasets in the humanities is often a difficult task without learning how to code or program. In humanities centered courses, we often privilege close reading or qualitative analysis over other methods of knowing, but by learning some new quantitative techniques we better prepare the students to tackle new forms of reading. This class will work with the data from the HathiTrust to develop ideas for thinking about how large groups and different discourse communities thought of queens of antiquity like Cleopatra and Dido.

Please refer to for more information.

We introduced students to spatial analysis in QGIS and R using location data from two whale species tagged with satellite transmitters. Students were given satellite tracks from five Cuvier’s beaked whales (Ziphius cavirostris) and five short-finned pilot whales (Globicephala macrorhynchus) tagged off the North Carolina coast. Students then used RStudio to calculate two metrics of these species' spatial ranges: home range (where a species spends 95% of its time) and core range (where a species spends 50% of its time). Next, students used QGIS to visualize the data, producing maps that displayed the whales' tracks and their ranges.