Introduction

In its simplest form, my research centers around building new remote sensing and robotic platforms, and the artificial intelligence algorithms to support them, that help answer the NASA Science Mission Directorate’s fundamental Earth Science question: “How is the planet changing and what are the consequences for life on Earth?” Doing so is a complex and pressing task, yet the technical development and research breakthroughs that may lead from this work need not be isolated in the ivory towers of academia nor restricted in applicability to the study of Earth. My research vision is to use the ocean and Antarctic environment as a testbed, improving our understanding of our world and its incredible biodiversity, while proving out the platforms and technologies we will send to other planetary bodies and nearby stars.

My PhD work has three primary components. First is the application of novel and rapidly improving artificial intelligence algorithms to remotely sensed data. Satellites and observatories are our primary tool for understanding the composition of stars, asteroids, and exoplanets, but they are also the only way to comprehensively monitor our own world in real-time. It only makes sense to collaboratively push forward analysis capabilities using lessons learned in both fields. Second, I am developing new drone platforms and advancing the logistics for deploying drones in extreme marine environments such as the Southern Ocean and around Palmer Station on the Western Antarctic Peninsula. This work is advancing our knowledge of the Earth’s polar regions and will support the critical need to advance the instruments and vehicles that will aid explorers on new worlds and help maintain long-term interstellar craft. The third and final objective of my PhD is to understand how the tools we use to monitor our dynamic oceans should be coordinated, using the multi-faceted Ocean Observatories Initiative as a case study. Satellites, long endurance aerial drones, and in-situ robotic assets, such as underwater vehicles and rovers, all supply necessary pieces of the puzzle to understanding our oceans. But how should they be coordinated, and how can they best complement each other? This is a question we must answer in order to effectively monitor our own world, and to doing so will help us decide what tools we will send out beyond it.

Drs. Kevin Hand and Chris German, from NASA Jet Propulsion Laboratory and Woods Hole Oceanographic Institute, put it well in a recent paper calling for more collaboration between planetary scientists and oceanographers: “Long before rovers carved tracks on Mars, scientists and engineers tested similar rovers in a variety of deserts here on Earth. Similarly, before we send robotic explorers to distant worlds, we will have the opportunity to test the platforms and instruments in analogous environments here on Earth. Robotic vehicles and instruments for planetary and ocean exploration will first be tested and utilized in Earth’s ocean; and on, within and beneath Earth’s cryosphere. Developing technologies for ocean exploration is a win–win for Earth science, planetary science and astrobiology. Clearly, much of the investment for the exploration of other planetary bodies could be leveraged to improve exploration capabilities here on Earth.” (Hand and German, 2018)

Isolated ocean and polar environments provide the ideal space to quickly and cost-effectively test out the technologies, logistics, and psychological requirements for humanity to venture out into the cosmos, while simultaneously developing the systems and science that will ensure our current home remains healthy long into the future.

Research Components

I. Applying machine learning to remote sensing data

In order to build the critical mass of funding and support necessary for an interstellar mission, it needs a destination. Finding a destination will require advances in both our ability to see into other solar systems, and our ability to analyze what we’re seeing. A more near-term task that will help boost us to another star, prospecting for resources in our own solar system, has similar requirements: remote sensing advances on our satellite platforms and an increase in the amount of data we’re able to process rapidly. Before setting sail beyond Sol, we will need to understand what resources are contained within our own system and at our destination. An early step in this process, a major component of my PhD funded by NASA’s North Carolina Space Grant, and the focus of my next 6 months, is to develop a deep learning based convolutional neural network to identify objects of interests, such as whales, ice floes, and mineral deposits, on the Western Antarctic Peninsula from satellite imagery.

Figure 1. Earth and Venus to scale: satellite remote sensing allows us to better understand our own planet and our neighbors near and far. Photo Credit: NASA / JPL.

Technological trends within satellite remote sensing toward higher spatial and spectral resolution, faster revisit rates, and increased data availability to researchers are permitting novel projects such as space-based observation of whales, monitoring individual ships, and monitoring vegetation or mineral change in real-time (Fretwell et al., 2014). With satellite spatial resolution now sufficient to detect these small objects of interest, the sheer area of ocean is an obstacle. Our inability to process this amount of data is hampering effective management, for example, preventing population level insight of endangered species or blocking proper situational awareness of sea ice to ensure maritime safety in the Southern Ocean. New analysis techniques beyond manual inspection are required.

Convolutional neural networks (CNNs) are a subset of deep learning, inspired by the neural connections in the human brain, that have been particularly successful in analyzing imagery (LeCun et al., 1999). Beginning with computer vision and image processing, CNNs are now powering advances in research and industry ranging from exoplanet detection to natural language processing to pharmaceutical discovery. CNNS are a promising detection method and make it feasible to continuously analyze large spatial expanses of our ocean or other planetary bodies.

My research group, the Duke Marine Robotics and Remote Sensing Lab, has significant expertise in geospatial analysis and oceanographic remote sensing. I have the support of the lab’s resources and expert personnel throughout this project. This work is also facilitated by my lab’s membership in the Palmer Antarctica Long Term Ecological Research (LTER) program, by access to WorldView-3 satellite imagery at two-week intervals from the Polar Geospatial Center, and by access to weekly Planet satellite imagery as a member of the Planet Research and Education Program.

II. Building new drones and autonomous systems

NASA recently announced it will be sending a small drone, the Mars Helicopter, to the Red Planet along with the Mars 2020 Rover. Another drone mission, Dragonfly, is in the final selection round of NASA’s New Frontiers program to explore Titan from the air. NASA has already deployed its SPHERES robots to the ISS, semi-autonomous drones that aid astronauts with tasks and allow remote researchers to access the Station. The next generation of SPHERES, Astrobee, is planned for launch in 2019. Small platforms, both remote controlled and fully autonomous, capable of operating in harsh environments from the Martian atmosphere to hard vacuum, will be critical for exploring new worlds and maintaining large scale spacecraft that could sustain life on the journey to another star. They will be vital for ensuring safety of both humans during extra-vehicular activities (EVAs) and the health of the spacecraft.

Figure 2. Astronaut Scott Kelly and NASA’s SPHERES semi-autonomous robots. Photo Credit: NASA.

The second component of my PhD, and a project that just received full funding from the Wildlife Conservation Society, is to develop an aerial drone platform that is capable of locating and tracking a single radio signal and navigating to that radio signal in order to rapidly download data that is contained at a sensor at that location. Our specific use case is to find animals such as birds, insects, and small mammals that have been tagged with a complex sensor package to better understand their ecology. These sensor packages are often too small to include a satellite transponder to connect to GPS or uplink the data to satellite, thus needing a local uplink, such as a nearby drone, to act as the data mule. The parallels of this system to resource prospecting and atmospheric exploration on other planets is incredibly exciting.

Our work will support critical conservation priorities here on Earth, endangered birds in Southeast Asia and threatened seals in the Antarctic are the first two study species. Our modular system of flight-proven open-source components will allow future scientists and engineers to further develop this system and pull useful modules into their own work. Success here will prove out another use of autonomous aerial systems and open the door for future applications.

One potential use case would be during the deployment of a large number of low-cost, non-guided, in-situ sensors over a low gravity asteroid where they bounce to disperse across a large study area. Replacing the need for comprehensive coverage from an orbiting satellite, a small autonomous craft could zero in on signals from these dispersed sensors and act as a data mule for the information collected by these small robots.

Another use case could be a similar deployment of low weight, highly energy efficient, floating sensors in the oceans of Titan or Europa. Making these sensors small and low power increases the number that can be brought on a single mission. A single aerial drone could provide uplink capability that would otherwise be limited without numerous orbiting craft and could solve the communication issues inherent in a thick atmosphere such as Titan’s by flying low to collect data and flying high for uplink back to an orbiter or communication relay in deep space.

Finally, this category of platforms, tested on Earth and in low Earth orbit, will likely be the predecessors for a new generation of maintenance drones, similar to NASA’s Astrobee, that will be critical for maintaining large spacecraft and preventing human residents of an interstellar craft from needing to do frequent, high risk, EVAs. This work is part of a larger movement to provide explorers and future astronauts with semi-autonomous robotic assistants and it is incredibly exciting to see the advances in intelligence, power storage, and sensor technology of the last decade that are making these robots an enabler for humans to live in and explore new environments.

III. Coordinating across different levels of remote sensing and robotics

When building technical systems for exploration, whether it is Antarctica, Europa, or TRAPPIST-1e, no single platform will be sufficient to understand whether or not that area has useful resources, contains life, or could potentially support it. The final component of my PhD is building out our understanding of how different remote sensing platforms, in-situ robots, and human operators should coordinate their work, both for studying our oceans here on Earth and for planning what we will send into the cosmos.

The Ocean Observatories Initiative (OOI) is a suite of instruments and platforms that study the physical, chemical, biological, and geological properties of the ocean (Smith et al., 2018). The OOI merges monitoring across domains from satellites gathering oceanographic scale data on a daily basis, to deep-sea platforms focused on a single hydrothermal vent, to ship based scientists conducting more varied and complex sampling but only for short periods. Given the expense and risk of sending new platforms out beyond Earth’s gravity well, systems like the OOI provide an ideal testbed for coordinating the role different robotic systems can play, understanding where humans are still necessary, and prototyping new technologies. My specific work with the OOI uses multispectral satellites, long endurance drones, and oceanographic gliders to track great whales across the Southern Ocean. In this system, which is still being developed, satellite imagery, analyzed by a CNN, will be used to map whale locations at a weekly time step. Oceanographic gliders are underwater robots that can stay out in the field for months at a time with a wide array of sensors. The satellite data will guide gilders to whale location hotspots where the gliders will collect environmental data and use acoustic sensors to locate whales that may be miles from their initial satellite derived location. The gliders will pinpoint the whale locations and communicate this in real-time to a swarm of aerial drones which will rapidly arrive to collect imagery of the whale pod and microbiome health data from the breath of the animals. This system is being funded as a part of the National Science Foundation’s Palmer Antarctic Station ecological research directive and is setting the stage for further integration of different robotic systems and increasingly complicated coordination between platforms.

As we begin to implement the next generation of long-term semi-autonomous monitoring systems across our own solar system and decide what we will equip spacecraft with for long duration human spaceflight, understanding the tradeoffs among different technologies, and where humans fit into the system, will be of the utmost importance. The expense and risk of testing these tradeoffs in space is simply not tolerable in our current political and financial climate.

The need to test these systems in our oceans draws up a romantic quote written by the Antarctic explorer Ernest Shackleton nearly a century ago. “Unlike the land, where courage and the simple will to endure can often see a man through, the struggle against the sea is an act of physical combat, and there is no escape. It is a battle against a tireless enemy in which man never actually wins; the most that he can hope for is not to be defeated.” While this was true at the turn of the 20th century, it is no longer the case. We now have the means to endure, survive, and thrive on the sea for indefinite periods of time. Technologists are planning out long term “sea-steading” communities and we travel from pole to pole without issue. This transformation in our capability to thrive on the world’s oceans came from constant testing, rapid iteration, and the relatively low cost of failure of any specific vessel or platform design. Now in the 21st century, Shackleton’s quote stands true not for humans on the sea but for humans in space. And it is with many of the lessons we are learning today from the incredibly diverse array of platforms, ships, and robotic systems, all working together to ensure the safety of the maritime world and help us understand its dynamic nature, that we can bring the same reality to human life in space.

Figure 3. The Ocean Observatories Initiative’s many robotic components. Photo credit: Woods Hole Oceanographic Institute

Conclusion

Finishing up with another quote from Hand and German “We should leverage the scientific and technological lessons learned from both Earth and planetary exploration. Moving forward, the opportunity to make great discoveries in our ocean and beyond will be advanced best by a shared vision for exploration.” (Hand and German, 2018) I believe this shared vision for exploration is absolutely possible, has growing momentum from the scientific community, and will take us to the stars. Beyond that, we must maintain our own world in order to succeed in our interstellar ambitions.

This grant will provide vital funding for me to attend ocean and space focused to conferences presenting my work and learning from both communities. It will additionally support my time over the coming months as we develop novel drone platforms to explore the Antarctic. I am thrilled at the opportunity to represent TVIW at these conferences and in the platforms we build, communicating our shared vision, and working towards an interstellar future together.

References

Fretwell, P.T., Staniland, I.J., Forcada, J., 2014. Whales from space: Counting southern right whales by satellite. PLoS One 9, 1–9. https://doi.org/10.1371/journal.pone.0088655

Hand, K.P., German, C.R., 2018. Exploring ocean worlds on Earth and beyond. Nature 11, 2017–2019.

LeCun, Y., Haffner, P., Bottou, L., Bengio, Y., 1999. Object Recognition with Gradient-Based Learning 319–345. https://doi.org/10.1007/3-540-46805-6_19

Smith, L.M., Barth, J.A., Kelley, D.S., Plueddemann, A., Rodero, I., Ulses, G.A., Vardaro, M.F., Weller., R., 2018. The Ocean Observatories Initiative. Oceanography 31, 16–35.