Researchers seeking new ways to explore the ocean have developed an underwater “robotic plankton” inexpensive enough to be deployed in swarms to study the ocean from a plankton-eye viewpoint.
Not that the devices truly resemble plankton. Plankton is a mix of tiny organisms, including diatoms, microscopic crustaceans, and the larvae of larger marine creatures. The new devices are the size of a football, and look like a cross between a tin can and Thor’s hammer. Not to mention that they’re bright yellow, so researchers can easily retrieve them at the end of a mission.
What they have in common with plankton is that they can drift with the current, adjusting depth as necessary.
One goal is to understand the motions of larvae within the plankton.
Jules Jaffe, a research oceanographer at Scripps Institution of Oceanography, La Jolla, California, is enthusiastic about the idea.
“We can make them mimic planktonic larvae,” he says. “We’re in the reference frame of the organisms.”
Because the devices only cost $3000 to $5000 each, it’s possible to deploy a dozen or more at a time, allowing researchers to determine how entire groups of plankton drift on currents. In particular, Jaffe says, researchers want to figure out how larvae harness the forces of wave action to find their way from the deep sea to coastal waters, where they can mature and reproduce.
But that’s not the only way these devices can be used. They could also track the movement of subsurface oil spills, such as the plume created by the 2010 Deepwater Horizon disaster, in which much of the oil was trapped far below the surface. Or they could eavesdrop on whales, monitor red tides, or look for subsurface links between marine sanctuaries.
Other robotic devices are larger. For more than a decade, an international project called Argo has maintained an array of 3000 floats drifting around the world’s oceans. Every 10 days, each of these floats dives to 2000 metres and returns to the surface. So far, says Dean Roemmich, a physical oceanographer from Scripps, these dives have produced 1.8 million salinity and temperature profiles. The next step is to deploy 1200 next-generation floats to carry the process all the way to the ocean bottom.
Robots aren’t the only way in which scientists hope to harness computer technology to peel back the ocean’s mysteries. Stuart Sandin — yet another researcher from Scripps — is using computer-driven image processing to stitch together photos into detailed 3D images of 100-square-metre portions of coral reefs such as the Palmyra atoll in the equatorial Pacific, south of Hawaii.
Based on this, he says, it is possible to identify individual types of coral, mapping their distributions across the reef. In one recent paper, he says, his team was able to map 45,000 individual coral colonies across 16 sites — something that would have been inconceivable with traditional research techniques conducted by scuba divers.
Better yet, this type of analysis can be done by anyone with computer access to the images. “I have 30 to 50 undergraduates in my lab that may never have seen a coral reef who are learning the taxonomy of corals,” he says.
It’s also possible to return to the same site time and again, observing how the coral reacts to changing conditions. “The scientific potential is really at its infancy,” Sandin says.
Ultimately, it may even be possible to take human taxonomists out of the loop entirely. Jaffe is developing artificial-intelligence methods to scan microscope images of plankton in order to determine what types of organisms are present.
The goal is to put the microscopes in the water, on robotic vehicles. But rather than have them use satellite links to transmit images whenever they surface — a time-consuming task—he wants onboard microprocessors to do their own processing. Then, he says, “we won’t even have to transmit the images. The satellite [uploads] will just tell us what’s there.”