S.C. Sea Grant Consortium

Coastal Heritage Magazine

New Technology: Driving Advances in Coastal Science

In the past 25 years, technology has accelerated extraordinary advances in how scientists record, measure, and process information, and thus has revolutionized research.

A person operating an all-terrain vehicle on a beach.

Quick Surveys. The U.S. Army Corps of Engineers transformed an all-terrain vehicle into the Rapid Assessment Mobile LiDAR, or RAMbLr, to map beach topography. The device measured the impact of 2016’s Hurricane Matthew on South Carolina beaches in a fraction of the time such work took after 1989’s Hurricane Hugo or 1999’s Hurricane Floyd. Photo by Sara Corbett, U.S. Army Corps of Engineers.

New Technology: Driving Advances in Coastal Science

For decades, the standard river flood forecasts in South Carolina were based primarily on rainfall totals from several dozen weather stations and stream flow data from gauges on the largest rivers in the state.

Now powerful supercomputers crunch terabytes of data to provide near real-time forecasts for every lake, pond, river, and stream as water flows from the mountains to the sea.

After Hurricane Hugo in 1989, survey crews trudged with tripods across the state’s beaches for weeks to determine how much sand had been lost to the powerful storm’s surge. After Hurricane Matthew in 2016, more precise measurements were available in days thanks to an all-terrain vehicle equipped with laser-based technology.

Coastal science is constantly evolving; moving the field forward is the goal of research. In the past 25 years, technology has accelerated extraordinary advances in how scientists record, measure, and process information, and thus has revolutionized research.

CH-SPRING-2017-SeaGlide

Stoking Future Researchers. The SeaGlide® is designed to recruit the next generation of scientists and deepsea explorers. Made from a kit created by the U.S. Navy’s Naval Surface Warfare Center Carderock Division in Bethesda, Maryland, the SeaGlide® can be used to teach K-12 students the basics of robotics. Photo by Daniel Daglis, U.S. Navy.

Many of these recent advances trace their roots to major breakthroughs of the 20th century: development of electronic computers, the first great number crunchers, in the 1940s; decoding of the structure and synthesis of genetic building blocks deoxyribonucleic acid (DNA) and ribonucleic acid (RNA), and the launch of Sputnik and other early satellites in the 1950s; and discovery in the 1960s of a process for amplifying light to create a laser.

The trunks of research from those roots grew rapidly and relatively straight for decades. Then, about a generation ago, the branches began proliferating.

Those first computers led to more powerful models that sparked today’s big data revolution. Sputnik fueled the space race that made possible mapping breakthroughs with the Global Positioning Satellite (GPS) system. Understanding the structure of DNA and RNA sparked major advances in genetics and genomics. The laser begat Light Detection and Ranging (LiDAR), a process that has revolutionized precise spatial measurement.

Today, computers crunch massive amounts of data for daily weather forecasts, autonomous vehicles utilize GPS to map the flood potential of individual residential lots or the composition of the deep-sea floor, and specially programmed robotic devices can quickly identify species of fish eggs through genetics.

The breakneck speed of new technological advances isn’t about to slow down. John R. Delaney, a professor of oceanography at University of Washington, addressed the potential as co-author with Roger S. Barca of a 2009 essay entitled “A 2020 Vision for Ocean Science.”

“This new era will draw deeply on the emergence, and convergence, of many rapidly evolving new Technologies. These changes are setting the scene for what Marcel Proust called “the real voyage of discovery, [which] lies not in seeking new landscapes, but in having new eyes.”

More information expands horizons. The vast majority of new technologies have some connection to advancements in “big data,” the collection, storage, and processing of massive amounts of information. For generations, printed journals and books in libraries were the storage venue for data gathered by researchers. Capacity was limited by square footage.

Then extraordinary breakthroughs in computer storage from the 1980s through the 2000s took us from data stored in kilobytes (2 KB equal about one page of type) on floppy disks to petabytes (1 PB equals 1,000 copies of the 32-volume Encyclopedia Britannica) stored in the entire cloud that is the internet. Equally rapid development of processing capacity allowed computer analysis of these larger datasets.

Viktor Mayer-Schonberger details the process in his 2014 book Big Data: A Revolution That Will Change How We Live, Work and Think.

“Just as the telescope enabled us to comprehend the universe and the microscope allowed us to understand germs,” Mayer-Schonberger writes, “new techniques for collecting and analyzing huge bodies of data will help us make sense of our works in ways we are just starting to appreciate.”

Weather forecasting is among the most obvious of the scientific advances sparked by big data collection and processing. Nearly from the dawn of the supercomputer, the National Oceanic and Atmospheric Administration (NOAA) has run computer models to predict how hurricanes build and move over oceans. Those models become more accurate each year with the growth of computer storage and processing.

Unlike hurricanes or even general atmospheric forecasting, however, river flooding forecasts were a low priority on overcommitted NOAA supercomputers for years, says Ed Clark, director of the National Water Center in Tuscaloosa, Alabama.

“The serendipitous moment came in 2014 as more supercomputing capability became available and we no longer had performance constraints,” Clark says. “That began a new genre of hydrology science.”

The National Water Center team in 2000 began developing a national hydrology dataset and models that included information from 7.5 million miles of streams and rivers and 6.5 million lakes and ponds. To create a true water model, rainfall totals have to be merged with land-surface models, soil-moisture readings, evapotranspiration rates, and plant-canopy information in each of those watersheds.

But gathering the data is only half of the equation. It needed to be put to use. “And when you have these large datasets, the human mind can’t really integrate it in a meaningful way,” Clark says.

NOAA’s computer capacity upgrade in 2014 allowed the team to make the most of its big data, about 4 terabytes gathered daily. (If a smart phone had 4 terabytes of memory, it could hold about two million photos.)

The National Water Model was still in its test phase when Hurricane Matthew hit the Atlantic coast in October 2016, but it accurately predicted river flooding which occurred in coastal South Carolina in the weeks after the storm. Models such as these will be critical in areas like Georgetown County, where a lack of long-term data collection gauges on rivers frustrated emergency personnel during the October 2015 flood.

Next up for Clark and his team is fine-tuning the basic model while adding details such as what type of runoff to expect downstream based on whether upstream rain falls on farms, forests, or urban areas.

“We turn on the TV today to get a weather forecast,” Clark says. “In five years, I want people to be thinking about a water forecast and what it means to their day-to-day lives.”

High-tech Utility Vehicle

Sometimes, even new technology can require a boost from a low-tech household item to serve a specific real-life use. The RAMbLr, a device created by the Charleston District of the U.S. Army Corps of Engineers (ACE), is a prime example.

In the Army tradition of acronyms, RAMbLr stands for Rapid Assessment Mobile LiDAR. It’s a standard all-terrain vehicle with a Dynascan® mobile LiDAR mapping system mounted atop a custom-built superstructure.

The Dynascan® unit, acquired in 2013, consists of a laser scanner that spins 360 degrees and collects data along with two GPS antennas. It’s high tech. The superstructure, however, is remarkably utilitarian.

The antennas need to be separated to provide the proper orientation. So the RAMbLr team attached a standard 8-foot aluminum ladder to the top of the all-terrain vehicle’s superstructure, with the main scanner and GPS antenna on one end of the ladder and another GPS antenna perched on the other end.

In the LiDAR process, a laser is fired at an object. The time it takes for the light to reflect back is used to measure the precise distance between the two points. The key to using LiDAR in mapping, or to measure volume of a beach’s sand, is basing the measurements on at least three known points derived from GPS satellites. Using more satellites improves the accuracy even more.

The U.S. Department of Defense began placing navigation satellites in set orbits in 1978 and by 1995 had a full array of 24 spaced out to cover the entire planet. Microwave signals sent from those satellites to receivers determine the location of the receiver. The full capacity of the GPS system at first was available only to the United States military. When that capacity was opened to everyone in 2000, the field of mapping science and research exploded. The S.C. Sea Grant Consortium and U.S. Geological Survey researchers used the technology in a 2006 study of erosion of state beaches.

The RAMbLr was built in 2013 to improve the accuracy of topographic data at ACE project sites along the coast. First used to measure dredging disposal sites, the vehicle allowed for remarkably quick surveys of the damage to beach-renourishment project sites at Folly Beach and Myrtle Beach after Hurricane Matthew.

The RAMbLr team surveyed three miles of Folly dunes in less than a day, a process that took weeks of shooting individual elevations using tripods after Hurricane Hugo in 1989. They did 30 miles of Horry County’s beaches in two nights after Matthew. (The ACE later mapped the damage from Miami to Ocean City, Maryland, with 76 flights in LiDAR-equipped aircraft over 36 days.)

As fun as the RAMbLr is, the real star of the process in South Carolina is a decade-old project of the S.C. Geodetic Survey. The S.C. Virtual Reference Station Network set up 45 satellite receivers statewide and connected them to computer servers in Columbia. Combining that system with existing GPS as well as a Russian satellite system, the accuracy of satellite measurements was cut from a few meters (like you might get now with the mapping device on your smart phone) to a few centimeters.

The Virtual Reference Station Network was designed to aid road construction crews, but it has been a godsend for any number of users—including the local ACE office. It’s critical for LiDAR on land and sonar used for accurately dredging shipping channels under the water.

“The Network allowed us to cut the cord on the prior type of survey methods we employed,” says Matt Foss, chief of survey for the Charleston office. “That was a game-changer in terms of increasing the accuracy and reducing time and manpower required to collect our surveys.”

A man assembles a drone.

Drone Mapping. Norm Levine says drones, combined with the Global Positioning Satellite system and high-resolution photography, have made it possible to quickly fly over an area and map elevation with much finer detail than is available from satellite images. Photo by Grace Beahm Alford.

Research Eyes in the Sky

Another major advance soon could provide extremely focused data for flood forecasts. Norm Levine, an associate professor at the College of Charleston and director of the Lowcountry Hazards Center, is using the latest in drone-based image technology to measure the city’s peninsula in such fine detail that undulations in the green space in Marion Square show up on an elevation map.

Such extremely accurate maps can indicate where slight changes in elevation or the amount of impervious surface can influence the volume of water flowing into local streams or stormwater systems. Those details are extremely important in a flood-prone city like Charleston, and unmanned aerial vehicles, or drones, are the best way to get them.

“Drones allow us to look at hard-to-see and hard-to-understand features,” Levine says. Advances in drone technology “have changed everything about how we think about mapping.”

Drones have been around for more than a century, from the first balloon flights to the earliest unmanned planes. Military uses dominated, however, until the early 2000s. Then a new wave of less expensive drones took advantage of advances in miniaturization that allowed for lighter motors and better cameras. Top-of-the-line research drones capable of precise programmed paths and high-resolution photography still cost six figures, but less specialized drones range from $30 to $500. As a result, drone use exploded, for hobbyists as well as researchers.

In 2010, the Federal Aviation Administration (FAA) estimated there would be 15,000 drones in use by civilians in the United States by 2020. A 2016 FAA report updated the estimate to 543,500 civilian drones by 2020. Most of those are flown by individuals as a hobby. The ones used for research face extra requirements for pilot certification, rules that have changed several times in recent years.

Aerial view of a city park.

Multiple Choice. Drone images with high resolution can reveal slight elevation changes in urban areas like Charleston’s Marion Square, helping map the flow of rainfall runoff. Photo by Norm Levine, College of Charleston.

Despite the challenge of keeping up with new regulations, drone research has expanded quickly. Drones are used by agricultural researchers to track crop productivity, by wildlife biologists to count endangered species in hard-to-reach places, and by geographers in all types of mapping. College of Charleston drones—the school has five—have been used to locate unmarked graves in a cemetery and, this summer, will help locate potential dinosaur fossils on a huge tract the university leases in Wyoming.

In mapping, precision is what makes drones so important. For centuries before aerial imagery, the standard for quality maps was one inch on the map equaled one mile, or a ratio of 1 to 63,360. Satellite imagery allowed that ratio to drop to 1 to 1,200. By combining high-resolution photography with GPS location, high-end drones today can cut the ratio almost to 1 to 10.

With basic satellite imagery, the dirt walkways in Charleston’s Marion Square would have blended in with the grass. When the resolution gets down to 1 to 10, the walkways worn down a few inches through the years show up as a different elevation from the grass. Maps drawn using drones help visualize small changes in elevation that can have large impacts on the flow of rainfall runoff.

“The last 10 years have seen exponential advances in spatial resolution,” Levine says.

He envisions similar advancements in spectral resolution, which picks up variations in wavelengths of light reflected from various surfaces and allows the identification of individual species of plants or deposits of minerals in the soil. Improvements in spectral resolution have lagged behind spatial resolution, but they seem inevitable considering the relentless nature of technology.

Tiny Technology, Huge Changes

In terms of technological breakthroughs, nanotechnology is a newcomer. While discussed as a concept for centuries, the field only began revving up in the 1980s when developments in electron microscopy allowed the engineering of particles at the molecular level. Nanotechnology took off in the 1990s, quickly becoming integral in creating lighter-weight products, more efficient sunscreen, and smaller, faster computer processors.

A nanometer is one-billionth of a meter, and nanoparticle generally refers to something measuring from one to 100 nanometers, or about 100,000 times smaller than the width of a human hair. In sunscreen, for instance, the chemical compound titanium dioxide reflects and absorbs ultraviolet light, protecting the skin from the sun’s harmful rays. When titanium dioxide is broken down into particles 25 to 50 nanometers in size, the sunscreen is transparent on the skin and can be more effective than sunscreens with larger reflective particles.

But those nanoparticles, and others used in soaps, shampoos, and hundreds of other consumer products, rinse off and make their way into the soil and waterways, where they end up in the food chain. The field of nanotechnology has spawned a parallel field of research into safe use of nanoparticles; thus the dual purpose in the title of the University of South Carolina’s Center for Environmental Nanoscience and Risk (CENR).

Two large magnets next to a jar of water.

Nanotech Solution. Researchers at the Center for Environmental Nanoscience and Risk are working on a process that could have implications for oil spills. Oil in polluted water attaches to magnetic nanoparticles, allowing the oil to be separated from the water. Photo by Grant Jackson, University of South Carolina.

Jamie Lead, director of CENR, says measuring the risk they pose and developing safer alternatives are tremendous challenges. As a relatively new technology, manufactured nanoparticles are at low levels in the environment now “but they are growing at such a rate that, in the future, they could be a significant problem. Even now in some cases, there is a potential risk to the environment.”

Researchers at CENR are examining how nanoparticles behave in seawater and other parts of the environment, how they end up in organisms such as oysters, and at what rate they accumulate in the food chain. Getting into the food chain then means human health could be affected. The key to sustainable and successful use of nanotechnology is to determine if the risks outweigh the benefits, and the benefits are potentially massive.

For instance, CENR researchers are working on two nanoparticle-driven processes that could revolutionize oil-spill cleanups. They both involve magnetic nanoparticles which are coated in specific polymers. Oil in polluted water attaches to the polymer when the nanoparticles are put into the water-oil mixture. Then magnets are used to separate the nanoparticles and oil, leaving clean water to return to the ocean. Early tests of the process show great promise, Lead says.

CENR researchers also are working with nanoparticles that reduce the toxicity of oil while serving as nutrients to bacteria that naturally degrades oil in salt water. This can be used at the site of a spill to stimulate natural degradation processes.

Both processes would require the production of large amounts of specialized nanoparticles, which is likely to be less expensive and more effective than conventional methods for cleaning oil spills. As the nanoparticle processes are refined, Lead predicts they will become the conventional method for cleaning spills. Before that happens, though, scientists will want a clear understanding of the residual impact of introducing those nanoparticles in the ocean environment.“That’s the underlying nature of our challenge,” Lead says. “We try to understand better which particles are most effective and cause fewest problems.”

Genetics, Robotics, and Species Identification

New technology and microscopic particles also are the basis for Dianne Greenfield’s research at the Hollings Marine Laboratory in Charleston. A research associate professor with the University of South Carolina’s Belle W. Baruch Institute of Marine and Coastal Sciences and director of the S.C. Department of Natural Resources’ Algal Ecology Laboratory, Greenfield has been working for years with a process to rapidly identify and quantify microscopic organisms using their ribosomal RNA (rRNA). The process, sandwich hybridization assay (SHA), traces its roots to the original identification and understanding of genetic material in the 1950s.

A researcher's gloved hand operates a piece of scientific equipment.

Genetic Analysis. As part of the sandwich hybridization assay process, homogenized ribosomal RNA samples from fish eggs are exposed to reagents in a robotic device to determine the species of the eggs. Photo by Grace Beahm Alford.

The SHA process features two molecular probes designed to bind with a species’ rRNA in a homogenized sample placed between them, creating an rRNA “sandwich.” The sandwiched rRNA, along with a series of enzymes, create a reaction that is processed using a programmed robotic device about the size of a small toaster oven. The device exposes the sample to different reagents. At the end of the reaction, a change in sample color indicates if a species is present in the sample.“You can rapidly and very accurately quantify an organism of interest,” Greenfield says. “It’s an alternative to microscopy, which is a powerful tool but takes longer to perform and requires specialized knowledge and experience. Some different species don’t look very different under a microscope, which is a problem if you want to distinguish morphologically similar species—like eggs of different fish species or visually similar harmful algae.”

Much of Greenfield’s work has been related to marine and coastal phytoplankton, the type that might cause harmful algal blooms in local waterways, but her research team recently has devised a process to identify the eggs of red drum. Their work was detailed in the March 3, 2015 issue of Canadian Journal of Fisheries and Aquatic Sciences.

As part of that study, then-graduate student Rebecca Mortensen refined a method for breaking down the protective shell and chorion membrane surrounding the developing red drum embryo to get to the genetic material. Then the team designed the specific SHA to differentiate the genetic material in red drum eggs from those of other similar species. It was the first use of SHA to identify a vertebrate species, and it opens opportunities for other species, Greenfield says.

Devising an SHA for a species can take several months. But once that initial work is done, the identification of a species from a homogenized sample can be performed within a few hours.

Stephen Arnott, associate marine scientist with the S.C. Department of Natural Resources, says previous spawning studies have been limited because red drum eggs are difficult to differentiate from those of similar species. Tracking egg production using the SHA process could help “quantify egg production on an annual basis and help us look at long-term spawning trends.”

Test tubes.

Cold Storage. Water samples are filtered then stored in cryovials in liquid nitrogen before sandwich hybridization assay processing. Photo by Grace Beahm Alford.

Shedding Light on Oceans

Another way to track fish species is to watch them under water. Technology has made that much easier in recent years.
Lance Horn started working at University of North Carolina Wilmington three decades ago on a ship equipped as a surface support system for undersea divers. A year later, the school sold the ship, bought a Remotely Operated Vehicle (ROV), and a high-tech niche was born.

We saw right away, not many researchers wanted to get in diving gear, and they could only go 300 feet down,” Horn says. “They could explore only a limited amount of the water column.”

ROVs, directed by trained pilots at the ocean surface, allow the exploration deeper and farther from the base ship. Short dives were replaced by trips limited only by the endurance of an operator on board the base ship.

“We were out there seeing things people had never seen before,” Horn says.

As much as those early vehicles advanced the scope of research, however, they had limitations. The support lines attaching the vehicles to the base ship were copper-based and bulky, and the charts marking the location of the ships were on paper. The introduction of fiber-optic cables and GPS in the 1990s broke those shackles.

The university later acquired an Autonomous Underwater Vehicle (AUV), which explores based on a programmed set of directions and isn’t connected physically to the launch ship.

Then along came less-expensive multibeam sonar devices. Sonar measures distances based on the time required for sound waves to echo back from a surface. The process is similar to LiDAR and GPS, but sonar is more effective under water because sound moves more efficiently through water than lasers or microwaves.

Standard sonar calculates the time and distance of sound bouncing off a focused area. Multibeam sonar incorporates additional directional information to measure a wide swath rather than a single point. While the concept of multibeam sonar has been around since the 1960s, the expense of multibeam devices limited its use to military and large-scale research vessels for decades. When costs began to drop in the past decade, multibeam sonar became affordable for researchers, including Horn.

“At that point, we knew exactly where the ship was,” Horn says. “We used to say, ‘Let’s go west and hope we hit a feature.’ Now we were able to go right to it, and the science could be more focused on the habitat.”

Today, the university’s Undersea Vehicles Program is the go-to provider for researchers who want to explore under the surface off the East Coast. The vehicles are available only for scientific research. Once supported by grants as a NOAA National Undersea Research Center, the program now stands on its own. Researchers pay for equipment and services, from a basic price of $1,100 for a day up to a full seven-day package costing nearly $19,000.

CH-SPRING-2017-UNC-Wilmington

Diving Deep. This SubAtlantic Mohawk 18 is a workhorse of the University of North Carolina Wilmington’s Undersea Vehicles Program, which utilizes remotely operated vehicles and autonomous underwater vehicles to take research cameras and monitoring equipment to the ocean depths. Photo by University of North Carolina Wilmington.

The program’s vehicles were instrumental in the mapping of Marine Protected Areas in the Southeast and Gulf of Mexico, helping determine the ideal boundaries for these underwater environments designated for managing overfished species. “And now we can go back to see if they are working,” Horn says.

The technological advances make it easy to take nine computers out on the ship and guide an ROV to shoot high-definition video in exactly the same water column where grouper or snapper were counted a year or two earlier.

Students Prep to Take Technology Forward

The University of North Carolina Wilmington program also takes on a science-education outreach role. Several times in recent years, it has donated time on its vehicles for College of Charleston students. Those undergraduates use the ROV to “ground truth” maps they already have created.

Leslie Sautter, an associate professor in the Department of Geology and Environmental Geosciences at College of Charleston, started the Benthic Acoustic Mapping and Survey (BEAMS) Pro-gram in 2007. Sautter’s background was in micropaleontology, but she recognized the expanded use of multibeam sonar was opening undersea mapping as a new field of research and employment for her students.

About 140 students have come through the program in the past 10 years, taking courses in marine geology and seafloor mapping, then participating in related research projects. Much of the work involves learning the ins and outs of complex mapping software, but the students also have taken survey trips on NOAA ships and on a research vessel out of the University of Georgia’s Skidaway Institute of Oceanography.

In addition to using sonar to measure ocean depths, the sophisticated technology can determine the character of the seafloor—rocky vs. sandy, for instance—based on the strength of the returning signal in a process called backscatter.

“Twenty years ago, what we’re doing would have been very difficult,” Sautter says. “It used to be more like mapping with a Fish Finder.”

Today, multibeam sonar allows the mapping of complete areas of the seafloor to create detailed 3-D visualizations.

“What we’re doing now is as exploratory as mapping Mars,” Sautter says. “But we can’t see the seafloor the way we can see the surface of Mars. With sonar technology, we’re finding new seafloor features all the time.”

Of course, it’s not just gee-whiz science. The students’ work has major applications in fishery management—many fish and deep-coral ecosystems thrive on harder seafloor surfaces. Better maps also can identify sand resources for beach renourishment, locate suitable fiber-optic cable routes, aid in determining the best wind farm sites, and update navigation charts.

“Every student who has come through the program and has sought a job in this industry has gotten one,” Sautter says. “BEAMS alums are all over the place and the program is recognized internationally.”

Current student Ryan Hawsey, who just finished his junior year, landed a prestigious summer internship at Woods Hole Oceanographic Institution in Massachusetts. He arrived at College of Charleston as a marine biology major, but a geology class and a meeting with Sautter convinced him to change majors.

CH-SPRING-2017-Ryan-Dawsey

Double-Checking Data. College of Charleston student Ryan Hawsey, working on R/V Savannah out of the Skidaway Institute of Oceanography, pulls in a sediment grab sampler used to ground truth the results of a multibeam-sonar scan of the ocean floor. A multibeam scan can reveal the character —whether hard or soft—of a large area. Photo by Leslie Sautter, College of Charleston, Beams Program.

“I joke with my friends that I became a geology major to go hiking and pick up rocks, but now all I do is sit in front of a computer,” Hawsey says. He has, however, also sailed on three research vessels for more than 20 days at sea, including two weeks mapping off Ireland’s coast.

A member of a generation raised on computers, Hawsey’s comfortable dealing with complicated mapping software. He could see himself working on a survey ship for NOAA or ACE, or doing academic research. “There are just so many directions you can go with these skills,” he says.

He envisions a future when a base ship will launch 10 AUVs or ROVs with their own multibeam scanners. They will be programmed for 10 different paths. When they return to the ship after a day, they will have done the equivalent of 10 days of mapping as currently practiced.

And that might be in just a few years. The pace of technology—whether related to mapping, underwater soundscapes, fishery management, or weather-data analysis—continues to amaze.

“It seems like when a new technology comes out, we say that’s the best we’ll ever get,” says Hawsey, whose perception belies his youth. “And then something is developed that exceeds current capabilities.”

Current students like those in Sautter’s program will be the ones charting the future, and Sautter can’t wait to see where they go.

“It’s truly unimaginable where this technology is heading,” Sautter says. “It’s for today’s dreamers—and BEAMers —to figure out.”

Future Dilemma: Extracting Value from Expanding Data

Jyotika Virmani encourages those dreamers as senior director at XPRIZE, a California-based non-profit that designs and manages competitions to come up with technological breakthroughs. The $7 million Shell Ocean Discovery XPRIZE, for instance, is designed to spark deep-sea exploration. Many entries involve expansion in the capabilities of autonomous vehicles paired with miniaturization, Virmani says.

AUVs with energy-saving designs and stronger power supplies could launch from shore, thus reducing the hassle and expense of going out on a ship. Drones that operate as well in the air as under water could be even more efficient. Underwater robots could sniff out sulfur typical of biological deep-sea hotspots.

Considering the warp-speed advances in drone technology in the past few years, Virmani fully expects autonomous vehicles envisioned by XPRIZE contestants to be operational soon. And as with most technology, the capabilities will expand as equipment costs drop.

“Technology frees up research to look at other questions,” Virmani says. “Instead of spending money to gather the data, we can say ‘Here’s all the data.’ Then we can ask questions.”

With the rise of big data, machines will do some of that work, too. What’s referred to as Artificial Intelligence (AI) involves creating algorithms that use the data to answer specific research questions. AI will be the key to learning from the vast accumulation of data from satellites, sensors, and autonomous vehicles throughout the ocean, Virmani says.

Oceanographers often say we know less about the ocean depths than we know about the surface of Mars. That is quickly changing thanks to emerging technology. Delaney’s “A 2020 Vision for Ocean Science” was written in 2009, at the beginning of the effort to build the Cabled Axial Seamount Array, a system of fiber-optic cables, cameras, and sensors covering nearly 550 miles of the Pacific floor off the U.S. coast.

Underwater shot of an array.

Volcano Watch. The Cabled Axial Seamount Array uses high-resolution electro-optic cables to provide power and internet connections to a camera in a caldera on the floor of the Pacific Ocean, thus allowing the first real-time video of an eruption of a volcano at such depths. Photo courtesy of National Science Foundation, Ocean Observatories Initiative.

The design of the cabled array depended on emerging technology—high-resolution electro-optic cables—to provide power to distant cameras and sensors without use of batteries. The cables also needed to withstand the harsh ocean environment while transmitting massive amounts of data back to be shared on the internet.

The technology worked so well the cabled array’s cameras in April 2015 captured the eruption of the Axial Seamount, 298 miles off the coast of Oregon. The images shed new light on how underwater volcanoes shape changes of the seafloor.

“The most exciting thing so far has been the eruption, but it isn’t just about the eruption,” Delaney told a student newspaper reporter at the University of Washington early this year. “It’s about the fact that because of this system, human beings actually have a presence out in the environment that we’ve never had before.”Real-time video of a deep-sea volcano eruption was the realm of science fiction 50 years ago, and not much more than a dream of researchers 20 years ago. Technology made it come true.

Sidebar

A Brief History of Data Storage

Information is the backbone of research and science. Thus, the exponential expansion of data-storage capacity fueled advancements in those fields in the past 30 years.

Data storage moved forward at a snail’s pace from the invention of the printing press by Johannes Gutenberg in the 1400s until the 20th century. With the advent of the hard disk drive for storing digital information in the 1950s and magnetic tape storage in the 1970s, however, the capacity to store and process data exploded.

The pace of expansion in the past 30 years has been extraordinary.

  • In 1986, researchers stored kilobytes of data on floppy disks. About two kilobytes equal a page of type.
  • In 1993, CD-ROMs were the new storage platform of choice, holding megabytes, each about 1,000 kilobytes.
  • In 2000, storage began to move to disk arrays that could handle data in terabytes, each about one million megabytes.
  • By 2007, cloud storage on the internet meant researchers could store data in terms of petabytes, each about 1,000 terabytes.
Floppy disks.

When trying to wrap your brain around these numbers, consider that if one petabyte of data were translated to text on paper, it would fill about 20 million standard four-drawer filing cabinets.

Martin Hilbert and Priscilla Lopez tracked the world’s capacity to store data in a research paper that appeared in the April 1, 2011 issue of the journal Science. They estimated the amount of information stored on major analog (including books) and digital platforms worldwide in 1986 was 2.6 exabytes. An exabyte is 1,000 petabytes. Worldwide storage went up to 15.8 exabytes during the CD-ROM era, to 54.8 exabytes during the disk array era, and to 295 exabytes in 2007 as cloud storage took over.

In 2016, the National Aeronautics and Space Administration Earth Exchange collaboration encouraged research projects that required processing of petabytes of data. Projects include creating a tree-cover map of the United States at one-meter resolution and utilizing more than 65 years of data to create climate change projections.

Reading and Websites

Benthic Acoustic Mapping and Survey Program, College of Charleston.

Delaney, John and Roger Barga, “A 2020 Vision for Ocean Science,” an essay in The Fourth Paradigm: Data-Intensive Scientific Discovery, Microsoft Research, 2009.

Lapine, Lewis A., and Matthew J. Wellslager. “GPS + GLONASS for Precision: South Carolina’s GNSS Virtual Reference Network,” InsideGNSS, July/August 2007.

Mayer-Schonberger, Viktor, and Kenneth Cukier. Big Data: A Revolution That Will Change How We Live, Work and Think. Eamon Dolan/Houghton Mifflin Harcourt. 2014.

Mortensen, Rebecca A., and Stephen A. Arnott, William J. Jones, Dianne I. Greenfield. “Development of a sandwich hybridization assay for the identification and quantification of red drum (Sciaenops ocellatus) eggs: a novel tool for fishery research and management,” Canadian Journal of Fisheries and Aquatic Sciences, March 3, 2015.

National Oceanic and Atmospheric Administration National Water Model.

National Science Foundation Ocean Observatories Initiative, Cabled Array ­projects.

Shell Ocean Discovery XPRIZE.

SmartState Center for Environmental Nanoscience and Risk, University of South Carolina.

University of North Carolina Wilmington Undersea Vehicles Program.

U.S. Army Corps of Engineers news release on the RAMbLr.