ContactSite MapSearchNews
Inside Sea GrantResearchExtensionEducationFundingProductsEvents

SC Sea Grant Consortium
287 Meeting Street
Charleston, SC 29401
p: 843.953.2078
f: 843.953.2080
Coastal Heritage – Spring 2017
heritage logo
New Technology: Driving Advances in Coastal Science
VOLUME 30, NUMBER 2, SPRING 2017               

Coastal Heritage is a quarterly publication of the S.C. Sea Grant Consortium, a science-based state agency supporting research, education, and outreach to conserve coastal resources and enhance economic opportunity for the people of South Carolina.

Subscription are free upon request. Subscribe, or comment regarding this or future issues of Coastal Heritage, by contacting:

Executive Director: M. Richard DeVoe                        past issues of Coastal Heritage
Director of Communications: Susan Ferris Hill
Editor: Joey Holleman
Art Director: Pam Hesse Graphic Design

Curriculum Connection link

New Technology coastal Heritage magazine cover

New Technology:
Driving Advances in Coastal Science

By Joey Holleman

New Technology: Driving Advances in Coastal Science
A Brief History of Data Storage
Reading and Websites
News and Notes

For decades, the standard river flood forecasts in South Carolina were based primarily on rainfall totals from several dozen weather stations and stream flow data from gauges on the largest rivers in the state.

Now powerful supercomputers crunch terabytes of data to provide near real-time forecasts for every lake, pond, river, and stream as water flows from the mountains to the sea.

After Hurricane Hugo in 1989, survey crews trudged with tripods across the state’s beaches for weeks to determine how much sand had been lost to the powerful storm’s surge. After Hurricane Matthew in 2016, more precise measurements were available in days thanks to an all-terrain vehicle equipped with laser-based technology.

Coastal science is constantly evolving; moving the field forward is the goal of research. In the past 25 years, technology has accelerated extraordinary advances in how scientists record, measure, and process information, and thus has revolutionized research.

SeaGlideMany of these recent advances trace their roots to major breakthroughs of the 20th century: development of electronic computers, the first great number crunchers, in the 1940s; decoding of the structure and synthesis of genetic building blocks deoxyribonucleic acid (DNA) and ribonucleic acid (RNA), and the launch of Sputnik and other early satellites in the 1950s; and discovery in the 1960s of a process for amplifying light to create a laser.

The trunks of research from those roots grew rapidly and relatively straight for decades. Then, about a generation ago, the branches began proliferating.

Those first computers led to more powerful models that sparked today’s big data revolution. Sputnik fueled the space race that made possible mapping breakthroughs with the Global Positioning Satellite (GPS) system. Understanding the structure of DNA and RNA sparked major advances in genetics and genomics. The laser begat Light Detection and Ranging (LiDAR), a process that has revolutionized precise spatial measurement.

Today, computers crunch massive amounts of data for daily weather forecasts, autonomous vehicles utilize GPS to map the flood potential of individual residential lots or the composition of the deep-sea floor, and specially programmed robotic devices can quickly identify species of fish eggs through genetics.

The breakneck speed of new technological advances isn’t about to slow down. John R. Delaney, a professor of oceanography at University of Washington, addressed the potential as co-author with Roger S. Barca of a 2009 essay entitled “A 2020 Vision for Ocean Science.”

“This new era will draw deeply on the emergence, and convergence, of many rapidly evolving new Technologies. These changes are setting the scene for what Marcel Proust called “the real voyage of discovery, [which] lies not in seeking new landscapes, but in having new eyes.”

More information expands horizons. The vast majority of new technologies have some connection to advancements in “big data,” the collection, storage, and processing of massive amounts of information. For generations, printed journals and books in libraries were the storage venue for data gathered by researchers. Capacity was limited by square footage.

Then extraordinary breakthroughs in computer storage from the 1980s through the 2000s took us from data stored in kilobytes (2 KB equal about one page of type) on floppy disks to petabytes (1 PB equals 1,000 copies of the 32-volume Encyclopedia Britannica) stored in the entire cloud that is the internet. Equally rapid development of processing capacity allowed computer analysis of these larger datasets.

Viktor Mayer-Schonberger details the process in his 2014 book Big Data: A Revolution That Will Change How We Live, Work and Think.

“Just as the telescope enabled us to comprehend the universe and the microscope allowed us to understand germs,” Mayer-Schonberger writes, “new techniques for collecting and analyzing huge bodies of data will help us make sense of our works in ways we are just starting to appreciate.”

Weather forecasting is among the most obvious of the scientific advances sparked by big data collection and processing. Nearly from the dawn of the supercomputer, the National Oceanic and Atmospheric Administration (NOAA) has run computer models to predict how hurricanes build and move over oceans. Those models become more accurate each year with the growth of computer storage and processing.

Unlike hurricanes or even general atmospheric forecasting, however, river flooding forecasts were a low priority on overcommitted NOAA supercomputers for years, says Ed Clark, director of the National Water Center in Tuscaloosa, Alabama.

“The serendipitous moment came in 2014 as more supercomputing capability became available and we no longer had performance constraints,” Clark says. “That began a new genre of hydrology science.”

all-terrain vehicleThe National Water Center team in 2000 began developing a national hydrology dataset and models that included information from 7.5 million miles of streams and rivers and 6.5 million lakes and ponds. To create a true water model, rainfall totals have to be merged with land-surface models, soil-moisture readings, evapotranspiration rates, and plant-canopy information in each of those watersheds.

But gathering the data is only half of the equation. It needed to be put to use. “And when you have these large datasets, the human mind can’t really integrate it in a meaningful way,” Clark says.

NOAA’s computer capacity upgrade in 2014 allowed the team to make the most of its big data, about 4 terabytes gathered daily. (If a smart phone had 4 terabytes of memory, it could hold about two million photos.)

The National Water Model was still in its test phase when Hurricane Matthew hit the Atlantic coast in October 2016, but it accurately predicted river flooding which occurred in coastal South Carolina in the weeks after the storm. Models such as these will be critical in areas like Georgetown County, where a lack of long-term data collection gauges on rivers frustrated emergency personnel during the October 2015 flood.

Next up for Clark and his team is fine-tuning the basic model while adding details such as what type of runoff to expect downstream based on whether upstream rain falls on farms, forests, or urban areas.

“We turn on the TV today to get a weather forecast,” Clark says. “In five years, I want people to be thinking about a water forecast and what it means to their day-to-day lives.”

High-tech utility vehicle

Sometimes, even new technology can require a boost from a low-tech household item to serve a specific real-life use. The RAMbLr, a device created by the Charleston District of the U.S. Army Corps of Engineers (ACE), is a prime example.

In the Army tradition of acronyms, RAMbLr stands for Rapid Assessment Mobile LiDAR. It’s a standard all-terrain vehicle with a Dynascan® mobile LiDAR mapping system mounted atop a custom-built superstructure.

The Dynascan® unit, acquired in 2013, consists of a laser scanner that spins 360 degrees and collects data along with two GPS antennas. It’s high tech. The superstructure, however, is remarkably utilitarian.

The antennas need to be separated to provide the proper orientation. So the RAMbLr team attached a standard 8-foot aluminum ladder to the top of the all-terrain vehicle’s superstructure, with the main scanner and GPS antenna on one end of the ladder and another GPS antenna perched on the other end.

In the LiDAR process, a laser is fired at an object. The time it takes for the light to reflect back is used to measure the precise distance between the two points. The key to using LiDAR in mapping, or to measure volume of a beach’s sand, is basing the measurements on at least three known points derived from GPS satellites. Using more satellites improves the accuracy even more.

The U.S. Department of Defense began placing navigation satellites in set orbits in 1978 and by 1995 had a full array of 24 spaced out to cover the entire planet. Microwave signals sent from those satellites to receivers determine the location of the receiver. The full capacity of the GPS system at first was available only to the United States military. When that capacity was opened to everyone in 2000, the field of mapping science and research exploded. The S.C. Sea Grant Consortium and U.S. Geological Survey researchers used the technology in a 2006 study of erosion of state beaches.

The RAMbLr was built in 2013 to improve the accuracy of topographic data at ACE project sites along the coast. First used to measure dredging disposal sites, the vehicle allowed for remarkably quick surveys of the damage to beach-renourishment project sites at Folly Beach and Myrtle Beach after Hurricane Matthew.

The RAMbLr team surveyed three miles of Folly dunes in less than a day, a process that took weeks of shooting individual elevations using tripods after Hurricane Hugo in 1989. They did 30 miles of Horry County’s beaches in two nights after Matthew. (The ACE later mapped the damage from Miami to Ocean City, Maryland, with 76 flights in LiDAR-equipped aircraft over 36 days.)

As fun as the RAMbLr is, the real star of the process in South Carolina is a decade-old project of the S.C. Geodetic Survey. The S.C. Virtual Reference Station Network set up 45 satellite receivers statewide and connected them to computer servers in Columbia. Combining that system with existing GPS as well as a Russian satellite system, the accuracy of satellite measurements was cut from a few meters (like you might get now with the mapping device on your smart phone) to a few centimeters.

The Virtual Reference Station Network was designed to aid road construction crews, but it has been a godsend for any number of users—including the local ACE office. It’s critical for LiDAR on land and sonar used for accurately dredging shipping channels under the water.

“The Network allowed us to cut the cord on the prior type of survey methods we employed,” says Matt Foss, chief of survey for the Charleston office. “That was a game-changer in terms of increasing the accuracy and reducing time and manpower required to collect our surveys.”

Research eyes in the sky

droneAnother major advance soon could provide extremely focused data for flood forecasts. Norm Levine, an associate professor at the College of Charleston and director of the Lowcountry Hazards Center, is using the latest in drone-based image technology to measure the city’s peninsula in such fine detail that undulations in the green space in Marion Square show up on an elevation map.

Such extremely accurate maps can indicate where slight changes in elevation or the amount of impervious surface can influence the volume of water flowing into local streams or stormwater systems. Those details are extremely important in a flood-prone city like Charleston, and unmanned aerial vehicles, or drones, are the best way to get them.

“Drones allow us to look at hard-to-see and hard-to-understand features,” Levine says. Advances in drone technology “have changed everything about how we think about mapping.”

Drones have been around for more than a century, from the first balloon flights to the earliest unmanned planes. Military uses dominated, however, until the early 2000s. Then a new wave of less expensive drones took advantage of advances in miniaturization that allowed for lighter motors and better cameras. Top-of-the-line research drones capable of precise programmed paths and high-resolution photography still cost six figures, but less specialized drones range from $30 to $500. As a result, drone use exploded, for hobbyists as well as researchers.

In 2010, the Federal Aviation Administration (FAA) estimated there would be 15,000 drones in use by civilians in the United States by 2020. A 2016 FAA report updated the estimate to 543,500 civilian drones by 2020. Most of those are flown by individuals as a hobby. The ones used for research face extra requirements for pilot certification, rules that have changed several times in recent years.

drone aerial viewDespite the challenge of keeping up with new regulations, drone research has expanded quickly. Drones are used by agricultural researchers to track crop productivity, by wildlife biologists to count endangered species in hard-to-reach places, and by geographers in all types of mapping. College of Charleston drones—the school has five—have been used to locate unmarked graves in a cemetery and, this summer, will help locate potential dinosaur fossils on a huge tract the university leases in Wyoming.

In mapping, precision is what makes drones so important. For centuries before aerial imagery, the standard for quality maps was one inch on the map equaled one mile, or a ratio of 1 to 63,360. Satellite imagery allowed that ratio to drop to 1 to 1,200. By combining high-resolution photography with GPS location, high-end drones today can cut the ratio almost to 1 to 10.

With basic satellite imagery, the dirt walkways in Charleston’s Marion Square would have blended in with the grass. When the resolution gets down to 1 to 10, the walkways worn down a few inches through the years show up as a different elevation from the grass. Maps drawn using drones help visualize small changes in elevation that can have large impacts on the flow of rainfall runoff.

“The last 10 years have seen exponential advances in spatial resolution,” Levine says.
He envisions similar advancements in spectral resolution, which picks up variations in wavelengths of light reflected from various surfaces and allows the identification of individual species of plants or deposits of minerals in the soil. Improvements in spectral resolution have lagged behind spatial resolution, but they seem inevitable considering the relentless nature of technology.

Tiny technology, HUGE changes

In terms of technological breakthroughs, nanotechnology is a newcomer. While discussed as a concept for centuries, the field only began revving up in the 1980s when developments in electron microscopy allowed the engineering of particles at the molecular level.

Nanotechnology took off in the 1990s, quickly becoming integral in creating lighter-weight products, more efficient sunscreen, and smaller, faster computer processors. A nanometer is one-billionth of a meter, and nanoparticle generally refers to something measuring from one to 100 nanometers, or about 100,000 times smaller than the width of a human hair. In sunscreen, for instance, the chemical compound titanium dioxide reflects and absorbs ultraviolet light, protecting the skin from the sun’s harmful rays. When titanium dioxide is broken down into particles 25 to 50 nanometers in size, the sunscreen is transparent on the skin and can be more effective than sunscreens with larger reflective particles.But those nanoparticles, and others used in soaps, shampoos, and hundreds of other consumer products, rinse off and make their way into the soil and waterways, where they end up in the food chain. The field of nanotechnology has spawned a parallel field of research into safe use of nanoparticles; thus the dual purpose in the title of the University of South Carolina’s Center for Environmental Nanoscience and Risk (CENR).

nanotechnologyJamie Lead, director of CENR, says measuring the risk they pose and developing safer alternatives are tremendous challenges. As a relatively new technology, manufactured nanoparticles are at low levels in the environment now “but they are growing at such a rate that, in the future, they could be a significant problem. Even now in some cases, there is a potential risk to the environment.”

Researchers at CENR are examining how nanoparticles behave in seawater and other parts of the environment, how they end up in organisms such as oysters, and at what rate they accumulate in the food chain. Getting into the food chain then means human health could be affected. The key to sustainable and successful use of nanotechnology is to determine if the risks outweigh the benefits, and the benefits are potentially massive.

For instance, CENR researchers are working on two nanoparticle-driven processes that could revolutionize oil-spill cleanups. They both involve magnetic nanoparticles which are coated in specific polymers. Oil in polluted water attaches to the polymer when the nanoparticles are put into the water-oil mixture. Then magnets are used to separate the nanoparticles and oil, leaving clean water to return to the ocean. Early tests of the process show great promise, Lead says.

CENR researchers also are working with nanoparticles that reduce the toxicity of oil while serving as nutrients to bacteria that naturally degrades oil in salt water. This can be used at the site of a spill to stimulate natural degradation processes.

Both processes would require the production of large amounts of specialized nanoparticles, which is likely to be less expensive and more effective than conventional methods for cleaning oil spills. As the nanoparticle processes are refined, Lead predicts they will become the conventional method for cleaning spills. Before that happens, though, scientists will want a clear understanding of the residual impact of introducing those nanoparticles in the ocean environment.“That’s the underlying nature of our challenge,” Lead says. “We try to understand better which particles are most effective and cause fewest problems.”

Genetics, robotics, and species identification

New technology and microscopic particles also are the basis for Dianne Greenfield’s research at the Hollings Marine Laboratory in Charleston. A research associate professor with the University of South Carolina’s Belle W. Baruch Institute of Marine and Coastal Sciences and director of the S.C. Department of Natural Resources’ Algal Ecology Laboratory, Greenfield has been working for years with a process to rapidly identify and quantify microscopic organisms using their ribosomal RNA (rRNA). The process, sandwich hybridization assay (SHA), traces its roots to the original identification and understanding of genetic material in the 1950s.

genetic analyticsThe SHA process features two molecular probes designed to bind with a species’ rRNA in a homogenized sample placed between them, creating an rRNA “sandwich.” The sandwiched rRNA, along with a series of enzymes, create a reaction that is processed using a programmed robotic device about the size of a small toaster oven. The device exposes the sample to different reagents. At the end of the reaction, a change in sample color indicates if a species is present in the sample.“You can rapidly and very accurately quantify an organism of interest,” Greenfield says. “It’s an alternative to microscopy, which is a powerful tool but takes longer to perform and requires specialized knowledge and experience. Some different species don’t look very different under a microscope, which is a problem if you want to distinguish morphologically similar species—like eggs of different fish species or visually similar harmful algae.”

Much of Greenfield’s work has been related to marine and coastal phytoplankton, the type that might cause harmful algal blooms in local waterways, but her research team recently has devised a process to identify the eggs of red drum. Their work was detailed in the March 3, 2015 issue of Canadian Journal of Fisheries and Aquatic Sciences.

As part of that study, then-graduate student Rebecca Mortensen refined a method for breaking down the protective shell and chorion membrane surrounding the developing red drum embryo to get to the genetic material. Then the team designed the specific SHA to differentiate the genetic material in red drum eggs from those of other similar species. It was the first use of SHA to identify a vertebrate species, and it opens opportunities for other species, Greenfield says.

Devising an SHA for a species can take several months. But once that initial work is done, the identification of a species from a homogenized sample can be performed within a few hours.

Stephen Arnott, associate marine scientist with the S.C. Department of Natural Resources, says previous spawning studies have been limited because red drum eggs are difficult to differentiate from those of similar species. Tracking egg production using the SHA process could help “quantify egg production on an annual basis and help us look at long-term spawning trends.”

Shedding light on oceans

Undersea vehicleAnother way to track fish species is to watch them under water. Technology has made that much easier in recent years.
Lance Horn started working at University of North Carolina Wilmington three decades ago on a ship equipped as a surface support system for undersea divers. A year later, the school sold the ship, bought a Remotely Operated Vehicle (ROV), and a high-tech niche was born.

“We saw right away, not many researchers wanted to get in diving gear, and they could only go 300 feet down,” Horn says. “They could explore only a limited amount of the water column.”

ROVs, directed by trained pilots at the ocean surface, allow the exploration deeper and farther from the base ship. Short dives were replaced by trips limited only by the endurance of an operator on board the base ship.

“We were out there seeing things people had never seen before,” Horn says.

As much as those early vehicles advanced the scope of research, however, they had limitations. The support lines attaching the vehicles to the base ship were copper-based and bulky, and the charts marking the location of the ships were on paper. The introduction of fiber-optic cables and GPS in the 1990s broke those shackles.

The university later aquired an Autonomous Underwater Vehicle (AUV), which explores based on a programmed set of directions and isn’t connected physically to the launch ship.

Then along came less-expensive multibeam sonar devices. Sonar measures distances based on the time required for sound waves to echo back from a surface. The process is similar to LiDAR and GPS, but sonar is more effective under water because sound moves more efficiently through water than lasers or microwaves.

Standard sonar calculates the time and distance of sound bouncing off a focused area. Multibeam sonar incorporates additional directional information to measure a wide swath rather than a single point. While the concept of multibeam sonar has been around since the 1960s, the expense of multibeam devices limited its use to military and large-scale research vessels for decades. When costs began to drop in the past decade, multibeam sonar became affordable for researchers, including Horn.

“At that point, we knew exactly where the ship was,” Horn says. “We used to say, ‘Let’s go west and hope we hit a feature.’ Now we were able to go right to it, and the science could be more focused on the habitat.”

Today, the university’s Undersea Vehicles Program is the go-to provider for researchers who want to explore under the surface off the East Coast. The vehicles are available only for scientific research. Once supported by grants as a NOAA National Undersea Research Center, the program now stands on its own. Researchers pay for equipment and services, from a basic price of $1,100 for a day up to a full seven-day package costing nearly $19,000.

The program’s vehicles were instrumental in the mapping of Marine Protected Areas in the Southeast and Gulf of Mexico, helping determine the ideal boundaries for these underwater environments designated for managing overfished species. “And now we can go back to see if they are working,” Horn says.

The technological advances make it easy to take nine computers out on the ship and guide an ROV to shoot high-definition video in exactly the same water column where grouper or snapper were counted a year or two earlier.

Students prep to take technology forward

The University of North Carolina Wilmington program also takes on a science-education outreach role. Several times in recent years, it has donated time on its vehicles for College of Charleston students. Those undergraduates use the ROV to “ground truth” maps they already have created.

Leslie Sautter, an associate professor in the Department of Geology and Environmental Geosciences at College of Charleston, started the Benthic Acoustic Mapping and Survey (BEAMS) Pro-gram in 2007. Sautter’s background was in micropaleontology, but she recognized the expanded use of multibeam sonar was opening undersea mapping as a new field of research and employment for her students.

About 140 students have come through the program in the past 10 years, taking courses in marine geology and seafloor mapping, then participating in related research projects. Much of the work involves learning the ins and outs of complex mapping software, but the students also have taken survey trips on NOAA ships and on a research vessel out of the University of Georgia’s Skidaway Institute of Oceanography.

In addition to using sonar to measure ocean depths, the sophisticated technology can determine the character of the seafloor—rocky vs. sandy, for instance—based on the strength of the returning signal in a process called backscatter.

“Twenty years ago, what we’re doing would have been very difficult,” Sautter says. “It used to be more like mapping with a Fish Finder.”

Today, multibeam sonar allows the mapping of complete areas of the seafloor to create detailed 3-D visualizations.

checking data“What we’re doing now is as exploratory as mapping Mars,” Sautter says. “But we can’t see the seafloor the way we can see the surface of Mars. With sonar technology, we’re finding new seafloor features all the time.”

Of course, it’s not just gee-whiz science. The students’ work has major applications in fishery management—many fish and deep-coral ecosystems thrive on harder seafloor surfaces. Better maps also can identify sand resources for beach renourishment, locate suitable fiber-optic cable routes, aid in determining the best wind farm sites, and update navigation charts.

“Every student who has come through the program and has sought a job in this industry has gotten one,” Sautter says. “BEAMS alums are all over the place and the program is recognized internationally.”

Current student Ryan Hawsey, who just finished his junior year, landed a prestigious summer internship at Woods Hole Oceanographic Institution in Massachusetts. He arrived at College of Charleston as a marine biology major, but a geology class and a meeting with Sautter convinced him to change majors.

“I joke with my friends that I became a geology major to go hiking and pick up rocks, but now all I do is sit in front of a computer,” Hawsey says. He has, however, also sailed on three research vessels for more than 20 days at sea, including two weeks mapping off Ireland’s coast.

A member of a generation raised on computers, Hawsey’s comfortable dealing with complicated mapping software. He could see himself working on a survey ship for NOAA or ACE, or doing academic research. “There are just so many directions you can go with these skills,” he says.

He envisions a future when a base ship will launch 10 AUVs or ROVs with their own multibeam scanners. They will be programmed for 10 different paths. When they return to the ship after a day, they will have done the equivalent of 10 days of mapping as currently practiced.

And that might be in just a few years. The pace of technology—whether related to mapping, underwater soundscapes, fishery management, or weather-data analysis—continues to amaze.

“It seems like when a new technology comes out, we say that’s the best we’ll ever get,” says Hawsey, whose perception belies his youth. “And then something is developed that exceeds current capabilities.”

Current students like those in Sautter’s program will be the ones charting the future, and Sautter can’t wait to see where they go.

“It’s truly unimaginable where this technology is heading,” Sautter says. “It’s for today’s dreamers—and BEAMers —to figure out.”

Future dilemma: Extracting value from expanding data

Jyotika Virmani encourages those dreamers as senior director at XPRIZE, a California-based non-profit that designs and manages competitions to come up with technological breakthroughs. The $7 million Shell Ocean Discovery XPRIZE, for instance, is designed to spark deep-sea exploration. Many entries involve expansion in the capabilities of autonomous vehicles paired with miniaturization, Virmani says.

AUVs with energy-saving designs and stronger power supplies could launch from shore, thus reducing the hassle and expense of going out on a ship. Drones that operate as well in the air as under water could be even more efficient. Underwater robots could sniff out sulfur typical of biological deep-sea hotspots.

underwater volcanoConsidering the warp-speed advances in drone technology in the past few years, Virmani fully expects autonomous vehicles envisioned by XPRIZE contestants to be operational soon. And as with most technology, the capabilities will expand as equipment costs drop.

“Technology frees up research to look at other questions,” Virmani says. “Instead of spending money to gather the data, we can say ‘Here’s all the data.’ Then we can ask questions.”

With the rise of big data, machines will do some of that work, too. What’s referred to as Artificial Intelligence (AI) involves creating algorithms that use the data to answer specific research questions. AI will be the key to learning from the vast accumulation of data from satellites, sensors, and autonomous vehicles throughout the ocean, Virmani says.

Oceanographers often say we know less about the ocean depths than we know about the surface of Mars. That is quickly changing thanks to emerging technology. Delaney’s “A 2020 Vision for Ocean Science” was written in 2009, at the beginning of the effort to build the Cabled Axial Seamount Array, a system of fiber-optic cables, cameras, and sensors covering nearly 550 miles of the Pacific floor off the U.S. coast.

The design of the cabled array depended on emerging technology—high-resolution electro-optic cables—to provide power to distant cameras and sensors without use of batteries. The cables also needed to withstand the harsh ocean environment while transmitting massive amounts of data back to be shared on the internet.

The technology worked so well the cabled array’s cameras in April 2015 captured the eruption of the Axial Seamount, 298 miles off the coast of Oregon. The images shed new light on how underwater volcanoes shape changes of the seafloor.

“The most exciting thing so far has been the eruption, but it isn’t just about the eruption,” Delaney told a student newspaper reporter at the University of Washington early this year. “It’s about the fact that because of this system, human beings actually have a presence out in the environment that we’ve never had before.”Real-time video of a deep-sea volcano eruption was the realm of science fiction 50 years ago, and not much more than a dream of researchers 20 years ago. Technology made it come true.

back to top

Last updated: 11/30/2017 8:28:06 AM
Coastal Heritage – Spring 2017


Page Tools Print this page
E-mail this page
Bookmark this page

Coastal Science Serving South Carolina
Copyright © 2001-2019 South Carolina Sea Grant Consortium
Turbulent Flow Image Courtesy of Prof. Haris J. Catrakis, University of California, Irvine
Privacy & Accessibility