As climate change increasingly becomes a threat to global coastlines, figuring out how, where and when waves tear up shores will be even more important.

THE GIST

Stereo vision could solve the problem of remotely monitoring surf conditions.

The trick is to get the computers behind the cameras to match patterns and make a 3D image, like the human brain does.

The technology has greater importance as climate change grows larger storms and rowdier surf.

A remote-controlled stereo vision system now under development could revolutionize the science of watching how, where and when waves tear up coastlines, say researchers.

Storm swells that pound at coastlines -- thrilling surfers and worrying coastal engineers -- have always been hard to monitor. Wave-measuring buoys don't work well in the surf zone. Like other methods, they only provide data for one small point of surf rather than the beach-wide, broader-scale surf event.

Radar systems, for their part, provide nice wave speed information -- like a police officer's radar gun -- but are not very good at measuring wave heights.

By linking two visible-light camera together, however, and processing the images in a way similar to how the human brain does with the vision of two eyes, loads more wave data can be collected without even getting a toe wet. The potential applications of the technique are growing more important as climate change is growing larger surf that threatens more coastlines worldwide.

"It's like two eyes," explained researcher David Hill of Oregon State University. "It doesn't just see the waves and see them moving, but how high they are."

Hill worked with Dutch researchers on a stereo vision system that is beginning to yield specific wave height information for a swath of surf that is more on the scale of a recreational beach. Their research results will be published in the March 2011 issue of the journal Coastal Engineering.

"As you get closer to shore, there's a lot more variability" caused by shoaling waters refractions of waves and other shoreline effects, Hill explained to Discovery News. There have been other attempts to use stereo vision to measure waves, he said, but not on a realistic and useful area of surf.

Their latest experiment involved two off-the-shelf digital cameras on a pier on the Dutch coast. The data was processed by an ordinary desktop computer.

The data processing is the hard part, he said, because they had to come up with an algorithm for the computer to recognize and match up waves seen from two cameras in different positions.

"Your eyes match features then the brain does the reconstruction," said Hill. "The main bit of work is to come up with ways for the computer to match the images."

Of course, this is not the first attempt to make artificial stereo vision. Robotics researchers, for instance, have worked on this problem.

"For robotics it's quite close," said oceanographer Robert Holman, also of Oregon State University, but not involved with the Dutch study. Robots are normally working with objects at close range and there is no great need to gather accurate distance and height data from their stereo vision, Holman explained.

"If you think of oceanographic problems, the surf zone is one of the toughest," Holman said. There's not only huge spatial variation, but rowdy waves, changing sea bottoms and other hazards. "It's hard to measure that with traditional instruments. So remote sensing is a natural match for that."