Back | Home

Breakthroughs: Honorable Mentions


Aerial Photography for Coastal Mapping

Since the 1800s, the Coast and Geodetic Survey (C&GS) has been mapping our coastlines.  Early on, plane table surveying was the primary method for creating maps–a technique that required surveyors to be on-the-ground and in-the-field for extended periods.  This changed in 1921, when the C&GS became the first civilian government agency to formally incorporate aerial photography into the development of its maps.  Aerial photography provided a “birds-eye-view” of our coasts, and allowed C&GS to map areas faster and at a lower cost than previously possible, forever changing the way the agency created navigational maps. 

Over the years, new technologies (many created by C&GS), such as the C&GS nine-lens camera, have made aerial photography a cornerstone in the cartographic process.  Additionally, since the late 1930s, aerial photographs that are the source for shoreline information on nautical charts and maps have also provided information to define ownership and jurisdiction boundaries, make coastal management decisions, plan waterfront development, and locate features or obstructions to marine and air navigation.

Bilby Towers

In 1926, Chief Signalman Jasper Bilby of the Coast and Geodetic Survey (C&GS) introduced the Bilby Tower–greatly enhancing the speed and accuracy with which the C&GS conducted surveys.  Used for surveying by the C&GS (and later by the National Geodetic Survey) for almost 60 years, the Bilby Tower raised surveying equipment to a sufficient height to clear obstructions and provide the necessary clear line-of-sight between adjacent survey marks.  While not the first tower used for surveying, the Bilby Tower could be assembled more quickly and easily than earlier towers and was more rigid, allowing more precise survey observations.  The wide-spread use of Bilby Towers directly led to the faster expansion of the National Spatial Reference System.  The towers were retired in the 1980s, when the advent of global positioning system technology meant that surveyors no longer needed a (horizontal) line-of-sight between survey marks.

Bycatch Reduction Devices

Shrimp trawling has been identified as the largest source of human-caused sea turtle mortality in the United States, as turtles become trapped in shrimp nets and can not escape. To combat incidental death of protected turtles, in the early 1980s, NOAA scientists designed and developed Turtle Excluder Devices (TEDs).  These devices allow sea turtles to be safely released without any loss of shrimp harvest.  The successful development, introduction, and enforced effective use of TEDs by NOAA has reduced incidental death of sea turtles by shrimp trawling by 97 percent while allowing the shrimp fishery to continue unfettered.

Pelagic (open ocean) longline fishing gear used to catch swordfish has also been a source of sea turtle bycatch.  Experiments on the Grand Banks (2001-2003) demonstrated that feasible modifications to this gear resulted in significant reductions in the bycatch of sea turtles while maintaining the catches of swordfish. Using 18/0 circle hooks and mackerel bait, scientists observed reductions of  65 percent in leatherback turtle catches and 90 percent in loggerhead catches. The circle hooks also reduce the percentage of loggerheads swallowing the hooks (69 vs. 27), which presumably indicates a reduction in mortality.  Tools developed during this experiment can be used to remove fishing hooks and line from the turtles that are captured, further reducing their mortality.  This technology (circle hooks and gear removal tools) is now required in both the Atlantic and Pacific U.S. pelagic longline fleets, drastically reducing mortality of protected resources, and is being exported worldwide.

CoastWatch

In 1987, a harmful algal bloom spread off the coast of North Carolina, ultimately causing $25 million in losses to fisheries and tourism in the area.  In efforts to find the source of the bloom, scientists turned to sea surface temperature data collected by NOAA’s polar orbiting satellites and were able to determine that the Gulf Stream had transported toxic algae cells from Florida into the colder coastal waters of North Carolina.  This marked the beginning of the NOAA CoastWatch Program.

Operational since 1995, CoastWatch was the first application of satellite data to oceanography to be used on an operational basis.  Since its early days, the program has grown from including only sea surface temperature data for the East Coast to include a variety of data from several different platforms (e.g., satellite imagery, aircraft observations, and various sensors) covering all U.S. coastal waters.  A number of diverse projects now depend on CoastWatch data for uses such as near real-time monitoring of water quality, tracking sediment plumes, detecting harmful algal blooms, and identifying physical features important to coastal fisheries.

Cromwell Current

The Cromwell Current (also called the Pacific Equatorial Undercurrent or Equatorial Undercurrent) is the major subsurface current in the Pacific equatorial current system. The current transports around 30 million cubic meters per second of water–about 1,000 times the volume of the Mississippi River.  It plays a key role in determining the physics, chemistry, and biology of the equatorial Pacific.  Despite the volume and importance of the current, prior to 1952, it remained undiscovered. 

In 1951, scientists conducting experimental longline fishing for tunas in the equatorial Pacific region noticed that when fishing gear was set deep, it drifted eastward even though the surface currents traveled westward.  Townsend Cromwell, an oceanographer at the Honolulu Laboratory, Pacific Oceanic Fisheries Investigations (which later became part of NOAA’s Southwest Fisheries Science Center) recognized that these observations were significant. In 1952, he led an oceanographic cruise to investigate how the Pacific equatorial currents varied as a function of depth and discovered the eastward flowing subsurface current, which would eventually be named after him.  Locating the Cromwell Current is a fundamental discovery in the field of oceanography and critical to our understanding of ocean and ecosystem dynamics in the Pacific.

Fishery Acoustic Surveys

Fishery surveys are used to estimate the total population of a fish stock–information that is critical to making fishery management decisions.  Since 1970, NOAA’s Alaska Fisheries Science Center has pioneered the development of acoustics, or sound waves, to conduct these surveys.  Innovative technologies, such as digital data collection systems or multi-beam sonar, as well as techniques for collecting acoustic data developed by the Center have improved the quality of data collected, allowing better overall management of our nation’s fisheries.

Harmful Algal Bloom Forecast System

Harmful algal blooms (HABs) occur when algae, simple plants that live in the sea, produce toxic or harmful effects on people, fish, shellfish, and marine mammals and birds.  Economists estimate that harmful algal blooms result in losses of over $80 million each year.  In the past two decades, NOAA scientists have been building a tool to mitigate the harmful effects of these blooms—the Harmful Algal Bloom Forecast System.  Development of the service began in 1987, when scientists first used satellite sea surface temperature data to determine the source of a bloom off the coast of North Carolina.  While satellite images allowed scientists to monitor blooms, monitoring was not enough.  To protect the public, predicting where and when a bloom will appear is necessary in the same way that weather forecasters predict thunderstorms or other weather events.  Thus, researchers worked to build an integrated tool that not only monitors HABs, but also forecasts how a bloom will grow and move.  Launched September 30, 2004, for the coast of Florida, the HAB Forecast System uses models, together with data from satellite sensors and field and instrument observations from research vessels, to develop the remote sensing methods necessary to monitor and forecast HABs. HAB bulletins containing this information are developed by integrating data from various ocean-observing systems, including satellite imagery, meteorological data from NOAA observing stations, and field data collected by state and university monitoring programs. 

The HAB Forecast System is the first such service in the nation to provide accurate, reliable advance information about current and projected HAB locations and intensities and represents a vast improvement over the field-based sampling and monitoring used previously. The HAB Bulletin has been issued approximately 230 times between October 1, 2004 and August 31, 2006.  Research is on-going to develop operational HAB forecasting in other regions.

Hurricane Forecast Model

When a hurricane threatens our shores, we want to know where and when it will strike and how strong it will be when it arrives.  It was not so long ago that we could not answer these questions more than a few hours or a day in advance.  However, today NOAA provides answers with greater accuracy and greater lead-times using an arsenal of forecasters, instruments, and computer-based tools.  One of these computer-based tools is the hurricane model developed by NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL). 

The groundwork for this model was established in the 1970s, as GFDL researchers expanded early experiments on hurricane dynamics to build a simulation that more closely mimicked realistic hurricane structure.  After two decades of research, the model became operational in 1995.  Since that time, the GFDL hurricane model has been among the leading models used by the National Hurricane Center.  Over the last three hurricane seasons, the model has had the lowest error in forecasting hurricane tracks of any other model.  The latest version of the model, which went into production before the start of the 2006 hurricane season, is expected to improve hurricane track forecasts and allow forecasters to predict where a hurricane will make landfall two to five days in advance with a 20 percent reduction in forecast error.

NOAA Next Generation Water Level Measurement System

Precise knowledge of the times, heights, and extent of inflow and outflow of tidal waters is critical to a wide range of purposes, from navigating waterways to selecting the best times for fishing or surfing. NOAA and its predecessor organizations have maintained systematic tidal records for 200 years.  However, the multi-year (1980-2000) implementation of the NOAA Next Generation Water Level Measurement System has revolutionized the ability of the NOAA National Water Level Program to deliver tidal information. 

Under the initiative, water level measurement sensor systems were changed from the float/wire stilling well systems of the past to newly engineered air acoustic and pressure systems that reduced known error sources of the old systems.  These systems can operate for a full year without requiring maintenance and collect more than just water levels—they collect a wide variety of environmental measurements such as wind speed, air and water temperature, barometric pressure, and conductivity.  Data are now collected using an electronic data acquisition system and, instead of mailing a data tape once a month to NOAA headquarters for processing, data are transmitted over the NOAA geostationary satellite system every three hours.  Overall, the NOAA Next Generation Water Level Measurement System has fundamentally changed and modernized the way water-level data are measured, collected, quality-controlled and processed, managed and archived, and disseminated to users.

Passive Integrated Transponder Tags for Fisheries Research

The passive integrated transponder (PIT) tag is a small, internal tag that allows individual fish to be passively identified as they pass detection stations.  Unlike other much larger electronic tags that require batteries or external tags or marks that require recapturing fish, the PIT-tag will continue to identify and provide valuable information about a tagged fish throughout its lifetime without the need to rehandle the fish. 

The PIT-tag was developed after Mr. Earl Prentice, a scientist at NOAA’s Northwest Fisheries Science Center, took a conceptual vision of using electronic tags to identify people and translated it into a device that has revolutionized fisheries research by allowing scientists to collect data on the migratory timing, passage, behavior, and survival of many fish populations under different environmental conditions.  Mr. Prentice began working on the tag in 1982, and has since guided its development and helped refine many of the applications of the technology. 

Today, PIT-tags are used in many types of fisheries research and have revolutionized certain aspects of experimental biology by providing a rapid, non-lethal means of identifying individual experimental animals.  The tags have been used extensively to evaluate management actions associated with Pacific Northwest salmon, and the tags are now used with sturgeon, halibut, red drum, bass, gar, eels, crayfish, king and tanner crab, seals, sea otters, turtles, and alligators. 

Telepresence Technology

Prior to 2005, a scientist who wanted to participate in a NOAA research cruise needed to make arrangements to spend weeks or months on a ship out to sea.  However, in the summer of 2005, a scientist and her team sitting in Seattle, Washington, commanded a NOAA research cruise taking place in the middle of the Atlantic Ocean.  This marked NOAA’s first application of “telepresence technology,” including satellite communications and high-speed Internet connections, opening doors for a greater number of scientists to participate in “at-sea” research.  Continued application of the technology by NOAA will ultimately mean a greater number of top minds working to tackle the mysteries of our ocean.  Also, through telepresence technology, children and the general public can “participate” in ocean explorations as never before, instilling in current and future generations an appreciation and understanding of the wonder and importance of our oceans.

Tropical Instability Waves

The longest waves in the ocean do not crash along a sandy beach; they are found meandering as part of east-west currents that flow along the equator.  At a length of over 1,000 kilometers, the existence of these waves was theoretical until 1975, when NOAA’s newest geostationary operational environmental satellite (GOES) detected tropical instability waves (TIW) in the eastern equatorial Pacific.  The GOES system provided the key observational platform for discovery of these waves in a relatively cloudy atmosphere in 1975. Because GOES takes snapshots every 30 minutes, scientists had access to up to 48 images of the Pacific each day.  Composites from satellite images ultimately revealed the monster waves. Subsequently, similar waves were detected in the Atlantic and Indian oceans. 

The observation of TIW generated a great deal of interest in the ocean science community and helped justify using satellites for ocean observations.  In efforts to study TIW, NOAA installed a permanent buoy array along the equator in the Pacific, used new satellite microwave instruments to make the first cloud-free movie loops of the waves, and estimated the chlorophyll content along the equator due to upwelling associated with the long waves using visible spectrum measurements.

Scientists have determined that the waves usually appear seasonally each year between May and December, but the absence of the long waves can be a harbinger of El Niño and warmer equatorial waters.  The equatorial long waves have become one of the most studied equatorial ocean phenomena due to their link to El Niño and La Niña cycles.