Tutorials
MTS/OES members: $175 per half-day tutorial, $275 per full-day tutorial. Non-members: $275 per half-day tutorial, $375 per full-day tutorial.
#1 Design of synthetic aperture sonar systems for high-resolution seabed imaging Full Day: Monday, September 18, 2006, 8:30 am - 4:30 pm
This tutorial will review the key aspects of the design of synthetic aperture sonar (SAS) systems for high resolution seabed imaging. After a quick overview of the expected benefits and main features of SAS, the design of the transmitter and receiver arrays will be discussed, with emphasis on the mitigation of spatial aliasing with multi-element receiver arrays, wideband operation and extension to interferometric SAS for estimating the seabed bathymetry.
Next the most difficult issue in SAS, which is the micronavigation problem, i.e. that of estimating the unwanted platform motions with the required sub-wavelength accuracy, will be addressed in detail. The emphasis will be on methods which have proved their value at sea, which combine inertial navigation systems (INS) with data-driven methods based on the Displaced Phase Centre Antenna (DPCA) technique. The topics covered will include the theory of spatial backscatter coherence, the derivation of ping to ping motion estimates using time delay estimation theory, including the use of bandwidth for phase unwrapping and the appropriate range-dependent near field corrections to arrive at unbiased estimates, the establishment of the Cramer Rao lower bounds for motion estimation which demonstrate the need for fusion with an INS to achieve full performance. The geometrical relationship between the DPCA and INS projection frames, which is necessary for accurate fusion, will be established and shown to depend also on the local seabed slope. The estimation of this slope with interferometric sonar will be discussed.
Furthermore the impact of the environment, and in particular of the multipath structure in large range to water depth ratios will be discussed. Multipath will be shown to degrade the quality of the SAS imagery as well as adversely impact the accuracy of interferometric estimates including DPCA. Means to mitigate multipath operation by management of the vertical transmission and reception beams will be discussed, showing experimental results which point to some of the limitations of existing sonar performance prediction tools.
Finally different design tradeoffs between computational efficiency and robustness for micronavigated SAS imaging algorithms will be discussed and an example of a real-time implementation suited for operation on-board an autonomous underwater vehicle will be described.
Bio Marc Pinto
Marc Pinto was born in Wellington, India in 1960. He graduated from the Ecole Nationale des Ponts et Chaussées, Paris (France) in 1983. From 1985 to 1989 and 1989 to 1993 he worked as a research engineer for Thomson-CSF, specializing in the development of finite element techniques for solving non-linear magnetostatics to support the modeling of the magnetic recording process. In 1991, he received the Ph.D. degree in Solid State Physics from the University of Paris, Orsay. In 1993 he joined Thomson-Sintra ASM (now Thales Underwater Systems) as Head of the Signal Processing Group, specializing in research into advanced MCM and airborne ASW sonar. In 1997 he joined the NATO Saclant Undersea Research Center, La Spezia, Italy as principal scientist. He was appointed Head of the Mine Countermeasures Group, in the Signal and Systems Division in 1998 and held this position until the group was dissolved in 2000. From 2000 to 2004 he conducted, as project leader, research into synthetic aperture sonar systems for hunting proud and buried mines. In 2004 he was appointed Head of the Expeditionary MCM and Port Protection Department where he presently oversees the research into AUV-based minehunting, electronic mine countermeasures and harbour defence.
#2 AUV Technology and Application Basics Half Day: Monday, September 18, 2006, 8:30 am - noon
AUV Application Basics is a short course that provides an overview of AUV technology and operations. The objective is to provide an overview of what AUV systems can provide and the best practices for their use. The class is targeted at scientists interested in using AUVs for oceanographic applications. The attendee will gain basic understanding of AUV types, technologies, and navigation techniques, including discussion of the comparative strengths of AUVs and alternative methods of data collection. The attendee will also be provided an understanding of trade-offs in AUV operations, including power estimation, endurance considerations, and mission structure to acquire the desired data sets.
Key points are illustrated by applications and results from the Monterey Bay Aquarium Research Institute's (MBARI) Dorado AUV and other AUV operations. Topics include: Basic AUV technology, AUV at-sea Operation, Payload Considerations, Mission Planning, Upper and Mid-Water AUV missions, Benthic and Mapping AUV missions, Data Collection and Reduction, AUV Integration into Sampling Networks, and a look at coming AUV advances.Intended Participants:
This class is intended for scientists interested in applying AUVs to particular problems, persons interested in AUV applications and the impact of AUV technology, and graduates in oceanographic fields seeking a broader understanding about the application of AUV platforms.
Bio: W.J. Kirkwood
Bill is currently the Associate Director of Engineering at the Monterey Bay Aquarium Research Institute (MBARI) located in Monterey Bay, California. Bill has a BS in Mechanical Engineering and a MS in Computer Science which he has applied to controls and automation of electromechanical systems and robotics since 1978. Bill has been with MBARI for 15 years as a lead mechanical engineer and program manager developing the Tiburon remotely operated vehicle. Bill also performed as program manager and lead engineer developing the Dorado class autonomous underwater vehicles at MBARI. Bill's current efforts are focused on precision and/or automated in situ instrumentation and laser Raman spectroscopy for the deep ocean science.
#3 Matlab Tools for Processing Data from Acoustic Doppler Current Meters Deployed on Deep Water Moorings Half Day: Monday, September 18, 2006, 8:30 am - noon
Data collected using deep-sea moored instruments must be processed to engineering units, inspected for quality assurance (QA), and edited to remove unwanted data points before final archival storage. Community-accepted archival formats, such as EPIC netCDF, require fully processed and QA?d data, as well as meta-data. Acoustic Doppler Current Profilers (ADCPs) produce large amounts of data, making manual methods of data editing and reformatting cumbersome. For ADCPs, the necessary editing tasks include:
-
Removal of data collected before completion of mooring deployment and after initialization of mooring retrieval
-
Removal of data collected beyond the effective range of the ADCP, beyond the sea surface for upward looking ADCPs, or beyond the bottom for downward looking ADCPs
-
Verification of data quality
-
Flagging of questionable data due to occasional instrument malfunction or unfavorable environmental conditions
-
Depth mapping of data collected by sub-surface moored ADCPs whose depth varies with current drag forces. The ADCP's depth will increase (draw down) as the current increases. The ADCP's measurement cells are defined in terms of distance away from the ADCP and do not represent fixed depths in the water column Mapping the data to fixed depth horizons is necessary for time series analysis.
To streamline these tasks, Woods Hole Group, Inc. (WHG) has developed a set of Matlab-based data processing and QA tools for use with data from R.D. Instruments ADCPs. These tools are based on the ADCPTools package developed by USGS, which was intended for processing data from bottom mounted ADCPs in shallow water. However, unlike the original USGS ADCPTools package, the new processing tools are intended specifically for processing data from deep-sea moored ADCPs. The original ADCPTools software has been available publicly from USGS for several years. WHG intends to makes its new deep-water version of ADCPTools available on the same basis, as a work-in-process, so that other users may join with WHG to use, maintain and improve the software.The use of the new WHG ADCPTools will be described in detail and demonstrated using data from deep-moored ADCPs. Processing begins with raw binary data in a file as recovered from the ADCP, and ends with quality checked and edited data in a netCDF file following EPIC conventions for Doppler current meter data.
Attendees will be shown how to:
-
Enter meta-data not included in the raw file required for the netCDF format
-
Make an initial assessment of the data from depth/time color contour plots of raw data and plan the rest of the processing.
-
Pick the beginning and end of the good data range (after deployment and before retrieval) from a plot of depth (pressure) over the entire length of the record and confirm (or adjust) the depth mapping range. The new ADCPTools then performs mapping of data from depth cells relative to distance from the ADCP to fixed depth cells using the actual time dependent ADCP depth. ADCP depth can be determined from a pressure sensor if the ADCP is so equipped, from an estimate of the distance to surface based on acoustic data if the ADCP is upward looking, or from another instrument on the same mooring.
-
Specify various criteria that will be used to automatically edit the data. These criteria include physical parameters (out-of-range, excessive rate of change, etc.) as well as criteria based on the ADCP's reported QA diagnostic parameters (vertical and error velocity exceedance, echo amplitude, correlation magnitude, and percent good pings).
-
Examine the data graphically to determine if the masking is having the proper results, and also manually mark additional data as bad or alternatively mark masked data as good.
-
Repeat the editing process with different criteria until a satisfactory result is obtained.
-
Specify the number of bins to trim at the end of the data range since an ADCP is usually set up to record more bins than will actually contain good data. Unlike the editing process, which flags data in a particular bin at a particular time as ?bad? , the trimming process completely removes all data for the specified bins from the record.
-
Create a final EPIC netCDF data file that is in a format that is standard within the oceanographic community and can be read by many analysis packages for post-processing.
As time permits, attendees will have the opportunity to use the tools themselves. Attendees wishing to try these tools at the tutorial on their own data (collected with a Teledyne RD Instruments ADCP) should contact Bruce Andrews at bandrews@whgrp.com
Bio:J Bruce Andrews
J Bruce Andrews has more than 30 years? experience at MIT, EG&G, and Woods Hole Group, Inc. in software development, scientific data processing, and numerical modeling for a wide range of commercial and government clients. Mr. Andrews earned his M.S. in Ocean Engineering at MIT in 1969. His fields of expertise include applications programming, system programming, systems integration, instrumentation software design, data analysis, data acquisition, data presentation and display, processing system design and implementation, physical oceanography, signal processing, numerical modeling. He has experience with many different programming languages including FORTRAN, C, and Visual Basic. His most recent work has been with Lab Windows CVI (National Instruments) development environment and Matlab (Mathworks). He has developed software used to collect, analyze, and report on data for numerous deepwater current measurement programs worldwide, including programs in the USA, offshore Brazil, offshore west Africa, offshore Australia, and offshore Indonesia. He is also involved in on going development of software tools for processing and analyzing data from various oceanographic and meteorological instrumentation, with emphasis on interactive quality assessment and data editing as well as archiving in standard (netCDF) formats.
Bio: Bruce A. Magnell Ph.D
Bruce A. Magnell, Ph.D. has more than 30 years? experience at MIT, EG&G, and Woods Hole Group, Inc. in applied science, and ocean engineering for a wide range of commercial and government clients. He is a recognized expert in the field of physical oceanography in industry, government, and academia. His fields of expertise include physical oceanography, electrical engineering, coastal ocean dynamics, oceanographic instrumentation design and evaluation, signal processing, data analysis, data acquisition, real-time telemetry, processing system design and implementation, technology evaluation, business management and business development. He has extensive experience in the analysis of coastal ocean dynamics, especially wind-driven circulation and mixing on the continental shelf. Specifically, he has collected, analyzed, and reported on numerous deepwater current measurement programs worldwide, including some of the first observations of the loop current in the Gulf of Mexico. In the past few years, he has been the Principal Investigator for oceanographic data collection programs in the USA, offshore Brazil, offshore West Africa, offshore Australia, and offshore Indonesia. He also has participated in and directed large-scale oceanographic measurement programs for the MMS, such as the Northern California Coastal Circulation Study and the New England Outer Continental Shelf Physical Oceanography Program.
#4 Acoustic seabed classification with multibeam and sidescan images Half Day: Monday, September 18, 2006, 8:30 am - noon
Acoustic seabed classification is the organization of the sea floor and shallow subsurface sediment into discrete classes based on information in the echoes. Geoacoustic sediment properties such as grain size and porosity are not available from acoustic backscatter alone, but the survey area can be segmented into regions of similar acoustic character. Systematically exploiting details in backscatter is the basis of acoustic segmentation.
This tutorial presents theory and applications of image-based acoustic classification, from the early papers through to recent applications. The acoustic principles of classifying with echoes from single beams at normal incidence are presented first, since they relate to the principles of image classification. Near nadir, the amplitudes and shapes of sounder echoes are rich in sediment information. Away from vertical incidence, echoes carry sediment information in their amplitudes and their noise characteristics, but not in their shapes. Echoes from imaging sonars, with their wide horizontal beamwidths, become rasters in sonar images, so noise in these echoes becomes image texture. Macro-roughness such as sand waves and changes in sediment also contribute to texture. Image amplitude and texture are both heavily influenced by sediment type and are exploited for segmentation.
Sonar calibration is not necessary for image-based acoustic classification. Image amplitudes are made consistent throughout a survey, but remain in relative, not absolute, units. Since calibrating imaging sonars is challenging, the ability to use systems that need only be consistent offers cost-effective practical classification for military and civil purposes.Topics in this tutorial include:
-
Quality control, suppressing system artifacts.
-
Compensating images for beam patterns and grazing angle effects.
-
Features that capture amplitude and texture characteristics.
-
Classification with amplitude: backscatter, backscatter vs. grazing angle.
-
Classification with texture: Pace, Haralick, fractal, wavelet.
-
Differences between classifying multibeam and sidescan images: resolution, using bathymetric data for compensation, benefits of images stitched together from backscatter in beams.
-
Supervised classification, training sets.
-
Unsupervised classification, PCA, manual and automated clustering.
-
Using non-acoustic data to relate acoustic classes to sediment geoacoustic properties.
-
Categorical interpolation.
-
Maps with acoustic classes in similarity colours.
The techniques presented in this tutorial are wide ranging, and do not concentrate on a selected technical approach. As time allows, hands-on experience with classification software suites will reinforce the tutorial material. Participants are invited to bring their own laptops for this part of the tutorial, and would be able to continue classifying data sets after the session has ended.
Through this combination of theory and experience, participants in this tutorial can expect to gain a thorough understanding of principles and practice of image-based sediment classification.
Bio: Dr. Jon Preston
Dr. Jon Preston (PhD, University of British Columbia) is Senior Scientist at Quester Tangent Corporation, Sidney, BC, and an adjunct professor at the University of Victoria. In his seven years at QTC he has lead the development of software suites for rigorous statistical classification of multibeam and sidescan images, interpolation and visualization of acoustic classes, and automated objective clustering through simulated annealing.
#5 Outline for Workshop on Airborne Hyperspectral Imaging Half Day: Monday, September 18, 2006, 1:00 pm - 4:30 pm
This half-day workshop will focus on the "do's and don'ts" of preparing for, conducting and then dealing with the data collected during an airborne hyperspectral survey. The session will cover:
-
background to the technology
-
planning an airborne survey (design and layout of flight lines, things to avoid)
-
execution of the survey (type of aircraft to use, when to fly)
-
data pre-processing (geocorrection, atmospheric correction)
-
overview of data analysis
The session leader has been involved in airborne hyperspectral surveys for seventeen years and has organized and conducted airborne surveys in more than twenty-five countries on four continents. The session will include plenty of real-life examples from many of these projects.
Bio:Herb Ripley
Herb has been trained in geography and remote sensing/geomatics and has spent his entire career with Atlantic Canadian firms. Herb has comprehensive experience in all aspects of aerial photography and airborne remote sensing data collection acquired in over 25 years of project work. This experience includes projects conducted both nationally and internationally. Herb's world recognized casi project experience includes numerous applications but in particular he has considerable experience on coastal projects including mapping invasive species, coral reef surveys, mapping benthic habitats and mapping near-shore vegetation. Herb has been principal author and co-author on numerous technical publications During his career Herb has been very active in industry develop mental activities and has held executive positions with the Nova Scotia Oceans Initiative, the Champlain Institute, the Alliance for Marine Remote Sensing, the Geomatics Industry Association of Canada and the Geomatics Association of Nova Scotia. As a recognition of his professional standing, several years ago Herb was appointed a Fellow of the Remote Sensing Society (U.K.).
#6 Signal Processing Methods for Underwater Acoustic Communications Half Day: Monday, September 18, 2006, 1:00 pm - 4:30 pm
Wireless information transmission through the ocean is one of the enabling technologies for the development of future ocean-observation systems, whose applications include gathering of scientific data, pollution control, climate recording, detection of objects on the ocean floor, and transmission of images from remote sites. Implicitly, wireless signal transmission is crucial for control of autonomous underwater vehicles (AUVs) which will serve as mobile nodes in the future information networks of distributed underwater sensors. Wireless communication provides advantages of collecting data without the need to retrieve instruments, and maneuvering underwater vehicles and robots without the burden of cables.
Acoustic wireless communications are governed by three factors: limited bandwidth, time-varying multipath propagation, and low speed of sound in the ocean. Together, these factors result in a communication channel of poor quality and high latency (thus ironically combining the worst of mobile radio and satellite channels). To achieve high information throughput on such channels, coherent modulation/detection techniques, such as PSK and QAM, must be considered because of their bandwidth efficiency. Signal processing methods for underwater acoustic channels are based on the same principles as those for radio communications; yet, they differ substantially due to the amount of time-spreading introduced by the channel, as well as frequency-spreading introduced by the system mobility.
Signal processing methods for high speed underwater communications have been a topic of extensive research over the past decade, resulting in the development of first high speed underwater acoustic modems. In this lecture, we focus on signal processing methods of adaptive equalization, digital synchronization, and multichannel combining for bandwidth-efficient underwater communication systems. We also address methods for multiple-access underwater communications, which form the basis of future underwater wireless communication networks, and discuss the need for scalable network architectures that provide efficient use of channel resources by a large number of AUVs. Finally, we outline the principles used in today's real-time implementation of these techniques. The performance of various techniques is discussed through a series of experimental results, which include transmission over distances ranging from a few kilometers in shallow water to hundreds of kilometers in deep water, at highest bit-rates demonstrated to date.
Bio: Milica Stojanovic
Milica Stojanovic graduated from the University of Belgrade, Serbia, in 1988, and received the M.S. and Ph.D. degrees in electrical engineering from Northeastern University, Boston, Massachusetts, in 1991 and 1993. She is currently a Principal Scientist at the Massachusetts Institute of Technology, and also a Guest Investigator at the Woods Hole Oceanographic Institution. Her research interests include digital communications theory and statistical signal processing, and their applications to mobile radio and underwater acoustic communication systems. Milica is an Associate Editor for Communications with the IEEE Vehicular Technology Society.
Bio: Lee Freitag
Lee Freitag holds BS and MS degrees in Electrical Engineering from the Univ. of Alaska, Fairbanks which he received in 1986 and 1987He is currently a Senior Engineer at the Woods Hole Oceanographic Institution where he has worked on projects related to underwater acoustics for 15 years. His research programs focus on underwater acoustic communication and navigation with a strong focus on UUVs, sensors and submarine systems.
|