In recent weeks, a multitude of different technologies have been used in attempts to locate Malaysian Airlines flight 370 (MH370), which disappeared on March 8. Chris Strother and Zac Miller, two experts from the University of North Georgia's Institute for Environmental and Spatial Analysis (IESA), explain satellite technology and remote sensing and how both have been used in the search for MH370.
What type of technology was used to determine the likely path of the missing plane?
A communications satellite recorded "pings" for several hours after all other contact with MH370 was lost. Once an hour, the satellite asks an airliner's data network "Are you still there?" and the ping, also called a "handshake," is an airliner answering "yes." The increasing amount of time it took for MH370's answering ping to reach the satellite each hour indicated the airliner was moving away from the satellite's stationary position over the Indian Ocean. Additional analysis determined MH370 most likely crashed into the Indian Ocean south of Australia.
Some communications and weather satellites are geostationary, meaning they are designed to remain over the same spot at all times in a geosynchronous orbit – calculated to match the exact speed of Earth's rotation – around the equator. They orbit at more than 22,000 miles above Earth's surface.
How are the satellites being used to search for the wreckage different?
"Remote sensing" satellites are tasked with imaging the entire world as often as possible, among other things. Unlike geostationary satellites synchronized with Earth's rotation, many remote-sensing satellites fly in near-polar orbits that take the satellites over the north and south poles.
These satellites use a low Earth orbit (LEO) that keeps them within about 700 kilometers (around 450 miles) of the Earth's surface because images are higher in resolution the closer sensors are to the surface; sensors can be pointed in a particular direction. Many commercial remote sensing satellites, and most likely many military ones as well, are now focused on the southern Indian Ocean.
Certain remote sensing satellites have an amazing level of resolution, but these satellites must have a clear line of sight, so cloud cover or even a choppy ocean could affect images.
What's the process of locating the wreckage?
Remote sensing is used to classify large areas of Earth's surface through image analysis; IESA uses remote sensing to study large areas of land cover among other environmental phenomena. Looking for wreckage from a plane crash in the water is different from looking for it on land because it’s not going to move on land. You need multiple images from different times, depending upon how much you expect things to drift around.
It's easier to use remote sensing to find an abnormality on the water than trying to find a piece of wing in a forested area or urban area because, on a calm day, the ocean surface is going to look the same. You're in a binary situation where you're trying to find something that is either water or not water, so the image analysis can be automated by feeding images into a program that analyzes the difference between sunlight reflecting off water and metal or other materials.
Human eyes are being used in the image analysis, too. One of the initiatives throughout the search has been run by Tomnod, a crowd-source company that helps in disasters like this. They upload the satellites' high resolution images to their website and ask the general public to comb through image after image to look for things that might be unusual; in this case they're looking for oil slicks and debris. Thousands of volunteers, including us and our students, have logged into this platform to find debris using the software.