Seas and exclusive economic zones are wide areas where satellites are the only effective approach for surveillance. Synthetic Aperture Radar (SAR) is widely used for maritime applications, as it shows very interesting capabilities like being independent of sunlight and being able to see through clouds. In the case of ocean applications, SAR has an excellent ability to read what is happening at the surface of the ocean, whether it is related to the dynamics of the sea surface, whether we are monitoring human activities at sea. However, the interpretation of SAR imagery might be challenging, as our eyes are not always familiar with this kind of information.
SAR intelligence is driven by human intelligence, training, and crucially, experience
I like to dream about the idea that remote sensing is a form of very ancient language. In the end, energy is the way universe communicates and the words can translate into information that we, humans, can understand and use. Electromagnetic waves have been here since the universe was born, and now we are able to design sensors capable of capturing that language. Several decades ago, we decided to put instruments in orbit pointing at the earth, to help us understand what happens in our planet, giving birth to what we now call Earth Observation. A satellite image is the result of how electromagnetic waves interact with natural and artificial surfaces. In the case of Synthetic Aperture Radar (SAR), we are talking about an active sensor that sends pulses -in the microwave range- to our planet and reads the energy that comes back. Rough surfaces reflect the energy in many directions some of which comes back to the sensor, while smooth surfaces tend to act like a mirror, bouncing the energy far from the sensor. We call this phenomenon radar backscattering, and normally it is represented in the image as a scale of grey, where black means very low backscattered signal and white high backscatter. When dealing with maritime applications, we have to know that this effect is modulated by numerous physical, chemical and biological processes that happen at the surface of the ocean.
In a sense, the sensor and the ocean “speak” among themselves, and we have to learn that language to properly understand what a SAR image is trying to tell us.
When trying to detect oil spills at sea, we profit from the fact that oil films dampen surface capillary waves. Capillary waves are wind-caused ripple-waves with a wavelength less than a few centimetres. The attenuation of these very small waves creates a “mirror” at the sea surface, therefore, a region with low backscatter. The oil spill reveals itself in the image as an area of dark grey or black. This visualization effect can provoke two misleading perceptions in observers: the first one, the subconscious association of this black color with the black color of thick oil spills we are familiar with, from pictures of major accidents. Unfortunately, very thin films can provoke a similar effect, so the darkness of the pixel can´t provide information of the thickness of the oil slick or the quantity. Oils spread enormously when spilled and only very thick oil layers show a black color when observed with naked eye. The second misleading perception comes from understanding that every dark pixel in the image corresponds to a mineral oil spill. As mentioned, numerous processes have the capability of modulating surface roughness, some of them creating patches that can look like oil spills. The most common one is associated with the biological dimension of the oceans. Fortunately for us, ocean waters are populated by plancton, what makes life possible on earth. This tiny creatures exchange chemical components with its environment, some of which are natural oils that tend to aggregate at the surface in calm weather. This natural films -known as biogenic slicks- are ubiquitous and excellent tracers of other physical phenomena like ocean fronts, ocean eddies, upwelling, internal waves, etc. Therefore, the analysis of a SAR image has to be taken with care, ancillary data and years of experience to provide an accurate outcome. The next image shows a SAR image taken 3 months before the Wakashio accident -left- and another one the week the ship started leaking oil -right-, revealing how low backscatter cannot be the only driver for a high quality analysis.
Often, natural oil films and mineral oil films -oil spills- coexist in the same satellite image, so the interpretation of the image by an experienced observer is crucial.
The next natural step will be to transform this human-based process into a digital, automatic one, boosting its scalability. New Space rules impose a definite recipe: cost reduction, commercialisation, disruptive innovation, agility and flexibility. Following the examples that we see in other markets like health, AI solutions will become the standard in image interpretation. However, we will need radiologists to teach a machine how a certain illness reveals in an X-Ray image. The human brain patterns, acquired over years of profession will translate into artificial brain patterns, making the process agile, accurate and scalable thanks deep learning algorithms. Similarly, the automatic detection of oil spills in satellite imagery will need from experts in oceanography and remote sensing with proven experience in the field. We call this expert-driven AI and although labelling skin cancer images is far more challenging than labelling images of cats, dogs or cars, it enables the opportunity of driving disruptive innovation in the field, bringing convenience, availability and affordability where complexity and high cost has become the status quo.