ESA title

Φ-lab workshop on Sentinel-1 SAR data

Posted in

Last week, ESA’s Φ-lab welcomed partners from the World Wildlife Fund (WWF) and the Food and Agriculture Organization of the United Nations (UN-FAO) for a collaborative workshop elaborating the potential use of Sentinel-1 SAR data for some of their projects. Both organisations traditionally work a lot over tropical regions where cloud coverage hinders the regular mapping of the environment with optical datasets.

It is well known that SAR sensors, thanks to their active SAR antennas, can acquire data independent of cloud cover and daytime. On the other hand, the underlying physical principles are fundamentally different to optical sensors such as Sentinel-2 or Landsat. This confronts new users with difficulties regarding proper image processing and interpretation, as well as an adequate use of the data for various tasks related to environmental monitoring.

The workshop was therefore targeted to de-mystify SAR data by giving practical examples of Sentinel-1 processing workflows and adapt them to specific problem statements such as the mapping of de-forestation and mangrove forests, the identification of water holes as well as the large-scale mapping of crop types.

RGB Sentinel-1 Timescan composite over the northeast of Borneo island in Malaysia. The green area in the center is the Tabin Wildlife Resort
WWF, ESA and FAO participants

A special focus was put on the innovative use of the free and openly available data of Sentinel-1, whose radar eyes cover the entire earth with a minimum of 12 days repeat, and produce around 10 TB of raw data every day. While the availability of this amount of data allows for completely new ways of extracting ever more detailed information on large scales, it also confronts the users with issues regarding the data handling and information extraction. Both partner organisations are mainly using Google’s online platform Earth Engine to tackle this issue, and respective processing strategies have been presented there. In addition, the SNAP based Open SAR Toolkit has been introduced, which allows for an almost fully automatic production of large-scale, analysis-ready SAR imagery and provide a more customisable way of processing for non-SAR experts. Its incorporation of Jupyter Notebooks allows for an eased usage on remote machines in cloud environments such as the Copernicus Data Information and Access services (DIASes) or FAO’s Sepal platform.

Finally, techniques of how to ingest the analysis-ready SAR data into machine-learning and AI frameworks have been discussed. While those techniques have been around for a while, they gain more and more importance with the constantly growing amount of satellite data. On this part, future collaborations between phi-lab and both organisations are foreseen in order to support an effective use of Copernicus data to help WWF and FAO in achieving important goals such as wildlife conservation and a world without hunger. 

Post contributed by Andreas Vollrath.

Latest news

Subscribe to our newsletter

Share