Report on the 2012 IEEE GRSS Data Fusion Contest: Multi-Modal/Multi-Temporal Fusion

September 13, 2012
Share

Sharing is Caring

The Data Fusion Contest is organized by the Data Fusion Technical Committee of the Geoscience and Remote Sensing Society (GRSS). The Committee serves as a global, multi-disciplinary, network for geospatial data fusion, with the aim of connecting people and resources, educating students and professionals, and promoting the best practices in data fusion applications.

The Contest has been held annually since 2006. It is open not only to IEEE members, but to everyone, with the goal of evaluating existing methodologies at the research or operational level to solve remote sensing problems using data from a variety of sensors.

1. OVERVIEW OF PREVIOUS DATA FUSION CONTESTS

The focus of the 2006 Contest was on the fusion of multispectral and panchromatic images [1]. Six simulated Pleiades images were provided by the French National Space Agency (CNES). Each data set included one very high spatial resolution panchromatic image (80 cm) and the corresponding multi-spectral image (3.2 m resolution). A multi-spectral airborne image was available as ground reference which was used by the organizing committee for evaluation, but was not distributed to the participants.

In 2007, the Contest theme was urban mapping using synthetic aperture radar (SAR) and optical data, and 9 ERS amplitude data sets and 2 Landsat multi-spectral images were made available [2]. The task was to obtain a classification map as accurate as possible with respect to the unknown (to the participants) ground reference, depicting land cover and land use patterns for the urban area under study.

The 2008 Contest was dedicated to the classification of very high spatial resolution (1.3 m) hyper-spectral imagery [3]. The data set was distributed to every participant, and the task was again to obtain a classification map as accurate as possible with respect to the unknown (to the participants) ground reference. The data set was collected by the Reflective Optics System Imaging Spectrometer (ROSIS-03) optical sensor with 115 bands covering the 0.43-0.86 μm spectral range.

In 2009-2010, the aim of Contest was to perform change detection using multi-temporal and multi-modal data [4]. Two pairs of data sets were available over Gloucester, UK, before and after a flood event. The data set contained SPOT and ERS images (before and after the disaster). The optical and SAR images were provided by CNES. Similar to previous years’ Contests, the ground truth used to assess the results was not provided to the participants. Each set of results was tested and ranked a first time using the Kappa coefficient. The best five results were used to perform decision fusion with majority voting. Then, re-ranking was carried out after evaluating the level of improvement with respect to the fusion results.

A set of WorldView-2 multi-angular images was provided by DigitalGlobe for the 2011 Contest [5]. This unique set was composed of five Ortho Ready Standard multi-angular acquisitions, including both 16 bit panchromatic and multispectral 8-band images. The data were collected over Rio de Janeiro (Brazil) in January 2010 within a three minute time frame with satellite elevation angles of 44.7°, 56.0°, and 81.4° in the forward direction, and 59.8° and 44.6° in the backward direction. Since there was a large variety of possible applications, each participant was allowed to decide the research topic to work on, exploring the most creative use of optical multi-angular information. At the end of the Contest, each participant was asked to submit a paper describing in detail the problem addressed, the method used, and the final result.

2. 2012 DATA FUSION CONTEST AND ITS DATA SET

The 2012 Contest was designed to investigate the potential of multi-modal/multi-temporal fusion of very high spatial resolution imagery in various remote sensing applications.  As shown in Figure 1, three different types of data sets (optical, SAR, and LiDAR) over downtown San Francisco were made available by DigitalGlobe, Astrium Services, and the United States Geological Survey (USGS), including QuickBird, WorldView-2, TerraSAR-X, and LiDAR imagery. The image scenes covered a number of large buildings, skyscrapers, commercial and industrial structures, a mixture of community parks and private housing, and highways and bridges.

Fig. 1. Composition of the optical, SAR, and LiDAR data sets over the downtown of San Francisco.

Optical and SAR data sets were composed of eight images acquired in 2007 and 2011, as shown in Table 1.

Table 1. Sensors and acquisition dates for the images distributed during the Contest.

Sensor

Acquisition 1

Acquisition 2

QuickBird/WorldView-2

11 November 2007

9 October 2011

TerraSAR-X

5 December 2007
16 December 2007
27 December 2007

2 October 2011
13 October 2011
24 October 2011

LiDAR

June 2010

 

Following the success of the multi-angular Data Fusion Contest, again this year, each participant was asked to submit a paper at the end of the Contest describing in detail the problem addressed, method used, and final result. The papers submitted were automatically formatted to hide names and affiliations of the authors to favor the neutrality and impartiality of all reviews.

The Data Fusion Award Committee consisted of eight independent judges from universities, government institutions, and industries:

  • Jocelyn Chanussot, Grenoble Institute of Technology, France
  • Curt Davis, University of Missouri, USA
  • Jenny Q. Du, Mississippi State University, USA
  • Paolo Gamba, University of Pavia, Italy
  • Karl Heidemann, USGS, USA
  • Oliver Lang, Astrium Services, Germany
  • Fabio Pacifici, DigitalGlobe, Inc., USA
  • Uwe Sörgel, Leibniz Universität Hannover, Germany

Papers were judged in terms of sound scientific reasoning, problem definition, methodology, validation, and presentation.

3. OUTCOME OF THE CONTEST

More than 1150 researchers across the globe registered for the Contest, corresponding to an increase of more than 51% over the previous year. This demonstrates the great interest from the Earth observation scientific research and application community. The data set was downloaded from 78 different countries, with a large number from less developed areas. Figure 2 shows the geographical distribution of the subscribers, where other indicates the countries with less than 16 participants. Also, as illustrated in Figure 3, about 20% of participants were from corporations or government agencies.

Fig. 2: Geographical distribution of the registered users for countries with more than 16 participants

Fig. 3: Affiliation of the registered users

A number of interesting research topics were submitted, demonstrating numerous possibilities and a variety of applications that multi-modal/multi-temporal remote sensing images can offer, such as, change detection, land cover classification, road extraction, moving object detection, information fusion, and image superresolution.

Final results were announced at the 2012 IEEE International Geoscience and Remote Sensing Symposium held in Munich, Germany. The winners of the 2012 Data Fusion Contest are:

1.      C. Berger, M. Voltersen, R. Eckardt, J. Eberle, T. Heyer, N. Salepci, S. Hese, and C. Schmullius, from the University of Jena, Germany, with a paper entitled “FUSION OF HIGH-RESOLUTION OPTICAL IMAGERY AND OBJECT HEIGHT INFORMATION FOR AN INTEGRATED ASSESSMENT OF URBAN DENSITY (UD)”

2.      J. Tao1, S. Auer2, R. Bamler1, from (1) German Aerospace Center (DLR), (2) Technische Universität München, Germany, with a paper entitled “COMBINATION OF LIDAR AND SAR DATA WITH SIMULATION TECHNIQUES FOR IMAGE INTERPRETATION AND CHANGE DETECTION”

3.      K. Ewald1, M. Gartley2, J. Jacobson3, and A. Buswell1, from (1) Ball Aerospace & Technologies Corp., (2) Rochester Institute of Technology, (3) National Air and Space Intelligence Center, United States, with a paper entitled “RADIOSITY TECHNIQUE FOR REFLECTANCE RETRIEVAL APPLIED TO WORLDVIEW-2 DATA”

Congratulations to the winners whose papers were judged to be superior in terms of sound scientific reasoning, problem definition, methodology, validation, and presentation!!!

The winning teams were awarded IEEE GRSS Certificates of Appreciation during the Technical Committees and Chapters Luncheon. Additionally, this year the Data Fusion Technical Committee was pleased to offer a monetary prize to the winning teams as follows:

  • First Prize: $800
  • Second Prize: $500
  • Third Prize: $300

As is tradition, a manuscript summarizing the Contest outcomes will be submitted for peer review to IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (JSTARS). To further enhance its impact in the community, the Data Fusion Technical Committee will support its open-access publication cost with the funding being provided by the IEEE Geoscience and Remote Sensing Society and DigitalGlobe, Inc.

At the end of the Contest, K. Ewald, M. Gartley, J. Jacobson, and A. Buswell have communicated to the Data Fusion Technical Committee the intention to donate their monetary prize to United Way, a non-profit, charitable organization that supports education, income, and health (www.unitedway.org).

4. REFERENCES

[1] L. Alparone, L. Wald, J. Chanussot, C. Thomas, P. Gamba, L. M. Bruce, “Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data fusion contest”, IEEE Transactions on Geoscience and Remote Sensing, vol. 45, no. 10, pp. 3012–3021, Oct. 2007.

[2] F. Pacifici, F. Del Frate, W. J. Emery, P. Gamba, J. Chanussot, “Urban mapping using coarse SAR and optical data: outcome of the 2007 GRS-S data fusion contest”, IEEE Geoscience and Remote Sensing Letters, vol. 5, no. 3, pp. 331-335, July 2008

[3] G. Licciardi, F. Pacifici, D. Tuia, S. Prasad, T. West, F. Giacco, J. Inglada, E. Christophe, J. Chanussot, P. Gamba, “Decision fusion for the classification of hyperspectral data: outcome of the 2008 GRS-S data fusion contest”, IEEE Transactions on Geoscience and Remote Sensing, vol. 47, no. 11, pp. 3857-3865, November 2009

[4] N. Longbotham, F. Pacifici, T. Glenn, A. Zare, M. Volpi, D. Tuia, E. Christophe, J. Michel, J. Inglada, J. Chanussot, Q. Du “Multi-modal Change Detection, Application to the Detection of Flooded Areas: Outcome of the 2009-2010 Data Fusion Contest”, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 5, no. 1, pp. 331-342, February 2012

[5] F. Pacifici, Q. Du, “Foreword to the Special Issue on Optical Multiangular Data Exploitation and Outcome of the 2011 GRSS Data Fusion Contest”, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 5, no. 1, pp.3-7, February 2012

5. ACKNOWLEDGMENTS

The IEEE GRSS Data Fusion Technical Committee would like to express its great appreciation to DigitalGlobe, Astrium Services, and USGS/CLICK for donating data sets to the scientific community and for their continuing support in providing resources for this initiative.

Reprinted with permission, IEEE GRSS Data Fusion Technical Committee

Share

Sharing is Caring

Explore more: GEOINT, Remote Sensing

Geospatial Newsletters

Keep up to date with the latest geospatial trends!

Sign up

Search DM

Get Directions Magazine delivered to you
Please enter a valid email address
Please let us know that you're not a robot by using reCAPTCHA.
Sorry, there was a problem submitting your sign up request. Please try again or email editors@directionsmag.com

Thank You! We'll email you to verify your address.

In order to complete the subscription process, simply check your inbox and click on the link in the email we have just sent you. If it is not there, please check your junk mail folder.

Thank you!

It looks like you're already subscribed.

If you still experience difficulties subscribing to our newsletters, please contact us at editors@directionsmag.com