SAROPT (2017-2020)

Funding Body: German Research Foundation (DFG)
Duration: 2017 to 2020
Budget: 273.000 Euro
Role: PI and sole author of proposal

The scientific goal of the SAROPT project is to investigate a stereogrammetric fusion of optical and SAR remote sensing data for the reconstruction of 3D information in urban areas. In this context, the main focus of this project is the investigation of deep learning techniques for the matching of SAR and optical image data, since the image matching process is the core challenge of SAR optical stereo analysis. The result of the research project will contribute to a most flexible analysis of heterogeneous satellite image data, especially with respect to fast 3D mapping in time-critical applications.

Selected Publications:

  • Hughes LH, Marcos D, Lobry S, Tuia D, Schmitt M (2020) A deep learning framework for matching of SAR and optical imagery. ISPRS Journal of Photogrammetry and Remote Sensing 169: 166-179
  • Hughes LH, Schmitt M, Mou L, Wang Y, Zhu XX (2018) Identifying corresponding patches in SAR and optical images with a pseudo-siamese CNN. IEEE Geoscience and Remote Sensing Letters 15 (5): 784-788
  • Qiu C, Schmitt M, Zhu XX (2018) Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution images. ISPRS Journal of Photogrammetry and Remote Sensing 138: 218-231
  • Bürgmann T, Koppe W, Schmitt M (2019) Matching of TerraSAR-X derived ground control points to optical image elements using deep learning. ISPRS Journal of Photogrammetry and Remote Sensing 158: 241-248
  • Hughes LH, Schmitt M, Zhu XX (2018) Mining hard negative samples for SAR-optical image matching using generative adversarial networks. Remote Sensing 10 (10): 1552