During the last decade, huge amounts of remote sensing (RS) images have been acquired, leading to massive Earth Observation (EO) data archives from which mining and retrieving useful information are challenging. Volunteered Geographic Information (VGI) such as OpenStreetMap (OSM) can offer rich geometric and semantic information that goes beyond land use tags, which can be very beneficial for accessing and extracting vital information for observing Earth from big Earth Observation archives. However, user-provided tags within OSM can be noisy, incomplete and redundant.
The IDEAL-VGI project aims to address very important scientific and practical problems by focusing on the main challenges of:
- VGI for land use classification which are: a missing framework to exploit the rich semantic information present at different scales and the uncertainty of OSM derived land use classes.
- Big EO data, which are: RS image characterization, indexing and search from massive archives.
To this end, we will develop innovative methods, which can significantly improve the state-of-the-art both in the theory and in the tools currently available. In particular, novel methods will be developed, aiming to:
- identification of the importance, uncertainty and quality of different OSM derived features;
- enhancing methods for better assessment of quality to promote relevant semantic content of OSM and integration of supporting complementary VGI data streams;
- developing machine learning/deep learning algorithms in the framework of RS image classification for automatic OSM tag refinement and assignment;
- developing RS image classification, search and retrieval methods that consider OSM tags with their uncertainty information;
- improve both OSM semantic land use description as well as remote sensing image classification based on a comparison between the two classification approaches;
- make full use of VGI to generate accurate annotated data sets and improve accuracy of labelling, which should contribute to more convincing training data sets.