Geospatial imagery data is an indispensable tool for growingly complex projects. As imagery data becomes more accessible and readily available, many aggregators and providers are offering products that come from a wide range of sources. When data originates from different platforms, their varying standards and quality measures may affect data consistency and quality, which directly impacts your project results.
Here are six crucial areas for determining the overall quality of a data source and for recognizing the significant differences among data sources that can influence your project.
To avoid spatial errors that affect downstream applications, foundational level base maps should be encoded with the highest accuracy possible and avoid the common practice of re-digitizing features based on far less accurate data sources. Trustworthy data sources use vast networks of third-party ground control points. Accuracy can be validated according to key standards set by the American Society of Photogrammetry and Remote Sensing (ASPRS), including CE90 (circular error), CL95 (confidence level), and RMSE (root mean square error).
Best-in-class providers use advanced photogrammetric algorithms and quality control processes to ensure image quality. The quality can be defined by cross-regional consistency, resolution repeatability, and the absence of artifacts in the data. Other considerations include collection during the same acquisition period, sun angles to limit shadowing and tonal variations, and near-invisible seamlines.
A final quality control check by professionals can elevate the data quality even further by correcting seamlines and colour gradings and by providing detailed image adjustments. While today’s processing software offers high image quality, human curation can detect nuances and imperfections that even the best processing algorithms can’t catch.
Coverage and frequency
Projects need comprehensive area coverage, but mixing different data sets to attain coverage can introduce challenges related to data consistency. This can become especially problematic when various sources are used. Look for providers who consistently, over the years, offer complete coverage of an area and commit to a consistent refresh schedule. How often images are refreshed is essential. Images taken in the same season and with the same system can improve change detection and automated feature extraction.
Image resolution must be right for the job. Know exactly what you need upfront. For most projects, 15 cm/6 in resolution in metropolitan areas and about 30 cm/12 in resolution in rural areas provides a good, affordable start for a project. For core urban locations, requirements can be as low as 5 cm/2 in resolution.
Each imagery provider makes different commitments to data quality, acquisition timelines, data usage rights, and support. For largescale projects, it’s critical to select a vendor that best aligns with your needs — for example, the licensing rights to extract analytics from the data via machine learning algorithms or commitments to an acquisition schedule.
The delivery method you choose affects how you obtain, manage, use and share images. It also affects cost, IT infrastructure, and crucial security access.
In conclusion, focusing on these key areas will help you establish expectations as your begin your search and ultimately boost results for your project. As you evaluate various data providers, be sure to take a look at the big picture to consider additional factors in the decision-making process, such as different capture methods and traits of a good data provider. Download our eBook to learn more about how to select the right geospatial imagery for the job.