Geospatial data is increasingly central to decision-making in all walks of life. In fact, the greater use of location-based data has the potential to generate $700 billion of value for the end user.

However, anyone who relies on data for a living knows that not all data is fit for purpose and the cost of bad data can be huge.

Poor data leads to poor outcomes, wasting time, increasing human intervention and costing money along the way. Analysts spend more time re-working data than analysing it. Incorrect location information delays projects.

IBM recently estimated that poor quality (across all types of data) costs the US economy alone as much as $3.1 trillion each year.

As geospatial, or location, data is used and shared more widely, accuracy and reliability become of paramount importance. Just a few metres' error on a road layout can send vital emergency services many miles and precious minutes in the wrong direction. Misrepresenting the position of an electricity line can endanger the lives of workers and lead to power cuts across a city. Operations relying on inadequate spatial data can waste large sums of money and damage their reputation for customer service.

Managing spatial data quality can seem a daunting task. Typically, an organisation's spatial data comes from different sources. It was collected over different periods, at different frequencies, to different levels of accuracy and for different purposes. Often, it is stored in different formats and at different levels of quality and completeness. Integrating that data to support a valid, single point decision is hard. Managing and maintaining that data for regular interrogation is even harder.

Instead of treating data quality as a series of discrete (and expensive) projects, more far-sighted organisations are beginning to take a more holistic approach. They are creating data quality strategies that are aligned to organisational needs and requirements. They are adopting tools to automate data quality routines, reducing the cost and time involved in keeping their data continuously fit for purpose.

It's an approach that ensures their data is always available, up to date and accurate.

It's an approach in which 1Spatial specialises. Our automated, rules-based approach to data quality helps organisations like Ordnance Survey Great Britain, Northumbrian Water and the Ministry of Defence to cost-effectively manage their critical spatial data.

Over the next few weeks, we'll be looking at the impact of poor data quality and exploring how automation can help organisations build smarter data for smarter decisions.

To learn more, download our Little Book of Data Quality, here https://1spatial.com/capabilities/little-book-data-quality/

1Spatial plc published this content on 02 February 2018 and is solely responsible for the information contained herein.
Distributed by Public, unedited and unaltered, on 02 February 2018 06:34:03 UTC.

Original documenthttps://1spatial.com/news/2018/02/blog-important-data-quality-location-data/

Public permalinkhttp://www.publicnow.com/view/CDFA12A939C7FF790B0108166BE45EABAC06546A