Have you ever tried to answer a question using data you knew your organisation had (or had been told existed), but couldn’t find it?
Was this data in a zip file stored on a USB in someone’s desk drawer? All too often valuable data is captured in the field for projects or single use tasks and then never re-used as it’s not easily locatable, inconsistent with other data, has no metadata or is simply forgotten about.
Data capture and fieldwork are relatively expensive tasks that organisations undertake themselves or have undertaken by other external parties to gain an understanding of a particular issue. However, this data which is used to gain insight is typically forgotten about and not treated as a tangible asset with future use. All too often the data captured is in a different data schema and different to what is captured by others, further hindering the opportunity to consolidate data into a single repository. Attempting to carry out analysis using data that is unstructured and doesn’t follow a common schema is a task fraught with challenges.
When low emphasis is put onto the importance of the data that is created or collected, time is not set aside to ensure that data quality standards are met on some projects let alone fully implemented over time. Data quality metrics such as accuracy, consistency, completeness and accessibility are the basic indicators that can be used to measure data quality. These indicators may be enhanced or made more appropriate to an organisation by thinking about how coherent the dataset is, what it was captured to achieve and how it should be used in the future. How often is this analysed, let alone stored in accessible metadata within an organisation? The most common data quality issues prevalent are completeness and accessibility as the data captured has a single purpose to fulfil and often the data is a means to an end rather than seen as an important asset in its own right.
Accuracy + consistency + completeness + accessibility = data quality
With fieldwork and data capture occurring across many different parts of an organisation, opportunities for cooperative planning across departments and or timeframes are sometimes missed. Simple things that can be completed in the field such as taking geo-tagged photos may increase or even reduce the need for a future field trips as the required knowledge can be extracted from previously captured information.
There are many solutions to the common data management problems that plague organisations. These solutions can be simple however, the task of who is responsible for managing and coordinating these solutions can be more complex. Some basic standards around data creation, processing, metadata standards, fieldwork planning and standardised workflow processes can achieve a high degree of data uniformity if correctly implemented and practised across the business. An understanding of an organisations overall objectives needs to include a key understanding of the where the data lifecycle falls within the objectives and how it is implemented.
Data standards for field and office based data capture are an integral part of intelligent data creation, and indeed efficient workflows. Data standards enable consistent data capture across staff, teams, offices and more importantly over time. As some projects run for years’ data standards make future consolidation and integration of newly captured into the existing data model seamless, particularly for analysis of change over time. For example, a simple data capture standard form may include the types of fields to be captured, what is mandatory to be completed by the user and domain fields to limit the variety of free text entry. A more complex standard may conform to a common or standardised data schema specific for the organisation type or the common data use. However, it is valuable to capture data with a wider perspective of uses and later derive narrower subsets of data for particular uses.
Forward fieldwork planning within organisations can increase efficiencies in data capture and data post processing. Planning, travel and safety considerations all add to the cost of fieldwork, hence combining fieldwork in areas across teams or departments could enable economic efficiencies and or reduce the total amount of fieldwork required. Post capture quality assurance (QA) is a fundamental step prior to any data being used for analysis or shared with others. The QA would capture and fix gaps, illogical entries and flag outliers while also ensure the data is consistent and complete for the purpose it was captured.
Forward planning can make data a more useful asset for the organisation, however careful thought and planning should be undertaken to understand the suitability of organisational data for decision making. There are distinct limitations for using targeted data for extrapolation to catchment scale that need to be taken into account. A consolidated data repository of organisational data is a key asset that can be used to share and increase the knowledge held by an organisation. Do you have a forward fieldwork plan? What is your data quality as of today? Is it accurate, consistent, complete and accessible?
Meet Megan
Regional Lead - Location Intelligence
+61 7 3316 3274
Email Megan Stanley