“The Sell Sider” is a column composed by the sell side of the digital media neighborhood.
Todays column is composed by Declan Owens, digital analytics specialist at Piano.
As the deprecation of third-party cookies looms, the vision of first-party data as the backbone of marketing is clearer than ever. But prior to they make the switch, publishers and brands alike need to guarantee the data theyre collecting fulfills quality standards for dependability and security.
Here are 6 ways to examine your companys data quality and improve data flow through your organization:
Control the efficiency of your information collection
Data tasting is an extensive practice that evaluates just a subset of your information, which is used to estimate total results. But when it concerns crucial decisions, a quote isnt enough. Your data must likewise record all cross-device user behavior, preserving a complete view of your customer and your relative performance. Complete information sets, readily available with no sampling, are the very best method to avoid undermining your companys decision-making with manipulated information that may not fully reflect truth.

Carrying out reliable information governance promotes constant usage of information by establishing a single point of truth. Common analysis tools assist in sound decision-making based on data that is collected, computed and processed in the very same way. Data has incredible potential, but so typically we see organizations trying to turn lead into gold rather of making pipes.

Data sampling is a prevalent practice that examines just a subset of your information, which is utilized to approximate total results. Complete information sets, available with no tasting, are the best way to avoid weakening your organizations decision-making with skewed details that may not totally reflect reality.

Audit your tagging routinely
Data owners should have access to easy quality control and reputable tagging, including regular treatments such as automatic testing that will enable you to look for the presence of all tags and confirm data reliability. Tools to assist in tag audits are important, especially when youre updating your digital platforms.
Guarantee the precision of your measurements
Its important to have full openness over how metrics are calculated by understanding what goes into your service providers information pipeline and how the metrics are developed. To comprehend your genuine traffic volume, youll also require to identify and leave out traffic from bots that browse your sites. Robot traffic can represent more than half of all web traffic, so this can substantially sway stats if its not attended to.
Additionally, lots of ad-blocking extensions and web browsers obstruct some trackers by default, even though the legal context (approval, for instance) offers you the authenticity to gather this data. To address this, it is very important to release first-party domain measurement to take back control and encourage trust, especially in the relationship between your consumers and your service.
Constantly clean your information to improve its stability
Make certain your data is available, well-formatted and being captured precisely as planned. By checking how your information is retrieved and provided, you guarantee your values are properly shown on your reports. You should check whether you have reasonable and validated worths of the numbers in your analysis during collection.
With customized tools for tagging that set custom data-processing guidelines, you can fix errors due to tagging problems, improve the information collected and leave out unwanted traffic. Tools that have actually automated quality assurance can permit you to separately update your information when needed without relying on technical assistance to alter the code.
Centralize data around a single recommendation point
Executing efficient data governance promotes constant usage of information by establishing a single point of reality. Common analysis tools assist in sound decision-making based upon information that is gathered, determined and processed in the exact same way. With typical recommendation indications for consistency, you determine which signs, based upon their definition and calculation, need to be utilized, for instance, as efficiency criteria for each business function or to run macro-level reporting.
Maintain strict regulative compliance
By aligning your approach to data collection and processing with appropriate information protection laws, you guarantee by default that your information is precise and compliant. You must have total clarity on the data youre gathering, how its processed, where its kept and for the length of time– and be able to customize or erase data when needed.
Information has incredible capacity, but so often we see organizations trying to turn lead into gold instead of making pipes. In other words, if data is the oil of tomorrow, we are trying to put it in our engines without improving it initially. The result is explosive.
Without the effort required at all stages of the chain– collection, processing, storage, restitution, circulation– data can not correctly irrigate the decision-making systems that assist a businesss strategy. An executive group that supports a tactical decision based upon a questionable indicator without quality control may be even worse off than those using good old intuition.
Follow Piano (@piano_io) and AdExchanger (@adexchanger) on Twitter.