Major impact on online marketing and analytics |Third-party cookies

ITP 2.1: What is changing and how do we deal with it?

Apple has announced plans to sharpen their ITP (Intelligent Tracking Prevention) regulations for their Safari browser. ITP version 2.1 is now live and instantly has a major impact on digital marketing and analytics due to its handling of third-party cookies. Firefox has indicated a similar tracking prevention, also cracking down in first-party cookies in addition to third-party ones. In this blog we bring you up to speed on what the tracking preventions means for organisations and how we have resolved this for the users of our Datastreams Platform.

What is ITP?

ITP stands for Intelligent Tracking Prevention. It represents Apple’s stand against online tracking and has been causing concerns for companies applying personalised marketing since its first incarnation. The first version started by limiting the possibilities for placing third-party cookies, with later releases increasingly limiting the potential for workarounds and alternatives. The previous version 2.0 blocked the placement of third-party cookies altogether. First-party cookies were largely unaffected by ITP. Until now, with the release of ITP 2.1.

What is changing?

The most important change for organisations engaging in digital marketing in ITP version 2.1 is the way that both first and third-party cookies will be handled. After the update, first-party client-side cookies created through JavaScript’s document.cookie will expire after seven days. Third-party cookies created by domains other than the current website continue to be blocked, as was the case in ITP 2.0.

Where the blocking of third-party cookies had severe consequences for marketeers, the blocking of client-side first-party cookies has the potential to significantly impact analytics. Since site visitors who return after seven days will no longer be counted as returning visitors, current solutions for assessing conversion tracking based on these cookies risk breaking down.

What are we doing about it?

Currently, the solutions to ITP 2.1 are two-fold: first, drastically limit reliance on third-party cookies. DimML, the language at the core of the Datastreams Platform, already enables our users to do this by allowing a script to be delivered through the same domain as the webpage from which it was loaded. The second solution is to place first-party cookies through a server-side method instead of through the client-side document-cookie implementation.

We’ve released a new component within our platform that will allow our customers to integrate our complete Datastreams Platform with all the capabilities within their own domain. This means that the Datastreams Platform is a part of your IT architecture and not a third party application. Data ownership and compliant data management is at the core of our architecture so it will not be effected by ITP2.1.  A core differentiator to many SAAS marketing technology or consent management providers, we give you full control how to manage your first party data, accurate and compliant data ownership driven by our state-of-the-art data architecture.

As the data and privacy landscape continues to change, we will continue to ensure the users of our Datastreams Platform can perform data analysis in an easy, secure and compliant manner. Do you want more information about how we are dealing with the ITP 2.1 update? Contact us!

Clean your database and esnuring that data is valid, complete, stored in the right places and accurate across the organisation.

Why you should (not have to) clean your company database

Spring is here, which means it’s time for a thorough spring cleaning. Aside from cleaning out the unnecessary papers from those clogged filing cabinets, consider turning your attention to your company database this year, because according to recent studies towards the data practices of contemporary organisations you probably need to clean your database.

In a world where companies are growing increasingly data-driven, business success increasingly depends on analytics based on large quantities of high-quality, trusted data. While many organisations are succeeding in acquiring large amounts of data and applying analytics to them, data quality often leaves a lot to be desired. In a study conducted by Experian, 95% of organisations indicated experiencing wasted resources and unnecessary costs due to poor quality data. This is not surprising, since organisations on average believe 29% of their data to be inaccurate, and as is often said in the field of data science: ‘Garbage in is garbage out’.  

It is clear from the percentages above that, statistically, it is highly likely that your company can benefit from a good spring cleaning of your database. Ensuring that data is valid, complete, stored in the right places and accurate across the organisation empowers you to trust your data again. This means you won’t have to waste time and money on marketing campaigns that are based on unreliable analytics. However, cleaning your data can be very time consuming, especially if your data infrastructure is not designed to be managed easily by business professionals. Additionally, data will need to be cleaned regularly to keep your data environment healthy and useable. Luckily, a good data quality monitoring & assurance solution can make your life a lot easier by preventing dirty data from entering your database in the first place and making cleaning a lot easier.

Data professionals know that data cleaning is a key part of any database management strategy. However, just cleaning your data periodically is not enough. If you don’t ensure data quality at the source, polluted data will continue to build up between cleaning sessions, potentially throwing off your analytics. That is why a strategy for validating data at the source, before it is analysed or enters your database, is crucial. Our data Quality and Assurance module increases the overall quality of your data-ecosystem by ensuring  only quality data enters your database and it continuously monitors your data streams to ensure they continue to supply data that is complete and of high quality. This, together with the streamlining and seamless integration of data streams in your company by the main Datastreams platform, ensures companies have a clean and orderly environment to manage their data in.

Implementing our solution does not mean you won’t ever have to clean your data (cleaning is imperative for keeping your data up-to date and removing data you no longer need), but it makes these periodical cleanings a lot less time-consuming. Want to know more about our data Quality and Assurance module and how it works? Visit our page about it.