Firefox-update-v069-blocks-third-party-tracking-cookies

Firefox V69.0

Have you heard of the new change within the latest version of the internet browser Mozilla Firefox?
From now on (Version #69.0) Firefox blocks third party tracking cookies and cryptomining by default!

As Firefox is the browser with the second largest install base on desktop; this has major impact on your website analytics data!

marketing-online-content-data-future

Dreams of the future: Marketing

A peek into the future

A customer is navigating his way around the internet. On one of his favorite websites, he comes across a piece of sponsored content. Although the ad is clearly marked as such, it is not perceived as annoying or instrusive but looked upon as a piece of relevant content that is presenting a way to style clothing he’d been interested in before. Experiencing this kind of relevant targeted content makes our customer decided to give our advertiser permission to use his data for advertising purposes and to turn off his adblocker.

Enticed by an offer, the customer clicks on the advertisement and is directed to the website of the advertiser where he notices the option to subscribe to the company’s newsletter. Because the content by the advertiser has been relevant and enjoyable for him he subscribes to the newsletter, knowing that if he ever wants to unsubscribe it will be easy and the company will respect his wishes. When he receives the newsletter, he is pleasantly surprised to find it is filled with articles relevant to him personally, as if it was created specifically for him.

Made possible by data streaming solutions

Positive online experiences like above have often been facilitated by collecting, combining  & streaming relevant data for creating the best personal perception.  Let’s take a look behind the scenes on how a seamless & positive customer experience can be achieved:

  • A (new) customer may not realize it, but the way he is approached online during his customer journey is carefully planned and monitored. Campaign performance is consistently evaluated, and campaigns are adjusted dynamically to respond to customer behavior and outside forces (such as weather patterns or current events). An advertisement may even be initiated based on his behavior in real-time through marketing triggers based on streams of customer behavior data. This enables very timely and relevant marketing efforts to be performed.
  • The marketing department of the company has achieved a 360-degree customer view, based on all data relevant of him. This makes it possible to construct a newsletter consisting of content selected only for him by extensive machine learning algorithms.
  • In the marketing world of the future, customers feel secure in allowing their data to be shared by companies for advertising purposes. This is because they know their data will be handled securely and with full respect of their privacy. They also know that they can rescind the consent given for using their data at any time. This trust is facilitated by organisations adopting safe and transparent data processing practices and solutions, such as our Datastreams Platform and its consent management module.

Ready for the future of customer-friendly, relevant and valuable marketing? Read more about how our Datastreams Platform helps online marketeers here.

The Future Of Healtcare, enabled by data streaming

Dreams of the future: Healthcare

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of healthcare, which will become more customer friendly and efficient as we learn to leverage the power of data streaming.

A peek into the future

A patient walks into the clinic or a hospital. He suddenly started feeling ill during his holiday and an online consult with his own physician has resulted in the advice to see a GP locally. He goes to the GP, who has already received his patient file and the information from the consult with his own physician. Upon inspection, the GP refers our patient to the hospital to get further examination. When he does, the hospital is already aware of who he is, what his symptoms are and any other conditions they should be aware of. This means the hospital is already aware of the patient’s allergy to penicillin, even if he forgets to mention it himself. After being examined (using an MRI scan, amongst other equipment) the patient is swiftly diagnosed. Luckily, his condition does not require a stay in the hospital. He receives his treatment and is sent on his way with a prescription and a healthcare wearable to monitor his condition remotely. When he arrives back home, his practitioner is aware of everything and takes over care from the hospital. Additionally, a refill of his medication is already ready at his local pharmacy for pickup. Even though different institutions and people were involved in the care of this patient, he experienced a seamless medical journey.

Enabled by data streaming

The seamless healthcare journey our patient experiences, is made possible by data streaming solutions enabling seamless data-driven collaboration. Take a look behind the scenes:

  • Solutions for patient records in a secure manner ensure that healthcare practitioners already know vital information about patients and their symptoms when they arrive. This makes care more personal, efficient and safe. It also makes it easy for our patients to switch between healthcare practitioners.
  • Our patient doesn’t know it, but the MRI machine used to examine him is part of a healthcare equipment sharing programme. This means that the MRI machine is also used by other healthcare providers and the MRI images are securely streamed to the appropriate institution. This makes healthcare equipment more affordable for smaller institutions. Solutions for handling this type of MRI imaging data responsibly are required to bring such equipment sharing programmes to life.
  • The healthcare wearable our patient wears continuously monitors his biometric data and streams this data to his healthcare practitioner. A solution dedicated to processing and responding to this real-time data allows healthcare practitioners to respond to any warning signals before our patient even notices them.
  • During this visit, our patient has caused plenty of treatment and administrative data to be generated. After leaving the hospital, this data is stored anonymously and used to improve care in the future. Solutions for processing and analysing large amounts of data are incremental for doing this in an efficient manner. Using a solution like the Datastreams platform, this data can also be shared with other healthcare institutions to improve R&D and enable faster healthcare innovation.

The future is not here yet, but we’re moving towards it every day. Ready to move towards the future with us? Read more about how our Datastreams Platform can help healthcare institutions here.

Leading expert in the field of (big) data solutions, Finance future

Dreams of the future: Finance

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of financial service institutions, which will only become more data-driven as pressure on these institutions increase. Today we present a snapshot of the financial institution of the future and the data streaming technologies that power it.

A view of the future

The financial institution of the future is faster and more customer-friendly than the financial institutions of our current time. When interacting with their bank online, the customer has a seamless customer experience that is tailored to their preferences. Additionally, they are proactively approached about possible issues or opportunities. For instance, a customer might be approached about the possibility of scheduling a weekly payment when the system notices weekly manual transfers or be asked if they have made an error when exhibiting abnormal behaviour (i.e. transferring €1050 instead of €10,50). These suggestions are frequently made at relevant times while the customer is actively interacting with the system, making banking more personal, engaging and meaningful. 

Customers in the future also enjoy a more open and seamless financial experience across platforms. Open banking regulations mean they can choose from a plethora of third-party applications for managing their finances in a seamless and easy manner. The financial customer of the future has no problem consenting to their data being shared for this purpose, trusting the high security and privacy standards upheld by financial organisations. 

Finally, aside from offering a better banking experience for customers, the financial institution of the future is also much safer and less likely to facilitate fraud or money laundering. Banks know their customers and can instantly detect behaviours that are indicative of (future) fraudulent behaviour or that suggest an account has been compromised. When warning signs are detected, immediate action can be taken and financial crimes such as fraud and money laundering can be prevented in real-time.   

Enabled by data streaming technology

The future of the financial industry described above will be enabled for a large part by (real-time) data integration, streaming data analytics and event stream processing.

  • More personal customer experiences are achieved by utilising some of the over 2.5 exabytes of data that customers are generating daily. Bringing together customer data from different sources and performing analytics on this integrated source of customer data helps institutions know their customers and engage them in a more personal manner with marketing or customer relation messages.
  • Real-time analytics and event-stream processing, combined with historical customer data, are at the core of enabling real-time personalised engagement with customers. They allow financial institutions to respond to customer behaviour as it is happening, enabling timely, relevant cross-selling and immediate responses to uncharacteristic behaviour. It also enables issues to be detected and resolved pro-actively before they hinder a customer’s online experience.
  • Sharing customer data with certified third-parties upon request while also adhering the strict GDPR-regulations regarding data sharing and customer consent, requires a solution capable of performing a delicate balancing act. A platform like the Datastreams Platform that is dedicated both to data governance and data collaboration, allows financial institutions to build a system where financial data can be shared safely, securely and seamlessly.
  • Real-time analytics and event-stream processing enable organisations to monitor ongoing transactions and immediately recognise suspicious, fraudulent patterns. This allows banks to detect fraud within seconds and shut down transactions before any damage is done. Comparing ongoing transaction data with historic transaction data also allows identification of uncharacteristic customer behaviour, potentially safeguarding customers from losing money in cases of hacked or stolen accounts.

The future is not here yet, but we’re moving towards it every day. Ready to move towards the future with us? Read more about how our Datastreams Platform can help financial institutions here.

Data streaming, the future of education

Dreams of the future: Education & research

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of Education and research, which will allow research institutions to become more efficient, collaborative and student-friendly through data streaming technologies.

The Future of education

In the research and education institution of the future, collaboration is central. Collaboration between research institutions and the corporate world, between researchers from different research institutions and collaboration between students and researchers. Data-driven research is no longer largely limited to a single institution. Instead, each research institution has data collaboration solutions set up that allow researchers to easily combine data collected in their experiments with open source data and data by previous researchers from other institutions. The data collected and generated by their own research is subsequently shared with other research institutions upon request. This improves the data available to researchers in the network of research institutions, improving research outcomes and replicability.

The data-driven collaboration in the research institutions of the future also extends to students. As part of their studies, students are invited to participate in research projects from their own university or from external research organisations. This allows students to gain experience in working with real data, improving the level and practical applicability of their education. It also enables research institutions to conduct faster and more efficient research by tapping into the passion and skill of students learning about state-of-the-art analytics and research methods.

Enabled by data streaming

The future of education and research we envision will require research and education organisations to share (research) data with each other in a safe and secure manner. It will also require these organisations to be able to use data from other third-party sources and integrate it with their own in-house data easily and reliably. Platforms designed for data collaboration will enable organisations to do so seamlessly, providing a controlled and monitored, yet easily accessible, infrastructure for secure data sharing between organisations. These platforms will also allow researchers and students to easily integrate data from different sources for their research projects without requiring extensities technological skills.

The future is not here yet, but we’re moving towards it every day. Ready to move towards the future with us? Read more about how our Datastreams Platform can help education and research institutions here.

GDPR compliant solutions, Customer Journey 360 data

Solutions for 360 customer journey data under the GDPR

In 2016, Gartner recognised the market need for improved customer journey analytics. In that same year, the GDPR was adopted by the European Parliament, drastically limiting the data collection activities at the core of these analytics. Since the regulation went into effect last year, we have been keeping an eye on current discussions in data-driven organisations. We have noticed that many companies desire to implement a 360° customer journey data strategy but are not sure how to do this in an effective and GDPR-compliant manner. We discuss the current landscape of 360° customer journey analytics under the GDPR and examine what data-driven businesses look for in relevant solutions.

A challenge in data wrangling

360° customer journeys require the ability for organisations to handle the ever-increasing volume, variety, velocity and veracity of all the ‘Big Data’ present in their databases. Many organisations struggle to do so, often because their data is distributed across many silos and it is owned by different stakeholders and teams. This causes marketing analysts use many different tools to leverage this data. Web and mobile analytics, social analytics, media analytics, customer journey analytics and voice of the customer analytics are all performed on different datasets with different tools.

Using many different tools on different datasets is not only inefficient, but also hinders attempts to optimise customer journey analytics. The key downfall behind 360° customer journey analytics is, in the words of Gartner; “Today’s marketing analyst uses too many tools and all of these are looking at bits and pieces of what to a person is – whether called a target or audience or prospect or customer, it’s still a person, people – a single relationship”. Paramount in 360° customer journey analytics is therefore bringing together these different sources of information into an integrated customer profile. Therefore, 360° customer journey requires effective solutions for streaming the right data, in the right place, in the right format, at the right time and doing so any time, all the time.

A challenge in GDPR compliance

With the GDPR in effect, platforms capable of facilitating integrated data streaming are no longer enough to ensure complaint customer journey analytics. Platforms also need to incorporate the required functions and restrictions that allow customer journey analytics in a GDPR-compliant manner. This involves asking customers for informed consent before collecting their data, storing it securely according to GDPR-standards, processing it according to Privacy by Design principles and deleting all traces of a customer when they ask for it. To ensure this can be accomplished, a solution including a strong data governance layer is paramount.

Additional Industry demands

We have established that 360° customer journey analytics under the GDPR requires a solution that is capable of not only integrating data from a variety of sources in a variety of formats, but also of doing so in a fully GDPR-compliant manner. The industry places additional requirements and demands on solutions for 360° customer journey data management. We have observed the following main themes in what DPO’s and Chief Privacy, Revenue, Data and Marketing officers look for in solutions:

  • While some organisations are seeking complete end to end GDPR solutions from one solution provider, this isn’t always the case. Desires for solutions regarding specific areas within the legislative requirements are prevalent, especially universal consent management and data stream governance.
  • Organisations want enhanced data security, governance and auditing user access controls built in. Key users of GDPR solutions are going to be DPOs, who as part of their enhanced role under the GDPR, should be able to advise, view, take control and sign off on data streaming from one place to another. They also need to be able to report and audit this when it comes to fulfilling GDPR data subject rights.
  • We are seeing a lot of traction from enquiries which are about building collaborative partnerships combining smart, specialist technology from multiple suppliers.
  • The majority of enquiries we receive are cloud/SaaS based, but easy to deploy On-Premise implementation is also a requirement the market is asking solutions for.
  • Customers want to be able to quickly start (or improve) their GDPR data journey within one business area, data source and destination. This means that solution which is quickly scalable to multiple areas, sources and destinations is highly desirable.

We have combined the two-sided challenge of 360° customer journey management under the GDPR with the industry demands to create the Datastreams Platform. This platform allows companies to collect, integrate and process data from a variety of sources in a GDPR-compliant manner. Modules such as our Consent Management Solution can be included in the full infrastructure or implemented separately if desired. Additionally, the system can easily be linked with technologies by many other suppliers and can be implemented rapidly. On-site implementation is also available. Finally, the system is highly user-friendly, allowing DPO’s to take control of their data streams and easily create data-streaming reports when required. Interested in learning more? Request a personal demo.

Collaboration advanced Smart City data in the future

The future of an advanced Smart City

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of smart cities, which are about to become not only a lot smarter, but also a lot more citizen-friendly.

A glimse of the future

The idea of a smart city generally conjures up images of a technological metropolis. However, in the future, it won’t just be big capital cities that are powered by smart city technology. The increasing presence and availability of technologies required to make a city smart, will lead every city to become a “smart city”.

The smart city of the future runs like clockwork. Traffic lights won’t be needed as the systems of driverless cars and the city’s IoT systems routes traffic intelligently and prevent traffic jams from occurring. Passengers rarely have to wait longer than a few minutes for a car to take them to their destination, as they are all shared and continiously move around in the most efficient route. The city is also clean: smart waste solutions ensure that trash cans around the city are emptied frequently and smartly. On top of that, air quality is constantly monitored, and actions are taken to prevent pollution levels from rising above acceptable levels.

The city is not just a smoothy running machine, however. The smart city of the future is, above everything, a people-oriented city. Citizens are always invited to give their opinion on current developments or improvements for the future through an accessible survey system. Additionally, future needs and opportunities of citizens are anticipated ahead of time: elderly citizens are informed about possibilities for assistive technology ahead of time and citizens in need are pro-actively approached about support programs.

The effect of data stream technology

Planning the smart city infrastructure to be set-up in a city requires a thorough understanding of the way a city functions, where it struggles and how it can be improved. Explorative analytics on an integrated set of urban data helps city planners set up smart cities in an optimal manner. Technologies for collecting, combining and running analytics on large amounts of disparate data, will be essential for accomplishing this. To enable seamless traffic management in and around cities requires cars to communicate not only with each other, but also with the current and past traffic and congestion data. This will require an infrastructure capable of rapid real-time data streaming, integration and processing on a large, distributed scale. Such an infrastructure lies at the root of many other smart city initiatives, such as smart waste solutions and air-quality monitoring programs.

Running a continuous system for citizen participation requires a survey system that can handle streams of possibly sensitive response data in a secure manner and integrate them to create understandable and accurate reports to reflect citizen feedback. A data streaming platform is the perfect foundation to build such a system on. A smart city relies to a large extent on citizens’ willingness to share their data and to engage with the smart city technology. This is especially important for pro-actively supporting citizens in need. Because of this, citizens need to be confident that their data is handled in a transparent, responsible, secure and compliant manner. A platform that has proven to handle sensitive data in this matter, is crucial for earning citizen trust and support for smart city initiatives.

What can you do in the development of a Smart City? Learn how you can collaborate with your partners with our Smart City Solution.

How the future will change the field of logistics and how technological innovations will push operational excellence to its very limit.

The future of advanced operational excellence in Logistics

As a leading expert in the field of (big) data solutions, we develop the technologies that are going to shape our future. To prepare for the future we analyse the industries we operate in, we share our image of the future of the field of logistics and how technological innovations will push operational excellence to its very limit.

A glimpse of the future

Imagine a warehouse just outside a highly technologically advanced smart city. This small warehouse doesn’t stand out in size, but in its diversity: the warehouse is completely stocked with goods from a variety of clients. Just as a truck is leaving the warehouse, another truck arrives with goods to fill up the space that just opened up. This is not a coincidence, it’s planned; no storage space goes unused. As the cargo leaves the warehouse, clients are informed about the current location and the condition of their goods and billing is handled automatically. If a problem occurs when the goods are loaded on the trucks or in the warehouse, it will instantly be documented and all parties involved will be directly informed.

Outside the warehouse, the driverless truck that has just left the warehouse sets off towards its destinations. It is completely full of packages from a variety of clients with a variety of destinations. Some packages need to be delivered to other warehouses in the distributed warehouse network, others need to be delivered to customer’s home. Upon leaving the warehouse, the ideal route is computed to deliver all the packages. The automatic vehicle drives from destination to destination, using real-time traffic information and weather data to expertly avoid any traffic jams and deliver packages to their intended location at the promised time.  

After all deliveries have been completed, the truck picks up a new load that have been ordered as recently as minutes ago. Newly filled with goods destined for customers and other warehouses in the network, it is ready to make a new set of deliveries. A seamless logistics system running 24 hours a day, 7 days a week, around the globe. 

The impact of data streaming technology

This glimpse of the future might seem an idealistic or futuristic one, but it’s not as far away as you may think. Many of the techniques required to make this dream come true are already being developed and implemented in streaming solutions such as our Datastreams Platform:

  • Ensuring optimal warehouse capacity by instantly utilising newly freed space requires a system to continuously monitor future incoming and outgoing deliveries from a variety of clients. Technologies enabling extensive data collaboration and business-to-business data streaming is an essential part of enabling this kind of dynamic shared warehousing.
  • Optimally handling issues with broken or incomplete goods requires an automated system for recognising, documenting and photographing any issues when they arrive. This information then needs to be disseminated to the relevant parties and integrated into relevant software systems. We built a system capable of doing this by combining our Datastreams Platform architecture with a custom-made app. Read more about it here.
  • Monitoring each client’s storage space currently in use and handling the pay-per-shelf billing for this requires a seamlessly integrated data architecture that is capable of handling incoming and outgoing deliveries in real-time. Technologies for integrating different streams of data in a structured manner will be required to accomplish this.
  • Utilising transport capacity sharing will require a collaboration platform where different clients can pool their data on plans for incoming and outgoing deliveries. A technology platform where organisations feel secure in sharing potentially sensitive data will play an important part in this.
  • Allowing a (driverless) vehicle to deliver packages on will require a lot of data to be collected, streamed and processed in real-time. This is not only required to make such a car functional and safe, but also to allow the routing systems to make decisions based on weather and traffic data to continuously find the optimal route to take. Additionally, integrating the routing system data with the delivery database and customer database will allow customers to continuously be up to date on the status and arrival time of their packages.

The future is not here yet, but we’re building it brick by brick. Want to get a little bit of the future we are building? Our Datastreams platform empowers logistics providers to get a little bit of the future right now, enabling more efficient processes with big data in every part of the supply chain. Want to know more about what it can do? View our page on logistics here.

Major impact on online marketing and analytics |Third-party cookies

ITP 2.1: What is changing and how do we deal with it?

Apple has announced plans to sharpen their ITP (Intelligent Tracking Prevention) regulations for their Safari browser. ITP version 2.1 is now live and instantly has a major impact on digital marketing and analytics due to its handling of third-party cookies. Firefox has indicated a similar tracking prevention, also cracking down in first-party cookies in addition to third-party ones. In this blog we bring you up to speed on what the tracking preventions means for organisations and how we have resolved this for the users of our Datastreams Platform.

What is ITP?

ITP stands for Intelligent Tracking Prevention. It represents Apple’s stand against online tracking and has been causing concerns for companies applying personalised marketing since its first incarnation. The first version started by limiting the possibilities for placing third-party cookies, with later releases increasingly limiting the potential for workarounds and alternatives. The previous version 2.0 blocked the placement of third-party cookies altogether. First-party cookies were largely unaffected by ITP. Until now, with the release of ITP 2.1.

What is changing?

The most important change for organisations engaging in digital marketing in ITP version 2.1 is the way that both first and third-party cookies will be handled. After the update, first-party client-side cookies created through JavaScript’s document.cookie will expire after seven days. Third-party cookies created by domains other than the current website continue to be blocked, as was the case in ITP 2.0.

Where the blocking of third-party cookies had severe consequences for marketeers, the blocking of client-side first-party cookies has the potential to significantly impact analytics. Since site visitors who return after seven days will no longer be counted as returning visitors, current solutions for assessing conversion tracking based on these cookies risk breaking down.

What are we doing about it?

Currently, the solutions to ITP 2.1 are two-fold: first, drastically limit reliance on third-party cookies. DimML, the language at the core of the Datastreams Platform, already enables our users to do this by allowing a script to be delivered through the same domain as the webpage from which it was loaded. The second solution is to place first-party cookies through a server-side method instead of through the client-side document-cookie implementation.

We’ve released a new component within our platform that will allow our customers to integrate our complete Datastreams Platform with all the capabilities within their own domain. This means that the Datastreams Platform is a part of your IT architecture and not a third party application. Data ownership and compliant data management is at the core of our architecture so it will not be effected by ITP2.1.  A core differentiator to many SAAS marketing technology or consent management providers, we give you full control how to manage your first party data, accurate and compliant data ownership driven by our state-of-the-art data architecture.

As the data and privacy landscape continues to change, we will continue to ensure the users of our Datastreams Platform can perform data analysis in an easy, secure and compliant manner. Do you want more information about how we are dealing with the ITP 2.1 update? Contact us!

Clean your database and esnuring that data is valid, complete, stored in the right places and accurate across the organisation.

Why you should (not have to) clean your company database

Spring is here, which means it’s time for a thorough spring cleaning. Aside from cleaning out the unnecessary papers from those clogged filing cabinets, consider turning your attention to your company database this year, because according to recent studies towards the data practices of contemporary organisations you probably need to clean your database.

In a world where companies are growing increasingly data-driven, business success increasingly depends on analytics based on large quantities of high-quality, trusted data. While many organisations are succeeding in acquiring large amounts of data and applying analytics to them, data quality often leaves a lot to be desired. In a study conducted by Experian, 95% of organisations indicated experiencing wasted resources and unnecessary costs due to poor quality data. This is not surprising, since organisations on average believe 29% of their data to be inaccurate, and as is often said in the field of data science: ‘Garbage in is garbage out’.  

It is clear from the percentages above that, statistically, it is highly likely that your company can benefit from a good spring cleaning of your database. Ensuring that data is valid, complete, stored in the right places and accurate across the organisation empowers you to trust your data again. This means you won’t have to waste time and money on marketing campaigns that are based on unreliable analytics. However, cleaning your data can be very time consuming, especially if your data infrastructure is not designed to be managed easily by business professionals. Additionally, data will need to be cleaned regularly to keep your data environment healthy and useable. Luckily, a good data quality monitoring & assurance solution can make your life a lot easier by preventing dirty data from entering your database in the first place and making cleaning a lot easier.

Data professionals know that data cleaning is a key part of any database management strategy. However, just cleaning your data periodically is not enough. If you don’t ensure data quality at the source, polluted data will continue to build up between cleaning sessions, potentially throwing off your analytics. That is why a strategy for validating data at the source, before it is analysed or enters your database, is crucial. Our data Quality and Assurance module increases the overall quality of your data-ecosystem by ensuring  only quality data enters your database and it continuously monitors your data streams to ensure they continue to supply data that is complete and of high quality. This, together with the streamlining and seamless integration of data streams in your company by the main Datastreams platform, ensures companies have a clean and orderly environment to manage their data in.

Implementing our solution does not mean you won’t ever have to clean your data (cleaning is imperative for keeping your data up-to date and removing data you no longer need), but it makes these periodical cleanings a lot less time-consuming. Want to know more about our data Quality and Assurance module and how it works? Visit our page about it.