GDPR compliant solutions, Customer Journey 360 data

Solutions for 360 customer journey data under the GDPR

In 2016, Gartner recognised the market need for improved customer journey analytics. In that same year, the GDPR was adopted by the European Parliament, drastically limiting the data collection activities at the core of these analytics. Since the regulation went into effect last year, we have been keeping an eye on current discussions in data-driven organisations. We have noticed that many companies desire to implement a 360° customer journey data strategy but are not sure how to do this in an effective and GDPR-compliant manner. We discuss the current landscape of 360° customer journey analytics under the GDPR and examine what data-driven businesses look for in relevant solutions.

A challenge in data wrangling

360° customer journeys require the ability for organisations to handle the ever-increasing volume, variety, velocity and veracity of all the ‘Big Data’ present in their databases. Many organisations struggle to do so, often because their data is distributed across many silos and it is owned by different stakeholders and teams. This causes marketing analysts use many different tools to leverage this data. Web and mobile analytics, social analytics, media analytics, customer journey analytics and voice of the customer analytics are all performed on different datasets with different tools.

Using many different tools on different datasets is not only inefficient, but also hinders attempts to optimise customer journey analytics. The key downfall behind 360° customer journey analytics is, in the words of Gartner; “Today’s marketing analyst uses too many tools and all of these are looking at bits and pieces of what to a person is – whether called a target or audience or prospect or customer, it’s still a person, people – a single relationship”. Paramount in 360° customer journey analytics is therefore bringing together these different sources of information into an integrated customer profile. Therefore, 360° customer journey requires effective solutions for streaming the right data, in the right place, in the right format, at the right time and doing so any time, all the time.

A challenge in GDPR compliance

With the GDPR in effect, platforms capable of facilitating integrated data streaming are no longer enough to ensure complaint customer journey analytics. Platforms also need to incorporate the required functions and restrictions that allow customer journey analytics in a GDPR-compliant manner. This involves asking customers for informed consent before collecting their data, storing it securely according to GDPR-standards, processing it according to Privacy by Design principles and deleting all traces of a customer when they ask for it. To ensure this can be accomplished, a solution including a strong data governance layer is paramount.

Additional Industry demands

We have established that 360° customer journey analytics under the GDPR requires a solution that is capable of not only integrating data from a variety of sources in a variety of formats, but also of doing so in a fully GDPR-compliant manner. The industry places additional requirements and demands on solutions for 360° customer journey data management. We have observed the following main themes in what DPO’s and Chief Privacy, Revenue, Data and Marketing officers look for in solutions:

  • While some organisations are seeking complete end to end GDPR solutions from one solution provider, this isn’t always the case. Desires for solutions regarding specific areas within the legislative requirements are prevalent, especially universal consent management and data stream governance.
  • Organisations want enhanced data security, governance and auditing user access controls built in. Key users of GDPR solutions are going to be DPOs, who as part of their enhanced role under the GDPR, should be able to advise, view, take control and sign off on data streaming from one place to another. They also need to be able to report and audit this when it comes to fulfilling GDPR data subject rights.
  • We are seeing a lot of traction from enquiries which are about building collaborative partnerships combining smart, specialist technology from multiple suppliers.
  • The majority of enquiries we receive are cloud/SaaS based, but easy to deploy On-Premise implementation is also a requirement the market is asking solutions for.
  • Customers want to be able to quickly start (or improve) their GDPR data journey within one business area, data source and destination. This means that solution which is quickly scalable to multiple areas, sources and destinations is highly desirable.

We have combined the two-sided challenge of 360° customer journey management under the GDPR with the industry demands to create the Datastreams Platform. This platform allows companies to collect, integrate and process data from a variety of sources in a GDPR-compliant manner. Modules such as our Consent Management Solution can be included in the full infrastructure or implemented separately if desired. Additionally, the system can easily be linked with technologies by many other suppliers and can be implemented rapidly. On-site implementation is also available. Finally, the system is highly user-friendly, allowing DPO’s to take control of their data streams and easily create data-streaming reports when required. Interested in learning more? Request a personal demo.

Collaboration advanced Smart City data in the future

The future of an advanced Smart City

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of smart cities, which are about to become not only a lot smarter, but also a lot more citizen-friendly.

A glimse of the future

The darknet is one of the fastest-growing markets for data. It’s also one of the most dangerous places to sell your data, or buy it, for that matter. The darknet is an encrypted network that can only be accessed through specialized browsers like Tor and I2P. This dark underground marketplace has been used by cybercriminals in order to exchange their wares anonymously on sites like asap market onion. Today, however, there are new darknet markets with more robust security features than ever before. Trends in darknet marketplaces will continue to change as time goes on; but at this moment in history, they’re not slowing down anytime soon!

The idea of a smart city generally conjures up images of a technological metropolis. However, in the future, it won’t just be big capital cities that are powered by smart city technology. The increasing presence and availability of technologies required to make a city smart, will lead every city to become a “smart city”. The smart city of the future runs like clockwork. Traffic lights won’t be needed as the systems of driverless cars and the city’s IoT systems routes traffic intelligently and prevent traffic jams from occurring. Passengers rarely have to wait longer than a few minutes for a car to take them to their destination, as they are all shared and continiously move around in the most efficient route. The city is also clean: smart waste solutions ensure that trash cans around the city are emptied frequently and smartly. On top of that, air quality is constantly monitored, and actions are taken to prevent pollution levels from rising above acceptable levels. The city is not just a smoothy running machine, however. The smart city of the future is, above everything, a people-oriented city. Citizens are always invited to give their opinion on current developments or improvements for the future through an accessible survey system. Additionally, future needs and opportunities of citizens are anticipated ahead of time: elderly citizens are informed about possibilities for assistive technology ahead of time and citizens in need are pro-actively approached about support programs.

The effect of data stream technology

Planning the smart city infrastructure to be set-up in a city requires a thorough understanding of the way a city functions, where it struggles and how it can be improved. Explorative analytics on an integrated set of urban data helps city planners set up smart cities in an optimal manner. Technologies for collecting, combining and running analytics on large amounts of disparate data, will be essential for accomplishing this. To enable seamless traffic management in and around cities requires cars to communicate not only with each other, but also with the current and past traffic and congestion data. This will require an infrastructure capable of rapid real-time data streaming, integration and processing on a large, distributed scale. Such an infrastructure lies at the root of many other smart city initiatives, such as smart waste solutions and air-quality monitoring programs. Running a continuous system for citizen participation requires a survey system that can handle streams of possibly sensitive response data in a secure manner and integrate them to create understandable and accurate reports to reflect citizen feedback. A data streaming platform is the perfect foundation to build such a system on. A smart city relies to a large extent on citizens’ willingness to share their data and to engage with the smart city technology. This is especially important for pro-actively supporting citizens in need. Because of this, citizens need to be confident that their data is handled in a transparent, responsible, secure and compliant manner. A platform that has proven to handle sensitive data in this matter, is crucial for earning citizen trust and support for smart city initiatives. What can you do in the development of a Smart City? Learn how you can collaborate with your partners with our Smart City Solution.

The future of advanced operational excellence in Logistics

As a leading expert in the field of (big) data solutions, we develop the technologies that are going to shape our future. To prepare for the future we analyse the industries we operate in, we share our image of the future of the field of logistics and how technological innovations will push operational excellence to its very limit.

A glimpse of the future

Imagine a warehouse just outside a highly technologically advanced smart city. This small warehouse doesn’t stand out in size, but in its diversity: the warehouse is completely stocked with goods from a variety of clients. Just as a truck is leaving the warehouse, another truck arrives with goods to fill up the space that just opened up. This is not a coincidence, it’s planned; no storage space goes unused. As the cargo leaves the warehouse, clients are informed about the current location and the condition of their goods and billing is handled automatically. If a problem occurs when the goods are loaded on the trucks or in the warehouse, it will instantly be documented and all parties involved will be directly informed.

Outside the warehouse, the driverless truck that has just left the warehouse sets off towards its destinations. It is completely full of packages from a variety of clients with a variety of destinations. Some packages need to be delivered to other warehouses in the distributed warehouse network, others need to be delivered to customer’s home. Upon leaving the warehouse, the ideal route is computed to deliver all the packages. The automatic vehicle drives from destination to destination, using real-time traffic information and weather data to expertly avoid any traffic jams and deliver packages to their intended location at the promised time.  

After all deliveries have been completed, the truck picks up a new load that have been ordered as recently as minutes ago. Newly filled with goods destined for customers and other warehouses in the network, it is ready to make a new set of deliveries. A seamless logistics system running 24 hours a day, 7 days a week, around the globe. 

The impact of data streaming technology

This glimpse of the future might seem an idealistic or futuristic one, but it’s not as far away as you may think. Many of the techniques required to make this dream come true are already being developed and implemented in streaming solutions such as our Datastreams Platform:

  • Ensuring optimal warehouse capacity by instantly utilising newly freed space requires a system to continuously monitor future incoming and outgoing deliveries from a variety of clients. Technologies enabling extensive data collaboration and business-to-business data streaming is an essential part of enabling this kind of dynamic shared warehousing.
  • Optimally handling issues with broken or incomplete goods requires an automated system for recognising, documenting and photographing any issues when they arrive. This information then needs to be disseminated to the relevant parties and integrated into relevant software systems. We built a system capable of doing this by combining our Datastreams Platform architecture with a custom-made app. Read more about it here.
  • Monitoring each client’s storage space currently in use and handling the pay-per-shelf billing for this requires a seamlessly integrated data architecture that is capable of handling incoming and outgoing deliveries in real-time. Technologies for integrating different streams of data in a structured manner will be required to accomplish this.
  • Utilising transport capacity sharing will require a collaboration platform where different clients can pool their data on plans for incoming and outgoing deliveries. A technology platform where organisations feel secure in sharing potentially sensitive data will play an important part in this.
  • Allowing a (driverless) vehicle to deliver packages on will require a lot of data to be collected, streamed and processed in real-time. This is not only required to make such a car functional and safe, but also to allow the routing systems to make decisions based on weather and traffic data to continuously find the optimal route to take. Additionally, integrating the routing system data with the delivery database and customer database will allow customers to continuously be up to date on the status and arrival time of their packages.

The future is not here yet, but we’re building it brick by brick. Want to get a little bit of the future we are building? Our Datastreams platform empowers logistics providers to get a little bit of the future right now, enabling more efficient processes with big data in every part of the supply chain. Want to know more about what it can do? View our page on logistics here.

Major impact on online marketing and analytics |Third-party cookies

ITP 2.1: What is changing and how do we deal with it?

Apple has announced plans to sharpen their ITP (Intelligent Tracking Prevention) regulations for their Safari browser. ITP version 2.1 is now live and instantly has a major impact on digital marketing and analytics due to its handling of third-party cookies. Firefox has indicated a similar tracking prevention, also cracking down in first-party cookies in addition to third-party ones. In this blog we bring you up to speed on what the tracking preventions means for organisations and how we have resolved this for the users of our Datastreams Platform.

What is ITP?

ITP stands for Intelligent Tracking Prevention. It represents Apple’s stand against online tracking and has been causing concerns for companies applying personalised marketing since its first incarnation. The first version started by limiting the possibilities for placing third-party cookies, with later releases increasingly limiting the potential for workarounds and alternatives. The previous version 2.0 blocked the placement of third-party cookies altogether. First-party cookies were largely unaffected by ITP. Until now, with the release of ITP 2.1.

What is changing?

The most important change for organisations engaging in digital marketing in ITP version 2.1 is the way that both first and third-party cookies will be handled. After the update, first-party client-side cookies created through JavaScript’s document.cookie will expire after seven days. Third-party cookies created by domains other than the current website continue to be blocked, as was the case in ITP 2.0.

Where the blocking of third-party cookies had severe consequences for marketeers, the blocking of client-side first-party cookies has the potential to significantly impact analytics. Since site visitors who return after seven days will no longer be counted as returning visitors, current solutions for assessing conversion tracking based on these cookies risk breaking down.

What are we doing about it?

Currently, the solutions to ITP 2.1 are two-fold: first, drastically limit reliance on third-party cookies. DimML, the language at the core of the Datastreams Platform, already enables our users to do this by allowing a script to be delivered through the same domain as the webpage from which it was loaded. The second solution is to place first-party cookies through a server-side method instead of through the client-side document-cookie implementation.

We’ve released a new component within our platform that will allow our customers to integrate our complete Datastreams Platform with all the capabilities within their own domain. This means that the Datastreams Platform is a part of your IT architecture and not a third party application. Data ownership and compliant data management is at the core of our architecture so it will not be effected by ITP2.1.  A core differentiator to many SAAS marketing technology or consent management providers, we give you full control how to manage your first party data, accurate and compliant data ownership driven by our state-of-the-art data architecture.

As the data and privacy landscape continues to change, we will continue to ensure the users of our Datastreams Platform can perform data analysis in an easy, secure and compliant manner. Do you want more information about how we are dealing with the ITP 2.1 update? Contact us!

Clean your database and esnuring that data is valid, complete, stored in the right places and accurate across the organisation.

Why you should (not have to) clean your company database

Spring is here, which means it’s time for a thorough spring cleaning. Aside from cleaning out the unnecessary papers from those clogged filing cabinets, consider turning your attention to your company database this year, because according to recent studies towards the data practices of contemporary organisations you probably need to clean your database.

In a world where companies are growing increasingly data-driven, business success increasingly depends on analytics based on large quantities of high-quality, trusted data. While many organisations are succeeding in acquiring large amounts of data and applying analytics to them, data quality often leaves a lot to be desired. In a study conducted by Experian, 95% of organisations indicated experiencing wasted resources and unnecessary costs due to poor quality data. This is not surprising, since organisations on average believe 29% of their data to be inaccurate, and as is often said in the field of data science: ‘Garbage in is garbage out’.  

It is clear from the percentages above that, statistically, it is highly likely that your company can benefit from a good spring cleaning of your database. Ensuring that data is valid, complete, stored in the right places and accurate across the organisation empowers you to trust your data again. This means you won’t have to waste time and money on marketing campaigns that are based on unreliable analytics. However, cleaning your data can be very time consuming, especially if your data infrastructure is not designed to be managed easily by business professionals. Additionally, data will need to be cleaned regularly to keep your data environment healthy and useable. Luckily, a good data quality monitoring & assurance solution can make your life a lot easier by preventing dirty data from entering your database in the first place and making cleaning a lot easier.

Data professionals know that data cleaning is a key part of any database management strategy. However, just cleaning your data periodically is not enough. If you don’t ensure data quality at the source, polluted data will continue to build up between cleaning sessions, potentially throwing off your analytics. That is why a strategy for validating data at the source, before it is analysed or enters your database, is crucial. Our data Quality and Assurance module increases the overall quality of your data-ecosystem by ensuring  only quality data enters your database and it continuously monitors your data streams to ensure they continue to supply data that is complete and of high quality. This, together with the streamlining and seamless integration of data streams in your company by the main Datastreams platform, ensures companies have a clean and orderly environment to manage their data in.

Implementing our solution does not mean you won’t ever have to clean your data (cleaning is imperative for keeping your data up-to date and removing data you no longer need), but it makes these periodical cleanings a lot less time-consuming. Want to know more about our data Quality and Assurance module and how it works? Visit our page about it.

Talk with Martijn Lamers (Fontys) about data science

Martijn Lamers: a talk about company involvement in contemporary education

As a young, innovative company we love to invite students into our office and have them work with us on one of our projects. Which is one of the many reasons we work together with the Fontys University of Applied Sciences. Martijn Lamers is a teacher and is director of the minor data science. In the context of ‘company case’ assignments, he has supervised many of the students during their time working with us. We invited him for a good talk on data science, student projects and the future of education.

It is clear that Martijn Lamers has a heart for data science, people and teaching. With a background in both psychology and IT, he has a solid understanding of both people and data science. As a company always concerned with keeping the ‘human touch’ in big data alive, we instantly feel a kind of kinship with this well-spoken lecturer.

The role of companies in education
We believe that there is an important role to play for companies in education, now and in the future. Lamers agrees with us, telling us about the so-called ‘proftaak’: a practical project by a group of six students to put their skills into practice. “In this project, students need to run a project from beginning to end, from data collection to reporting. Everything comes together, it’s no longer fragmented and theoretical.” Lamers explains. He tells us that these projects are often done in collaboration with companies, for good reasons.

Lamers indicates four reasons for collaboration between education and companies. The first is that finding enough big, interesting and available datasets for several groups of students is not easy without involving companies. Secondly, he explains that companies know what is happening in the industry, making student projects completed for a company fit better in the contemporary industry. Thirdly, students working with companies makes students feel like they are contributing something to society. “The project doesn’t just disappear in the bin after it’s finished.” He jokes. “It’s much more fun for students to work with real data in a real company, solving real problems.”

Additionally, working for a company is an important part of a student’s personal development. “At some point you need to break through the passive mentality of a student waiting for an assignment.” Working in a company is a good way to teach students a new ‘working’ mentality that they will need when they start their careers.

Finally, Lamers also takes explains why companies are motivated to work with Fontys: “Both parties benefit from the project: companies get access to young, motivated students to help them with their projects and there are limited costs and it allows students to learn from companies and work with real data.”

The future of education
Regarding the future of education, Lamers clearly envisions the role that companies will continue to play. “I think the role of companies will become bigger in the education of the future.” He explains. “At Fontys, we now start inviting companies into the classroom earlier than we did before. It gives students access to the knowledge held by industry professionals.”

It does affect the role as teacher. “Students are sometimes left to ‘figure out’ a lot for themselves, especially when companies are involved. That is not necessarily bad, but it is still important for students to be taught theory and be guided by a teacher.”

Working with Datastreams
We are proud to be one of the companies that Fontys can turn to when a group of students needs a good dataset or exciting project to work on. In fact, five groups have worked with us since last year, with positive results. When asked for his feedback on his and students’ experience in working with us, Lamers praises the fact that we like to provide sizeable sample sets within a short timeframe, allowing students to get to work quickly. Some students have chosen to continue working with us after the project ended, are clear signs that we are doing the right thing!

We understand that, more than ever, to get the most out of their talents, students need to work with real data on real projects in innovative companies. Why not start a collaboration today?

Talented & Young, our reasons why we working with students

Young & Talented: 5 reasons why we love working with students

We are Datastreams, and we love data. We help companies to collaborate with data and create new opportunities. We are always looking for talented students to join our team of data scientists. Why students? Because we love working with people who are like us: smart, talented & ambitious. Want to know more? Here are five more reasons why we love welcoming students to our office.

Students are willing to learn

We are always looking for young talent to share our knowledge with. In our experience, students are open to learning new things and less likely to get bogged down in their presuppositions about how things should work. A willingness to learn and the ability to adapt to new situations is the best skill you can have. You can gain experience over time, after all.

Students are passionate

Students are most passionate about the work they do and highly motivated to use their skills to solve actual challenges in a real business. Student passion is not just a great way to contribute to our projects and it keeps reminding us of why we love what we do.

Students are technologically savvy

As a data science company, we are no strangers to new, innovative technology. In fact, we have developed our fair share ourselves. However, many current students have grown up in a world permeated by IT in every facet of life, making navigating websites and applications second nature. A natural affinity with technology combined with quality education about current and future trends, means students are more equipped than ever to work with complex IT-applications; both now and in the future.

Students are willing to take risks

More so than the generations preceding them, students are willing to take risks. They are willing to go abroad to make memories or to give up a comfortable place to live to find their own footing. It is this willingness to take a gamble that we like in students: instead of working on tried-and-true projects that are industry standards, students are willing to (and often want to) try new, innovative solutions. This makes students perfect candidates to work on more unorthodox, experimental projects. Sometimes all it takes is someone willing to take that leap of faith to get amazing results.

Students are fun!

It’s not all business. The final reason why we like working with students is that students always bring life and energy into our office. Most of our team consists of young adults who still know what it was like to be a student and enjoy interacting with them. Whether they play in our office foosball tournament, organise ice-skating trips or just regaling us with stories about the student-life, it’s always more fun when we have a student (or two) in our office.

Who will be our next student colleague?

Reasons why students love working with Datastreams

Why students love working with us

At Datastreams we always got some students helping around the office. We already discussed the reasons we love to work with these students in our blog ‘5 reasons we love working with students’. However, every story has two sides, so let’s look at the reasons students love working with us (according to two of our current student employees).

1. Our students work on varied projects with real data

Students have told us that one of the best parts of working with us is the opportunity to work with real data. Students working with us get the opportunity to work on a variety of projects for real clients. This doesn’t only allow students to experience the trials and tribulations that can come with working with real data, but also to see their projects implemented by companies: “Many of the projects I’ve worked on are still being used.” One of our employees told us. “It felt good to work on real, useful projects in addition to studying.”

2. Our office is a great learning environment

Learning about data science and IT at school is very valuable, but our students often tell us that being immersed in our data-driven environment is a fantastic learning experience. Data is the lifeblood of our company; it is our core business and ingrained in everything we do. Maybe it is because of this that students tell us that they learn a lot by listening to our data-professionals and working on innovative projects. Got a question about data science? Great chance someone in our office knows the answer.

3. We offer flexible working hours

From our experience with students, we know that sometimes they have a lot of time to work and sometimes it’s exam week and they are completely swamped. We emphasize. In fact, many of us still vividly remember it. That’s why we offer our students flexible working hours and the possibility to work from home.

4. Our office is never boring

Our office is full of young, enthusiastic, friendly people who are as happy to talk about data science as they are to share a beer or play foosball. That’s why it is never boring in our building.

Are you interested in working with us or do you want to know more about who we are and what we do? Don’t hesitate to contact us!

Data collection and privacy activities, Datastreams helps you

Dear Santa, people don’t want to be on your list anymore

Dear Santa Claus, last year we expressed our concerns about your data collecting activities. We advised you to make some big changes to avoid being fined under the GDPR. One of the changes we suggested was to ask people for consent before tracking them with your Elv3s software. We know that these lists are an important part of your business and to help you give everybody the perfect personalised present, but we think it really might be time to change with the times. Because, Santa, as it turns out, more and more people don’t want to be on your list anymore.

According to data gathered across our own platforms, the number of people indicating not wanting to be tracked is steadily increasing. We have observed a 26% increase in do not track headers in the last three months across our platforms. It seems clear that many people wish not to have their behaviour tracked. We know that you understand, more than anyone else, the importance of granting people’s wishes. It is how you earned your jolly reputation, after all!

We understand that finding the perfect presents for people who don’t want to be on your list might be a bit more difficult. However, we are sure you will find a way to make everybody smile this Christmas, whether they are on your list or not. Merry Christmas, Santa!

Ps. If you need some help getting your consent practices up to date, we are happy to help you. That’s our gift to you, Santa!

Data changing the world, quality, analytics, privacy

The world of data is changing

The world of data is constantly changing and evolving. New technologies, legislations and policies pressure companies to re-examine the way they deal with data. Because it’s better to be prepared than to be surprised; here are five interesting ways of how the world of data is changing.

1. More and stricter legislations

The General Data Protection Regulation (GDPR) was not the first legislation cracking down on irresponsible data use, and it certainly won’t be the last. Government and non-government institutions around the world are establishing new policies and laws for processing data in a more ethical, transparent and secure manner. Some examples of recent laws are the California Consumer Protection Act (CCPA), the Indian Data Protection Bill and the Brazilian General Data Privacy Law (Lei Geral de Proteção de Dados Pessoais or “LGPD”). Even in Africa, a continent where more than half the countries have no data protection law, change might be on the horizon with Kenya drafting a new law to protect customer data. With a future of increased legislative pressure, solutions built for compliant consent and encryption are becoming increasingly important.

2. People are more aware than ever

The days that data subjects were ignorant of the data being collected about them are gone. In May 2018 the Global Alliance of Data-Driven Marketing Associations (GDMA) published their research on global privacy attitudes, based on a survey conducted in November 2017 across ten countries. The report showed that while the majority of the respondents was prepared to share their data, 74% of respondents reported being ‘concerned’ about their online privacy, with 83% indicating wanting more control over the data they share.

With incidents like the scandal of Facebook and Cambridge Analytica occurring earlier this year and the GDPR coming into effect, awareness of online privacy has only increased. A study by Janrain conducted in the US showed that 57% of respondents had increased concerns about their data privacy as a result of the Cambridge Analytica scandal. Additionally, according to a survey by SAS, a quarter of consumers in the UK and Ireland have already exercised their GDPR rights. It is clear that people have woken up to the issue of privacy and will likely grow in their understanding and awareness of online privacy issues. Since trust is the foremost reason that customers are willing to share their data with companies (as reported by the SAS survey), building trust through transparency is key in ensuring customers will continue to share data, even if new legislations and policies give them increasingly more power to stop doing so.

3. Privacy-conscious browsers are on the rise

On August 31st, Mozilla announced that it would start implementing changes to the Firefox browser to protect their users’ privacy. Future versions of Firefox will block web trackers by default, meaning users won’t need to take any action to prevent companies from following them across the web. Firefox is not unique in offering this do-not-track option, but is unique in making it the default option. While Chrome is still overwhelmingly the most frequently used browser, the focus on privacy by its number-one competitor combined with doubts about its own incognito privacy mode, might cause privacy-conscious individuals to make the jump.

In addition to established browsers like Firefox making changes to ensure user privacy, new browsers with a focus on privacy also appear to be on the rise. TOR has long been a popular choice to avoid tracking, but other browsers like Epic and Brave have also opted to target the privacy-conscious market. The latter makes use of the anonymous search engine DuckDuckGo and integrates with TOR to make private browsing via its ‘onion network’ easy and fast. On August 28th, Brave announced surpassing 10 million downloads on Android (up from 1.5 million in April).

While most internet users likely won’t make the switch from Chrome to a different browser soon, data analysts and marketers would do well to look at how other browsers are implementing privacy measures, if only to know what to expect when Google is put under pressure to make similar changes.

4. More advanced predictive analytics

The way we collect data may be changing, but so are the things we can do with our data. Predictive analytics using artificial intelligence, deep learning and machine learning are increasingly finding their place in marketing, allowing marketers to not only look to the past, but also into the future when using data. Technologically savvy companies use these advanced techniques to predict customer behaviour, identify potential leads and target customers at the right time with the right products. Research suggests that the investment is worth it: companies using predictive analytics are twice as likely to identify high-value customers, according to a study by Aberdeen Group.

Under the GDPR, many companies are faced with a push towards (partial) anonymisation of data. Technologies such as machine learning might grow in popularity as less data contains personal identifiers. This is because aside from predicting the value or behaviour of a single prospect, these technologies also work well on aggregated datasets without personal identifiers. In this way, they are useful for predicting the behaviour of groups, making them invaluable for predicting various types of market trends. Under the GDPR, then, predictive modelling using machine learning might prove to be instrumental in helping companies deal with larger amounts of (pseudo)anonymous data.

When touting the possibilities of technologies such as machine learning, we would be remiss not to remark upon a possible issue of machine learning in line with the GDPR.  Under the GDPR, customers have the right to be informed of how their data will be used and to opt-out of automated decision-making practices. Some experts have suggested that this is difficult to gel with machine learning, as machine learning models are generally not concerned with why specific choices are made. According to critics, constantly adapting ‘black box’ machine learning models cannot be adequately explained to data subjects, making informed consent for data collection impossible. We believe that predictive analytics, including machine learning, are still very much possible under the GDPR, but anyone working with these technologies should be mindful of issues like these and handle them with appropriate care.

5. Focus on data quality, not quantity

Despite the seemingly limitless promises of so-called ‘big data’, many marketers are still drowning in data. Partially to blame for this is the misconception that ‘more data is better’. While this is true to the extent that most predictive models work best with large amounts of data, the importance of data quality should not be underestimated. Both scientists and business experts are now pressing the importance of quality data, the latter stating that bad data can have many negative effects, like wasting time and increasing costs.

Quality data is more important than ever as machine learning becomes more popular. For a model to accurately learn and predict customer behaviour, the data set used to train the model needs to be as accurate as possible. Many companies currently have access to fairly large datasets, but many of these are messy and unorganised. As the use of advanced data mining techniques in marketing and analytics increases, it will be the companies that focus on data quality, instead of just quantity, that will have access to the most reliable models and all the valuable insights that come with them.

Change can be scary, but it can also be good. It’s the organisations that anticipate changes and plan ahead that thrive in data-driven industries. What changes do you anticipate and how are you preparing for them?