The Future Of Healtcare, enabled by data streaming

Dreams of the future: Healthcare

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of healthcare, which will become more customer friendly and efficient as we learn to leverage the power of data streaming.

A peek into the future

A patient walks into the clinic or a hospital. He suddenly started feeling ill during his holiday and an online consult with his own physician has resulted in the advice to see a GP locally. He goes to the GP, who has already received his patient file and the information from the consult with his own physician. Upon inspection, the GP refers our patient to the hospital to get further examination. When he does, the hospital is already aware of who he is, what his symptoms are and any other conditions they should be aware of. This means the hospital is already aware of the patient’s allergy to penicillin, even if he forgets to mention it himself. After being examined (using an MRI scan, amongst other equipment) the patient is swiftly diagnosed. Luckily, his condition does not require a stay in the hospital. He receives his treatment and is sent on his way with a prescription and a healthcare wearable to monitor his condition remotely. When he arrives back home, his practitioner is aware of everything and takes over care from the hospital. Additionally, a refill of his medication is already ready at his local pharmacy for pickup. Even though different institutions and people were involved in the care of this patient, he experienced a seamless medical journey.

Enabled by data streaming

The seamless healthcare journey our patient experiences, is made possible by data streaming solutions enabling seamless data-driven collaboration. Take a look behind the scenes:

  • Solutions for patient records in a secure manner ensure that healthcare practitioners already know vital information about patients and their symptoms when they arrive. This makes care more personal, efficient and safe. It also makes it easy for our patients to switch between healthcare practitioners.
  • Our patient doesn’t know it, but the MRI machine used to examine him is part of a healthcare equipment sharing programme. This means that the MRI machine is also used by other healthcare providers and the MRI images are securely streamed to the appropriate institution. This makes healthcare equipment more affordable for smaller institutions. Solutions for handling this type of MRI imaging data responsibly are required to bring such equipment sharing programmes to life.
  • The healthcare wearable our patient wears continuously monitors his biometric data and streams this data to his healthcare practitioner. A solution dedicated to processing and responding to this real-time data allows healthcare practitioners to respond to any warning signals before our patient even notices them.
  • During this visit, our patient has caused plenty of treatment and administrative data to be generated. After leaving the hospital, this data is stored anonymously and used to improve care in the future. Solutions for processing and analysing large amounts of data are incremental for doing this in an efficient manner. Using a solution like the Datastreams platform, this data can also be shared with other healthcare institutions to improve R&D and enable faster healthcare innovation.

The future is not here yet, but we’re moving towards it every day. Ready to move towards the future with us? Read more about how our Datastreams Platform can help healthcare institutions here.

Leading expert in the field of (big) data solutions, Finance future

Dreams of the future: Finance

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of financial service institutions, which will only become more data-driven as pressure on these institutions increase. Today we present a snapshot of the financial institution of the future and the data streaming technologies that power it.

A view of the future

The financial institution of the future is faster and more customer-friendly than the financial institutions of our current time. When interacting with their bank online, the customer has a seamless customer experience that is tailored to their preferences. Additionally, they are proactively approached about possible issues or opportunities. For instance, a customer might be approached about the possibility of scheduling a weekly payment when the system notices weekly manual transfers or be asked if they have made an error when exhibiting abnormal behaviour (i.e. transferring €1050 instead of €10,50). These suggestions are frequently made at relevant times while the customer is actively interacting with the system, making banking more personal, engaging and meaningful. 

Customers in the future also enjoy a more open and seamless financial experience across platforms. Open banking regulations mean they can choose from a plethora of third-party applications for managing their finances in a seamless and easy manner. The financial customer of the future has no problem consenting to their data being shared for this purpose, trusting the high security and privacy standards upheld by financial organisations. 

Finally, aside from offering a better banking experience for customers, the financial institution of the future is also much safer and less likely to facilitate fraud or money laundering. Banks know their customers and can instantly detect behaviours that are indicative of (future) fraudulent behaviour or that suggest an account has been compromised. When warning signs are detected, immediate action can be taken and financial crimes such as fraud and money laundering can be prevented in real-time.   

Enabled by data streaming technology

The future of the financial industry described above will be enabled for a large part by (real-time) data integration, streaming data analytics and event stream processing.

  • More personal customer experiences are achieved by utilising some of the over 2.5 exabytes of data that customers are generating daily. Bringing together customer data from different sources and performing analytics on this integrated source of customer data helps institutions know their customers and engage them in a more personal manner with marketing or customer relation messages.
  • Real-time analytics and event-stream processing, combined with historical customer data, are at the core of enabling real-time personalised engagement with customers. They allow financial institutions to respond to customer behaviour as it is happening, enabling timely, relevant cross-selling and immediate responses to uncharacteristic behaviour. It also enables issues to be detected and resolved pro-actively before they hinder a customer’s online experience.
  • Sharing customer data with certified third-parties upon request while also adhering the strict GDPR-regulations regarding data sharing and customer consent, requires a solution capable of performing a delicate balancing act. A platform like the Datastreams Platform that is dedicated both to data governance and data collaboration, allows financial institutions to build a system where financial data can be shared safely, securely and seamlessly.
  • Real-time analytics and event-stream processing enable organisations to monitor ongoing transactions and immediately recognise suspicious, fraudulent patterns. This allows banks to detect fraud within seconds and shut down transactions before any damage is done. Comparing ongoing transaction data with historic transaction data also allows identification of uncharacteristic customer behaviour, potentially safeguarding customers from losing money in cases of hacked or stolen accounts.

The future is not here yet, but we’re moving towards it every day. Ready to move towards the future with us? Read more about how our Datastreams Platform can help financial institutions here.

Data streaming, the future of education

Dreams of the future: Education & research

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of Education and research, which will allow research institutions to become more efficient, collaborative and student-friendly through data streaming technologies.

The Future of education

In the research and education institution of the future, collaboration is central. Collaboration between research institutions and the corporate world, between researchers from different research institutions and collaboration between students and researchers. Data-driven research is no longer largely limited to a single institution. Instead, each research institution has data collaboration solutions set up that allow researchers to easily combine data collected in their experiments with open source data and data by previous researchers from other institutions. The data collected and generated by their own research is subsequently shared with other research institutions upon request. This improves the data available to researchers in the network of research institutions, improving research outcomes and replicability.

The data-driven collaboration in the research institutions of the future also extends to students. As part of their studies, students are invited to participate in research projects from their own university or from external research organisations. This allows students to gain experience in working with real data, improving the level and practical applicability of their education. It also enables research institutions to conduct faster and more efficient research by tapping into the passion and skill of students learning about state-of-the-art analytics and research methods.

Enabled by data streaming

The future of education and research we envision will require research and education organisations to share (research) data with each other in a safe and secure manner. It will also require these organisations to be able to use data from other third-party sources and integrate it with their own in-house data easily and reliably. Platforms designed for data collaboration will enable organisations to do so seamlessly, providing a controlled and monitored, yet easily accessible, infrastructure for secure data sharing between organisations. These platforms will also allow researchers and students to easily integrate data from different sources for their research projects without requiring extensities technological skills.

The future is not here yet, but we’re moving towards it every day. Ready to move towards the future with us? Read more about how our Datastreams Platform can help education and research institutions here.

GDPR compliant solutions, Customer Journey 360 data

Solutions for 360 customer journey data under the GDPR

In 2016, Gartner recognised the market need for improved customer journey analytics. In that same year, the GDPR was adopted by the European Parliament, drastically limiting the data collection activities at the core of these analytics. Since the regulation went into effect last year, we have been keeping an eye on current discussions in data-driven organisations. We have noticed that many companies desire to implement a 360° customer journey data strategy but are not sure how to do this in an effective and GDPR-compliant manner. We discuss the current landscape of 360° customer journey analytics under the GDPR and examine what data-driven businesses look for in relevant solutions.

A challenge in data wrangling

360° customer journeys require the ability for organisations to handle the ever-increasing volume, variety, velocity and veracity of all the ‘Big Data’ present in their databases. Many organisations struggle to do so, often because their data is distributed across many silos and it is owned by different stakeholders and teams. This causes marketing analysts use many different tools to leverage this data. Web and mobile analytics, social analytics, media analytics, customer journey analytics and voice of the customer analytics are all performed on different datasets with different tools.

Using many different tools on different datasets is not only inefficient, but also hinders attempts to optimise customer journey analytics. The key downfall behind 360° customer journey analytics is, in the words of Gartner; “Today’s marketing analyst uses too many tools and all of these are looking at bits and pieces of what to a person is – whether called a target or audience or prospect or customer, it’s still a person, people – a single relationship”. Paramount in 360° customer journey analytics is therefore bringing together these different sources of information into an integrated customer profile. Therefore, 360° customer journey requires effective solutions for streaming the right data, in the right place, in the right format, at the right time and doing so any time, all the time.

A challenge in GDPR compliance

With the GDPR in effect, platforms capable of facilitating integrated data streaming are no longer enough to ensure complaint customer journey analytics. Platforms also need to incorporate the required functions and restrictions that allow customer journey analytics in a GDPR-compliant manner. This involves asking customers for informed consent before collecting their data, storing it securely according to GDPR-standards, processing it according to Privacy by Design principles and deleting all traces of a customer when they ask for it. To ensure this can be accomplished, a solution including a strong data governance layer is paramount.

Additional Industry demands

We have established that 360° customer journey analytics under the GDPR requires a solution that is capable of not only integrating data from a variety of sources in a variety of formats, but also of doing so in a fully GDPR-compliant manner. The industry places additional requirements and demands on solutions for 360° customer journey data management. We have observed the following main themes in what DPO’s and Chief Privacy, Revenue, Data and Marketing officers look for in solutions:

  • While some organisations are seeking complete end to end GDPR solutions from one solution provider, this isn’t always the case. Desires for solutions regarding specific areas within the legislative requirements are prevalent, especially universal consent management and data stream governance.
  • Organisations want enhanced data security, governance and auditing user access controls built in. Key users of GDPR solutions are going to be DPOs, who as part of their enhanced role under the GDPR, should be able to advise, view, take control and sign off on data streaming from one place to another. They also need to be able to report and audit this when it comes to fulfilling GDPR data subject rights.
  • We are seeing a lot of traction from enquiries which are about building collaborative partnerships combining smart, specialist technology from multiple suppliers.
  • The majority of enquiries we receive are cloud/SaaS based, but easy to deploy On-Premise implementation is also a requirement the market is asking solutions for.
  • Customers want to be able to quickly start (or improve) their GDPR data journey within one business area, data source and destination. This means that solution which is quickly scalable to multiple areas, sources and destinations is highly desirable.

We have combined the two-sided challenge of 360° customer journey management under the GDPR with the industry demands to create the Datastreams Platform. This platform allows companies to collect, integrate and process data from a variety of sources in a GDPR-compliant manner. Modules such as our Consent Management Solution can be included in the full infrastructure or implemented separately if desired. Additionally, the system can easily be linked with technologies by many other suppliers and can be implemented rapidly. On-site implementation is also available. Finally, the system is highly user-friendly, allowing DPO’s to take control of their data streams and easily create data-streaming reports when required. Interested in learning more? Request a personal demo.

Collaboration advanced Smart City data in the future

The future of an advanced Smart City

As a leading expert in the field of (big) data solutions, we help develop the technologies that are going to shape our future. In this blog we discuss the future of smart cities, which are about to become not only a lot smarter, but also a lot more citizen-friendly.

A glimse of the future

The darknet is one of the fastest-growing markets for data. It’s also one of the most dangerous places to sell your data, or buy it, for that matter. The darknet is an encrypted network that can only be accessed through specialized browsers like Tor and I2P. This dark underground marketplace has been used by cybercriminals in order to exchange their wares anonymously on sites like asap market onion. Today, however, there are new darknet markets with more robust security features than ever before. Trends in darknet marketplaces will continue to change as time goes on; but at this moment in history, they’re not slowing down anytime soon!

The idea of a smart city generally conjures up images of a technological metropolis. However, in the future, it won’t just be big capital cities that are powered by smart city technology. The increasing presence and availability of technologies required to make a city smart, will lead every city to become a “smart city”. The smart city of the future runs like clockwork. Traffic lights won’t be needed as the systems of driverless cars and the city’s IoT systems routes traffic intelligently and prevent traffic jams from occurring. Passengers rarely have to wait longer than a few minutes for a car to take them to their destination, as they are all shared and continiously move around in the most efficient route. The city is also clean: smart waste solutions ensure that trash cans around the city are emptied frequently and smartly. On top of that, air quality is constantly monitored, and actions are taken to prevent pollution levels from rising above acceptable levels. The city is not just a smoothy running machine, however. The smart city of the future is, above everything, a people-oriented city. Citizens are always invited to give their opinion on current developments or improvements for the future through an accessible survey system. Additionally, future needs and opportunities of citizens are anticipated ahead of time: elderly citizens are informed about possibilities for assistive technology ahead of time and citizens in need are pro-actively approached about support programs.

The effect of data stream technology

Planning the smart city infrastructure to be set-up in a city requires a thorough understanding of the way a city functions, where it struggles and how it can be improved. Explorative analytics on an integrated set of urban data helps city planners set up smart cities in an optimal manner. Technologies for collecting, combining and running analytics on large amounts of disparate data, will be essential for accomplishing this. To enable seamless traffic management in and around cities requires cars to communicate not only with each other, but also with the current and past traffic and congestion data. This will require an infrastructure capable of rapid real-time data streaming, integration and processing on a large, distributed scale. Such an infrastructure lies at the root of many other smart city initiatives, such as smart waste solutions and air-quality monitoring programs. Running a continuous system for citizen participation requires a survey system that can handle streams of possibly sensitive response data in a secure manner and integrate them to create understandable and accurate reports to reflect citizen feedback. A data streaming platform is the perfect foundation to build such a system on. A smart city relies to a large extent on citizens’ willingness to share their data and to engage with the smart city technology. This is especially important for pro-actively supporting citizens in need. Because of this, citizens need to be confident that their data is handled in a transparent, responsible, secure and compliant manner. A platform that has proven to handle sensitive data in this matter, is crucial for earning citizen trust and support for smart city initiatives. What can you do in the development of a Smart City? Learn how you can collaborate with your partners with our Smart City Solution.

The future of advanced operational excellence in Logistics

As a leading expert in the field of (big) data solutions, we develop the technologies that are going to shape our future. To prepare for the future we analyse the industries we operate in, we share our image of the future of the field of logistics and how technological innovations will push operational excellence to its very limit.

A glimpse of the future

Imagine a warehouse just outside a highly technologically advanced smart city. This small warehouse doesn’t stand out in size, but in its diversity: the warehouse is completely stocked with goods from a variety of clients. Just as a truck is leaving the warehouse, another truck arrives with goods to fill up the space that just opened up. This is not a coincidence, it’s planned; no storage space goes unused. As the cargo leaves the warehouse, clients are informed about the current location and the condition of their goods and billing is handled automatically. If a problem occurs when the goods are loaded on the trucks or in the warehouse, it will instantly be documented and all parties involved will be directly informed.

Outside the warehouse, the driverless truck that has just left the warehouse sets off towards its destinations. It is completely full of packages from a variety of clients with a variety of destinations. Some packages need to be delivered to other warehouses in the distributed warehouse network, others need to be delivered to customer’s home. Upon leaving the warehouse, the ideal route is computed to deliver all the packages. The automatic vehicle drives from destination to destination, using real-time traffic information and weather data to expertly avoid any traffic jams and deliver packages to their intended location at the promised time.  

After all deliveries have been completed, the truck picks up a new load that have been ordered as recently as minutes ago. Newly filled with goods destined for customers and other warehouses in the network, it is ready to make a new set of deliveries. A seamless logistics system running 24 hours a day, 7 days a week, around the globe. 

The impact of data streaming technology

This glimpse of the future might seem an idealistic or futuristic one, but it’s not as far away as you may think. Many of the techniques required to make this dream come true are already being developed and implemented in streaming solutions such as our Datastreams Platform:

  • Ensuring optimal warehouse capacity by instantly utilising newly freed space requires a system to continuously monitor future incoming and outgoing deliveries from a variety of clients. Technologies enabling extensive data collaboration and business-to-business data streaming is an essential part of enabling this kind of dynamic shared warehousing.
  • Optimally handling issues with broken or incomplete goods requires an automated system for recognising, documenting and photographing any issues when they arrive. This information then needs to be disseminated to the relevant parties and integrated into relevant software systems. We built a system capable of doing this by combining our Datastreams Platform architecture with a custom-made app. Read more about it here.
  • Monitoring each client’s storage space currently in use and handling the pay-per-shelf billing for this requires a seamlessly integrated data architecture that is capable of handling incoming and outgoing deliveries in real-time. Technologies for integrating different streams of data in a structured manner will be required to accomplish this.
  • Utilising transport capacity sharing will require a collaboration platform where different clients can pool their data on plans for incoming and outgoing deliveries. A technology platform where organisations feel secure in sharing potentially sensitive data will play an important part in this.
  • Allowing a (driverless) vehicle to deliver packages on will require a lot of data to be collected, streamed and processed in real-time. This is not only required to make such a car functional and safe, but also to allow the routing systems to make decisions based on weather and traffic data to continuously find the optimal route to take. Additionally, integrating the routing system data with the delivery database and customer database will allow customers to continuously be up to date on the status and arrival time of their packages.

The future is not here yet, but we’re building it brick by brick. Want to get a little bit of the future we are building? Our Datastreams platform empowers logistics providers to get a little bit of the future right now, enabling more efficient processes with big data in every part of the supply chain. Want to know more about what it can do? View our page on logistics here.

Major impact on online marketing and analytics |Third-party cookies

ITP 2.1: What is changing and how do we deal with it?

Apple has announced plans to sharpen their ITP (Intelligent Tracking Prevention) regulations for their Safari browser. ITP version 2.1 is now live and instantly has a major impact on digital marketing and analytics due to its handling of third-party cookies. Firefox has indicated a similar tracking prevention, also cracking down in first-party cookies in addition to third-party ones. In this blog we bring you up to speed on what the tracking preventions means for organisations and how we have resolved this for the users of our Datastreams Platform.

What is ITP?

ITP stands for Intelligent Tracking Prevention. It represents Apple’s stand against online tracking and has been causing concerns for companies applying personalised marketing since its first incarnation. The first version started by limiting the possibilities for placing third-party cookies, with later releases increasingly limiting the potential for workarounds and alternatives. The previous version 2.0 blocked the placement of third-party cookies altogether. First-party cookies were largely unaffected by ITP. Until now, with the release of ITP 2.1.

What is changing?

The most important change for organisations engaging in digital marketing in ITP version 2.1 is the way that both first and third-party cookies will be handled. After the update, first-party client-side cookies created through JavaScript’s document.cookie will expire after seven days. Third-party cookies created by domains other than the current website continue to be blocked, as was the case in ITP 2.0.

Where the blocking of third-party cookies had severe consequences for marketeers, the blocking of client-side first-party cookies has the potential to significantly impact analytics. Since site visitors who return after seven days will no longer be counted as returning visitors, current solutions for assessing conversion tracking based on these cookies risk breaking down.

What are we doing about it?

Currently, the solutions to ITP 2.1 are two-fold: first, drastically limit reliance on third-party cookies. DimML, the language at the core of the Datastreams Platform, already enables our users to do this by allowing a script to be delivered through the same domain as the webpage from which it was loaded. The second solution is to place first-party cookies through a server-side method instead of through the client-side document-cookie implementation.

We’ve released a new component within our platform that will allow our customers to integrate our complete Datastreams Platform with all the capabilities within their own domain. This means that the Datastreams Platform is a part of your IT architecture and not a third party application. Data ownership and compliant data management is at the core of our architecture so it will not be effected by ITP2.1.  A core differentiator to many SAAS marketing technology or consent management providers, we give you full control how to manage your first party data, accurate and compliant data ownership driven by our state-of-the-art data architecture.

As the data and privacy landscape continues to change, we will continue to ensure the users of our Datastreams Platform can perform data analysis in an easy, secure and compliant manner. Do you want more information about how we are dealing with the ITP 2.1 update? Contact us!

Clean your database and esnuring that data is valid, complete, stored in the right places and accurate across the organisation.

Why you should (not have to) clean your company database

Spring is here, which means it’s time for a thorough spring cleaning. Aside from cleaning out the unnecessary papers from those clogged filing cabinets, consider turning your attention to your company database this year, because according to recent studies towards the data practices of contemporary organisations you probably need to clean your database.

In a world where companies are growing increasingly data-driven, business success increasingly depends on analytics based on large quantities of high-quality, trusted data. While many organisations are succeeding in acquiring large amounts of data and applying analytics to them, data quality often leaves a lot to be desired. In a study conducted by Experian, 95% of organisations indicated experiencing wasted resources and unnecessary costs due to poor quality data. This is not surprising, since organisations on average believe 29% of their data to be inaccurate, and as is often said in the field of data science: ‘Garbage in is garbage out’.  

It is clear from the percentages above that, statistically, it is highly likely that your company can benefit from a good spring cleaning of your database. Ensuring that data is valid, complete, stored in the right places and accurate across the organisation empowers you to trust your data again. This means you won’t have to waste time and money on marketing campaigns that are based on unreliable analytics. However, cleaning your data can be very time consuming, especially if your data infrastructure is not designed to be managed easily by business professionals. Additionally, data will need to be cleaned regularly to keep your data environment healthy and useable. Luckily, a good data quality monitoring & assurance solution can make your life a lot easier by preventing dirty data from entering your database in the first place and making cleaning a lot easier.

Data professionals know that data cleaning is a key part of any database management strategy. However, just cleaning your data periodically is not enough. If you don’t ensure data quality at the source, polluted data will continue to build up between cleaning sessions, potentially throwing off your analytics. That is why a strategy for validating data at the source, before it is analysed or enters your database, is crucial. Our data Quality and Assurance module increases the overall quality of your data-ecosystem by ensuring  only quality data enters your database and it continuously monitors your data streams to ensure they continue to supply data that is complete and of high quality. This, together with the streamlining and seamless integration of data streams in your company by the main Datastreams platform, ensures companies have a clean and orderly environment to manage their data in.

Implementing our solution does not mean you won’t ever have to clean your data (cleaning is imperative for keeping your data up-to date and removing data you no longer need), but it makes these periodical cleanings a lot less time-consuming. Want to know more about our data Quality and Assurance module and how it works? Visit our page about it.

Talk with Martijn Lamers (Fontys) about data science

Martijn Lamers: a talk about company involvement in contemporary education

As a young, innovative company we love to invite students into our office and have them work with us on one of our projects. Which is one of the many reasons we work together with the Fontys University of Applied Sciences. Martijn Lamers is a teacher and is director of the minor data science. In the context of ‘company case’ assignments, he has supervised many of the students during their time working with us. We invited him for a good talk on data science, student projects and the future of education.

It is clear that Martijn Lamers has a heart for data science, people and teaching. With a background in both psychology and IT, he has a solid understanding of both people and data science. As a company always concerned with keeping the ‘human touch’ in big data alive, we instantly feel a kind of kinship with this well-spoken lecturer.

The role of companies in education
We believe that there is an important role to play for companies in education, now and in the future. Lamers agrees with us, telling us about the so-called ‘proftaak’: a practical project by a group of six students to put their skills into practice. “In this project, students need to run a project from beginning to end, from data collection to reporting. Everything comes together, it’s no longer fragmented and theoretical.” Lamers explains. He tells us that these projects are often done in collaboration with companies, for good reasons.

Lamers indicates four reasons for collaboration between education and companies. The first is that finding enough big, interesting and available datasets for several groups of students is not easy without involving companies. Secondly, he explains that companies know what is happening in the industry, making student projects completed for a company fit better in the contemporary industry. Thirdly, students working with companies makes students feel like they are contributing something to society. “The project doesn’t just disappear in the bin after it’s finished.” He jokes. “It’s much more fun for students to work with real data in a real company, solving real problems.”

Additionally, working for a company is an important part of a student’s personal development. “At some point you need to break through the passive mentality of a student waiting for an assignment.” Working in a company is a good way to teach students a new ‘working’ mentality that they will need when they start their careers.

Finally, Lamers also takes explains why companies are motivated to work with Fontys: “Both parties benefit from the project: companies get access to young, motivated students to help them with their projects and there are limited costs and it allows students to learn from companies and work with real data.”

The future of education
Regarding the future of education, Lamers clearly envisions the role that companies will continue to play. “I think the role of companies will become bigger in the education of the future.” He explains. “At Fontys, we now start inviting companies into the classroom earlier than we did before. It gives students access to the knowledge held by industry professionals.”

It does affect the role as teacher. “Students are sometimes left to ‘figure out’ a lot for themselves, especially when companies are involved. That is not necessarily bad, but it is still important for students to be taught theory and be guided by a teacher.”

Working with Datastreams
We are proud to be one of the companies that Fontys can turn to when a group of students needs a good dataset or exciting project to work on. In fact, five groups have worked with us since last year, with positive results. When asked for his feedback on his and students’ experience in working with us, Lamers praises the fact that we like to provide sizeable sample sets within a short timeframe, allowing students to get to work quickly. Some students have chosen to continue working with us after the project ended, are clear signs that we are doing the right thing!

We understand that, more than ever, to get the most out of their talents, students need to work with real data on real projects in innovative companies. Why not start a collaboration today?

Talented & Young, our reasons why we working with students

Young & Talented: 5 reasons why we love working with students

We are Datastreams, and we love data. We help companies to collaborate with data and create new opportunities. We are always looking for talented students to join our team of data scientists. Why students? Because we love working with people who are like us: smart, talented & ambitious. Want to know more? Here are five more reasons why we love welcoming students to our office.

Students are willing to learn

We are always looking for young talent to share our knowledge with. In our experience, students are open to learning new things and less likely to get bogged down in their presuppositions about how things should work. A willingness to learn and the ability to adapt to new situations is the best skill you can have. You can gain experience over time, after all.

Students are passionate

Students are most passionate about the work they do and highly motivated to use their skills to solve actual challenges in a real business. Student passion is not just a great way to contribute to our projects and it keeps reminding us of why we love what we do.

Students are technologically savvy

As a data science company, we are no strangers to new, innovative technology. In fact, we have developed our fair share ourselves. However, many current students have grown up in a world permeated by IT in every facet of life, making navigating websites and applications second nature. A natural affinity with technology combined with quality education about current and future trends, means students are more equipped than ever to work with complex IT-applications; both now and in the future.

Students are willing to take risks

More so than the generations preceding them, students are willing to take risks. They are willing to go abroad to make memories or to give up a comfortable place to live to find their own footing. It is this willingness to take a gamble that we like in students: instead of working on tried-and-true projects that are industry standards, students are willing to (and often want to) try new, innovative solutions. This makes students perfect candidates to work on more unorthodox, experimental projects. Sometimes all it takes is someone willing to take that leap of faith to get amazing results.

Students are fun!

It’s not all business. The final reason why we like working with students is that students always bring life and energy into our office. Most of our team consists of young adults who still know what it was like to be a student and enjoy interacting with them. Whether they play in our office foosball tournament, organise ice-skating trips or just regaling us with stories about the student-life, it’s always more fun when we have a student (or two) in our office.

Who will be our next student colleague?