Data and AI - how to use it to create value now

01 | 2023 Ken Ikeda, Hands-on Azure / Solution Architect and Senior Consultant

 

Understanding the needs of the customer and end-users is everything. Don't start big implementation projects or IT solutions until you know what the benefits are for your users. Take into account that achieving real business benefits may require not only customising systems but also streamlining your own processes. Then you can start working together on value-added projects.

Just 5 years ago, companies were dreaming of using artificial intelligence and real-time data. Terms like deep learning, machine learning, IoT Platform were at the top of the 2017 Gartner Emerging Technology curve. And companies were conducting various studies and experiments on the topics.

Many succeeded in developing a solution that made the user's daily life easier, but for many, AI and IoT remained experiments.

Now the world looks like AI and real-time data are being widely used and are part of our everyday lives at work and at home. Many of us have a robotic vacuum cleaner in our homes, working autonomously with real-time data. Many optimise their own electricity consumption by monitoring electricity price fluctuations via the Fingrid mobile app.

Ken is a seasoned data expert and Hands-on Azure/solution architect with over 20 years of experience.

Speaking of AI, you can't help but love ChatGPT. It's clear that AI-based chatbots like ChatGPT are going to revolutionise the everyday life of a software developer like me. They already help us with design, coding and technical documentation. ChatGPT has only just been released, so I can only imagine what it will be capable of in a few years.

When data steers users in the right direction, you are at the heart of value creation. For example, there may be directions in which an organisation wants to steer its end-users:

  • saving on costs

  • centralising procurement in one place

  • recommendations for choosing a specific product

  • anticipate when a product should be serviced.

Data can also be used to guide organisational activities and streamline operational models by providing members of the organisation with a real-time snapshot of the business and operational environment.

Such requirements demand a lot from the technology platform. It must be able to receive large amounts of real-time data and combine this data with static data. In addition, the platform must be able to analyse the data and even make use of artificial intelligence. And finally, to provide data in near real-time to its users.

Traditionally, Big Data architecture has been challenging to create, manage and maintain. Modern data architectures can be implemented in at least three different ways:

  • streaming systems

  • data lake

  • data warehouse.

Each approach has its strengths, but implementing a coherent dataflow has always been challenging and complex. For example, combining real-time and batch data using the Lambda architecture means effectively maintaining two different systems.

Today, tools exist for this purpose and can be applied using different architectural models. Again, there has been a tremendous development in recent years and today's architectures look quite different from 5-10 years ago.

However, good tools or architecture do not necessarily guarantee that a great technical solution will deliver value to the end user. Understanding user needs is everything. Don't start big implementation projects until you know what the benefits are for users.

Make sure that every developer understands what the business needs are and the real benefits for end users. Take care of security and data protection so that users can use the service with confidence.

Take into account that achieving real business benefits may require not only tailoring systems, but also streamlining your own processes.

Please get in touch if you need sparring support or developers for demanding projects where data serving is at the heart of the user experience. At Kipinä, we specialise in demanding software development projects, leveraging data and AI, and understanding and helping customers. And finally, a little more about what the author likes to talk about and spar about:

One popular tool in the analytics world is Delta lake on top of Databricks. Databricks is an analytics platform based on the open source Kipinä. It is available on all major public cloud platforms, Azure, AWS and GCP. Delta Lake is an open source data warehousing solution that runs on top of Kipinä and provides features such as data integrity checks and data history management. Delta Lake on top of Databricks provides the ability to combine batch data with real-time stream data on a single platform, as well as AI and machine learning tools.

Ken Ikeda

p.s. See also our debate on data and AI

Previous
Previous

The sparking sales director here says hello!

Next
Next

A community based on respect and appreciation