15 October 2020
1 min read

From big data to edge technologies

Neon sign saying Dat has a better idea and skyline showing in the background

The global market for IoT and AI is expected to reach USD 16.2 billion by 2024 (an annual growth rate of 26%), according to a report by Markets and Markets. The estimated growth is mainly driven by the fact that AI can help to reduce the cost of prediction significantly. Prediction is needed to be able to assess the value, price and outcome of, for example, manufacturing processes, medical research and analysis of human behaviour. AI helps to provide an almost consistent picture of reality. At an exceptionally much faster and more cost-effective pace than any traditional excel calculation could ever be done.

And now that we know it and want to be able to get it, we may be facing the biggest challenge; how do we implement it?

Being connected and maintaining powerful data processing in cloud services will not be stable enough. Data is heavy weighing, and more data creates latency and in the worst-case downtime that is completely devastating, for example, for a self-driving car, medical devices or clinical decision support systems.

The latest within AI, and what we work with is called “edge technologies”. That means smaller, smarter systems connected to the larger cloud AI. The smaller, smarter micro AI system is extremely able to calculate which data sources and bits that should be sent to the larger cloud AI for valuation and which should not. With the help of the micro AI, the system can run without the risk for latency or downtime.

Shortly, several processes, components, tools and products, as well as bodies, will be chipped and equipped with such similar micro-edge AI. Exciting, isn’t it? I don’t know about you, but I am looking forward to following this development.