Big data has many advantages. Both business giants and small businesses invest in big data management. But how to manage it, what techniques to implement? Is artificial intelligence (AI) a panacea to today’s challenges? These and other questions were discussed with Wannes Meert, a lector of Kaunas University of Technology (KTU) Big Data School, which will take place on September 30–October 2, 2020.
Wannes Meert is a Research Manager in the Machine Learning research group Declarative Languages and Artificial Intelligence (DTA) at KU Leuven University in Belgium; he works in the areas of AI, machine learning (ML) and anomaly detection.
Mr Meert, today the world creates an enormous volume of data. Are we capable of managing it? What does the future hold?
If we were to talk about digital big data process, for example, online shopping data, social media information, the systems we have today are impressive. Think about Twitter that generates millions of likes and messages.
However, when we talk about the analogue world, it is different. Although much of the data is generated every day, we have two major problems: there is a finite number of sensors or the metadata is missing, i.e. information about data structure and its meaning.
Imagine that we are looking at a working mechanism without knowing if it emits exhaust gas or what is its driving force. The same way, we often lack accurate data.
Let’s say that a machine stops functioning. It is important to find out the cause: whether it is because of a fault or, for example, a planned technical inspection. Usually, we do not have this information, or it is indirect, as on a technical data sheet of a mechanism. I see enormous challenges here.
Data has become the world’s most valuable resource. Are the companies that do not invest in data management and analysis at the risk of becoming uncompetitive?
Companies that want to manage and control big data seek to implement AI. However, they face two challenges. First, there is no simple way to find the right AI solution, so the long-term collaboration between different science and business partners is crucial.
Second, some problems may seem similar, but they require different existing AI methods or the ones that are not even out there. Only a team of experts from different disciplines could achieve this, and there are not many of them.
If no one addresses these challenges, then AI potential is disappointing and unfulfilled. This observation is not based only on my experience. Many companies that use AI in their processes talk about this.
Still, many companies plan to increase investments in AI. Not only because they expect a lot in return, but also because competitors who have successfully used AI might dominate the market.
It is difficult to predict whether it will be easy to automate tasks. But let’s remember companies like Amazon, Booking or Uber that have regained a large share of the market by automating their services and thus making huge profits.
It seems that the business understands the benefits of big data and AI, and there is no need to encourage them to invest. Do they have enough professionals to implement the decisions while also enable them?
I think we don’t have enough qualified professionals, especially those who would understand the essence of methods. Many AI tools are now accessible to non-experts, though it is important to understand the limits of the methods used.
In my opinion, Earl Wiener’s law for aviation and human errors applies to AI as well: “Digital devices tune out minor errors while creating opportunities for large errors.”
If you don’t try, you will not avoid mistakes. Recent years show that big data and its management had a great impact on business models and culture. What the future holds today? In what areas big data will have the greatest potential?
As we can see from the examples today, digital services will have the greatest potential. The more you can measure and test, the easier it is to automate. Everything that happens online is easy to observe, and what happens in the analogue world is more difficult to capture.
Applications in medicine are also developing rapidly. I am optimistic. Hence, the human body is very difficult.
I would say that smart factory, internet of things, industry 4.0. are the main factors that enable big data and AI implementation in industry.
Big data impacts many sectors. There is a lot of potential, but what about the risks? Maybe there are no risks? What are the negative consequences? What are the risks of using big data in decision making?
Risks? Misunderstanding the limits of model application and the stochastic nature of data. Again, it is worth to remember Wiener’s law. Models should answer the questions, to reduce the risk, thus enable counterfactual thinking.
For example, what should I do to get a loan? Here, researches show that explainable AI method creation field is promising. The notion of causality is also intriguing in this context – J. Pearl and D. Mackenzie The Book of Why is very interesting in this sense.
KTU Big Data School 2020 event for researchers and practitioners, people who work with big data, will take place from 30 September to 2 October. Topics covered: problems of data classification and clustering, the application of recurrent artificial neural networks and the application of data science and internet of things methods in banking and finance.
More information at https://bigdataschool.ktu.edu