AI is experiencing a renaissance. If you listen to the salespeople, you will hear some FOMO messaging about ‘Artificial Intelligence isn’t going to have another winter so you need to get on board before everyone else’. It’s more complex and subtle than that, and you need to be wary of people giving you glib superficial messaging.
It is important to be data-driven, and inspired by insights. We live in a world where we have an unwieldy amount of data, tons of information, some knowledge, and a bit of wisdom. When you talk to people, you are competing with so much more data than you can imagine; news outlets, celebrities, fake news, old news, people in our inner and wider circles. We have to find a way of communicating to be heard and understood above all that. Data is chaotic. Consumers of data act like active elements of a network, they are not passive.
There will be a touch of frost in AI adoption if businesses do not understand or trust AI, and do not know how to test it.
AI will also be subject to a data universe and part of the process is to select the right data so that it can work. Businesses will also need to know how to test it rigorously.
If the future is already here, then how can people make sure that their envionment doesn’t get chilly when it comes to AI?
How can organisations move to successful implementation with AI and avoid their own AI Winter?
Data is the language for AI-based solutions. Key factors to be considered when testing AI-based solutions include:
Changes in Input Data
AI solutions need to be tested for changes in input data to have an effective system. It is important to conduct an analysis of data dependencies.
Developing curated datasets
The input and the expected output constitute the training data sets. The dataset test contains data for all possible permutations and blends to test the efficiency of trained models. The model goes through an iterative training process. Developing test validation test suites are based on algorithms and test data sets. Confidence score results can be more meaningful than outcomes, and testers can think in terms of success thresholds and ranges.
Data Labelling and Bias
Supervised learning techniques mean that data labelling introduces human judgement and bias into the dataset. As organizations become more mindful of diversity and inclusion, data bias is becoming a more important topic. However, like the concept of an AI winter, there is additional complexity. If we do not imbue the human experience in labelled data, we miss out on experiential knowledge. If we do imbue human experience, then data biases can creep in. Apriori testing of the input labelled data identifies any concealed patterns, spurious correlations, heteroscedasticity and other potential issues in the dataset. Otherwise, the resulting model is biased toward the characteristics that have more extensive representation in the data set.
The role of the AI Translator
AI translators can help users to think more widely about the AI problems at hand. Often, business users have a narrow focus on how a problem set, inspired by their understanding of the solution. In itself, this produces bias. However, this can skew the solution towards what a user wants to see rather than introducing more complicated or unfamiliar solutions.
The ‘AI Winter’ concept is often used as a FOMO tactic to push organisations towards AI. It’s quite a different thing to say that ‘AI is everywhere’ and quite another thing to say that ‘AI will solve all your problems if you buy this tech’. If you have to ask yourself if your organisation is ready for an AI implementation, that means you are probably not ready for it and it is time to get data-savvy.
If your organisation can’t test or understand the AI output then it means that you may not have what you need to be successful with the technology. Start with the end in mind and the data throughout and be mindful that the problem isn’t always the tech; the people are crucial to success, too.
To summarise, although it may not be completely left out in the cold winter, it does mean that there could be a touch of frost in AI adoption in organisations because they do not know how to test artificial intelligence solutions output or trust the numbers. It needs to be translated into something meaningful that business people can understand and trust.