In Cognilytica Research’s briefing note on simMachines, January, 2018, it highlights the black box challenge of today’s AI machine learning technologies and the fundamental problems lack of explainability causes.
Today’s digitally-empowered customers have high expectations for relevant, highly personalized customer experiences. Companies must keep pace with these growing demands or they will be left behind by their more perceptive competition.
Similarity is a machine learning method that uses a nearest neighbor approach to identify the similarity of two or more objects to each other based on algorithmic distance functions.
Market segmentation is one of the most basic arms of business strategy. Firms bundle customers to understand their preferences, manage relationships with them, improve product and service offerings, and assess risk.
Most marketers are just beginning to explore machine learning applications. Machine learning is already providing tremendous analytic efficiency gains and increased precision.
Demand for explainable AI over the last year has started to ramp up from a variety of perspectives. DARPA, the Defense Advanced Research Projects Agency, for example, has been calling for more explainable machine learning models that human users can understand and trust.
The General Data Protection Regulation (or GDPR) going into effect in the EU in 2018 requires that algorithms used to make credit or insurance decisions must be explainable to consumers.
For financial services firms, the fact that artificial intelligence algorithms are “black boxes” that don’t enable them to explain decisions is a major problem to due legislative requirements.
The number one trend identified in the 2017 Retail Banking Trends and Predictions was a renewed focus on the customer experience. Personalizing customer communications at scale, and leveraging AI to maintain a competitive edge along with significant efficiency gains are discussed.