Marketers Need Granular Transparency Behind Machine Learning Predictions
Demand for explainable AI over the last year has started to ramp up from a variety of perspectives. DARPA, the Defense Advanced Research Projects Agency, for example, has been calling for more explainable machine learning models that human users can understand and trust. Likewise, the General Data Protection Regulation going into effect in 2018 requires a European financial institution to be able to explain why a consumer was denied credit. And then there are the ethical considerations with ensuring machine learning algorithms aren’t inadvertently excluding certain consumers based on ethnicity, gender, race etc. Without methods that provide transparency, these are significant issues.
Explain-ability is, of course, critical to marketing as well. If you are predicting someone is likely to churn for example, how would you know what to offer them to stay or even if they are worth retaining if you don’t know why they are leaving? How would you know what specific offer to make if you don’t know why an algorithm is predicting that a customer is ready to buy vs. just researching? If you don’t know the context within which a customer is engaging with your brand, how can you be relevant in these precious moments?
Early adopters that learn how to apply machine learning technologies that reveal rich new insights through granular level transparency, will inherently create competitive advantage.
Currently, machine learning explain-ability is global in nature, in that only a broader explanation of what data inputs the models are weighting overall are provided along with expert perspectives from data scientists who are used to using the specific data sets involved. That’s better than nothing, but marketers can be much more effective in engaging with precision when the prediction factors are known as an individual local prediction level.
Transparency is possible today. Without question, machine learning is far more efficient than traditional statistical modeling. However, the lack of transparency has become a big consequence for marketers. When you consider machine learning performance upgrades without explainability, from a campaign automation perspective, the results are significant. However, when you consider performance upgrades from a customer perspective – relevant and in context dialogs and specific recommendations and offers that match to what a customer wants in that moment, there are still significant gains that can be made.
Additionally, the value of transparency into the predictive value of specific data elements, correlations, and aggregations in the context of every customer can yield even greater gains in customer intelligence and competitive advantage as well as new innovative market opportunities. Machine learning with individual prediction transparency is possible now and offers marketers a powerful weapon. Early adopters that learn how to apply machine learning technologies that reveal rich new insights through granular local prediction level transparency, will inherently create competitive advantage.