Much has been stated about the uncertainties caused by the coronavirus and how businesses are tapping into big data and advanced analytics to find clear directions for the future. Predictive analytics are helping companies plan and assess business questions like potential revenue impacts, such as where demand is likely to increase or decrease, and competitive impacts, such as what other products may affect sales; as well as the reasons why.
Growing at a CAGR of 21.7%, the market for predictive analytics reflects the massive amount of data that is now being created, as well as new statistical and modeling techniques, and of course, the advances in AI that make predictive analytics processing possible at scale.
But not all predictive analytics platforms are created equal…
What to look for in choosing a predictive analytics platform
Number of data sets. Most predictive analytics platforms look at historical sales data and other internal sources. Enriching the data set with contextualized data sets that reflect the voice of the consumer, key opinion leaders, product reviews, product ratings, competitive indicators and other signals of innovation drives accuracy from 35% to more than double.
Timeliness and stability of data sets. Time does not stay still and predictions must be continually refreshed. But this is not so simple. When a predictive model is created, it is based on a particular data set. The characteristics of the data set may change, as it often does in the unstructured wild west of the internet, which means the system needs to be able to recognize these and be able to continually refresh.
Strength of the NLP and taxonomies. As stated before, a key element of accurate predictive analytics is the ability to connect multiple, external unstructured data sources into the model. The problem is that these sources are typically filled with hyperbole, incorrect grammar and syntax, emojis, word ambiguity, even multiple languages. For this reason, sentiment analysis has mostly hovered at around an 80% accuracy rate, however, new developments in machine learning and AI have created breakthroughs that result in greater than 90% accuracy. Applying taxonomies based on subject-matter expertise can also ensure that the right data is being fed into the model.
Flexibility of models and algorithms. There are multiple models and algorithms that should be available depending on the business question that is being asked. Having a clearly defined business question will lead to the data sources that are required, the taxonomies to be applied and finally the type of analysis that is being conducted (e.g., Linear Regression, K-means Clustering, Recurrent Neural Networks, etc.)
Visualizations and integrations. Finally, it is important for a predictive analytics platform to generate visualizations that allow for the output to be grasped and integrated into the decisioning process of the organization. This may be as simple as generating different prediction scenarios, highlighting indicators of growth drivers, etc. or resulting in more comprehensive forecasts built into business intelligence systems.
————————————–
*This blog post originally appeared on Signals-Analytics.com. Kenshoo acquired Signals-Analytics in December 2020. Read the press release.