The three most important emerging AI trends in data analytics

The three most important emerging AI trends in data analytics

AI is getting a lot of hype now, but where is it going? Here are three trends to keep watching.

It’s been a crazy year for data analytics, and three major trends have emerged that will change the practice forever. Until this year, we had concepts like “big data” that encouraged a lot of storage, but were very light on what the heck to do with the massive unstructured data being collected.

Then came data analytics, which was all about analyzing that data, but that didn’t focus on the big problem: you needed data scientists (which were in short supply) with the unique ability to talk to business executives (a skill that was almost non-existent). Then came digital transformation, a much broader term that focuses on what you need to do but isn’t clear about why you need to do it. Now we’re all working with AI that can finally make everything that came before it work.

Let’s take a look at the top three trends in data analytics.

Trend #1: Increased emphasis on conversational AI and large language models

This trend directly points to the need to bypass data scientists and create a better interface so that business managers can ask the system directly for what they need. When properly trained and implemented, these tools can provide reports and detailed answers in minutes, something that would normally take weeks or even months.

While the future will bring a greater ability for AI to learn from and adapt to the user, current systems are primarily inference machines that treat the user more generically and will require a higher level of training. However, the overall result is still faster for most, though not all, implementations that require a data scientist.

Trend #2: Enterprises need assurance of indemnified data sources

As AI-enabled applications performing analytics evolve, it is increasingly critical that training and production data sets are unbiased and uncorruptible. Bad training or production data sets that are biased or simply out of date can cause the system to make bad recommendations and worse decisions. Ensuring the safety of the data involves a legal process (requiring the company to ensure that the data in the repository is not owned by someone else who can take exception to its use) and some form of compensation.

However, the use of indemnification is not consistent, as some of the more mature firms indemnify their clients and some of the other firms seek indemnification from their clients. The latter has proven somewhat problematic, as not everyone who agrees to this indemnity gets the proper approval from their legal department or outside counsel.

Trend #3: Hybrid AI is growing in popularity

AI is very expensive to run in the cloud because it uses significant processing and storage resources. However, if you can shift the load to the client, it frees up those resources and allows for faster results with some loss of learnability and customization, as clients typically use a compressed data set and inferences that are more limited in capabilities of cloud deployment. However, most cloud deployments use limited inference models to reduce operational costs, so the lack of flexibility (at least with current technologies and customer implementations) is more of a theoretical problem than an actual one.

This is driving processor and platform companies and even AI companies like ChatGPT to develop neural focused processors (NPUs) that can be put into these desktop systems to better optimize them for this new hybrid reality. This hybrid trend started in 2023, but won’t really reach its potential until after 2024, when new and much more powerful NPUs are expected to hit the market.

Final thoughts

When they reach maturity (probably later this decade), all three of these trends will eventually deliver the kind of benefits (in terms of actionable information and recommendations) that data analytics has always promised but rarely has. were observed. The problem that still needs to be overcome is getting people to trust AI and not be afraid to use it. This will take additional time, especially given that much of what these systems currently use comes from corrupted datasets, poorly trained models, or incomplete implementations.

I expect much of these initial teething problems to disappear by the end of the decade, but until then, given the rapid development of this technology, make sure you have a salesperson or consultant who is an expert in the technology and what need it should make a huge difference in the success of your AI-driven data analytics solution.

About the author

Rob Enderle is president and principal analyst at Enderle Group, where he provides guidance to regional and global companies on how to create a reliable dialogue with the market, target customer needs, create new business opportunities, anticipate technological changes, select suppliers and products, and practice zero dollar marketing. You can contact the author by email.

Leave a Comment

Your email address will not be published. Required fields are marked *