The Challenges for Companies

GenAI has dominated the consumer market at a record pace. In 2023, private customers spent over a billion dollars on paid AI services.

Will this year belong to business? Probably yes. Estimates suggest several billion dollars in investments. Last year, many companies began experimenting with generative AI, but it was limited to a few use cases.

It was quite common to see products launched on the market that were essentially just ChatGPT wrapped up in different packaging. I've observed this among the startups that received grants from OpenAI.

A few years ago, when generative AI hadn't yet become mainstream, I studied these topics under Andrew Ng. Among many courses, generative text and graphic models held a significant place. They were interesting and promising. However, I didn't think they would spark a revolution on such a scale.

2024: AI in Business

Last year, most enterprise AI spending came from "innovation" funds and other special budgets. In 2024, these are being allocated as part of IT work.

According to a study by Andreessen Horowitz, the share of innovation budgets will drop to 25%, even as absolute values increase. With these increases, the real challenge is and will be building financial assumptions (business cases) for projects. A similar situation occurred with big-data, data science projects, or attempts to monetize data, so we have a continuation, albeit in a slightly different, new area.

Implementation and Deployment

When a company gets through the business case stage, it faces the challenge of project implementation. As I wrote in the book "Data-Driven Transformation" https://books.chiefdataofficer.pl/, certain conditions are essential for the successful implementation of data-based projects. With AI, it’s not any easier.

Implementation and scaling require the right team, which is often lacking. Having access to an API is not enough to build and deploy solutions. Implementation accounts for three-quarters of the cost and complexity of a project. The cost of models or technology is the minority. Often, a solution must be assembled from several algorithms and a significant amount of data. For this reason, both models acquired from multiple providers and open-source, which is increasingly dominant, are important.

Some people are surprised by this, but the winning open-source trend has long been visible in the data environment. That’s why well-known commercial platforms are often marginalized by open-source solutions.

The cost differences between open-source and commercial solutions were important at the beginning of this wave. Open components provide flexibility, faster access to new features, and support for environments.

Data Security in the Context of AI

Using external services during implementations carries the risk of information leaks. Relying on automation adds operational and reputational risks. Companies still don’t feel comfortable sharing their own data with closed model providers due to regulatory or data security concerns. Entities whose intellectual value is key to their business model are particularly conservative.

Common use of RAG

Companies are customizing models rather than building them from scratch. In 2023, there was a discussion about building custom models, like BloombergGPT. In 2024, many solutions are using the retrieval-augmented generation (RAG) approach or customizing an open-source model to meet their specific needs.

Many decisions involve the cloud. There is a correlation between the cloud provider used and the preferred model: Azure users typically preferred OpenAI, while Amazon users chose Anthropic or Cohere.

A fascinating trend is emerging—developers are designing their applications so that switching between LLM providers happens quickly. This ensures additional reliability and security.

Operationalization

The most challenging aspect (similar to the deployment of previous complex data science solutions) seems to be operationalization.

Only those who go through the experimentation stage, a small group of companies, know this. In production, we learn about all the problems with our solution. This is the real proving ground, which, in a successful scenario, has the potential to build a team that can deploy similar projects.

Model industrialization is currently happening for internal use cases, i.e., for employees. Applying it to external customers seems risky and more difficult.

Emerging Applications

AI is not just about conversations and creating messages. Applications are emerging that aim to unlock unstructured data. Thanks to AI, we gain access to quantified information embedded in long-collected data. Now it can be extracted and used for analysis.

AI also supports data quality. I’ve successfully applied these methods in the food-tech industry.

All these changes amount to a revolution. I was waiting for it and preparing for it, but its dynamism still surprised me.

This is a huge opportunity for startups that, by understanding business and problems, can successfully assist corporations in AI initiatives.

Such cooperation is good for the client, enabling a shift from a service-based approach to building scalable products.

We’re all learning this.

Author: Mariusz Jażdżyk

FirstScore