Saturday, March 14, 2026

Boston Consulting Group: To unlock AI Enterprise, start with the data you ignore

Share


Join the event trusted by corporate leaders for almost two decades. VB Transforma connects people building AI Real Enterprise. Learn more


By building AI Enterprise, some companies believe that the most arduous part sometimes determines what to build and how to solve various involved processes.

On VentureBeat 2025 transformationData quality and management were front and center, because companies look beyond the experimental phase of artificial intelligence and study the ways of producing and scaling agents and other applications.

>> See all our transform 2025 coverage HERE

Braden Holstege, managing director and partner, organizations are dealing with a pain in thinking about how technology crosses people, processes and design Boston Consulting Group. He added that companies must think about a series of complexities related to data exposure, AI budgets per person, access permissions and external and internal risk management methods.

Sometimes fresh solutions include how to apply previously useless data. Speaking on the stage on Tuesday afternoon, Holstege gave an example of one customer who used gigantic language models (LLM) to analyze millions of insights about people, complaints of products and positive feedback – and discovering insights that were not possible a few years ago with natural language processing (NLP).

“The wider lesson is that the data is not monolithic,” said Holstege. “You have everything from transactional registers, documents, feedback from customers, to the data that is created when creating applications and a million other data types.”

Susan Etlinger, senior director of strategy and rethinking of the management of Azure AI.

“When you are in it, you start to get a sense of possible art,” said Etlinger. “It is a balance between this and arrival with a clear sense of what you are trying to solve. Suppose you are trying to solve the customer’s experience. This is not the right case, but you do not always know. You can find something else in this process.”

Why data ready for AI are of key importance for the acceptance of the enterprise

Data ready for AI are a key step to adopting AI projects. In a separate Gartner questionnaireOver half of the 500 average CIO enterprises and technology leaders said that they expected that the adoption of an infrastructure ready for AI will lend a hand in faster and more versatile data processes.

It can be a tardy process. By 2026 Gartner predicts Organizations will abandon 60% of AI projects that are not supported by data ready for AI. When the research company surveyed leaders of data management last summer, 63% of respondents said that their organizations did not have adequate data management practices or that they were not sure of practices.

Because the implementation becomes more mature, it is significant to consider how to solve continuous challenges, such as Drift Model AI in time, said Awais Sher Bajwa, head of data and banking AI at Bank of America. He added that enterprises do not always have to hurry with end users who are already quite advanced in how they think about potential chat -based applications.

“We are all in everyday life users of chat applications,” said Sher Bajwa. “Users have become quite sophisticated. When it comes to training, you don’t have to push it to end users, but it also means that it becomes a very cooperating process. You must come up with elements of implementation and scaling, which become a challenge.”

Growing pains and complexity of AI calculations

Companies must also consider the possibilities and challenges of cloud, local and hybrid applications. Sher Bajwa said that the cloud AI applications allow testing various technologies and scaling in a more abstract way. He added, however, that companies must consider various infrastructure problems, such as security and costs – and that suppliers such as NVIDIA and AMD make it easier for companies to test various models and various implementation methods for companies

Holstege said that decisions regarding cloud suppliers became more convoluted than a few years ago. While newer options, such as Neokloud (offering servers supported by GPU and virtual machines), can sometimes offer cheaper alternatives for time-honored hyperslase, he noticed that many customers will probably implement artificial intelligence, where their data is already found-what will cause stern changes in infrastructure. But even with cheaper alternatives, Holstege sees a compromise with calculations, costs and optimization. For example, he pointed out that open source models, such as Llama and Mistral, may have higher calculation requirements.

“Does the calculation cost make it worth taking a headache to use Open Source models and data migration?” Holstege asked. “Only the limit of the elections with which people confront is much wider than three years ago.”

Latest Posts

More News