This article is part of the special edition of VentureBeat “Real Cost AI: performance, performance and large -scale roi”. Read more from this special edition.
Over the past two decades, the enterprises have had a choice between open source technologies and closed reserved.
The original choice for enterprises focused primarily on operating systems, and Linux offered an open source alternative to Microsoft Windows. In the field of programmers, open languages, such as Python and JavaScript, as Open Source technologies, including Kubernetes, are cloud standards.
The same type of choice between open and closed is now enterprises for AI, with many options for both types of models. On the reserved front of closed models there are one of the largest, most -used models on the planet, including those from OpenAI and Antropic. On the Open Source side there are models such as Lama Meta, IBM Granite, Qwen Alibaba and Deepseek.
Understanding when to operate the open or closed model is a key choice for AI decision makers in 2025 and later. The choice has both financial and adaptation for both options that enterprises must understand and consider.
Understanding the difference between open and closed licenses
There is no shortage of hyperboli around the decades of competition between open and closed licenses. But what does all this mean for corporate users?
Restricted technology, for example, GPT 4O OPENAI, there is no open or model that can be seen or available to everyone to see. The model is not easily available for refinement and in general, it is only available for real operate of enterprises with costs (certainly ChatgPT has a free level, But this will not limit it for the real burden on the enterprise).
Open technology, such as Meta Lama, IBM Granite or Deepseek, has openly available code. Enterprises can freely operate models, generally without restrictions, including tuning and adaptation.
Rohan Gupta, director of DeloitteVenturebeat said that the debate on the open vs. The closed source is not unique or from AI, nor can it be resolved in the near future.
Gupta explained that suppliers of closed sources usually offer several packages around their model, which allow ease of operate, simplified scaling, more velvety updates and lowering and constant stream of improvements. They also provide significant support for programmers. This includes documentation, as well as practical advice and often provide more integration with infrastructure and applications. In return, the company pays a bonus for these services.
“On the other hand, open source models can provide greater control, flexibility and adaptation options and are supported by a lively, enthusiastic programmer ecosystem,” said Gupta. “These models are increasingly available through the fully managed API interfaces between cloud suppliers, expanding their distribution.”
Making a choice between an open and closed model for Enterprise AI
A question that many corporate users can ask, what is better: an open or closed model? The answer is not necessarily one or the other.
“We do not consider it a binary choice”, David Guarrer, generative leader AI in EY AmericasVenturebeat said. “Open versus is increasingly closed with a liquid design space in which models are selected and even automatically organized, based on compromises between accuracy, delay, costs, interpretation and safety at various work points.”
Guarrera noticed that closed models limit how deep organizations can optimize or adapt behavior. Restricted models suppliers often limit tuning, downloading premium rates or hide the process in black boxes. While tools based on API interfaces simplify integration, they distinguish a lot of controls, which hinders to build highly specific or interpretable systems.
In contrast, Open Source models allow targeted tuning, handrail design and optimization for specific operate. This is more crucial in an aggressive future, in which models are no longer monolithic tools of general purpose, but interchangeable components in vigorous work flows. The ability to precisely shape the behavior of the model, at low costs and with full transparency, becomes the main competitive advantage when implementing agents specific to the task or strictly regulated solutions.
“In practice, we anticipate an aggressive future in which the choice of the model is expressed,” said Guarrera.
For example, the user can develop an e-mail with one AI tool, summarize legal documents using a second, search the company’s documents using a refined open source model and interact with AI locally via LLM on the device, not knowing everything, which model does what.
“The real question is: what mix of models best corresponds to the specific requirements of your workflow?” Said Guarrer.
Considering the total cost of ownership
In the case of open models, the basic idea is that the model is free of charge for operate. In contrast, enterprises always pay for closed models.
The reality when it comes to considering the total cost of ownership (TCO) is more refined.
Praveen Akkiraj, managing director in Insight partners Venturebeat was explained that TCO has many different layers. Several key considerations include the costs of infrastructure hosting and engineering: Are Open Source models independent by the company or cloud supplier? How much engineering, including thorough tuning, handrails of bodyguards and safety tests, is needed to safely operate the model?
Akkraju noticed it Tuning the open weight model can sometimes be a very sophisticated task. Companies with closed border models make huge engineering efforts to ensure performance in many tasks. In his opinion, unless enterprises implements similar engineering knowledge, they will have to face the sophisticated balance act when they tune the open source models. This creates the consequences of costs when organizations choose the model implementation strategy. For example, enterprises can tune many versions of models for various tasks or operate one API interface for many tasks.
Ryan Gross, head of data and application at Cloud Native Services Supplier Caylent Venturebeat said that from his perspective, the license conditions are not crucial, except for cases. The greatest restrictions often apply to the availability of the model when there are requirements for data residence. In this case, implementing an open infrastructure model, such as Amazon Sagemaker, may be the only way to get the most state-of-the-art model that still confesses. As for TCO, Gross noticed that the compromise lies between hosting costs and hosting and maintenance costs.
“There is a clear breakthrough point where economics switches from closed to open models,” said Gross.
In his opinion, in the case of most organizations, closed models, with hosting and scaling resolved on behalf of the organization, will have lower TCO. However, in the case of gigantic SAAS enterprises with a very high demand for its LLM, but simpler cases of operate requiring AI borders or company efficiency, hosting of distilled open models can be more profitable.
Like one Enterprise software programmer, he rated open vs closed models
Josh Bosquez, Cto at Second front systems It is one of many companies that had to take and evaluate open models compared to closed models.
“We use both open and closed AI models, depending on the specific case of use, safety requirements and strategic goals,” said Venturebeat Bosquez.
Bosquez explained that open models allow their company to integrate the latest possibilities without the time or costs of training models from scratch. In the case of internal experiments or rapid prototyping, open models support his company quickly iron and take advantage of community -based progress.
“On the other hand, closed models are our choice when data sovereignty, support and guarantees of corporate class security are necessary, especially in the case of applications or implementation related to clients covering sensitive or regulated environments,” he said. “These models often come from trusted suppliers who offer good results, compliance support and self -denial options.”
Bosquez said that the process of model selection is interfunctional and risk based, assessing not only the technical fit, but also data handling principles, integration requirements and long -term scalability.
Looking at TCO, he said that it differs significantly between open and closed models and no approach is widely cheaper.
“It depends on the scope of arrangement and organizational maturity,” said Bosquez. “Ultimately, we assess TCO not only with dollars spent, but for delivery speed, risk of compliance and the possibility of safe scaling.”
What does this mean for AI Enterprise AI strategy
For clever technical decision makers assessing AI investments in 2025, an open debate vs. Closed is not about choosing pages. It is about building a strategic wallet approach, which it optimizes for various cases of operate in the organization.
Immediate elements of action are straightforward. First of all, audit the current AI loads and map them as part of the decision frames specified by experts, taking into account the requirements for accuracy, the need for delays, cost reduction, safety requirements and compliance obligations for each case of operate. Secondly, honestly assess the engineering capabilities of organizations regarding tuning, hosting and maintenance models, because this directly affects your true cost of ownership.
Thirdly, start experimenting with model orchestration platforms that can automatically direct tasks to the most suitable model, open or closed. This positions your organization for the agency future, which industry leaders, such as Guarrer EY, predict where the selection of the model becomes imperceptible to end users.
