I learned for the first time About the Microsoft Satya Nadella CEO clause. During the interview with him in May 2023, I asked about a contract between Microsoft and Opennai, which granted his company exclusive access to AI startup groundbreaking technology. I knew that the contract had established how much Microsoft profit could earn from an agreement, and I asked him what would happen if and when this point was reached. The answer was a bit mysterious.
“Basically, their long-term idea is that we came to superintelligence,” he told me. “If this happens, I think all plants are turned off, right?” He seemed almost cheerful because of the possibilities, which led me to thinking how seriously he took him. “If this is the last invention of humanity, all plants are turned off,” he continued. “Different people will have different judgments about what it is and when it is.”
I did not realize how essential this determination would be a few weeks later. While working on the OpenAI function, I learned that the contract basically stated that if OpenAI models have achieved artificial general intelligence, Microsoft would no longer have access to its novel models. The terms of the contract, which would otherwise expand to 2030, would be invalid. Although I wrote about it in my history, and the clause was never really a state secret, it did not cause much discussion.
This is not the case anymore. The clause was in the center of an increasingly frayed relationship between Microsoft and OpenAI and is in terms of renegotia. Was the subject of investigative stores by InformationIN The Wall Street Journal, . Financial timesAnd yes, Wired.
But the importance of the clause goes beyond the fate of both companies that agreed to it. The conditional conditions of this contract are sent to the heart of a raging debate on how the changing world-and lucrative aga can be if it is implemented, and what it means for a profit-based company to control the technology that makes Sauron’s power ring look like a plastic doodad. If you want to understand what is happening in artificial intelligence, almost everything can be explained by a clause.
Let’s explore ourselves into details. Although the precise language has not been made public, sources with knowledge about the contract confirm that the clause has three parts, each with its own implications.
There are two conditions that must be met for OpenAI to refuse your Microsoft technology. First of all, the Opeli Council would specify that its novel models have reached Aga, which are defined in the OpenAI card as “a highly autonomous system that exceeds people in the most valuable economic work.” Sufficiently blurred for you? It is not surprising that Microsoft is worried that Opeli will prematurely make this determination. His only way to oppose the Declaration of the Openai Council would be to sue.
But that’s not all. The OpenAI Board would also be obliged to determine whether novel models have reached “sufficient Aga”. This is defined as a model capable of generating sufficient profits to reward Microsoft and other OpenAI investors, which in the amount of $ 100 billion. Opeli does not have to make these profits, simply provide evidence that his novel models will be Generate this award. Unlike the first determination, Microsoft must agree that Opeli meets this standard, but he cannot unjustly. (Again, in the event of a dispute, the court may finally decide.)
Altman himself admitted to me in 2023 that the standards were unclear. “This gives our board a lot of control to decide what will happen when we get there,” he said. In any case, if Opeli decides that he has achieved a sufficient number of Aga, he does not have to share these models from Microsoft, which will get stuck in the obsolete earlier versions. He won’t even have to exploit Microsoft cloud servers; Currently, Microsoft is entitled to the first refusal to work.
