Thursday, March 12, 2026

Developers lose concentration 1200 times a day – how MCP can change it

Share


Do you want smarter insights in your inbox? Sign up for our weekly newsletters to get what is critical for AI leaders, data and security. Subscribe now


Software developers spend most of the time NO Writing code; Recent industry studies have shown that actual coding is as little as 16% of programmers’ working hourswith the rest consumed by operational and supporting tasks. Because engineering teams are forced to “make more of the less”, and general directors boast how many of their code database is written by AI, the question remains: what was done to optimize the remaining 84% of the tasks that engineers are working on?

Hold programmers where they are the most productive

The main culprit of programmers’ performance is contextual switching: continuous jumping between a constantly growing range of tools and platforms needed to build and send software. Harvard Business Review showed that the average digital employee almost falls between applications and websites 1200 times a day. And every break matters. The University of California said it required About 23 minutes To fully recover the concentration after a single break, and sometimes worse, because almost 30% Tasks interrupted are never resumed. The context switching is located in the center of Dora, one of the most popular programming frames.

In the era in which AI -based companies try to enable their employees more out of less, except “only”, providing them with access to gigantic language models (LLM), some trends appear. For example, Jarrod Ruhland, chief engineer in Brex, hypotheses that “programmers provide their highest value when they focus on the integrated development environment (IDE)”. With this in mind, he decided to find up-to-date ways to make it happen, and the up-to-date Anthropica protocol can be one of the keys.

MCP: Protocol for bringing the context to ides

Coding assistants, such as IDE powered by LLM, such as cursor, Copilot and Windsurf, are located in the center of Renaissance developer. Their adoption speed is concealed. The cursor became the fastest growing SaaS in history, reaching $ 100 million ARR within 12 months of the premiere and 70% fortune 500 Companies employ Microsoft Copilot.


AI scaling hits its limits

Power capitals, the growing costs of the token and inference delay are transforming AI Enterprise. Join our exclusive salon to discover how the best teams are:

  • Changing energy into a strategic advantage
  • Architect of effective inference regarding real capacity profits
  • Unlocking competitive roi using balanced AI systems

Secure your place to remain ahead: https://bit.ly/4mwgni


But these coding assistants were narrow only to the context of the code database, which could lend a hand programmers write the code faster, but they could not lend a hand switch the context. The up-to-date protocol solves this problem: model Context Protocol (MCP). Published in November 2024 by Antropic, it is an open standard developed to facilitate integration between AI systems, especially tools based on LLM as well as external tools and data sources. The protocol is so popular that it has happened 500% increase up-to-date MCP servers in the last 6 months, with about 7 million downloads in June,

One of the most influential applications of MCP is its ability to combine AI coding assistants directly with tools on which developers rely every day, improving work flows and dramatically reducing the switching of context.

Let’s take the development of the function as an example. Traditionally, it includes jumping between several systems: reading a ticket in tracking the project, looking at a conversation with a teammate to clarify, searching for Documentation Documentation of the API and, and finally opening ideas for coding. Each step lives on a different tab, requiring mental changes, which were slowed down by programmers.

With MCP and contemporary AI assistants, such as Anthropic’s Claude, the whole process can take place in the editor.

For example, the implementation of functions under the coding assistant becomes:

The same principle may apply to many other workflow engineers, for example, the response to the incident for Sres may look like:

Nothing up-to-date under the sun

We saw this pattern earlier. Over the past decade, Slack has transformed performance in the workplace, becoming a center for hundreds of applications, enabling employees to manage a wide range of tasks without leaving the chat window. The Slack platform reduced the context switching in everyday work flows.

For example, Riot games have connected about 1000 Slack applications, and the engineers saw 27% reduction During the time needed to test and iteration of the code, 22% faster time to identify up-to-date errors and 24% escalate in the function of starting function; All were assigned to improve work flows and reduce friction changing tools.

Now a similar transformation takes place in the creation of software, and the AI ​​assistants and their MCP integrations serve as a bridge for all these external tools. As a result, IDE can become a up-to-date command center for engineers, just like Slack for general knowledge employees.

MCP may not be ready

MCP is, for example, a relatively emerging standard, for example, WISEM Security WISEM MCP does not have a built-in model of authentication or permissions, based on external implementation that is still evolving, there are also ambiguity around identity and control-control-the protrusions does not clearly distinguish whether the action has been activated by the user or AI. access. Lori Macvittie, an outstanding engineer and main evangelist at the CTO Networks office, says that MCP “breaks the basic assumptions of security, which we have been maintaining for a long time.”

Another practical limitation arises when too many MCP tools or servers are used at the same time, for example in a coding assistant. Each MCP server advertises a list of tools with descriptions and parameters that the AI ​​model should be taken into account. The flood of the model with dozens of available tools can overwhelm the context window. The performance degrades noticeably, because the number of tools increases along with some of the integrations of IDE, difficult limits have been imposed (about 40 tools in IDE cursor or ~ 20 tools for the openai agent) to prevent flatulence in terms of flatulence, except what the model can handle

Finally, there is no sophisticated way for tools for automatic discovery or contextually suggested outside their list, so developers often have to manually or exhaust them, which tools are lively to work efficiently. Referring to this example of riots installing 1000 Slack applications, we can see how it can be unable to employ the company.

Less rotating, more software

Over the past decade, he has taught us the value of bringing work to the employee, from the Slack channels, which start in updates, to the e -mail methodologies “inbox” and unified desktops at platform engineering. Now, having artificial intelligence in our set of tools, we have the opportunity to enable programmers for greater productivity. Suppose Slack has become a business communication center.

In this case, coding assistants are well prepared to become a software creation center, not only where the code is written, but also where all context and colleagues are connected. By keeping programmers in their flow, we remove the continuous shift of mental gears, which harassed the productivity of engineering.

For each organization that depends on the supply of software, look at how your developers spend the day; You may be surprised by what you find.

Sylvain Kalache runs AI laboratories Rally.

Latest Posts

More News