Last week, I shared a post about reasoning over cloud financial data using the Model Context Protocol (MCP). I wrapped it up with a bit of a casual challenge to the universe: "I'll wait for the open-source community to create something—or I’ll just build it myself."
Well, four days later, while browsing for MCP server implementations (as one does on a quiet afternoon), I stumbled onto a comprehensive list of projects. And right there, sitting calmly among the links, was exactly the server I had imagined.
The Protocol Has Landed
It was an exciting moment. MCP is beginning to unlock a standardized way for large language models to access real-time data. That’s a big deal. It opens the floodgates for community-driven servers capable of tapping into a wide variety of tools and platforms.
But then reality set in.
As I combed through the list of available servers, something jumped out at me: they weren’t from vendors like Slack or Atlassian. Most were community-built. And while I absolutely admire the energy and innovation coming from open-source contributors, handing remote execution capabilities of production services to servers not backed by the original vendors? That’s a level of risk most organizations—mine included—aren’t ready to embrace.
Which naturally led me to ask: Will Slack (or any major SaaS provider) actually release an official MCP server? And more broadly, how will SaaS vendors react when customers can start pulling their data into external LLMs, bypassing traditional upsells and packaged "AI agents"?
To answer that, let’s rewind a bit.
The AI Agent Gold Rush
When generative AI took off, it felt like every enterprise software vendor had a sudden epiphany. And then their entire sales org called me.
The pitch was always the same: “For an additional fee, you can use our AI agent to work with your data more efficiently.” Multiply that across a dozen platforms, and you can see where this is going.
I won’t name numbers, but it wasn’t exactly “freemium.” Meanwhile, these AI agents—beyond the branding—were largely built on a shared architecture:
- Move enterprise data into a proprietary knowledge base
- Connect it to an LLM
- Add prompt tuning and guardrails
- Package it up in a way that looks sleek and intelligent
Did it deliver value? Sometimes, yes. Was it repeatable across platforms? Sort of. But the important bit: most vendors kept the implementation details opaque, and all of them wanted a new line item on your budget.
Still, the C-suite wanted progress, investors wanted to hear "AI is in our roadmap," and for many orgs—mine included—the best path forward was to prioritize use cases first, technology second.
MCP Changes the Game
And now, just as we’ve started to understand the AI agent landscape, along comes MCP.
SaaS platforms have spent years building APIs to help customers and partners integrate and extend their tools. These APIs are central to product ecosystems. But now, MCP servers offer a new way to connect those same APIs to external LLMs—without needing proprietary AI agents or steep platform fees.
The implications are big:
- With MCP, it becomes much easier to pull structured, real-time data into models like Claude or GPT, dynamically and on demand.
- The infrastructure is open. The protocol is flexible.
- And critically, the control returns to the enterprise, rather than the vendor.
Yes, this was always technically possible. But MCP makes it clearer and more accessible—and that changes the economics of SaaS AI integrations in a meaningful way.
The SaaS Dilemma
So, SaaS vendors now face a few choices:
- Restrict API access – which would risk harming their ecosystem and developer goodwill.
- Charge for API usage – which might backfire if customers perceive it as double-dipping.
- Embrace MCP and lead the way – by offering secure, official MCP-compatible services.
Which brings me to a move I found particularly interesting: Atlassian’s decision to make their Rovo AI agent available for free.
Strategically, this is a smart response to the MCP wave. Rather than fight the protocol’s momentum, they’ve positioned themselves as the trusted MCP-compatible layer between enterprise data and LLMs.
Rovo integrates with Slack, Miro, Asana, GitLab, and more. And for companies that don’t want to manage their own MCP infrastructure (or worry about security risks from unknown servers), Atlassian is now offering a ready-made solution that already works with the tools they’re using.
Will it be free forever? Probably not. But the message is clear: Atlassian is playing long-term. And others will follow—or be forced to.
What Happens Next?
We’re at the beginning of another shift. Protocols like MCP signal a broader move away from closed, proprietary AI workflows and toward more open, composable, and cost-effective architectures.
It challenges the current “AI agent as a service” model and puts real pressure on SaaS vendors to rethink how they deliver AI functionality—and how they price it.
The next 6–12 months will be telling. Will vendors start throttling API access? Launch official MCP servers? Or rethink their monetization strategies altogether?
One thing is certain: the organizations that adapt quickly—either by enabling customers through open protocols or by providing trusted managed options—will be the ones leading the conversation.
As for the rest? They'll be watching from the shoreline as the next wave rolls in.