In a year from now, Github will run a single public Github MCP server that you will connect to via OAuth - you won't need to install it locally or faff around with tokens or environment variables at all.
You can get a taste of this already.
While they still call it a prototype/beta, Sentry's MCP server [0] is a model for others to follow when it comes to convenience and usefulness.
Remote-first with OAuth. The biggest hurdle to using it as-is at the moment, is that most clients don't natively support OAuth yet, so often you'll rely on a local proxy server, like mcp-remote [1], to handle auth. Clients will catch up.
That said, I think there might be a market for MCP servers that do more than the first-party client, it will really depend on what first-party support looks like. Did they implement all of their existing API in MCP or just a few parts?
However, my experience with MCP servers so far (and it’s super early days, I know), has taught me that in a lot of cases it’s better/easier to write your own MCP server/tools. A lot of MCP servers out there are sloppy and/or hard to run/debug. Since most tools are a thin layer over existing API/SDK calls it’s not hard to write (or LLM generate) the needed code which has the added bonus of giving you full control.
Even when an MCP server works 100% and is easy to run, it doesn’t always map 1:1 with the API and so I’ve run into “Yes, you can retrieve data object X but you can’t filter by Y because they didn’t implement that filter in the tool call”.
> In a year from now, Github will run a single public Github MCP server that you will connect to via OAuth - you won't need to install it locally or faff around with tokens or environment variables at all.
That sounds horrific. GitHub is known for their unreliability and centralizing everything to GitHub which isn't a good idea.
Combining two bad standards (MCP and OAuth) doesn't make remote MCP servers secure either.
- https://github.com/wong2/awesome-mcp-servers
- https://cursor.directory/mcp
But as mentioned above, there is an ongoing discussion for the Anthropic registry https://github.com/modelcontextprotocol/registry
> The MCP Registry service provides a centralized repository for MCP server entries. It allows discovery and management of various MCP implementations with their associated metadata, configurations, and capabilities.
As an example, today I re-implemented Google's AlphaEvolve with <7 tools (https://toolkami.com/alphaevolve-toolkami-style/).
Next steps are auto-generate or auto-mashup tools (a couple of projects are doing this) and small, reusable agents that only have access to the handful of tools they need.
“Auto-mashup” refers to (I just made it up) a concept of chaining existing tools with a bit of logic so that instead of having to round trip to the LLM for common cases you can call “Get the load, and the last N log lines, and procstat the top 10 procs, …” all into a “check_server_status”. Similar to some systems that let the LLM write and reuse tools, this would be the same thing, just leveraging other/existing MCP tools. Maybe “auto-composition” is a better name.
Seems silly in retrospect no?
What is hard is integrating across SAAS solutions that haven't done this yet in a way that is secure and easy. Most MCP things out there are so far about exposing things that have a very low value. All the high value stuff is locked up behind APIs, authorization, secure networking (i.e. not publicly accessible typically), etc.
Bridging that stuff is going to generate a lot of work in the next few years and more importantly, companies are going to spend large amounts of money on this because it can deliver a lot of value to them.
People that believe that this is going to be a done deal in six months are dreaming. It's more like ten years. But that just means that there is good money to be made by people that can do this stuff and that can navigate the decades of byzantine digital cruft in the corporate world. You can already see the usual suspects (big consultancy companies) sniffing around this topic. There will be lots of such companies doing a brisk business by the end of this year.
You might be underestimating how fast the current ETL / integration companies can pivot to provide reliable MCP servers as the lift is pretty small.
Am I getting this right? Based on the architecture/flow diagram of MCP, every SaaS app out there can build an MCP server. But you'll need a "MCP host" to make it work, right? Right now, I'm only seeing a handful of hosts — Claude Desktop and Windsurf. Who will be building these "hosts"? I'm only seeing use cases revolving around these hosts. Is there any real-life production use-cases? How will this pan out?
Once the authN/authZ stuff is fully codified and baked, we’ll see first part MCP gateways and the ability to connect to those tools with the Chatbot of your choice.
Consider what we see now as a developer preview…
Being able to offer a helpful API to the world and just getting paid whenever someone uses it would be really nice.
At the moment you have to process the payment "yourself" (even if you use a third party for that), issue an API key, etc.
I don't think you can make money on them, they are too simple to clone, but you can make money charging for the API. If you have a per usage license making an MCP is a very obvious choice - if you charge per seat it is mostly a question of how how sticky you are versus the competition.
With some kind of MCP tip jar, they could extract the data they need and pay $0.02 for the service.
It would remove a lot of friction in the system, and could generate revenues for content & data creators.
https://mintlify.com/blog/why-we-sunsetted-mcpt
Nice story of startup focus.
I know Han and he's a smart guy but this is very very wrong lol. there's significant SUPPLY for what he built. because everyone is just trying to self promote by putting mcp wrappers of their stuff out. the hard part is the demand.
(and also the fact that anthropic is putting up an official registry so it'll be steamrolled)
If you’re interested in the next layer beyond just discovering MCP servers, I’ve been working on https://ninja.ai — an app store for AI assistants to connect to tools via MCP, without needing to touch the command line. Think one-click installs for pipes that let agents actually do things like triage email or book Ubers.
Would love feedback if you’re experimenting in this space too!
can't really fix this
interesting
Suppose you want your agent to use postgres or git or even file modification. You write your code to use MCP and your backend is already available. It's code you don't have to write.
Why should I be self-hosting ANY local MCP server for accessing an external service?
i.e. I wrote a server for water.gov to pull the river height prediction nearby for the next 24hr. This helps the campground welcome message writing tool craft a better welcome message.
Sure that could be a plain tool call, but why not make it portable into any AI service.