For years, the best argument for centralizing on Github was that this was where the developers were. This is where you can have pull requests managed quickly and easily between developers and teams that otherwise weren't related. Getting random PRs from the community had very little friction. Most of the other features were `git` specific (branches, merges, post-commit hooks, etc), but pull requests, code review, and CI actions were very much Github specific.
However, with more Copilot, et al getting pushed through Github (and now-reverted Action pricing changes), having so much code in one place might not be enough of a benefit anymore. There is nothing about Git repositories that inherently requires Github, so it will be interesting to see how Gentoo fares.
I don't know if it's a one-off or not. Gentoo has always been happy to do their own thing, so it might just be them, but it's a trend I'm hearing talked about more frequently.
I'm watching this pretty closely, I've been mirroring my GitHub repos to my own forgejo instance for a few weeks, but am waiting for more federation before I reverse the mirrors.
Also will plug this tool for configuring mirrors: https://github.com/PatNei/GITHUB2FORGEJO
Note that Forgejo's API has a bug right now and you need to manually re-configure the mirror credentials for the mirrors to continue to receive updates.
Once the protocols are in place, one hopes that other forges could participate as well, though the history of the internet is littered with instances where federation APIs just became spam firehoses (see especially pingback/trackback on blog platforms).
I wonder if federation will also bring more diversity into the actual process. Maybe there will be hosts that let you use that Phabricator model.
I also wonder how this all gets paid for. Does it take pockets as deep as Microsoft's to keep npm/GitHub afloat? Will there be a free, open-source commons on other forges?
You can push any ref not necessarily HEAD. So as long as you send commit in order from a rebase on main it should be ok unless I got something wrong from the doc?
Can't you branch off from their head and cherry-pick your commits?
I can see merit in your suggestion, but it does require some discipline in practice. I’m not sure I could do it.
GitHub could have been project hosting and patch submission. It's the natural model for both the style of contributions that you see most on GitHub today and how it's used by Linux. (Pull requests are meant for a small circle of trusted collaborators that you're regularly pulling from and have already paid the one-time cost to set up in your remotes—someone is meant to literally invoke git-pull to get a whole slew of changes that have already been vetted by someone within the circle of frequent collaborators—since it is, after all, already in their tree—and anyone else submits patches.) Allowing simple patch submission poses a problem, though, in that even if Alice chooses to host projects on GitHub, then Bob might decide Gitorious is better and host stuff there even while remaining friendly and sending patches to Alice's GitHub-hosted project. By going with a different, proprietary pull request system and forcing a clunky forking workflow on Alice and Bob, on the other hand, you can enforce where the source of the changes are coming from (i.e. another repo hosted on GitHub). And that's what they did.
I don’t know how the Agit-flow stores the commit, but I assume it would have to be in the main repo, which I’m happy to not be used for random PRs.
Requiring forks makes it more convoluted for simple quick pushes, but I can see why it would be done this way.
I suspect the real answer is that’s the way Linux is developed. Traditionally, the mai developers all kept their own separate branches that would be used track changes. When it was time for a new release, the appropriate commits would then be merged into the main repository. For large scale changes, having separate forks makes sense — there is a lot to track. But, it does make the simple use-case more difficult. Git was designed to make the complex use-cases possible, sometimes at the expense of usability for simpler use cases.
Sure, the world has pretty much decided it hates the protocol. However, people _were_ doing all of that.
There's not much point in observing "but you could have done those things with email!". We could have done them with tarballs before git existed, too, if we built sufficient additional tooling atop them. That doesn't mean we have the functionality of current forges in a federated model, yet.
Those exist (badly and not integrated) as part of additional tools such as email, or as tasks done manually, or as part of forge software.
I don't think there's much point in splitting this hair further. I stand by the original statement that I'd love to see federated pull requests between forges, with all the capabilities people expect of a modern forge.
Give me “email” PR process anytime. Can review on a flight. Offline. Distraction free. On my federated email server and have it work with your federated email server.
And the clients were pretty decent, at running locally. And it still works great for established projects like Linux Kernel etc.
It’s just pain to set up for a new project, compared to pushing to some forge. But not impossible. Return the intentionality of email. With powerful clients doing threading, sorting, syncing etc, locally.
The advantage of old-school was partially that the user agents, were in fact user agents. Greasemonkey tried to bridge the gap a bit, but the Web does not lend itself to much user-side customization, the protocol is too low level, too generic, offering a lot of creative space to website creators, but making it harder to customize those creations to user’s wants.
Yes, it takes time to learn, but that is true for anything worthwhile.
I think it's dwm that explicitly advertises a small and elitist userbase as a feature/design goal. I feel like mailing lists as a workflow serve a similar purpose, even if unintentionally.
With the advent of AI slop as pull request I think I'm gravitating to platforms with a higher barrier to entry, not lower.
There is code or repository, there is a diff or patch. Everything else your labeling as pull request is unknown, not part of original design, debatable.
GitHub style pull request is not part of the original design. What aspects and features you want to keep, and what exactly you say many others are interested in?
We don't even know what a forge is. Let alone a modern one.
Pretty sure several of these distros started doing this with cvs or svn way back before git became popular even.
The first hit I could find of a git repository hosted on `archlinux.org` is from 2007; https://web.archive.org/web/20070512063341/http://projects.a...
---
> If you're a code forge competing with GitHub and you look anything like GitHub then you've already lost. GitHub was the best solution for 2010. [0]
> Using GitHub as an example but all forges are similar so not singling them out here This page is mostly useless. [1]
> The default source view ... should be something like this: https://haskellforall.com/2026/02/browse-code-by-meaning [2]
[0] https://x.com/mitchellh/status/2023502586440282256#m
And besides that - have you tried/tested "the amount of inference required for semantic grouping is small enough to run locally."?
While you can definitely run local inference on GPUs [even ~6 years old GPUs and it would not be slow]. Using normal CPUs it's pretty annoyingly slow (and takes up 100% of all CPU cores). Supposedly unified memory (Strix Halo and such) make it faster than ordinary CPU - but it's still (much) slower than GPU.
I don't have Strix Halo or that type of unified memory Mac to test that specifically, so that part is an inference I got from an LLM, and what the Internet/benchmarks are saying.
I'm also quite used to the GitHub layout and so have a very easy time using Codeberg and such.
I am definitely willing to believe that there are better ways to do this stuff, but it'll be hard to attract detractors if it causes friction, and unfamiliarity causes friction.
I say this as someone who does browse the web view for repos a lot, so I get the niceness of browsing online... but even then sometimes I'm just checking out a repo cuz ripgrep locally works better.
when he's working on his own project, obviously he never uses the about section or releases
but if you're exploring projects, you do
(though I agree for the tree view is bad for everyone)
I also look for releases if it's a program I want to install... much easier to download a processed artifact than pull the project and build it myself.
But, I think I'm coming around to the idea that we might need to rethink what the point of the repository is for outside users. There's a big difference in the needs of internal and external users, and perhaps it's time for some new ideas.
(I mean, it's been 18 years since Github was founded, we're due for a shakeup)
"This new thing that hasn't been shipped, tested, proven, in a public capacity on real projects should be the default experience going forwards" is a bit much.
I for one wouldn't prefer a pre-chewed machine analysis. That sounds like an interesting feature to explore, but why does it need to be forced into the spotlight?
For us Europeans has more to do with being local that reliability or copilot.
I hope so. When Microsoft embraced GitHub there was a sizeable migration away from it. A lot of it went to Gitlab which, if I recall correctly, tanked due to the volume.
But it didn't stick. And it always irked me, having Microsoft in control of the "default" Git service, given their history of hostility towards Free software.
Now in 2026 things look different. While the fears that Microsoft would revert to 90s Embrace, Extend, Extinguish mostly haven't come to pass, their products are instead all plagued by declining quality and stability, and a product direction that seems to willfully ignore most of the user base
Yet programmers in 2026 for some reason are still using it when signing their git tags and commits. Unless they are using a sha256 git repository.
Find a project, find out if it's the original or a fork, and either way, find all the other possibly more relevant forks. Maybe the original is actually derelict but 2 others are current. Or just forks with significant different features, etc. Find all the oddball individual small fixes or hacks, so even if you don't want to use someone's fork you may still like to pluck the one change they made to theirs.
I was going to also say the search but probably that can be had about the same just in regular google, at least for searching project names and docs to find the simple existence of projects. But maybe the code search is still only within github.
So absolutely not the start of the movement, but it seems to be accelerating more and more.
My guess is it's driven by very poor user experience coupled with worse product.
Technical leople who care about privacy/surveillance at least a little bit need take one look at the current state of tech and US govt to see how fucking fast dystopia is becoming reality. See discord/openai writeup that came out, ads literally everywhere, flock and ring cameras wide open and passively performing recon, routers doing the same... it's like snow crash out here
Makes perfect sense that those who know would say fuck this, im out. Convenience isn't worth it anymore
But the implementation of Gerrit seems rather unloved, it just seems to get the minimal maintenance to keep Go/Android chooching along, and nothing more.
There are lots of people who are very fond of Gerrit, and if anything, upstream development has picked up recently. Google has an increasing amount of production code that lives outside of their monorepo, and all of those teams use Gerrit. They have a few internal plugins, but most improvements are released as part of the upstream project.
My company has been using it for years, it's a big and sustained productivity win once everyone is past the learning curve.
Gerritforge[0] offers commercial support and runs a very stable public instance, GerritHub. I'm not affiliated with them and not a customer, but I talk to them a lot on the Gerrit Discord server and they're awesome.
Gitlab CI is good but we use local (k8s-hosted) runners so I have to imagine there's a bunch of options that provide a similar experience.
Just like with plain git - in GitLab you can merge a branch that has multiple separate commits in it. And you can also merge (e.g. topical/feature) branches into one branch - and then merge that "combined" branch into main/master.
Though most teams/project prefer you don't stretch that route to the extreme - simply because it's PITA to maintain/sync several branches for a long period of time, resolving merge conflicts between branches that have been separate for a long time isn't fun, and people don't like to review huge diffs.
This is how Gerrit operates "natively" - the commit message and everything is part of the artifact under review exactly like the diff.
If the model is to squash an MR into a single commit before merging it, I'd then want to be able to have MRs that depend on each other.
Hell even if you don't use VSCode there are much better options than messing around with patch files.
Patch files are excellent for small diffs at a glance. Sure, I can also `git remote add coworker ssh://fork.url` and `git diff origin/main..coworker/branch`, and that would even let me use Difftastic (!), but the patch is entirely reasonable for quick glances of small branch diffs.
> No need for that.
I generally expect a less complex solution, it seems like your is more complex (easiness is arguable though)
Just like Microsoft Windows. I wonder if they are the same company. /s
The original AGit blog post is no longer available, but it is archived: https://web.archive.org/web/20260114065059/https://git-repo....
From there, I found a dedicated Git subcommand for this workflow: https://github.com/alibaba/git-repo-go
I really like what I've read about AGit as a slightly improved version of the Gerrit workflow. In particular, I like that you can just use a self-defined session ID rather than relying on a commit hook to generate a Gerrit ChangeId. I would love to see Gerrit support this session token in place of ChangeIds.
On the downside, I don’t see them yet taking ops seriously. They are getting a lot of attention, but not yet establishing SLAs (at least not publicly). And their donations don’t seem to be scaling to the continued and expected demand.
This “Great Uncoupling” is well underway and will take us toward a less monocultural Internet.
Gentoo's Github mirrors have only been to make contributing easier for -I expect- newbies. The official repos have -AFAIK- always been hosted by the Gentoo folks. FTFA:
This [work] is part of the gradual mirror migration away from GitHub, as already mentioned in the 2025 end-of-year review.
These [Codeberg] mirrors are for convenience for contribution and we continue to host our own repositories, just like we did while using GitHub mirrors for ease of contribution too.
And from the end-of-year review mentioned in TFA [0] Mostly because of the continuous attempts to force Copilot usage for our repositories, Gentoo currently considers and plans the migration of our repository mirrors and pull request contributions to Codeberg. ... Gentoo continues to host its own primary git, bugs, etc infrastructure and has no plans to change that.
we learn that the primary reason for moving is Github attempting to force its shitty LLM onto folks who don't want to use it.So yeah, the Gentoo project has long been "decoupled" or "showing it can be done" or whatever.
Steam proved gaming doesn't depend on Windows, Linux can do it too.
Countries in Europe feed-up with Windows moving to Linux
LibreOffice is eating Microsoft 365 lunch
Microsoft buying GitHub caused a mass-exodus, its AI push is causing another mass-exodus.
Big open-source project moving away from GitHub, we only need a big player to make the move, followers will come.
No way.
I love LibreOffice. It's fantastic. I rolled it out to a prior employer where everyone needed a word processor but we certainly didn't need to pay Office prices when we didn't have requirements that only Office could satisfy. A high point was when we were having trouble collaborating on a DOCX file with a customer, then they sheepishly told us that they weren't using Office, but this other LibreOffice (OO.org at the time) thing. We laughed and told them we were, too. That day we started swapping ODT files instead and everything worked 100x better.
And all that said, I haven't seen LibreOffice in person in years. Mac shops uses Pages & friends for internal stuff, but really, almost everyone not using Office 365 or whatever they're calling it now is using Google Docs. Google is eating Microsoft's lunch in this space, and my gut estimate is that they split 95+% of the office software market between them.
I do wish that weren't the case, but my personal experience tells me it is. I wish it were more common, and also that there was a virtuous cycle where more Mac users made it get more attention, and more attention made it feel more like a "Mac-assed Mac app", and feeling more like a Mac-assed Mac app got it more users, etc. I just don't see that playing out.
For the rest - yes.
This one misses the point entirely, I'm sorry to say. Microsoft 365's "lunch" is that a majority of US businesses, schools, and governments are reliant on 365 for anything in their organization to function.
Basically a substantial (non-software) enginerring or financial work is done in Microsoft's proprietary formats, occasionaly involving VBA.
Many businesses cannot even pay salaries without macro-ridden Excel documents.
With 365 Microsoft has even stronger moat: cloud integrated co-editing using desktop apps. No browser will exceed C++/C# Office apps running directly on the PC. Not even proprietary apps have equivalent experience.
On top of that add all Azure, SharePoint etc. All big companies without exception use those and put significant portion of their business knowledge on Microsoft platforms.
US can literally kill Europe by just forcing Microsoft to shutdown its operations. It is fucking scary.
Both powerpoint and excel are well ahead of the competition.
Aren’t most games built on Windows and for Windows?
If you haven't seen it already, Codeberg is seeking donations here: <https://docs.codeberg.org/improving-codeberg/donate/>. A good way to support a product you like rather than becoming the product yourself.
It’s a great hobby app, and the forgejo software seems well assembled, but Codeberg needs to be a bit more forthcoming about their capacity before more large projects move over.
I want to see larger projects like Gentoo migrate, but everything will come grinding to a halt if codeberg doesn’t come up with scalable resourcing (money & capacity)
In my opinion, for a Git forge, switching to any other main source of income would almost certainly lead to some form of enshitification
surviving on online donations is extremely rare. In their case, they are not getting that many impressions, so $1-$5 donations won't cover their expenses. Think of how expensive even a single customer like Gentoo would be if they fully migrated -- how many Gentoo contributors would have to contribute money to cover Gentoo hosting.
I'm sure they have a sales pipeline planned, and similar to Facebook, they don't want to kill the buzz with sales before they build their audience.
- It's slow for git command-line tasks, despite the site UX being much faster, git operations are really slow compared to Github.
- It doesn't have full feature parity with Github actions. Their CI doesn't run a full pkgcheck I guess, so it's still safer for a new Gentoo contributor to submit PR's to github until that gets addressed.
I stand corrected (thanks Sam): This was previously the case before they made the announcement, it’s fully working now. Feel free to contribute on Codeberg.
Honestly I don't understand all the GitHub hate recently. Honestly seems like a fashionable trend. Virtue signaling.
There was a decade where they barely innovated. Maybe people forgot about that? Or maybe they are too young to remember? I'll gladly take all the advances over the past 8-ish years for the odd issue they have. GH actions has evolved A LOT and I'm a heavy Copilot user at the org/enterprise level..
Overall, though, it's ... fine. That's all. A little worse than it used to be, which is frustrating, but certainly nowhere near unusable. I stood up my own forge and mirror some repos to it. The performance is almost comically better. I know it's not a fair comparison: I have only one user. On the other hand, I'm on a 9-year-old Xeon located geographically farther from me than GitHub's servers.
The alliance any up-and-comers can make with the ecosystem is to develop more of what they host in the open source. In return for starting much closer to the finish line, we only ask that they also make the lines closer for those that come after them.
That's a bit of an indirect idea for today's Joe Internet. Joe Internet is going to hold out waiting for such services to be offered entirely for free, by a magical Github competitor who exists purely to serve in the public interest. Ah yes, Joe Internet means government-funded, but of course government solutions are not solutions for narrow-interest problems like "host my code" that affect only a tiny minority. And so Joe Internet will be waiting for quite some time.
Personally I wouldn't mind paying for access but I doubt there is a critical mass of users that can be weaned off of free access. Competing with free networks is hard. Codeberg, as far as I can tell, basically has a donation model where you can volunteer to pay and be a "member", but 0.5% of users choose that option, that is, they made a one time payment of 10 euros. That's enough to fund how many months of bandwidth and a couple of recycled servers. For cloud infrastructure standards are pretty high, you want replication, backup, anti-DDOS, monitoring, etc. All of that costs money. It would also help if they made it easier to donate with a paypal link instead of a SEPA QR code that requires an international bank transfer.
Well we could imagine where users give part of their laptop to join a pool of workers for builds.
If there are 3 builds that achieve the same output, one of it "wins" and is choosen.
Not a wikipedia banner. No guilt verbiage. No unrelatable total site/year numbers like "2.6M out of 5M goal" etc.
Just like some little bit of ui in a corner somewhere that passively just sits there and shows it's state like a red/yellow/green light or a battery meter or something. And what it shows is some at-a-glance representation of what you are costing the service, positive or negative.
If the org is open and low profit or even non profit, or even reasonable profit but organized as a co-op, this can be a totally honest number, which will probably be suprisingly small.
(and if any full-profit type services don't like having that kind of info made quite so public because it makes it hard to explain their own prices, well golly that sure sounds awful)
This will obviously have no effect on some people.
But I know that something like that will absolutely eat at some people until they decide they will feel better if they make that dot turn green.
And everyone else who just wants to take something for free and doesn't like being reminded of it, has no basis for complaining or claiming to be outraged at being nagged or browbeaten. It's a totally passive out of the way bit of display making no demands at all and not even hindering or speedbumping anything.
Even when you click on it for more info and the links to how to donate etc, the verbiage is careful not to make kids or drive-by laypeople or anyone else without real means feel bad or feel obligated. We don't need your soup money, don't sweat it.
Maybe even include some stories about how we all wound up in our high paying IT jobs because of the availability of stuff other people wrote and let us use for free when we were kids or former truck drivers etc, and so that's how you can understand and believe we really are ok with you now using this for free.
Can't possibly get any lighter touch than that.
And yet the fact that the little thing is just there all the time in view, that alone will make it like a voluntary itch that if you know you can afford it, you should make that light green. It's like a totally wholesome use of gamification psychology.
I guess it will also have to somehow show not just what you cost yourself, but also what all the non-paying users are costing and what your fraction of that would be to cover those. At least some payers would need to pay significantly more than what they cost.
But I'd be real curious to see just how bad that skew is after a while if a lot of individuals do end up paying at least for themselves, where today most of them pay nothing.
That may make the need for whales much reduced and really no whales, just a bunch that only pay like twice what they cost. Or even less, a heavy user that costs more might be able to totally cover the entire cost of 10 other light users with only 10% more than their own cost. It could eventually smooth out to being no real burden at all even for the biggest payers.
That's getting to be a bit much info to display all in a single colored dot or something without text or some complicated graphic, but I think this much could be shown and still be simple and elegant. Even a simple dot can have several dimensions all at once. size, hue, saturation, brightness, let alone any more detail like an outline or more complex shape.
About the only thing I can see that is a bad thing is I bet this is a recipe for unfairly taxing women more than men. You just know that far more women will make that light green even if it's not easy, and far more men will happily let it ride forever even though they could afford it effortlessly, just to spend that $3 on a half of a coffee instead.
I REALLY recommend it
Codeberg does suffer from the occasional DDOS attack—it doesn't have the resources that GH has to mitigate these sorts of things. Also, if you're across the pond, then latency can be a bit of an issue. That said, the pages are lighter weight, and on stable but low-bandwith connections, Codeberg loads really quickly and all the regular operations are supper zippy.
I don't think that it would take great contortions to implement a HTML + JS frontend that's an order of magnitude faster than the current crapola, but in practice it... just doesn't seem to happen.
I hear you and you're right that Codeberg has some struggles. If anyone needs to host critical infra, you're better off self-hosting a Forgejo instance. For personal stuff? Codeberg is more than good enough.
If you familiarize yourself with the way people use it for Linux kernel development, you'll see that it doesn't have to be this way.
I have not used Codeberg that much myself. I have known about it, but the UI is a bit ... scary. Gitlab also has a horrible UI. It is soooo strange that github is the only one that got UI right. Why can't the others learn from KEEPING THINGS SIMPLE?
I now run a local Gitea. Its rather more performant and uptime is rather better too!
I have no idea why on earth I even considered using GH in the first place. Laziness I suppose.