> [2025-11-27T02:10:07Z] it’s abundantly clear that the talented folks who used to work on the product have moved on to bigger and better things, with the remaining losers eager to inflict some kind of bloated, buggy JavaScript framework on us in the name of progress [1]
> [2025-11-27T14:04:47Z] it’s abundantly clear that the talented folks who used to work on the product have moved on to bigger and better things, with the remaining rookies eager to inflict some kind of bloated, buggy JavaScript framework on us in the name of progress [2]
> [2025-11-28T09:21:12Z] it’s abundantly clear that the engineering excellence that created GitHub’s success is no longer driving it [3]
---
1: https://web.archive.org/web/20251127021007/https://ziglang.o...
2: https://web.archive.org/web/20251127140447/https://ziglang.o...
3: https://web.archive.org/web/20251128092112/https://ziglang.o...
It would appear they listened to that feedback, swallowed their ego/pride and did what was best for the Zig community with these edits. I commend them for their actions in doing what's best for the community at the cost of some personal mea culpa edits.
If everyone is always bad regardless if they're trying to change, what incentives would they have from changing at all? It doesn't make any sense.
The reason why the latter stance is often popularized and cheered is because it is often harder to do, especially in the adverse conditions, when not changing your opinion has a direct cost of money or time or sanity or in rare cases even freedom. Usually it involves small human group or individual against a faceless corporation, making it even harder. Of course we should respect people standing against corporation.
PS: this is not applicable if they are "clearly wrong" of course.
If no one hates what you are doing chances are you're not doing anything really
Because this plays into a weird flaw in cognition that people have. When people become leaders because they are assholes and they are wrong, then after the wind blows the other way they see the light and do a mea culpa, there is always a certain segment that says that they're even more worthy to be a leader because they have the ability to change. They yell at the people who were always right that they are dogmatic and ask "why should people change their minds if they will be treated like this?"
If one can't see what's wrong with this toy scenario that I've strawmanned here, that's a problem. The only reason we ever cared about this person is because they were loud and wrong about everything. Now, we are expected to be proud of them because they are right, and make sure that they don't lose any status or position for admitting that. This becomes a new reason for the people who were previously attacking the people who were right to continue to attack the people who were right, who are also now officially dogmatic puritans whose problem is that they weren't being right correctly.
This is a social phenomenon, not a personality flaw in these leaders. People can be wrong and then right. People can not care either way and latch onto a trend for attention or profit, and follow it where it goes. I don't think either of these things are in and of themselves morally problematic. The problem is that there are people who are simply following individual personalities and repeating what they say, change their minds when that personality changes their mind, and whose primary aim is to attack anyone who is criticizing that personality. They don't really care about the issue in question (and usually don't know much about it), they're simply protecting that personality like a family member.
This, again, doesn't matter when the subject is stupid, like some aesthetic or consumer thing He used to hate the new Batman movies but now he says that he misunderstood them; who cares. But when the subject is a real life or death thing, or involves serious damage to people's lives and careers, it's poisonous when a vocal minority becomes dedicated to this personality worship.
It's so common that there now seems to be a pipeline of born-agains in front of everything, giving their opinion. Sir, you were a satanist until three years ago.
(He did post a kind of vague apology in https://ziggit.dev/t/migrating-from-github-to-codeberg-zig-p....)
If they would own up to it and say sorry, then your point stands. But that's not what happened here.
Indeed. The article even links to it.
https://ziggit.dev/t/migrating-from-github-to-codeberg-zig-p...
https://news.ycombinator.com/item?id=46114083
"I think that writing style is more LinkedIn than LLM, the style of people who might get slapped down if they wrote something individual.
Much of the world has agreed to sound like machines."
Eg,metaphors that make no sense or fail to contribute any meaningful insight or extrenely cliched phrases ("it was a dark and stormy night...") used seriously rather than for self-deprecating humor.
My favorite example of an AI tell was a youtube video about serial killers i was listening to for background noise which started one of its sentences with "but what at first seemed to be an innocent night of harmless serial murder quickly turned to something sinister."
It's just much easier now for "laypeople" to also adjust their style to this. My prediction is people will get quickly tired of it (as evidenced by your comment)
I was saddened to see how they ganged up to bully the author of the Zig book. The book author, as far as I could tell, seems like a possibly immature teenager. But to have a whole community gang up on you with pitch forks because they have a suspicion you might use AI... that was gross to watch.
I was already turned off by the constant Zig spam approach to marketing. But now that we're getting pitchfork mobs and ranty anti-AI diatribes it just seems like a community sustaining itself on negative energy. I think they can possibly still turn it around but it might involve cleaning house or instituting better rules for contributors.
I do think that it was weird to focus on the AI aspect so much. AI is going to pollute everything going forward whether you like it or not. And honestly who cares, either it is a good ressource for learning or it’s not. You have to decide that for yourself and not based on whether AI helped writing it.
However I think some of the critique was because he stole the code for the interactive editor and claimed he made it himself, which of course you shouldn’t do.
Does that count as substantial? I'm not sure because I'm not a lawyer, but this was really an issue about definitions in an attribution clause over less code than people regularly copy from stack overflow without a second thought. By the time this accusation was made, the Zigbook author was already under attack from the community which put them in a defensive posture.
Now, just to be clear, I think the book author behaved poorly in response. But the internet is full of young software engineers who would behave poorly if they wrote a book for a community and the community turned around and vilified them for it. I try not to judge individuals by the way they behave on their worst days. But I do think something like a community has a behavior and culture of its own and that does need to be guided with intention.
Without including proper credit, it is classic infringement. I wouldn't personally call copyright infringement "theft", though.
Imagine for a moment, the generosity of the MIT license: 'you can pretty much do anything you want with this code, I gift it to the world, all you have to do is give proper credit'. And so you read that, and take and take and take, and can't even give credit.
> Now, just to be clear, I think the book author behaved poorly in response
Precisely: maybe it was just a mistake? So, the author politely and professionally asks, not for the infringer to stop using the author's code, but just to give proper credit. And hey, here's a PR, so doing the right thing just requires an approval!
The infringer's response to the offer of help seemed to confirm that this was not a mistake, but rather someone acting in bad faith. IMO, people should learn early on in their life to say "I was wrong, I'm sorry, I'll make it right, it won't happen again". Say that when you're wrong, and the respect floods in.
> By the time this accusation was made, the Zigbook author was already under attack
This is not quite accurate, from my recollection of events (which could be mistaken!) The community didn't even know about it until after the author respectfully, directly contacted the infringer with an offer to help, and the infringer responded with hostility and what looked like a case of Oppositional Defiant Disorder.
> More importantly, Actions is created by monkeys ...
vs
> Most importantly, Actions has inexcusable bugs ...
I commend the author for correcting their mistakes. However, IMHO, an acknowledgement instead of just a silent edit would have been better.
Anyway, each to their own, and I'm happy for the Zig community.
we need less self censorship, not more.
Linux seems to be doing fine.
I wouldn't personally care either way but it is non-obvious to me that the first version would actually hurt the community.
In this case, the unnecessary insults detract from the otherwise important message, and reflect poorly on Zig. They were right to edit it.
Not at all, but this reads like childishness rather than political correctness.
The Zig comments come off has highly immature, maybe because they are comments made to unknown people, calling folks losers or monkeys just crosses some line to me. Telling someone to stfu is not great but calling groups of people monkeys feels worse.
Eager to do what? If it sucks it sucks, but that's a very childish way to frame it, no one did anything on purpose or out of spite. That kind of silliness hurts the image of the project. But bad translation I suppose.
I've spent more than a month trying to delete my account on GitHub, still couldn't do it
Product is useless, you move along. Save your compassion for those actually needing it.
> how hard it is for top performers to make change
then you're not a top performer anymore?
seems pretty straightforward
> they must be stupid
one can be not stupid and still not competent
Because one person is judging that "terribleness" before being entitled to flame, changes to that person influence their ability to objectively make that assessment.
Say, when their project becomes popular, they gain more power and fame, and suddenly their self-image is different.
Hence it usually being a more community-encouraging approach to keep discussions technical without vitriol.
Flaming is unnecessarily disruptive, not least because it gives other (probably not as talented) folks a license to also put their worst impulses to text.
My argument against how he handles things has always been that while it may seem effective, we do not know how much more effective he would be if he did not curse people out for being dumb fucks.
And it doesn’t seem like this is a requirement for the job: lots of other project leaders treat others with courtesy and respect and it doesn’t seem to cause issues.
The reality is that it is easy to wish more people were verbally abusive to others when it isn’t directed at you. But soon as you are on the receiving end of it, especially as a volunteer, there is a greater than not chance that you will be less likely to want to continue contributing.
> This week people in our community confronted me about my lifetime of not understanding emotions. My flippant attacks in emails have been both unprofessional and uncalled for.
> Especially at times when I made it personal. In my quest for a better patch, this made sense to me. I know now this was not OK and I am truly sorry. The above is basically a long-winded way to get to the somewhat painful personal admission that hey, I need to change some of my behavior, and I want to apologize to the people that my personal behavior hurt and possibly drove away from kernel development entirely.
> I am going to take time off and get some assistance on how to understand people's emotions and respond appropriately.
He took time off and he’s better now. What you call “political correctness” is what I and others call “basic professionalism”. It took Linus 25 years to understand that. I can only hope that the people who hero worshipped him and adopted a similar attitude can also mature.
https://lore.kernel.org/lkml/CAHk-=wjLCqUUWd8DzG+xsOn-yVL0Q=...
The normalization, in fact, has been quite successful. The entire silicon valley has tacitly approved of it.
You act like people arn't being rewarded for this type of behavior.
[0] https://www.whitehouse.gov/articles/2025/03/yes-biden-spent-...
Especially since it was created just by hammering people with repeated exposure to biased media over ~5 years.
If someone would take a beat, even from that biased copy, they might think that studying the effects of hormone treatment in animal models would be scientifically productive, regardless of how one feels about human transgender rights.
All in one, that's why developers like it so much. The obsession with AI makes me nervous, but the advantages still outweigh, as for me, the average developer. For now.
Things like number of stars on a repository, number of forks, number of issues answered, number of followers for an account. All these things are powerful indicators of quality, and like it or not are now part of modern software engineering. Developers are more likely to use a repo that has more stars than its alternatives.
I know that the code should speak for itself and one should audit their dependencies and not depend on Github stars, but in practice this is not what happens, we rely on the community.
I have no idea what the parent comment is talking about a "well-formed CI system." GitHub Actions is easily the worst CI tool I've ever used. There are no core features of GitHub that haven't been replicated by GitLab at this point, and in my estimation GitLab did all of it better. But, if I put something on GitLab, nobody sees it.
As for me, this does not negate the convenient things that I originally wrote about.
The previous popular free code hoster was Sourceforge, which eventually entered its what's now called "enshittifcation phase". Github was simply in the right place at the right time to replace Sourceforge and the rest is history.
1. Free hosting with decent UX
2. Social features
3. Lifecycle automation features
In this vein, it doing new stuff with AI isn't out of keeping with its development path, but I do think they need to pick a lane and decide if they want to boost professional developer productivity or be a platform for vibe coding.And probably, if the latter, fork that off into a different platform with a new name. (Microsoft loves naming things! Call it 'Codespaces 365 Live!')
And for those who don’t remember SourceForge, it had two major problems in DevEx: first you couldn’t just get your open source project published. It had to be approved. And once it did, you had an ugly URL. GitHub had pretty URLs.
I remember putting up my very first open source project back before GitHub and going through this huge checklist of what a good open source project must have. Then seeing that people just tossed code onto GitHub as is: no man pages, no or little documentation, build instructions that resulted in errors, no curated changelog, and realizing that things are changing.
But VCS has always been a standard-preferring space, because its primary point is collaboration, so using something different creates a lot of pain.
And the good ship SS Linux Kernel was a lot of mass for any non-git solution to compete with.
I hate that this is perceived as generally true. Stars can be farmed and gamed; and the value of a star does not decay over time. Issues can be automatically closed, or answered with a non-response and closed. Numbers of followers is a networking/platform thing (flag your significance by following people with significant follower numbers).
> Developers are more likely to use a repo that has more stars than its alternatives.
If anything, star numbers reflect first mover advantage rather than code quality. People choosing which one of a number of competing packages to use in their product should consider a lot more than just the star number. Sadly, time pressures on decision makers (and their assumptions) means that detailed consideration rarely happens and star count remains the major factor in choosing whether to include a repo in a project.
The metrics you want are mostly ones they don't and can't have. Number of dependent projects for instance.
The metrics they keep are just what people have said, a way to gameify and keep people interested.
All these things are a proxy for popularity and that is a valuable metric. I have seen projects with amazing code quality but if they are not maintained eventually they stop working due to updates to dependencies, external APIs, runtime environment, etc. And I have see projects with meh code quality but so popular that every quirk and weird issue had a known workaround. Take ffmpeg for example: its code is.. arcane. But would you choose a random video transcoder written in JavaScript just due to the beautiful code that was last updated in 2012?
??? Seriously?
> All these things are powerful indicators of quality
Not in my experience....
People don't just share their stargazing plots "for fun", but because it has meaning for them.
Hahahahahahahahahahahaha...
having used gerrit 10 years ago there's nothing about github's PRs that I like more, today.
> code navigation simply in a web browser
this is nice indeed, true.
> You write code, and almost everything works effortlessly.
if only. GHA are a hot mess because somehow we've landed in a local minimum of pretend-YAML-but-actually-shell-js-jinja-python and they have a smaller or bigger outage every other week, for years now.
> why developers like it so much
most everything else is much worse in at least one area and the most important thing it's what everyone uses. no one got fired for using github.
I've used Gerrit years ago, so wasn't totally unfamiliar, but it was still awkward to use when Go were using it for PRs. Notably that project ended up giving up on it because of the friction for users - and they were probably one of the most likely cases to stick to their guns and use something unusual.
That's not accurate. They more or less only use Gerrit still. They started accepting Github PRs, but not really, see https://go.dev/doc/contribute#sending_a_change_github
> You will need a Gerrit account to respond to your reviewers, including to mark feedback as 'Done' if implemented as suggested
The comments are still gerrit, you really shouldn't use Github.
The Go reviewers are also more likely than usual to assume you're incompetent if your PR comes from Github, and the review will accordingly be slower and more likely to be rejected, and none of the go core contributors use the weird github PR flow.
I've always done it that way, and never got that feeling.
A competent developer would be more likely to send a PR using the tool with zero friction than to dedicate a few additional hours of his life to create an account and figure out how to use some obscure.
Most likely, dedication says little about competence, and vice versa. If you do not want to use the tools available to get something done and rather not do the task instead, what does that say about your competence?
I'm not in a position to know or judge this, but I could see how dedication could be a useful proxy for the expected quality a PR and the interaction that will go with it, which could be useful for popular open source projects. Not saying that's necessarily true, just that it's worth considering some maintainers might have anecdotal experiences along that line.
I'm not saying a competent developer should be proficient in using gerrit, but they should know that it isn't an obscure tool - it's a google-sponsored project handling millions of lines of code internally in google and externally. It's like calling golang an obscure language when all you ever did is java or typescript.
Is there some kind of Google-centrism at work here? Most devs don’t work at Google or contribute to Google projects, so there is no reason for them to know anything about Gerrit.
Most devs have never worked on Solaris, but if I ask you about solaris and you don't even know what it is, that's a bad sign for how competent a developer you are.
Most devs have never used prolog or haskell or smalltalk seriously, but if they don't know what they are, that means they don't have curiosity about programming language paradigms, and that's a bad sign.
Most competent professional developers do code review and will run into issues with their code review tooling, and so they'll have some curiosity and look into what's out there.
There's no reason for most developers to know random trivia outside of their area of expertise "what compression format does png use by default", but text editors and code review software are fundamental developer tools, so fundamental that every competent developer I know has enough curiosity to know what's out there. Same for programming languages, shells, and operating systems.
codeberg supports logging in with GitHub accounts, and the PR interface is exactly the same
you have nothing new to learn!
I love patch stack review systems. I understand why they're not more popular, they can be a bit harder to understand and more work to craft, but it's just a wonderful experience once you get them. Making my reviews work in phabricator made my patchsets in general so much better, and making my patchsets better have improved my communication skills.
Man :| no. I genuinely understand the convenience of using Actions, but it's a horrible product.
Am I missing a truly better alternative or CI systems simply are all kind of a pita?
Is this not what you want?
https://docs.github.com/en/actions/how-tos/write-workflows/c...
> You just have github-runner-1 user and you need to manually check out repository, do your build and clean up after you're done with it. Very dirty and unpredictable. That's for self-hosted runner.
Yeah checking out everytime is a slight papercut I guess, but I guess it gives you control as sometimes you don't need to checkout anything or want a shallow/full clone. I guess if it checked out for you then their would be other papercuts.
I use their runners so never need to do any cleanup and get a fresh slate everytime.
It feels to me like people have become way too reliant on this (in particular, forcing things into CI that could easily be done locally) and too trusting of those runners (ISTR some reports of malware).
>In addition, it is best to use code navigation simply in a web browser.
I've always found their navigation quite clunky and glitchy.
How do you define "code navigation"? It might've got a bit easier with automatic highlighting of selected symbols, but in return source code viewer got way too laggy and, for a couple of years now, it has this weird bug with misplaced cursors if code is scrolled horizontally. I actually find myself using the "raw" button more and more often, or cloning repo even for some quick ad-hoc lookups.
Edit: not to mention the blame view that actively fights with browser's built in search functionality.
They have zero fees for individuals too which is amazing. Thanks to it I gained my first sponsor when one of my projects was posted here. Made me wish sponsorships could pay the bills.
IMHO the vanilla Github UI sucks for code browsing since it's incredibly slow, and the search is also useless (the integrated web-vscode works much better - e.g. press '.' inside a Github project).
> as well as a well-formed CI system with many developed actions and free runners
The only good thing about the Github CI system are the free runners (including free Mac runners), for everything else it's objectively worse than the alternatives (like Gitlab CI).
Having used Forgejo with AGit now, IMO the PR experience on GitHub is not great when trying to contribute to a new project. It's just unnecessarily convoluted.
GitHub's evolution as a good open source hosting platform stalled many years ago. Its advantages are its social network effects, not as technical infrastructure.
But from a technology and UX POV it's got growing issues because of this emphasis, and that's why the Zig people have moved, from what I can see.
I moved my projects (https://codeberg.org/timbran/) recently and have been so far impressed enough. Beyond ideological alignment (free software, distaste for Microsoft, want to get my stuff off US infrastructure [elbows up], etc.) the two chief advantages are that I could create my own "organization" without shelling over cash, and run my own actions with my own machines.
And since moving I haven't noticed any drop in engagement or new people noticing the project since moving. GitHub "stars" are a shite way of measuring project success.
Forgejo that's behind Codeberg is similar enough to GitHub that most people will barely notice anyways.
I'm personally not a fan of the code review tools in any of them (GitLab, Foregejo, or GitHub) because they don't support proper tracking of review commits like e.g. Gerritt does but oh well. At least Foregejo / Codeberg are open to community contribution.
That's not a Victorinox you're looking at, it's a cheap poorly made enshittified clone using a decades old playbook (e-e-e).
The focus on "Sponsorship buttons" and feature instead of fixing is just a waste of my time.
Nov 22, 2025 https://blog.codeberg.org/letter-from-codeberg-onwards-and-u...
Quotes from their website:
Infrastructure status [...] We are running on 3 servers, one Gigabyte and 2 Dell servers (R730 and R740).
Here's their current hardware: https://codeberg.org/Codeberg-Infrastructure/meta/src/branch...
[...] Although aged, the performance (and even energy efficiency) is often not much worse than with new hardware that we could afford. In the interest of saving embodied carbon emissions from hardware manufacturing, we believe that used hardware is the more sustainable path.
[...] We are investigating how broken Apple laptops could be repurposed into CI runners. After all, automated CI usage doesn't depend on the same factors that human beings depend on when using a computer (functioning screen, speakers, keyboard, battery, etc.). If you own a broken M1/M2 device or know someone who does, and believe that it is not worth a conventional repair, we would be happy to receive your hardware donation and give it a try!
[...] While it usually holds up nicely, we see sudden drop in performance every few days. It can usually be "fixed" with a simple restart of Forgejo to clear the backlog of queries.
Gives both early-Google as well as hackerspace vibes, which can or can not be a good thing.
Their reliability is not great unfortunately. Currently their 24h uptime is 89% for the main site. They are partially degraded right now.
The 14 day uptime is 98% but I think that’s actually because some of their auxiliary systems have great uptime, the main site is never that great it seems.
I would even consider that moving everything from one single point of failure to an other is not the brightest move.
Github does offer a self hosted product: GitHub Enterprise Server
It is easy to administer even for 15k users, and mostly it takes care of itself if you give it enough RAM and CPU for all the activity.
Downloading the virtual hard drive image from GitHub is easy and decrypting the code inside is borderline trivial, but I'm not going to help anyone do that. I've never had a need to do it.
As a server product it is good. I recommend it if you can afford it. It is not intended for private individuals or non-profits, though. It's for corporations who want their code on-premise, and for that it is quite good.
1. use git or jj
2. pull-request like data lives on the network
3. They have a UI, but anyone can also build one and the ecosystem is shared
I've been considering Gerrit for git-codereview, and tangled will be interesting when private data / repos are a thing. Not trying to have multiple git hosts while I wait
There are a few things that keep me on Gitlab, but the main one is the quality of the CI/CD system and the gitlab runners.
I looked at Woodpecker, but it seems so docker-centric and we are, uh, not.
The other big gulf is issues and issue management. Gitlab CE is terrible; weird limitations (no epics unless you pay), broken features, UX nightmares, but from the looks of it Forjego is even more lacking in this area? Despite this seeming disdain, the other feature we regularly use is referencing issue numbers in commits to tie work together easily. On this one, I can see the answer as "be the change - contribute this to Forgejo" and I'm certainly willing. Still, it's currently a blocker.
But my hopes in putting this comment out there is that perhaps others have suggestions or insight I'm missing?
I've been self-hosting it for a few years now and can definitely recommend. It has been very reliable. I even have a runner running. Full tutorial at https://huijzer.xyz/posts/55/installing-forgejo-with-a-separ....
I believe the correct question is "Why they are getting DDoSed this much if they are not something important?"
For anyone who wants to follow: https://social.anoxinon.de/@Codeberg
Even their status page is under attack. Sorry for my French, but WTF?
> in the past 48 hours, code.qt.io has been under a persistent DDoS attack. The attackers utilize a highly distributed network of IP addresses, attempting to obstruct services and network bandwidth.
https://lists.qt-project.org/pipermail/development/2025-Nove...
- the internet's a lot bigger nowadays
- there are a lot of crappily secured iot devices
- the average household internet connection has gotten a lot faster, especially on upload bandwidth.
- there's a pile of amplification techniques which can multiply the bandwidth of an attack by using poorly-configured services.
An issue with comments, linked to a PR with review comments, the commit stack implementing the feature, and further commits addressing comments is probably valuable data to train a coding agent.
Serving all that data is not just a matter of cloning the repo. It means hitting their (public, documented) API end points, that are likely more costly to run.
And if they rate limit the scrappers, the unscrupulous bunch will start spreading requests across the whole internet.
I think it's not malice, but stupidity. IoT made even a script kiddie capable of running a huge botnet capable of DDoSing anything but CloudFlare.
That's not how threat analysis works. That's a conspiracy theory. You need to consider the difficulty of achieving it.
Otherwise I could start speculating which large NAS provider is trying to DDoS me, when in fact it's a script kiddie.
As for who would have the most incentives? Unscrupulous AI scrapers. Every unprotected site experiences a flood of AI scrapers/bots.
Story time:
I remember that back in the day I had a domain name for a pretty hot keyword with a great, organic position in Google rankings. Then someday it got all of a sudden serious boost from black-SEO, with a bazillion links from all kinds of unrelated websites. My domain got penalized and dropped of from the front page.
For each potential adversary, you list the risk strategy; that's threat analysis 101.
E.g. you have a locked door, some valuables, and your opponent is the state-level. Risk strategy: ignore, no door you can afford will be able to stop a state-level actor.
Just research about Office formats' ISO standardization process.
I'm not insinuating MicroSoft will buy Codeberg, but I just wanted to say that, they are not foreigners to the process itself.
I don't think your comparison works out.
That's said, I believe my comparison checks out. Having ~800 members is a useful moat, and will deter actors from harming Codeberg.
OTOH, the mechanism can still theoretically work. Of course Microsoft won't try something that blatant, but if the e.V loses this moat, there are mechanisms which Microsoft can and would like to use as Codeberg gets more popular.
which is basically the same thing
Sure, they could try to bribe the Codeberg e.V. active members into changing its mission or disbanding the association entirely, but they would need to get a 2/3 majority at a general assembly while only the people actively involved in the e.V. and/or one of its projects can get voting rights. I find that highly unlikely to succeed.
Now, all the servers I run has no public SSH ports, anymore. This is also why I don't expose home-servers to internet. I don't want that chaos at my doorstep.
To avoid needing SSH just send your logs and metrics out and do something to autodeploy securely then you rarely need to be in. Or use k8s :)
Kubernetes for personal infrastructure is akin to getting an aircraft carrier for fishing trips.
For simple systems snapshots and backups are good enough. If you're managing a thousand machine fleet, then things are of course different.
I manage both so, I don't yearn to use big-stack-software on my small hosts. :D
No, it's just opsec.
> Sure, scanners will keep pinging it, but nobody is ever going to burn an ssh 0day on your home server.
I wouldn't be so sure about it, considering the things I have seen.
I'd better be safe than sorry. You can expose your SSH if you prefer to do so. Just don't connect your server to my network.
1. Never tell everything you know and seen.
2.
For what I do, you can refer to my profile.all my services are always exposed for convenience but never on a standard port (except http)
After managing a fleet for a long time, I'd never do that. Tailscale or any other VPN is mandatory for me to be able to access "login" ports.
Now that zig is fairly well known and trusted, it makes sense that this is less of a concern for them when migrating away.
At any rate, the feed is still available and you can reach it via browser autocomplete. I open GitHub by typing "not" in my URL bar and landing on the notifications page.
The people at Zig should use proper CI tools and not something that a big service provider is offering as an afterthought.
One of the nice things about switching to Forgejo Actions is that the runner is lightweight, fast, and reliable - none of which I can say for the GitHub Actions runner. But even then, it's still more bloated than we'd ideally like; we don't need all the complexity of the YAML workflow syntax and Node.js-based actions. It'd also be cool for the CI system to integrate with https://codeberg.org/mlugg/robust-jobserver which the Zig compiler and build system will soon start speaking.
So if anything, we're likely to just roll our own runner in the future and making it talk to the Forgejo Actions endpoints.
And there exist a lot of specialized solutions out there, where the business model is purely CI.
I guarantee that in ~24 months, most AI features will still remain in some form or another on most apps, but the marketing language of AI-first will have evaporated entirely.
Where have you been the last 15 years? However, I agree with your prediction. Coke making AI advertisements may have had cache a couple years ago, but now would be a doofus move.
https://en.wikipedia.org/wiki/AI_winter
Early 2010s had a lot of neural networks AI stuff going on and it certainly became a minor hype cycle as well though that kind of resulted in the current LLM wave.
Night and day compared to something like Linear.
$ time curl -L 'https://codeberg.org/'
real 0m3.063s
user 0m0.060s
sys 0m0.044s
$ time curl -L 'https://github.com/'
real 0m1.357s
user 0m0.077s
sys 0m0.096s Github
158 requests
15.56 MB (11.28 MB transferred)
Finish in 8.39s
Dom loaded in 2.46s
Load 6.95s
Codeberg
9 requests
1.94 MB (533.85 KB transferred)
Finish in 3.58s
Dom loaded in 3.21s
Load 3.31sHere are my results for what it's worth
$ time curl -o /dev/null -s -L 'https://codeberg.org'
real 0m0.907s
user 0m0.027s
sys 0m0.009s
$ time curl -o /dev/null -s -L 'https://github.com/'
real 0m0.514s
user 0m0.028s
sys 0m0.016sOn Github any page loads gradually and you don't see a blank page even initially.
Compared to then this product is downright mature now. And also, there always were people at GitHub who delivered crappy products outside the core that most people working on FOSS got to see. Enterprise Cloud has a ton of sharp edges and things that make you ask "WHY" out loud. Notifications with SAML enabled are broken in countless ways (I have 0 out of 12 notifications right now), newly onboarded users are encouraged to click a "request copilot" button that sends emails to admins you can't disable, policy toggles that used to do one thing get split up and not defaulted properly. The last two in particular are just dark pattern hacks to get people to use Copilot. In an enterprise product.
I haven't used GHES, but I imagine it's worse.
Right now github is great for discovering and tracking projects, reflecting growth via the star and fork system (although a bit broken in the last few years).
If a federated layer is applied to these github alternatives, you could have an account in Codeberg, and be able to track lots of projects wherever people want to host them. Right now, I see a lot of forgejo servers, but I don't want to register in all of them.
The primary issue with SO was that it was disconnected from the actual communities maintaining the project. A federated solution would be able to have the same network effects while handing ownership to the original community (rather than a separate SO branch of the community)
Migrating the main Zig repository from GitHub to Codeberg - 883 comments
You can also run a Forgejo instance (the software that powers Codeberg) locally - it is just a single binary that takes a minute to setup - and setup a local mirror of your Codeberg repo with code, issues, etc so you have access to your issues, wiki, etc until Codeberg is up and Forgejo (though you'll have to update them manually later).
> it is just a single binary that takes a minute to setup - and setup a local mirror of your Codeberg repo with code, issues, etc so you have access to your issues, wiki, etc
is really cool! Having a local mirror also presumably gives you the means to build tools on top, to group and navigate and view them as best works for you, which could make that side of the process so much easier.
> you'll have to update them manually later
What does the manually part mean here? Just that you'll have to remember to do a `forgejo fetch` (or whatever equivalent) to sync it up?
it makes me sad to see that github is now going through the same shit, and people are using other random half-ass alternatives, it’s not easy to keep track of your favourite open-source projects across many source forgeries. we need someone to buy github from Microsoft and remove all the crap they have added to it.
Most public forge instances and web presence for open source projects have RSS feeds.
https://github.com/torvalds/linux/pulls?q=is%3Apr+is%3Aclose...
Most open source projects talk about reducing GitHub dependency but never actually do it because the switching costs are brutal. Issues, PRs, CI integrations, contributor muscle memory - it all adds up. Codeberg is solid but the network effects aren't there yet.
Curious whether this pushes other projects to at least have contingency plans. The AI training concerns are real, but I suspect the bigger long-term risk is just platform enshittification in general - feature bloat, performance degradation, mandatory upsells.
done.
They didn't, poor wording on Register part. The pull request was closed for inactivity by the bot.
We had technical problems that GitHub had no interest in solving, and lots of small frustrations with the platform built up over years.
Jumping from one enshittified profit-driven platform to another profit-driven platform would just mean we'd set ourselves up for another enshittification -> migration cycle later down the line.
No stunt here.
https://ziglang.org/news/migrating-from-github-to-codeberg/
> Putting aside GitHub’s relationship with ICE
That was the extent of it. Six words.
Furthermore, this submission is an independent post, not from Zig, reporting on the original and adding more context.
> To use a great quote "either do or don't, but I got places to be".
What exactly is your complaint? The move had already been completed at the time of the original Zig post. They did do it.
There’s no incongruence between posts. The nature of your discontent or how it could possibly affect you isn’t clear in the slightest.
Like, reading and posting on Hacker News?
Every single application or webpage having its own AI integration is seriously one of the dumbest ideas in computing history (instead of separating applications and AI tools and let them work together via standardized interfaces).
Same effect at play watching all the top-tier AI corps under heavy competitive fire still, trying hard to keep the audience attached while battling to stay on top of (or keep up with) the competition. This mainly (for now) benefits the user. If OpenAI were to trailblaze on their own, we'd all be paying through the roof for even the most basic GPT by now.
"top-tier" is not a term I would use to describe Microsoft
Github is great. It barely changes at all and yet it's still too much for this originalist crowd.
Seriously though the big problem to solve will be squatters, when there are three logical places for a module to be hosted. That could create issues if you want to migrate.
I would rather have this happening after a contender to git has surfaced. Something for instance with more project tracking built in so migration were simpler.
I suspect Codeberg, which is focused on free software, will frown on them. They already disallow mirroring.
In which direction? (I'd check myself but they're down...). That doesn't sound very open to me.
From their FAQ:
> Why can't I mirror repositories from other code-hosting websites?
> Mirrors that pull content from other code hosting services were problematic for Codeberg. They ended up consuming a vast amount of resources (traffic, disk space) over time, as users that were experimenting with Codeberg would not delete those mirrors when leaving.
> A detailed explanation can be found in this blog post.[1]
[1]: https://blog.codeberg.org/mirror-repos-easily-created-consum...
This sounds a bit like an oxymoron. More diversity will only help the ecosystem IMHO.
I think it would've been far easier to build a decent GUI around that flow, with some email integration + a patch preview tool, rather than adding activitypub, but oh well.
Check out Sourcehut (https://sourcehut.org/). It uses a mailing list-based workflow so contributing code or bug reports is relatively effortless and doesn't require a Sourcehut account.
MS need a stint of focusing on what users want rather than ramming stuff down their throat unasked hoping they’ll swallow
This one is almost a one-line change (technically they need an extra flag in the YAML but that's hardly difficult): https://github.com/orgs/community/discussions/12882#discussi...
That said, I still think Github is fine, and you can't argue with free CI - especially on Windows/Mac. If they ever stop that I'll definitely consider Codeberg. Or if Codeberg gets support for stacked PRs (i.e. dependencies between PRs), then I'm there! So frustrating that Github doesn't support such an obvious workflow.
Not spending on maintenance and spending gobs on something many people don’t want is far worse. It says we have the money, we just don’t give a fuck.
I think this is the natural outcome of "chasing points" mechanic inside Microsoft.
It kind of does.
I used this a lot in several jobs to work in dependent tickets in advance. Just make another branch on top of the previous (a PR to the other PR branch).
People could review the child PR before the parent was merged. And it requires some less than trivial git knowledge to manage effectively, but nothing extraordinary. Any solution for stacked PRs based on git would also require it (or a custom tool).
I think I'm on their side on this one. From git perspective, it works just as I expect. Something else probably belongs to JIRA or project management instead.
You first send PR#1, then PR#2 on top of the first one.
The diff for PR#1 will show dough stuff. The diff for PR#2 will show toppings in relation to dough.
People can review them asynchronously. If you merge PR#1, PR#2 will automatically target main (that's where dough went) now.
In this arrangement, I use to cross-mention the PRs by number (a link will exist in both). I also like to keep the second one draft, but that depends on the team practices.
I don't understand why you would close the second PR when the first gets merged. It should lose the dependency automagically, which is exactly what happens if you branch correctly.
MS in particular _really_ seems to be sacrificing itself on the altar of Roko's Basilisk; they appear totally incapable of doing _anything_ that isn't AI-branded anymore.
Bun is made with Zig, and they just got acquired by Anthropic.
Ghostty is another notable piece of software made with Zig.
I assume the Bun acquisition is fueling most of this Zig news. There's about 4 articles on the frontpage about Zig.
Edit: Scrolling comments I see something called Codeberg but why am I getting connection refused?
Another edit: Oh because Codeberg is down. I had to look at another thread on the frontpage to find that out...
The Zig attitude towards AI usage is a bit odd in my view. I don't think it's that widely shared. But good for them if they feel strongly about that.
I'm kind of intrigued by Codeberg. I had never heard of it until a few days ago and it seems like that's happening in Berlin where I live. I don't think I would want to use it for commercial projects but it looks fine for open source things. Though I do have questions about the funding model. Running all this on donations seems like it could have some issues long term for more serious projects. Moving OSS communities around can be kind of disruptive. And it probably rules out commercial usage.
This whole Github is evil anti-capitalist stance is IMHO a bit out of place. I'm fine with diversity and having more players in the market though; that's a good thing. But many of the replacements are also for profit companies; which is probably why many people are a bit disillusioned with e.g. Gitlab. Codeberg seems structured to be more resilient against that.
Otherwise, Github remains good value and I'm getting a lot of value out of for profit AI companies providing me with stuff that was clearly trained on the body of work stored inside of it. I'm even paying for that. I think it's cool that this is now possible.
MIT license requires attribution, which AI algorithms don’t provide AFAIK. So either (a) it’s fair use and MS can do that regardless of the license or (b) MS can’t do that. In any case, yeah, that’s not the issue Zig folks have with GitHub.
MS training AIs on Zig isn't their complaint here. They're saying that Github has become a worse service because MS aren't working on the fundamentals any more and just chasing the AI dream, and trying to get AI to write code for it is having bad results.