The democratization ends at your router. Unless you are willing to lay down your own wires - which for legal reasons you most likely won't be able to do, we will hopelessly be dependent on the ISP. (Radio on free frequencies is possible and there are valiant attempts, they will ultimately remain niche and have severe bandwidth limitations)
For decades ISP have throttled upload speeds: they don't want you to run services over their lines. When DSL was around (I guess it still is) in Germany, there was a mandatory 24h disconnect. ISP control what you can see and how fast you can see it. They should be subject to heavy regulation to ensure a free internet.
The large networks, trans-atlantic, trans-pacific cables, all that stuff is beyond the control of individuals and even countries. If they don't like your HTTP(S) traffic, the rest of the world won't see it.
So what you can own is your local network. Using hardware that is free of back-doors and remote control. There's no guarantee for that. If you are being targeted even the Rasperry Pi you just ordered might be compromised. We should demand from our legislators that hardware like this is free of back-doors.
As to content creation: There are so so many tools available that allow non-technical users to write and publish. There's no crisis here other than picking the best tool for the job.
In short: there's no hope of getting a world-wide, free, uncensored, unlimited IP4/6 network back. We never had it in the first place.
We can build such a society. I am not sure why you think this is never possible.
People can work for a better world. That sometimes works, too.
Maybe we can, but it is A) a far bigger, older, and more difficult problem than how to structure a computer network, and B) fundamentally not solvable through technological means.
No matter how much technologists love the idea of technology as a liberating force, our worst instincts and dynamics always reassert themselves and soon figure out how to use that same technology to destroy liberty.
However, large institutions are also slow to move, grow, and change. At the leading edge of technological adoption small groups and individuals can use the amplified power to resist supression.
The trick is to remain at the leading edge and to remind early adopters of the power they wield. If enough of us fight for liberty many institutions will follow.
Where does such informed political and economic interest and power exist? With whom do we construct such society? Do they have the power and will to fight for it?
Normies live with normie standards and with incresing social media exposure with more and more emotional animal-like manipulated world views. They are either ignorant or ambivalent.
Will tech people gather on a piece of land and declare independence? Most of my tech worker colleagues are also quite pro-social media and they heavily use it to boost their apparent social status. We cannot even trust our kind.
Similar examples of new technology being used to motivate and mobilize masses have always ended with devastating wars and genocides. Previously the speed of propagation of information gave advantages to statespeople like FDR to put an end to increasing racism/Nazism/violent tendencies (of course not everywhere, when let to its own devices new technology almost perfect for constructing dictatorships). Now everybody has equal access to misinformation.
except we cant agree exactly HOW the new utopia should be and we end up splintering into two groups at loggerheads, fighting each other and back to square one, talking about how if we just followed someone else idea of a utopia we would have to fight all the time. dream on
Capitalism is not the only way of life, and FYGM is a mental illness outside of the US
Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.
Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.
When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."
Please don't fulminate. Please don't sneer, including at the rest of the community.
Eschew flamebait. Avoid generic tangents. Omit internet tropes.
Please don't use Hacker News for political or ideological battle. It tramples curiosity.
You* maybe cannot, but that certainly isn't everywhere.
We plebs are just driftwood floating in massive waves of nation state decision making. I don't doubt there are people who literally work at ISPs who are depressed at the state of things, depressed that theyre not allowed to take action on certain things, depressed that they see first-hand what kind of control mechanisms they're forced to implement or disallowed from implementing and more. It's got to be a trove of BS in an age of misinformation which has always been an information systems problem that humanity has implemented checks notes zero solutions for. And at the end of the day they, probably like all of us, just want to live a good and meaningful life.
That's not to say just... give up on ideals. But instead to acknowledge the realities of ideals not being enough on their own. Have some real conversations on what it would actually take to embed these types of fundamentals into a society, get comfortable with the uncomfortable realities. So much work needs to be done before new ideals can even be shared. Outreach alone to spread ideals is a massive uphill battle at this point due to conglomerate control of broadcast media and concentrated ownership of social media apps. A lot of these particular ideals require a decent understanding/background of technology in general which most people don't even have, making these things an incredibly unlikely basis for a society where these things are well-enough understood. So the circus trick here is how do you make it a digestable topic that touches the souls of many and galvanizes them to take the correct stance so that these things become embodied in the set of ideals a society values, so that legislators and whatever other proxies that are tasked with decision making give these things the resourcing or policy making attention they deserve. That's the mega hard part, which is then additionally compounded in difficulty by ... most households in our societies just never having these types of discussions make it to their TV/computer screens. Hackernews types like to call these people "normies" and tack the blame on them, but they can't seem to wrap their mind around that not everyone could or should have a deep compsci background. We should be coexisting with people of a variety of backgrounds and instead we should be looking at their "normie"-ness as a thing to account for, not blame. It would be absurd to have a "normie" expect us to be exceptional at rebuilding car engines or any other broad subset of knowledge that we haven't ourselves committed our own lives/spare time to.
So that leaves the other route to take which is just... renegade fine-we'll-do-it-ourselves. Which can succeed, but has its own set of challenges. Fronting infrastructure for a lot of stuff is expensive, so donors are needed on sometimes vast scales. To another commenters point like... ain't none of us on the renegade front laying undersea cables any time soon which are multi-billion dollar projects to cross the Pacific. Often times we see these underground efforts fail in their infancy simply because the UX just flat out sucks and we're up against entities who can giga-scale all their infrastructure/resources & ultimately capitalize on making whatever app thing fast&pleasant for users. It feels like we're drowning against titans sometimes, it's overwhelming.
One problem you face are high profile leaders apparently being "replaced" with ones that are a lot more "conformist." So yesteryear's Bezos might've said yes, today's Bezos: no. See here: https://www.youtube.com/watch?v=WqlEtPBNgLc (members only)
This is similar to Bill Joy and his unlimited music device in about 2000.
Having something local means superfast response times, even if there's a delay in freshness.
The key is moderation. I suppose it could end up like a magazine. Maybe charge micropayments to have content included with that fee changing according to a slop-scale?
It's a good idea to couple with something like the upcoming Linux handheld - Mecha. Need multi-TB SSDs - maybe AI can help navigate? Have different versions according to how much storage the user has available.
Maybe there's demand for something local that new AIs can train from. Something other than Internet Archive.
Are there any products that can minimize online-time by building a backlog of jobs to run when online, but also enabling a pause for offline-time?
Also, websites could queue requests, eliminate the AI thumping, and notify the client when their request is allowed? It's like queuing up for a niteclub and eventually getting passed the droid-hostile bouncer.
Not when people make arguments based on dreams, hope, and optimism.
If somebody tells me that we can build a shed, I want them to talk about wood, nails and concrete, or to stop talking.
If everyone was a normal (as far as anyone is normal) law abiding citizen perhaps I would agree, but sadly that is far from the case. I think history has quite clearly shown that there is a minority of people out there that will take advantage and ruin things for everyone else. It's the same reason we have militaries, police forces, government checks and balances, etc. The internet is no exception to this.
I don't think the world is simple enough where anyone could be absolutist about freedom, it's all grey areas and complicated lines drawn.
There will be more efforts like this: https://yggdrasil-network.github.io
Those who want control over other people's mouths and eyes and ears, and rely on it to maintain their undeserved authority and prosperity are going to have a bad time.
I think the main barrier is still the complexity of running your own service - it's a full time job to keep on top of the bad actors.
For example, if you have your own domain it's perfectly possible to run your own email server - however it's quite a lot of ongoing effort - it's not just set up and forget.
I have seen those kinds of opinions on internet already few times. No it is not that complicated. Yes you need to buy server. Yes you need to setup the DNS. Yes you need to maintain, and update server and its software. But this is like that with everything you selfhost.
Beside that you need mostly 1 time operations like: - setup domain entries - setup SPF - setup DKIM - setup certs - install server (of course) - test if this works - setup some Google Postmaster account because they do not like new domains sending them emails
I do not remember anything else beside some administrative tweaks here and there. But!
I never attempted to run postfix, dovecot combo myself. I was aiming to run whole thing on Docker and forget about configuring dozens of config files on Linux host. With docker you can just migrate whole set of volumes to new machine and that is it. I am running Mailcow BTW.
Lately I moved whole thing to new machine by just running one script https://docs.mailcow.email/backup_restore/b_n_r-coldstandby/...
On the other hand you need to have some technical knowledge, but I do not think this is harder then any other containerized software.
Dont you need to keep on top of DMARC reports?
eg https://www.duocircle.com/dmarc/how-to-fix-spf-records-by-an...
As for my own emails they were rejected maybe by just few times. But I do not use email much - just some personal communication. I am not running marketing campaigns that places my IP on some blacklist.
I think I just one time was on one of spam lists. I just emailed them or applied for removal via some webform and they removed the entry pretty much instantly. I do not remember other problems.
In some countries that may be possible (if only for now). Where chips are produced makes that an impossibility for most. That is, you can have certain guarantees if you run the chip fab, although if you are downstream of that, it can be a tall order to guarantee your chips are sovereign. So, while I like the sentiment that you have some sort of control behind your router, I'm really unsure how true that is given the complexity of producing modern day chips. Disclaimer, not an expert, just an opinion.
I'd settle for a maximally private totally uncensored IPV4 like there used to be. Broadband turned out to be over-rated in some ways.
One of the good things about dial-up was the way it was built on a peer-to-peer network that "everybody" already had, their land-line telephone service.
Way before actual "networking", anybody with a modem could connect privately with anybody else who had one.
An ISP could be formed by taking incoming calls from all active digital users simultaneously, and that was where the networking was done, plus connection to other networks around the world.
You could still contact any one computer user privately if you wanted to, without going through an ISP, just like it was before the web.
Also connect one network with another distant one, such as one office building to another, without ISP.
If anybody wanted to form their own working ISP, they could do it privately anytime as an interested group and not even tell anybody about it if they didn't want to. It might not be a commercial ISP but there was no mainstream to begin with where it was assumed that an ISP must be commercial or make any money at all.
These connections were intended to be "totally" private by law, it was well-established that a court order was required to do a wiretap, and the penalty for violation was based on the concept that spying on Americans was one of the worst crimes, and needed to deter those who acted to compromise the privacy & freedom that America cherished so deeply. And preserve citizen rights the country was chartered to uphold, no differently than before the telephone was invented.
There's nothing like this any more, land-line copper is in miserable disuse so the only remaining wire if any is TV cable. But the only way to do peer-to-peer contact over cable is through an ISP, how private is that and why is there not a court order necessary before privacy can be compromised and very select Americans be subject to espionage?
Cell phones won't help you now, they can be tapped without wires.
The options are far fewer than the possibilities offered when dial-up first got popular.
Mostly because we have allowed the ISPs to collapse into monopolies.
People have forgotten that the US used to have competition in broadband back at the point where the Internet was rolling out to everybody.
What do you mean "back"? It was never free, as in zero-cost. It was also not very unlimited; I remember times when I had to pay not only for the modem time online, but also for the kilobytes transferred. Uncensored, yes, because basically nobody cared, and the number of users was relatively minuscule.
The utopia was never in the past, and it remains in the future. I still think that staying irrelevant for large crowds and big money is key.
Not really having a plan here, so if nothing else this is out of curiosity, but I'd like to know who is actually owning that stuff.
For something that seems so ubiquitous and familiar like the internet, it would probably be good to understand who owns most of its infrastructure.
It’s Verizon.
Is that like owning a car and having some third-party (who?) remotely cut the brake cable because the manufacturer fitted that "feature" in stealth? Or maybe they did something more benign (ie disable car speakers) just as a reminder as to "who is the boss?"
We can have other protocols on top of TCP/IP and build a new Internet over the existing one, much like TOR/I2P/Hyphanet/Lokinet but without many of the disadvantages of those.
The next comment will be “but they can have short orbits” but that betrays the fact that they can collide with other objects and if its so cheap we will launch thousands for bandwidth.
As always: technical solutions to political problems is a band-aid and makes everything worse, lets beat our politicians to death (metaphorically) instead.
This is a fine aspiration but not at all reflected in reality.
This seems to be changing somewhat; even cable ISPs are adopting full duplex via DOCSIS 3.1 (now rebranded as 4.0?) and later.
Basically, you can as a private individual set up a wireless node, talk with your nearest node that you have a visual line of sight to, and get connected to a completely separate network from the internet, where there is a ton of interesting stuff going on, and it's mostly volunteer run.
I don't know - the rate of adoption of MeshCore and similar technologies is quite astonishing.
A whole post about not needing big corporations to publish things online, and then they use Microsoft to publish this thing online...
- A Raspberry Pi 3B+ with a 3 gigabyte hard drive setup as a "server" (makes this site available on my home network[9])
- I publish this site via GitHub Pages service for public Internet access (I have the least expensive subscription for this)
...
[9] I can view my personal web on my home network from my phone, tablet and computers. So can the rest of my family.What’s needed is a lot of work on the software front to make it much easier, with interoperable standards. Self-hosted WYSYWIG options as easy to use as the social media tools for photos and writing and social posts. Ability to run distributed chatroom style instances with tracker like discoverability to replace Discord. Built in backup options with easy offsite backup replication.
But just saying people should homelab more is totally cromulent.
it embiggens even the smallest lan
You cannot "not participate in society" in a meaningful way. But you can self-host a blog. Especially for someone tech-savvy enough to talk about web technology.
You also might not want your IP to be indexed by crawlers. Offloading the burden of hosting onto an already public site mitigates a lot of those issues.
If it wasn't Github it would be Gitlab, if not Gitlab, somewhere else I'm sure.
The point being you shouldn't need to soapbox from your doorstep, but use the commons as intended
For a static read-only website? No, that's not necessary.
Hosting on GitHub is merely a convenience; they can up and leave anytime.
It is possible through what he says. I’ve made fx [1] exactly for the purpose of the author. Wordpress but written in Rust for efficiency and loads of unneeded features omitted. Publishing is not via static site generator because the time between edit and seeing the result was too long for me. It does use Markdown for the posts and has built-in backup to Git functionality. I’m using it for my blog and like it a lot since I can quickly use it to jot down notes [2].
[1]: https://github.com/rikhuijzer/fx
[2]: https://huijzer.xyz/
The nice part here is that you can update your site via git version control, and have easy rollback etc... assuming you can deal with git.
I don't think it's such a problem to have big corporations involved in your publishing efforts; the problem is when they lock users in with proprietary technology, and create barriers to entry like high costs and technical complexity.
I don't have a need to mix anticapitalist fervor with the desire for an easy way to make durable, portable web sites. The internet has always involved paying some kind of piper. Corporations are too big & monopolization is a problem, but one thing at a time...
Then the URL was http://www.<hostname.domain>/~<username>
I haven't see an URL with a tilde ('~') in it in a long time.
Why did ISPs stop with this service? Was it to curb illegal file sharing?
https://www.cs.cmu.edu/~btitzer/riff.html
I actually haven't had a homepage for a long time because of the lack of the easy "put my home directory on the web", but I'd like to go back now to doing that.
There use to be lots and lots of ISPs and so they were small enough to have a single webserver with all their customers setup as users and Apache serving content. They'd also setup FTP on the same server so you could get your html files into your www folder. Software like Dreamweaver had a ftp client built in, so you'd click like a "publish" button and it wold login to FTP and transfer your files.
i would imagine this went away because it got expensive as the customer base grew and ISPs consolidated and it made no money. Other options with php, mysql, and other services cropped up and could offer more and charge for it so I think ISPs just preferred to concentrate on network access and not hosting websites.
It is also possible to add .htaccess and other things there, like username/password challenge (WWW-Authenticate) into that on per-user basis.
Mostly universities had hosting setup the same way. ISPs would also offer a similar thing with an additional fee to your internet-subscription. They mostly provided FTP to upload files. Nowadays if anyone tries to, it will be a SFTP rather than FTP.
They gave you a sub url, like mypage.hostingsite.com and ftp access where you could upload like 50mb of data.
I remember running my website in high school, and so did a ton of people, a significant number of whom I wouldn't describe as particularly technical.
I think my ISP offered a similar service but it was both less generous and much harder to use. I don't think a lot of people of people used those.
If you needed/wanted more, you could use your old crappy PC to host a webserver - just install a LAMP stack, phone your ISP to give you a static IP, and buy a domain from your pocket money.
One of my friends used to run a somewhat popular PhpBB based Counter-Strike forum, and even caused some noteworthy drama online, when his parents unplugged the PC because they said it was using too much power , and the forum was down for like a week.
I used to have my choice of dozens of ISP's. Now if I am lucky I might have 2 or 3 from very large companies that did the math on keeping that going. It mostly happened when ADSL and cable took over. In most areas that meant only 2 or 3 companies could actually provide anything at speeds their customers wanted. Think at the time they always said it was cost cutting.
Unfortunately reality is such that those are closed systems with historically abhorrent security and ISPs usually forbids the user from properly providing their own choice of router.
why dont you replace it with petabytes of free hosting for people, along with the bandwidth to serve it? moneys obviously no obstacle to you
Another thought I had is that local AI could most definitely play a part in helping non-technical users create the kind of content they want. If your CMS gives you a GPT-like chat window that allows a non-technical user to restyle the page as they like, or do things like make mass edits - then I think that is something that could help some of the issues mentioned here.
just fyi, Word still has Save As -> Web Page (.htm). For a blog post or newsletter type thing i bet it works just fine.
And for people that actually want to learn a bit of HTML, CSS and JavaScript, Mastro JS is as simple a static site generator as I could make it.
But I do think we’re reaching a turning point on the software side. The barrier to building custom, personalized apps is trending toward 0. I’m not naive enough to think every grandma will suddenly start asking ChatGPT to “build me an app to do XYZ,” but with the right UX it can be implicit. Imagine you tell an assistant: “My doctor says my blood sugar is high. Research tips to reduce it.” -> it not only replies with tips, it also proactively builds a custom app (that you own and control) for tracking your blood sugar (measurements, meals, reminders, charts, etc.). You can edit it by describing changes (“add a weekly trend graph,” “don’t nag me after 8pm,” etc.).
This doesn’t fully solve your Big Co control issue (they own the flagship models today), but open-weight + local options keep improving. I'm hopeful we have a chance to tip the scales back toward co-owner and participant.
Even this is hard. Most people don't know what they want, and/or they don't know how to describe it/imagine it. They don't even know what a trend graph is.
They just want someone else to do the mental effort of creating a nice product. Hence iOS > android for most people. They don't want to customise basically anything other than colours.
That's why i predict Lovable/replit etc will not go mainstream. And why chatgpt will just offer you their UIs mainly. Artifacts weren't a big hit
...And how much brainpower goes into understanding what people like this are getting at when they speak about things. There's a lot of context and human element to this; I'm skeptical AI will be any good at it in the near future.
This weekend I vibe coded (dont shoot me) a homelab platform that hosts a bunch of useful services on a MacMini and lets me deploy my own apps on top of. Using tailscale I can access the apps from my phone. I have multiple users with their own SSO to control access. I even have a pi as part of the network that hosts public facing content. All done with Claude Code and OpenClaw (as a kind of devops tool)... hardly any code written by me. Its been a seriously fun experiment that I will try to progress some how.... if only because I love the dream "Digital sovereignty" even if the reality is its unlikely to happen again. It got me thinking though if I could get inference hardware and a good enough open LLM to work with my setup it might just be possible. The OP advocates a form of basic computing that is understandable but when we are able to host our own LLM's we could end up in a very different but more capable paradigm.
The repo for homelab anyone who has an interest: https://github.com/briancunningham6/homelab
Does it deploy it as well?
People post photos on Instagram and status updates on Facebook because their friends will see it there and give it a thumbs up.
A couple of decades ago, I spent a lot of time laboriously building a website for scratch for my photography. It was objectively a really nice site. I had my own domain, hosted it on a VPS, and put a ton of work into the layout and design.
But none of my friends ever thought to go there. I could see by my web stats that every now and then a random stranger would find the site... but they had no easy way of connecting with me and acknowledging that they saw it. If they put a lot of effort in, they could find my email address and email, but that's a hell of a lot harder than just clicking a little thumbs up button next to a Facebook post or filling a comment in the comment box.
Uploading photos to my site was about as rewarding as printing them out and throwing them in the trash. I thought about adding support for that to my site, but then it opens the whole can of worms around user-generated content, abuse, moderation, etc.
Eventually, I moved to Flickr, which at the time was an actual community that gave me that connection. Then Flickr fizzled out. Now, on the rare times I bother to process a photo... I just upload it to Facebook because that's where (a dwindling subset of) my friends are.
It's not about the content. It's about the human connection. A CMS won't fix that.
Feedback maybe, but blogging didn't start for attention. That's something that got bolted on by a nasty virus we as humans tend to be carriers of. I don't think feedback was even an inspiration for the initial bloggers.
Certainly, it's fundamental to human nature that if we work hard to create something, we want some way to tell that another human was moved by it.
But, certainly, I think creating things, sharing them with people, and establishing a connection in return can be one of the most meaningful, joyful parts of the human experience.
That always-on device? To get critical mass, instead of just the nerds, you'd need it to ship with devices which are always-on, like routers/gateways, smart TV's. Then you're back to being at the mercy of centralized companies who also don't love patching their security vulnerabilities.
(1) Security. An always-on, externally accessible device will always be a target for breaking in. You want the device to be bulletproof, and to have defense in depth, so that breaking into one service does not affect anything else. Something like Proxmox that works on low-end hardware and is as easy to administer as a mobile phone would do. We are somehow far from this yet. A very limited thing like a static site may be made both easy and bulletproof though.
(2) Connectivity providers should allow that. Most home routers don't get a static IP, or even a globally routable IPv4 at all. Or even a stable IPv6. This complicates the DNS setup, and without DNS such resources are basically invisible.
From the pure resilience POV, it seems more important to keep control of your domain, and have an automated way to deploy your site / app on whatever new host, which is regularly tested. Then use free or cheap DNS and VM hosting of convenience. It takes some technical chops, but can likely be simplified and made relatively error-proof with a concerted effort.
Every phone/device is their own server, they connect with a web socket to the preferred station which is typically a server online serving as bridge.
There is no need to be always connected to a server, you can also connect locally on the WiFi, BLE or even USB-C cables (discovery is automatic).
From there are internal apps for sharing static websites, chat, blogs, files and so forth.
At least we still have DDNS which solves the static IP problem. I've been using it for at least 10-15 years and my home network has always been resolvable over DNS. I guess I'm lucky that I've always had an ISP that handed out publicly routable IPv4 addresses. I think if I joined an ISP where I got some internal node on the ISP's 10.x.x.x network, I'd immediately cancel my service.
With IPv6 it would theoretically be possible, but currently with ipv4 and NATs everywhere, your website would almost never be reachable, even with fancy workarounds like dynDNS
I'm almost curious enough to try for myself.
I absolutely agree with the concept, but people have to be ready to do their own work rather than delegating it to other parties. Consolidation has happened because these massive conglomerates absorb operational complexity on the cheap, and that's attractive. Moving away from them means we take on the responsibility of doing it ourselves.
And yes, I get the practicality of it. However, when people are actually doing shit like this[1] in the real world, writers of manifestos might consider practicing what they preach a tad more.
> Today the Web and Internet is owned and controlled by large for profit corporations and a few governments1. Corporate ownership combined with government policies has left us as tenant and product. It has given us a surveillance economy and enshittification
> What if I do not wish to be tenant and product?
> What can I do to change the equation?
> Those two questions lead me to a bigger question.
> What happens when ownership and control of hardware and software shifts from the domain of corporations to a world where a significant percentage are owned by individual people and cooperatives?
HTTP requires always-on + always-discoverable infrastructure
It's all over the place.
The current wave of AI agents is diminishing the value of identity as a DDOS or content-moderation signal. The formula until now included bot = bad, but unless your service wants to exclude everyone using OpenClaw and friends, that's no longer a valid heuristic.
If identity is no longer a strong signal, then the internet must move away from CAPTCHAs and logins and reputation, and focus more on the proposed content or action instead. Which might not be so bad. After all, if I read a thought-provoking, original, enriching comment on HN, do I really care if it was actually written by a dog?
We might finally be getting close to https://xkcd.com/810/.
One more half thought: what if the solution to the Sybil problem is deciding that it's not a problem? Go ahead and spin up your bot network, join the party. If we can design systems that assign zero value to uniqueness and require originality or creativity for a contribution to matter, then successful Sybil "attacks" are no longer attacks, but free work donated by the attacker.
I would rather just read the thought as it was originally expressed by a human somewhere in the AI's training data, rather than a version of it that's been laundered through AI and deployed according to the separate, hidden intent of the AI's operator.
Here's my small contribution to that. https://github.com/micro/mu - an app platform without ads, algorithms or tracking.
You can’t run your own email server. All other large email providers will consider your self hosted emails as spam by default. It understandable why they took this stance (due to actual spam) but it is also awfully convenient it also increases their market power.
We are now at the whim of large corps even if we get a custom domain with them.
You generally need to trade off two things:
1. A global namespace 2. Users being able to choose names without contention/disagreement about who owns which name
The only way to cut that knot is for an authority arbitrate access to names. That’s what we have now with the DNS and its registrars.
I think that's just a property of a naming system. Without something like a centralized threat of force that can know every person participating in the system - there really is no recourse. The approach we are taking is making it difficult to create a speculative market around names which seems to be the driving force behind squatters.
Happy to discuss it in more detail: hackernews@sepositus.com
(Note: that's an alias that goes to my email address which I avoid putting in public places for obvious reasons).
I have tried to get them to publish markdown sites using GitHub pages, but the pain of having to git commit and do it via desktop was the blocker.
So I recently made them a mobile app called JekyllPress [0] with which they can publish their posts similar to WordPress mobile app. And now a bunch of them regularly publish on GitHub pages. I think with more tools to simplify the publishing process, more people will start using GitHub pages (my app still requires some painful onboarding like creating a repo, enabling GitHub pages and getting PAT, no oAuth as I don't have any server).
When you publish to Facebook, WordPress etc you can't easily get your stuff out. You will have to process them even if they allow you to download your content as a zip folder. The images will be broken. Links between pages won't work etc.
There's no point in messing with custom hardware, etc. We could host bunch of redundant p2p access points for everyone, and use p2p portable software for everything.
E.g. I'm building a P2P/F2F Social Media protocol that is very close to syndication platform. https://app.radicle.xyz/nodes/radicle.dpc.pw/rad:zzK566qFsZn... . I'm not saying that it's exactly the same thing as author is looking for, but the technical bits and even functionality are very close.
I guess the author first need to get some stats on content type, use cases, money flows, controls etc and then define the problem that applies to most users of the web.
Keep in mind that, usually a system that evolves through feedback loops, and shaped by forces, doesn't have a major problem as it's evolution has ensured the fit between itself and it's context.
You may call the forces which shape that evolution as evil. But the forces are part of the context that you need to live with. The forces are also a product of that evolution.
This is not a real roadmap (or even completed thought) for people shifting to micro-enclave internet/ networks. That is already happening in many places, and has been for years, often driven by lack of internet infra rollout in low-income areas. It used to be community-installed wiring, but the advent of mesh networking has allowed this to blow up.
The challenge I've always felt, is shared services -- if I'm running infra myself, I can depend upon it, but if someone else is running it, I'm never really sure if I can, which makes external services really hard to rely on and invest into.
Maybe you can get further than expected with individual services? But shared services at some point seem really useful.
I think web2 solved that in an unfortunate way, where you know the corporations operating the services / networks are aligned in some ways but not in others.
But would be great to have shared services that do have better guarantees. Disclaimer, we're working on something in that direction, but really curious what others have seen or thinking in this area.
Simple to use software... this would be grand!
> Raspberry Pi OS (a Linux distribution based on Debian GNU Linux)
Is this simple? I would contend that it is not. Why do I tell people "buy apple products" as a matter of course? Because they have decent security, great ease of use, and support is an Apple Store away.
They still manage to screw things up.
Look at the emergence of docker as an install method for software on linux. We sing the praises of this as means of software distribution and installation... and yet it's functionally un-usable by normal (read: non technical) people.
Usability needs to make a comeback.
Apple stuff is a nightmare of dark patterns and user-hostile idiocy.
Maybe it's easy if you have Stockholm syndrome and have internalized all the arcane gestures, icons and bug avoidance patterns.
The average normie has no clue, though. (This is borne from experience, I have like 8 iPhones in the immediate family among children and seniors.)
Instead of every single person to maintain an offering a vertical slice through the whole stack, we should make it easier to publish content in the first place.
The real issue that this pushes the burden of maintenance and infrastructure on the individual; But this should be a shared responsibility.
Instead we need:
- A federated content/file system - An open standard for viewer/app definitions (hosted on this system)
This is like talking about how book authors don't need Amazon when you have a printer and glue at home.
I made a content management system (CMS) for some friends years ago which was very easy to use. It's main paradigm was: 1 folder = 1 page. This was very easy for anyone to manage. Files in the folder were rendered in sort order so you could have an image followed by some markdown etc Was so easy to use I never got anyone (mostly non-technical artists) asking how to change something on their site. Most ppl understand how to organise their content as files and folders. It is the easiest UI I've ever seen for a CMS i.e. no UI :P Was going to expand it to read files from a Dropbox folder so they didn't even need FTP but life... It was in PHP, which at the time, all ISPs supported. Setup was: copy code, change files/folders in `/content` dir and your away.
Yeah, sure we need this. The time for it was back in like 2005 at the latest.
The real issue is more existential. Right now we're about to lose the war that requires digital connectivity to live and use modern services. We're going to lose cash payments. If you're going to fight a fight, that is where the effort matters in 2026.
The idealist in me says we should still build a simple to use publishing and discovery system for hypertext that can be self-hosted and self-networked for the day the next generations realize they need it (authoritarian control of the Internet, collapse of social media, infrastructure instability, climate apocalypse, whatever). I suppose my idealism is still pretty pessimistic, but then it is Monday.
They totally suck like tiny homes? No, actually they are better than tiny homes. Browser are the #1 reason why you want a computer that's better than a Pi 500. Wanting to play modern games is #2.
We live in a time where people somehow think they cannot make bread without a $400 (USD) bread making machine. We suffer from learned helplessness, paint-by-number syndrome, follow-the-leader syndrome, and cargo-cult thinking. We use recipes instead of developing skills.
Implementing "a web we own" is a hard and difficult problem. The poster is correct that ISP's are a problem. But if this learned helplessness is the top comment on "HACKER" News then there is something seriously wrong with how HN works.
This is NOT about the commenters. This is about a system of interaction - comments on HN - that seems to promote anything but hacking.
My apologies for the "rant" nature of this post, but there is a point here that I believe is worth stating. Or you know, just unfollow me, vote me down, and I probably misspelld some words along the way.
Host your own website (on a free server for as long as you can), print out some flyers, paste them around town or pass them around (to bypass the Ad Gods), ask for donations to pay the growing costs of bandwidth etc. as you get more users?
Ultimately it comes to down to convincing people, the ickiest task on earth :<
Majority(>50%) of people have no idea what an OS is, you cannot expect then to care about self-hosting.
People(me included) do what is easier/cheaper, maybe down the line there are principles or ideology in the decision making.
I do not bother to read the complete article, by your actions you are a tenant/product, better lead by example.
Also, you don't even have the rights to lay 'em wires wherever you see fit.
Yet your approach is appallingly low on the other side of the spectrum. I've been in IT for the past 25 years. I have yet to see a non-IT person who knows what dedicated IP is. If you are not publishing it on the internet, then what's the point?
I've seen plenty of companies where the owner just had a read-only shared drive, where people can rummage thru a pack of PDFs. This' was all fine with that.
You have to understand, manage and work with the complexities of the tools, and offer tools quite enough for the task. It's alright to offer what you do to an engineer who has a spare Pi and a couple of days to kill. But it's quite useless for anyone else to adopt.
>I publish this site via GitHub Pages
Okay, and that depends on an entire economy and infrastructure of privately owned switching, other network equipment, fiber optic, etc, etc, etc, -- not to mention that if GitHub did not have, as a private company, a profit motive, they wouldn't even bother to offer the service you're using.
Sure, yes, rebuild the world but if you want it to be free like open source, you'll also need to make it free like beer -- and that means you'll need to work for free, too.
I support the aim. I acknowledge the problems. I'm just so frustrated by these silly oversimplifications of how to solve it.
Of course, I am asking bad faith questions here. I know the author is fully aware that the WWW is free and available to all. I don't assume any maliciousness here, but the author is not being honest about their intent. They know you can just put a website out there, but that's not the problem. The problem is that they won't get 2.5 million views, 100,000 interactions, and 1000 comments. They want a more open web, like it was back in the day. But then you remind them that back then having 1000 views on your entire site (let alone any single page) was considered successful, and that's not what they meant. The problem is that everyone wants to eat their cake and have it, too.
You either get a small web, where page views are counted in the hundreds, or you get locked into the big players and get the views you want. I, for one, choose the former.
The issue with publishing content has always been censorship. Anyone in power has incentives to apply as much censorship as they can. It's never been a technical problem.
If you consider Geocities, Tripod or Angelfire (or your local ISP) "bigcos" I guess. It is true that most people didn't host from their own servers and at one point all of those free hosts forced ad banners on your pages but it still doesn't seem like the same thing.
The problem is in the environment but also the user behavior. Unless you can provide a convincing argument to change both by presenting an actual improvement then its farting in the wind
> Professional influencers with over 500,000 followers fall under the Dutch Media Act (Mediawet, 2008) and are supervised by the media authority CvDM, which applies rules similar to those for on-demand audiovisual media services, including requirements on recognizable advertising and protection of minors.
https://cmpf.eui.eu/influencers-as-news-creators-implication...
> Current Rule: Now, influencers with 100,000+ followers (across YouTube, Instagram, or TikTok) who post at least 24 videos a year and earn money must register with the CvdM and pay annual supervision fees. > The Burden: You are expected to know your reach. If you cross the 100,000 mark and fail to register, you are technically in violation.
Another example:
https://medium.com/michigan-news/proposed-florida-blogging-l...
And self-hosting personal services makes sense and we're able to do that.
BUT, we don't own the connections. There's always going to be shared infrastructure for connecting these devices worldwide, and without an ideal state of Communism or utopian capitalism we're not going to own them or want to be responsible for them. Any kind of service that depends on a central database is not going to be communally owned.
Ownership is an economic problem, the technical aspect is merely interesting. Bitcoin might be a great example of this.
Like, suppose some really good personal server software existed. Suppose there were an OS-plus-app-repository platform, akin to linux plus snapcraft, but aimed solely at people who want to host a blog or email server despite knowing nothing and being willing to learn nothing. It installs on to a raspberry pi as easy as Windows. It figures out how to NAT out of your cable modem for you. It does all the disk partitioning and apt-gets and chmods, you just open the companion app on your phone and hit the Wordpress button and presto, you've got a blog. You hit the Minecraft button and you've got your own minecraft server, without having to learn what "-Xms2G -Xmx6G" means. It updates itself automatically, runs server components in sandboxes so they can't compromise each other, and it's crack-proof enough that you can store your bitcoins on it. Etc, etc.
If that existed, we wouldn't have to write essays about freedom and so forth to get people to buy it, they'd buy it just because it's there. I mean, look at those digital picture frames - they cost more than a rasbpi and are way less useful, and half the people I know got or gave them for christmas. Why? Because they're neat and they cost less than a hundred bucks and they require no knowledge or effort. If a server that can host your blog were that easy, it'd get adopted too, and we'd be on a path to some kind of distributed social media FB replacement. Imagine the software you could write, if you were allowed to assume that every user had a server to host it on!
The problem is, that software doesn't exist and it's not clear how it would ever get made. It'd be a huge effort (possibly "Google building Android" sized) and the extant open source efforts along these lines lack traction, mostly due to the chicken-and-egg problem of any new platform that needs apps to be useful. And until it exists, any kind of neighborhood-internet-collective-power-to-the-people dream has to necessarily begin with hoping that millions of people will spontaneously decide to spend their precious free time doing systems administration.
Not to shit on a fine essay that I mostly agree with. It just seems like, without figuring out the software, this is daydreaming.
There are 2 webs:
- the web site, to serve noscript/basic (x)html, namely basic HTML forms which can be augmented with <video> and <audio> nowadays, namely it serves web _pages_. It was made super modular, you have browsers not handling CSS, and it is fine for _pages_ with a semantic 2D table (implicit navigation even for braille browsers). Web engines there, are more than reasonable to write an alternative of, even a plain CSS renderer (look at netsurf browser), only text (lynx/edbrowse/etc), graphic (links2/elinks/etc). In the end, 'HTML' is not perfect (like CSS), a bit of a mess actually, that's why they tried an XML representation, a failure because it was literaly sabotaged by... "Big Co" or in the web realm, the 'whatng cartel': I remember their web engines were a pain to use xhtml to develop even a simple page... but not with html... curiously. That said, mistakes were made also on the "w3c" side: the 'semantic web', a real abomination of delirious complexity, which I think is what actually made people jump on the "whatng" train, what a disaster. Now, HTML has been back with its weird(shabby?) parsing, but this was kind of 'cleaned up' and much more rigorously defined.
- the web app: the abomination. Basically, only gigantic and insanely complex software can make a web app work (including their SDK), aka only the web engines from Big Co, here 'the whatng cartel'. It is getting worse, it is said web apps are more and more requiring only one web engine to 'properly work' (often gogol blink), and suspicions are very strong at this is made _on purpose_ (I remember the day when gmail.com did disable their noscript/basic (x)html web interface... then POP3 not a long time ago... I guess you all see where this is going). In this realm, there is near ZERO possibility to create a _real-life_ alternative without a bunch a developers laser focused on that for one billion years. I have been wishing for an alternative web engine I could build from source with a simple SDK, does not exist, and even the few attempts here and there are _not_ doing that: they lean towards super complex syntax computer languages (c++ and similar), hence a failure right from the start.
The 'web3'? A lean javascript engine (for instance quickjs, but there are others), with a small set of OS basic abstraction APIs, and a few 'accelerated' specialized APIs (vector drawing, pixel drawing blitter, video decoding, glyph drawing, etc). First problem: nobody will agree an those interfaces (they would have to be as simple as possible), and the 'whatng cartel' will make sure their are useless...
Or a even simpler "HTML" (probably the same with CSS)? "markdown" like the article suggest? Would it have enough expressive power? Again, nobody will agree on the format and will want to make its own.
A good middle ground is to work with a 'subset' of HTML: rought on the edge, but would do a good enough job for nearly all online services out-there, whatever the platform. Nearly 100% of the online services were running on that a few years back, and with <video> and <audio>, it could be even closer than 100% nowadays.
And there is the danger of the 'mobile app only': there, the only way out is to regulate and enforce the availability of a small, stable in time, set of as simple as possible protocols and file formats to allow reasonable efforts at developping an 'app' for an alternative platform (elf/linux, *BSD, fooOS, etc).
this man lives away with the fairies. i need read not a single word more
Tailscale and similar overlay networks have made the "accessible from anywhere" part way easier than it used to be. The missing piece is still discovery. RSS was the closest we got to decentralized discovery, and we collectively let it rot. Maybe it's time to bring it back properly.
I think the key issue here is that Attention is a temporal construct, meaning discovery is often tied to "being the first thing that comes to people's minds" which means SEO, reverse engineering the ranking algorithms, and constantly having to manage an "online persona". Note none of those things contribute to the actual work you're doing, just your "marketing department" (and whatever time/financial "budget" you intend to give it).
MrBeast figured out the YouTube algorithm - post early and often. Is that how we exist on modern Internet when every website/thumbnail is engineered by a team to maximize clickthrough rates? I agree RSS is useful, but it faces the same scalability issues if everyone starts filling up your RSS feeds. Given the limited amount of time you can devote to a particular task, we'll return to the era of A/B testing Headlines.
It’s still alive. Many sites still use it. Many people still subscribe to those sites. RSS reader apps are still being created to this day.
If P2P file sharing network can do distributed search, surely it wouldn't be so hard for self hosting? Could make a web server plugin to do it?
Would be nice to have a search engine/network that de-priorities pages with ads on them. Might be a Google killer ;)