I still get frustrated by WiFi, though, and never use it for my computers unless I had no choice. So many devices these days, the performance is still subpar. Packet loss on the best connections cause so many performance degradations.
I don't think that's the case, people don't call mobile internet "WiFi". In their minds "WiFi" probably means "home internet", so it's more like they call LAN "WiFi", because they have never used cable connection.
The boomers are hopeless with tech, because they grew up without it. But gen-z is also hopeless, because they grew up with opaque appliances like Iphones. You use it how Apple allows you to. The inner workings of it are hidden behind endless abstractions and shiny (liquid!) UI. Every error message is something like "Oopsie woopsie" with a sad face emoji. When it breaks you toss it in the garbage.
This reads like an "uphill both ways" story, but we used to build our own computers (still do, but I used to too), because that was how you aquired one unless you bought a Dell like a lamer. When your software messed up it segfaulted or coredumped, and you had to figure it out. When your hardware broke, you took it apart. People today use Discord because picking a client and specifying a server and port combo to connect to is too hard. And so on and so forth, you get the point...
And the point is these damn kids are on my lawn, making TikTok videos.
But if we go earlier a bit, it was common for people to know much more about house construction, electric work, flooring, making furniture etc. IKEA emulates some of this, but it's really a different thing to live around things you truly understand, you participated in builting your house, you know how and why everything is in it, you can fix your car, you can make produce in your garden, you eat your chickens' eggs, which you can turn into baked chicken, the whole process from hatching to hen to plate is managed by you etc.
People are having less and less control and understanding of their lives. It's all "coming from somewhere" and wrapped in abstractions and euphemisms. You no longer buy things, just rent, etc. It really changes the mentality to a more child-like thinking, at the mercy of some opaque system. With AI we will get the final blow. No skills, no intellectual muscle, just as people don't even remember the and driving directions to even places they regularly visit, because GPS gives instructions so it's just not memorized. It will be the same but for everything.
Anyways, apart from knowing how a car works on a theoretical level, I have no idea how to fix mine, and getting fucked at the dealership is a part of my life I have begrudgingly accepted.
Thanks for reading my blog.
looks at storage stats
40GB worth of “System Data” and 20GB of Safari “Documents and Data” - zero visibility as to what it is let alone simple controls to get rid of it in a reasonable way. On a real computer, that’s the kind of thing an admin has ultimate authority over. Now, especially on phones, you may as well be a call center tech saying “Well, I guess you can erase the entire phone and don’t restore a backup?” because that’s the only guaranteed fix for most problems.
I didn't even consider that the newest iPhones can connect to satellites directly.
Although I really think it is just used to mean “non-phone based internet”, rather than just home internet.
2003 WiFi was routinely awful, though. Generally unstable, poor compatibility and lousy range. A lot better now, but still could be easier for non-techs.
If I actually wanted to hack into networks, encrypted WiFi used WEP which could be cracked in minutes on a typical laptop. Most communication was unencrypted too, pwning entire WiFi networks wasn't even fun considering how easy it was.
Used to be that every cafe and business just had an open wifi network. Now if they provide it at all it’s password protected
I reacted to this. Not personal open wifi. In cafes the password is often on the wall, in the menu or otherwise easily obtained.
One of my early IT jobs in the 2000s was at a SME with Wifi: after you connected radio-wise, you had to start the VPN client, because at the time there really wasn't any (effective) encryption of the signal in 802.11 itself.
If a particular category is considered then yes, phones are the biggest chunk. But virtually every device these days comes with WiFi. So wifi is now the default method of connecting something.
Mostly because around here you can have 100GB over 5G for less than 10€ + they mostly don't use computers (a.k.a laptops) except for a) school (where they have free WiFi+Internet) and b) binge-watching the occasional Netflix (and then they use connection sharing)
Their first move upon setting up a new phone is to disable Bluetooth+WiFi to, uh, "save battery" (their cargo-cult answer, every single time)
In the early days of Wi-Fi, IEEE 802.11 group was still testing spread spectrum and OFDM with 802.11b and 802.11a, respectively. But then it's become apparent that the best bandwidth come from the proper orthogonality of wireless modulation aka OFDM [1].
At the time of the OP article back in 2003 the incumbent cellular mobile modulation of 3G is still spread spectrum based CMDA system but by 4G it's OFDM all-in and the rest is history. CSIRO become much richer due to the patent, and radio astronomy based technology generated some hard cash for the research institute that mainly pursuing science.
[1] Orthogonal frequency-division multiplexing (OFDM):
https://en.wikipedia.org/wiki/Orthogonal_frequency-division_...
[2] How the Aussie government "invented WiFi" and sued its way to $430 million [PDF]:
https://www.vbllaw.com/wp-content/uploads/2020/11/How-The-Au...
It’s true that, unlike other wireless transmission technologies, Wi-Fi allows any company to make a product that can transmit or receive on all frequency bands authorized by a country, whereas for mobile networks, for example, each operator acquires exclusive rights to a frequency band.
That shows that open standards work well and enable healthy competition.
The cell phone companies will regret their purchase of 3G spectrum! Those fools, they did not realize their 3G cell towers would soon be rendered obsolete, nay, ridiculous, by my mighty wireless router!
It's not consumers buying consumer electronics, no, it's "an authentic grassroots phenomenon."
I was using computers before Wi-Fi and eventually bought a Wi-Fi router for use at home, so it's not that I've never experienced world without Wi-Fi.
Welcome to every Wired article ever, certainly from its inception well into the mid-2000s at least.
Today it's at best amusing, but those were times just 1-1.5 generations ago when that was truly, genuinely generally enjoyed (by techies & youngsters) as neither Tired nor Expired but (Hot)Wired, and as a needed/welcome breath of air in an ocean of seemingly-immutable last-century whiffs & echoes =)
Either MLO doesn't work correctly or the drivers of the Modems (we tested Intel, Mediatek, Qualcomm, etc.) hit the shitter.
For my private stuff I stick to Wifi-6 and wait until Wifi-8 arrives. Finally having "friendly coordinated handovers" between APs is one of the biggest wins for me.
Just like when WiFi 6 came out, OFDMA didn't work well or wasn't even turned on by default. The same thing happened with WiFi 5 MU-MIMO, and WiFi 7 MLO. Expect the MLO to only work with WiFi 8.
And for all the latency reduction and reliability upgrade with WiFi 8? The expect them to work well in WiFi 9.
I wonder if we would have done the Ethernet again if he knew that Wi-Fi was going to become so common.
Today, if your wiring up a house you put ethernet drops everywhere.
POE is a thing, and it's getting more popular.
Cameras, blinds, MM wave... It's almost to the point where one should be putting a media box in every closet as a mini wiring hookup.
The issue with wiring your house for Ethernet is that 2003-era Cat5 that a random builder or DIYer grabs from Home Depot isn't going to carry nearly as much as the Cat6A cable you would want if you need the cable plant to have a chance of keeping up with network capacity growth. But that needs quality installation.
The issue wasn't whether wifi was going to become so common, it was the guaranteed improvement in reliability and speed of wifi.
Anyone could use ethernet, and still can.
An interesting historical document for studying the unit systems used in 2003.
WiFi USB Client: Fingernail, eg: Asus USB-AX56 adapter (25.5 x 16 x 9mm) [8]
WiFi Card Client: Postage stamp, eg: Intel Dual-Band Wireless Adapter AC-7260NGW M.2 2230 (22mm x 30mm x 2.4mm) [10]
802.11n Wireless N range: theoretically 230 ft (70 m) range indoors and 820 ft (250m) range outdoors, which is approximately 1 FIFA soccer field's (105 x 68 m) width for indoor range, and 2 soccer fields end-to-end by 3 side-to-side for outdoor range. Newer protocols do not seem to have extended the range of individual radios but rather rely on repeaters or range extender devices to provide additional coverage.
https://www.hummingbirdnetworks.com/articles/what-is-the-dis...
https://en.wikipedia.org/wiki/IEEE_802.11#Protocol
https://publications.fifa.com/de/football-stadiums-guideline...
Some of these are slightly older devices dating from around 2020, but the sizes should generally still be approximately the same in 2025.
Router/Access point: Mobile operating systems like WebOS, Android, and iOS have had hotspot capabilites built-in since around around 2010 (Mobile Hotspot on Palm Pre and Pixi Plus in 2010 [1], Wifi Tethering in Android 2.2 Froyo in 2010 [2], and Personal Hotspot in iOS 4.3 for iPhone 4 and 3GS in 2011 [3]). Today you can configure your phone to become a personal Wifi router with one button press [4], [5].
[1] https://www.cnn.com/2010/TECH/01/07/ces.palm.pre.plus.pixi/i...
https://web.archive.org/web/20101125023042/http://articles.c...
[2] https://www.wired.com/2010/05/android-22-froyo-features-usb-...
https://web.archive.org/web/20140707210122/https://www.wired...
[3] https://www.engadget.com/2011-03-09-ios-4-3-spotlight-person...
https://web.archive.org/web/20251018061108/https://www.engad...
[4] https://support.google.com/android/answer/9059108?hl=en
[5] https://support.apple.com/guide/iphone/share-your-internet-c...
For a discrete router device, TP-Link TL-WR802N (57 × 57 ×18 mm) [6] and GL-Inet GL-MT300N (58 x 58 x 25mm) [7] are matchbox-sized travel router devices that fit into the palm of your hand, cost less than $30 USD, and consume less than 2.75W (they can be run from a basic 5V, 1A USB connection).
[6] https://www.tp-link.com/us/home-networking/wifi-router/tl-wr...
https://web.archive.org/web/20250724184720/https://www.tp-li...
[7] https://www.gl-inet.com/products/gl-mt300n-v2/
https://web.archive.org/web/20250822045920/https://www.gl-in...
Wifi Client: Wifi capabilities are now built into mobile phone, TV, and many camera chipsets, there is no separate card required. If you want a separate WiFi adapter, you can buy a USB adapter that is about the size of a fingernail or 5 small coins stacked together (eg: 5 US pennies, or 5 Euro 1-cent coins), basically take the metal part of a USB-A male connector and extend or extrude it a bit. This Asus USB-AX56 adapter [8] is 25.5 x 16 x 9mm and supports WiFi 6 (802.11ax) with theoretical speeds up to 9.6 Gbps (actual real-world speeds appear closer to 800 Mbps). Streams in Wifi 6 can be up to 160MHz wide via channel bonding of up to 8 adjacent 20 MHz channels, but in practice this is only feasible in locations with low or no interference. Note that wider channels have a reduced effective range and perform poorly at a distance or through obstructions.
[8] https://www.asus.com/networking-iot-servers/adapters/all-ser...
https://web.archive.org/web/20251018061524/https://www.asus....
https://www.wi-fi.org/wi-fi-macphy
Discrete Wifi cards can still be found in some laptop computers (although many Wifi cards are now soldered down), one of the more recent (introduced around 2013 to 2015) and smaller sizes is M.2 2230 (22mm x 30mm x 2.4mm) which is also the form factor for some M.2 PCI-Express SSDs (solid state storage drives, typically flash devices). These are about the size of a postage stamp, weigh around 2.3 grams, and cost around $20 USD or less.
One example is Intel AX210 [9] which supports Wi-Fi 6E, on 3 bands: 2.4 / 5 / 6GHz in a 2x2 configuration (2 TX transmit and 2 RX receive antennas) at speeds of 2.4 Gbps, and also features Bluetooth 5.3.
[9] https://www.intel.com/content/www/us/en/products/sku/204836/...
A popular previous-generation device is Intel Dual-Band Wireless Adapter AC-7260NGW [10] with good support for GNU/Linux, again with a M.2 2230 (22mm x 30mm x 2.4mm) form-factor approximately the size of a postage stamp, supporting Wifi 5 802.11ac with up to 867 Mbps theoretical bandwidth on dual bands in a 2x2 configuration (2 TX transmit and 2 RX receive radios), and 433 Mbps per stream. Channels can be up to 80MHz wide via channel bonding of 4 adjacent 20 MHz channels.
An alternative, earlier form factor was Half Mini-PCI-Express, for example Intel Dual-Band Wirelss Adapter AC-7260HMW [10] at 26.80 x 30 x 2.4 mm.
[10] https://www.mouser.com/datasheet/2/612/dual-band-wireless-ac...
https://web.archive.org/web/20251018061520/https://www.mouse...
The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
Also, your glass door probably has Low-E glass which has a metallic coating.
> The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
This is somewhat the case, but it is limited. For example, in 5GHz there are 21x 20MHz channels available. In a highly dense environment, this can support roughly 30x devices per channel well and 50x devices per channel with some degradation.
Limiting the TX power on an AP can help, but it's not a panacea since clients always transmit their control frames at their default power (usually ~15dBm). There have been some improvements to this in .11ax, but depending on the spatial organization of the devices, it can only do so much.
Well, what if you want wifi in your garden? Maybe you own a few acres of property and you want wifi for a wedding? Now you actually have to do a minor bit of planning. Supported wifi versions, max EIRP, range, modulation rate, throughput, XPIC, coverage area, frequency, beamwidth, MU-MIMO, does the frequency require prior coordination with any government entity, etc.
I'm just giving a different use-case on the other end of the spectrum, to be fair. I agree 100% with your analysis however, we're going to mmwave frequency ranges with small but many APs. Massive Multi User MIMO, etc.
My guess is people with large houses and only one wifi AP are probably using 2.4 when their devices are out of range for 5. But other people doing this probably doesn't impact you at all since you can just make sure you have good coverage for 5ghz and enjoy uncongested wifi.
Though I would certainly not have complained about 50-100mbps throughout in 2003 — 1GBps wired networking was not mainstream then.
Pictures for anyone wondering what it looks like:
https://www.civilengineermag.com/chicken-mesh-for-plaster/
Wired wifi mesh or Ethernet all over would be my prescription. You basically have Faraday cages in every room right now!
That was my main Internet uplink for 5 or more years. About half way through I moved to another house and mounted the antenna outside on the roof for more gain, because the distance increased to about 11 miles. Caught some grief from the HOA, but I kept it up.
I'm not a radio engineer, but it doesn't take that many brain cells to beg the question: for a handheld/laptop device, why choose a carrier wave frequency absorbed by the body holding it, and by the metallic electronics sitting beside it? Logically, that's one of the most energy-inefficient frequencies one could choose, and a terrible design choice for personal wireless communication technology. I think a good engineer would want to conserve power and not be blocked by the very body holding it.
However, as the future unfolded, we now have nearly every household with a bright radiant point light casting human-shaped shadows, trivially reconstructed, to detect not only the body's silhouette but it's heartbeat and respiration, too.
And with everything we know, with leaks going back decades about the abuses of government power, surveilling their own citizenry, recording, analyzing, and manipulating the population in subtle ways, resulting in the financial benefactor of a handful of billionaires and the power benefactor of media-savvy pawns, why are these basic technological choices not being questioned more?
(mastax, I'm replying to you because you're top reply, but felt it important to continue my original point.)
TL;DR because the FCC regulates available frequency bands, and 900MHz, 2.4GHz, and 5GHz were the ones that were 1) the right combination of high enough to be fast and low enough to be energy efficient and easy to generate, and 2) actually available for use at the time.
(a) utterly magical (b) his father was the son of someone very high up in one of the Scottish banks and so this was affordable for him and clearly outside the range of normal people
In 2001 I bought a set of Prism 2 based cards that let me run HostAP (https://hostap.epitest.fi/) and was able to build my own network that didn't rely on ad-hoc mode and so everything was better but the speed at which all of this changed was incredible - we went from infrastructure being out of the reach of normal humans to it being a small reach, and by 2005 we were in the territory of all laptops having it by default. It was an incredible phase shift.
[1] ad hoc was a way for wifi cards to talk to each other without there being an access point, and there was a period where operating systems would show ac-hoc devices as if they were access points, and Windows would remember the last ad-hoc network you'd joined and would advertise that if nothing else was available, and this led to "Free Internet Access" being something that would show up because it was an ad-hoc network someone else advertised and obviously you'd join that and then if you had no internet your laptop would broadcast it and someone else would join it and look the internet was actually genuinely worse in the past please stop assuming everything was better