GE has a paper about the power conversion design, but it doesn't mention the unit to rack electrical and mechanical interface. Liteon is working on that, but the animation is rather vague.[2] They hint at hot plugging but hand-wave how the disconnects work. Delta offers a few more hints.[3] There's a complex hot-plugging control unit to avoid inrush currents on plug-in and arcing on disconnect. This requires active management of the switching silicon carbide MOSFETs.
There ought to be a mechanical disconnect behind this, so that when someone pulls out a rackmount unit, a shutter drops behind it to protect people from 800V. All these papers are kind of hand-wavey about how the electrical safety works.
Plus, all this is liquid-cooled, and that has to hot-plug, too.
[1] https://library.grid.gevernova.com/white-papers-case-studies...
[2] https://www.youtube.com/watch?v=CQOreYMhe-M&
[3] https://filecenter.deltaww.com/Products/download/2510/202510...
> When it is detected that the PDB starts to detach from the interface, the hot-swap controller quickly turns off the MOSFET to block the discharge path from Cin to the system. After the main power path is completely disconnected, the interface is physically detached, and no current flows at this time
> For insertion, long pins (typically for ground and control signals) make contact first to establish a stable reference and enable pre-insertion checks, while short pins (for power or sensitive signals) connect later once conditions are safe; during removal, the sequence is reversed, with short pins disconnecting first to minimize interference.
Somehow this seems the wrong approach to AI.
Data center workers are gonna need those big yoink sticks and those thick arc-fault bibs that furnace operators wear.
It's not that bad. It's just ordinary industrial protective gear.
[1] https://www.mcmaster.com/products/arc-flash-protection-face-...
[2] https://www.mcmaster.com/products/electrical-protection-glov...
> Somehow this seems the wrong approach to AI.
One a more ironic level, it's seems on par with the whole AI approach!
You could consider a robotized approach, but for now robots aren't very good at this type of task. Whether the robotized system is in space is irrelevant. It's not a "next five years" type of solution.
With that sort of voltage you should be able to use a capacitive or inductive sensor to activate a relay.
Copper vapor inhalation is a definite possibility and a horrible way to die.
Wouldn’t the biggest problem will be loading sleds into a hot rack, with all of the sleds powered down? Which probably you shouldn’t do. The parallel loads on a rack that’s half on should sink a lot of the potential.
Look at NTT Data or SoftBank.
They literally speak of preparing for a future that does not yet exist. I am optimistic it will exist, but that's not the same thing as it already happening and having a track record of reliability and profitability. I could find no mention of robots actually doing anything at that link. The article is about prepping servers, not specifically about robotics, in the same sense that planning hoses for gas pumps at gas stations is not about building cars.
People are tired of off topic speculation masquerading as relevant to real world problem solving. It's not a hate boner.
EDIT: expanded to make it clearer why I believe the speculation to be wildly off topic for this particular thread.
They are about to have fully human free datacenters by the end of this year.
When you are designing long term goals with datacenters, as I explicitly mentioned talking about, you can't ignore automation.
> They are about to have fully human free datacenters by the end of this year.
What I'm hearing is that they've been trying to build a human-free datacenter for 6 years and they haven't done it yet. What's the betting that that "end of this year" schedule slips further?
See e.g. https://www.dell.com/support/kbdoc/en-us/000221234/wiring-in...
I will say that this is a surprisingly deep and complex domain. The amount of flexibility, variety and scalability you see in DC architectures is mind-boogling. They can span from a 3kW system that fits in 2U all the way to multiples of 100kWs that span entire buildings and be powered through any combination of grid, solar and/or gas.
Honestly, that was pretty surprising to me when I had to work with some telco equipment a couple of decades ago. To this day, I don't think I've encountered anything else that requires negative voltage relative to ground.
130A, 48V -> 1.2V @ 94% efficiency! Except:
- $100 ea.
- Fixed 1/40 voltage ratio, regulation done by upstream regulator.
- Look at the minimum specs for efficiency…
What's horrific converter performance in numbers?
An isolated flyback (to 12V) should be able to hit >92% and doesn't care if it's fed -48V or +48V or ±24V. TI webench gives me 95% though I'd only believe that if I'd built and measured it. What's the performance of your -48V → +48V?
[with the caveat that these frequently require custom transformers... not an issue with large runs, but finding something that can be done with an existing part for smaller runs is... meh]
Horrific performance by my definition would be 48v to say 1v. We only realistically use buck topologies for POL supplies. Such a ratio is really bad for current transients, not to mention issues like minimum on times for the controller.
(Thanks for the info!)
[1] https://www.analogisnotdead.com/article26/what-is-going-on-w...
Automotive collectors can probably still relate to cars from the 1920s-50s having a "positive ground."
The crucial difference is the direction in which the current is flowing: is it going "in to", or "out of" a hot wire? This becomes rather important when those wires are leaving the building and are buried underground for miles, where they will inevitably develop minor faults.
With +48V corrosion will attack all those individual telephone wires, which will rapidly become a huge maintenance nightmare as you have to chase the precise location of each, dig it up, and patch it.
With -48V corrosion will attack the grounding rod at your exchange. Still not ideal, but monitoring it isn't too bad and replacing a corroded grounding rod isn't that difficult. Telephone wires will still develop minor faults, but it'll just cause some additional load rather than inevitably corroding away.
Does that mean when you have electronics and use multiple dc-dc converters all the inputs and outputs share the same ground, it's not just the values for that pair of wires?
And if I want to use a telephone on an incorrectly wired 48dc circuit, I could switch the positive and negative wires, as long as the circuit in the telephone is isolated and never touches ground?
Thanks. Somehow I got in my head that all circuits were just about the delta from neutral and therefore nothing outside them mattered.
No, it depends on the converter. There are converters that leave 160V on the DC power rail for a 110V AC input, and 155V on the DC "ground" rail.
They are economic and you could find then when galvanic isolation is at least in theory not important, but they're terribly unsafe when used on PCBs that people might muck with.
If you have some "normal" converters and some of this kind, sharing the ground would be quite dangerous.
I figured any happenstance from the multimeter that the grounds match was transitory and not to be trusted.
I think a circuit should mostly care about the deltas, but when you’re talking about things like phone lines, the earth becomes part of your circuit. You can’t influence its potential (it’s almost exactly neutral because any charge imbalance gets removed by interaction with the interplanetary medium) so everything else is going to end up being determined by what you need for their relative potential to that.
Objects don’t repel because they’re at the same potential. Electrostatic force comes from electric fields due to charge. If two objects truly have zero potential difference and no field between them, there’s no force.
You’re correct that circuits care about voltage differences. After all, all work requires a force gradient of some kind.
The interplanetary medium absolutely exists. I'm not talking about aether. I'm talking about the soup of dust, gas, and particles that fills space in the solar system. It contains a lot of charged particles, which is what keeps Earth extremely close to neutrally charged. Any deviation from neutral starts attracting positively charged particles and repelling negative, or vice versa, which equalizes the charge.
I didn't say objects repel because they're at the same potential. I said that objects at the same potential will still repel each other if that potential isn't zero.
Seriously, what is this reply? Aether? Objects repelling because they're at the same potential? You seem to have read a comment very different from what I wrote.
With DC systems you generally think about the issues - which is why modern cars are negative ground. However other than cars most people never encounter power systems of any size - inside a computer the voltages and distances are usually small enough that it doesn't matter what ground is. Not to mention most computers don't even have a chassis ground plane (there are circuit board ground planes but they conceptually different), and with non-conductive (plastic) cases ground doesn't even make sense.
With AC it's about where the ground is attached along the length of the transformer secondary. In the EU they ground one of the ends of the secondary, in the US we ground the center point.
I don't get to say this very often ... but the US way is objectively safer with no downside: 99% of human shocks are via ground, and it halves the voltage to ground (120V vs 240V). A neutral isn't required if there aren't 120V loads.
- uninsulated metal pins make contact with supply while partially exposed - much smaller distance between metal pins and the edge of the plug
But there's no inherent power tradeoff: you can have 240V outlets in the US, with the two prongs both 120V to ground. They're just really uncommon in residences.
Yes, but you only get the safety benefit on three phase equipment.
In the US there aren't a lot of 240V plugs, but if you get some installed you can get the safety benefits with plain old consumer goods.
The wye ground is beneficial even if there are no three phase loads, because the line-ground voltage is 1/sqrt(3) and you can use cheaper switchgear.
The 3-phase alternative to a wye ground is a corner ground (grounding a live phase of a delta secondary), which isn't done in modern installations because ground faults are full line-line voltage and you need more expensive switchgear.
Line-line faults are always interrupted by two breakers on a wye grounded 3 phase system. But with a corner ground, one breaker potentially has to break the full line voltage for a line-ground fault (you can't fuse the grounded phase).
My point is that in the context of a 230V single phase circuit, your ground is no longer in the middle. The ground is on one end of your single phase. If you want a safer single phase, you need to rebalance it to +115 and -115.
And because that problem of galvanic corrosion the GGP talked about, and the mirror one of material aggregation don't happen. And it also makes switches more reliable.
Both are less dangerous on telephone lines. But are very important on electricity ones.
1 - It won't break your posts, but can easily short small contacts.
Yes, or something similar[1]:
A few of the more efficient grounding electrodes for buildings and structures are:
- Metal Underground Water Pipe
- Metal In-ground Support Structures
- Concrete-Encased Electrode (also known as “footer ground” or “Ufer ground”).
- Ground Ring
As mentioned this is particularly important for telecom and similar systems which have signal wires going literally through the ground.
[1]: https://www.nfpa.org/news-blogs-and-articles/blogs/2021/09/2...
edit: found it https://www.cnet.com/tech/tech-industry/google-uncloaks-once...
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?
Actually, there was one. Even earlier phones had their own power. A dry-cell battery in each phone, and every 6 months, the phone company would come around with a cart and replace everyone's battery. Central battery was found to be more convenient, since phone company employees didn't have to go around to everyone's site. Central offices could economize scale and have actual generators feeding rechargeable batteries.
I was wiring in a phone extension for my grandma once as a boy and grabbed the live cable instead of the extension and stripped the wire with my teeth (as you do). I've been electrocuted a great number of times by the mains AC, but getting hit by that juicy DC was the best one yet. Jumped me 6ft across the room :D
The batteries, the grid/generator-supplied power supplies, and the telephone switch equipment are all connected in parallel -- as if the entire DC power infrastructure consists of only two wires, and everything involved with it connects only to those two wires.
1. In normal operation, the batteries are kept at a constant state of charge. The switches are powered from the same DC bus that keeps the batteries charged.
2. When the power grid goes down, the batteries slowly discharge and keep things running like nothing ever happened (for hours/days/weeks). There is no switchover for this; it's just the normal state, minus the ability to juice-up the batteries. (Remember: It's just one DC bus.)
3. When the grid comes back up (or the generators kick in), the batteries get recharged. There is no switchover for this either; nothing important even notices. (Still just one DC bus.)
4. If the grid stays up long enough, go to 1. Repeat as the external environment dictates. (And as you might guess, it's still one DC bus and there's also no switchover here. Things just continue to work.)
--
You can play with this at home with a capacitor (which loosely acts like a battery does), an LED+resistor combo (which acts as a load), and a small power supply that is appropriate for LED+resistor you've chosen (which acts as the AC-DC converting grid input).
Wire them all 3 parts up in parallel and the light comes on.
Disconnect the power supply, and the light stays on for a bit -- it successfully runs from power stored in the capacitor.
Reconnect the power supply, and the light comes on and the capacitor ("battery") recharges -- concurrently.
Improve staying power by adding more parallel capacitance. Reduce or eliminate it by reducing or eliminating capacitance. Goof around with it; it's fun. (Just don't wire the capacitor backwards. That's less fun.)
The batteries are floated at the line voltage nothing was really charging or discharging and there was no switchover.
This is similar to your cars 12v dc power system such the when the car is running the alternator is providing DC power and the batteries float doing nothing except buffering large fluctuations stabilizing voltage.
Another thing we lost in the age of VoIP landlines, but then again mobile towers also have batteries. Just don't be unlucky and have a power outage with 3% battery on your phone...
Much of the world's mains-voltage electronics run at 240V (historical) and have PFC circuits (which are essentially just boost converters) that run at ~400V DC link voltages. 650V gives you enough headroom to tolerate overshoots and still have an 80% safety margin with a single level topology.
This voltage also coincidentally is a convenient crossover point where silicon MOSFETs start to become inefficient and GaN FETs have recently become feasible and mass-produced.
Then I started routing ethernet with PoE throughout my house and observed that other than a few large appliances, the majority of powered devices in a typical home in 2026 could be supplied via PoE DC current as well! Lighting, laptops, small/medium televisions. The current PoE spec allows up to 100 W, which covers like 80% of the powered devices in most homes. I think it would make more sense to have fewer AC outlets around the modern house and many more terminals for PoE instead (maybe with a more robust connector than RJ45). I wonder what sort of energy efficiency improvements this would yield. No more power bricks all over the place either.
We installed 120 LED ceiling lights in our home circa 2020, all of which were run with high voltage (romex) and accompanied by 120 little transformer boxes that mount inside the ceiling next to them.
Later ...
We installed outdoor lighting with low voltage, outdoor rated wiring and powered by a 12V transformer[1] and I felt the same way you did: why did we use a mile of romex and install all of those little mini transformers when we could have powered the same lights with 12V and low voltage wire ?
I then learned that the energy draw of running the low-volt transformer all the time - especially one large enough to supply an entire house of lighting - would more than cancel out energy savings from powering lower voltage fixtures.
You don't have this problem with outdoor lighting because the entire transformer is on a switch leg and is off most of the time.
So ... I like the idea of removing a lot of unnecessary high voltage wire but it's not as simple as "just put all of your lights behind a transformer".
[1] https://residential.vistapro.com/lex-cms/product/262396-es-s...
That's not a constraint of physics, you can absolutely build a DC power supply that is efficient in a wide load range. (Worst case it might involve paralleling and switching between multiple PSUs that target different load ranges.) But of course something like that is more expensive...
More expensive than an inefficient unit, but it should still be a lot cheaper than 120 separate units, right?
And I expect one big fat unit to do a better job of smoothing out voltage and avoiding flicker than a bunch of single-light units. Especially because the output capacitors are sized for the entire system, but you'll rarely have all the lights on at the same time.
Though for efficiency I'd think you'd want 48v and not 12v.
With double-conversion, generally yes.
I recently ran across the (patented?) concept of a delta conversion/transformer UPS that seems to eliminate/reduce the inefficiencies:
* https://dc.mynetworkinsights.com/what-are-the-different-type...
* a bit technical: https://www.youtube.com/watch?v=nn_ydJemqCk
* Figures 6 to 8 [pdf]: https://www.totalpowersolutions.ie/wp-content/uploads/WP1-Di...
The double-conversion only occurs when there's a 'hiccup' from utility power, otherwise if power is clean the double-conversion is not done at all so the inefficiencies don't kick in.
I find it a little hard to imagine that those devices outnumber things like stoves, dishwashers, washers/dryers, kettles, hair dryers... by 4:1.
Unsure why PoE would be better for LED lighting than the standard approach of screwing a bulb directly into AC, either. How many lumens do you get out of strip lights these days? And you still have AC-DC conversion for whatever's sourcing power onto the Ethernet link.
In practice PoE will have lower efficiency than mains powered, since it'll usually be at least double conversion, often three converters in series, plus the losses of the thin network wires, and the relatively high idle losses / poor low-load efficiency of the necessarily over-dimensioned PSE.
USB-C could be that connector, using USB-PD instead of PoE. Though I'm not sure I'd want to need that much smarts for every single power outlet.
Even when it's your job the usb are still handled and cycled way more often. You might handle 100 ethernet jacks today, but it won't be the same one 100 times. You plug it in and don't touch that one again for 5 years.
Efficiency isn't as straightforward either. You're still being fed by 120V/230V AC, so you're going to need some kind of centralized rectifier and down converter. It'll need to be specced for peak use, but in practice it'll usually operate at a fraction of that load - which means it'll have a pretty poor efficiency. A per-device PSU can be designed exactly for the expected load, which means it'll operate at its peak efficiency.
We also don't use 5V DC grids because the wire losses would be horrible, so a domestic DC grid should probably operate at pretty close to regular AC voltage as well. In practice this means the most sensible option would be to have a centralized rectifier and a grid operating at whatever voltage it outputs - but what would be the point?
As to PoE: I personally really like the idea, but I don't believe it'll have a bright future. For its traditional use the main issue is that there doesn't seem to be a future for twisted-pair beyond 10Gbps. 25GBASE-T might exist as a standard on paper, but the hardware never took off due to complete disinterest from the datacenter market, and it is too limited to be of use in offices and homes. I fully expect that 25G will arrive in the home and office as some form of fiber-optic interconnect - with fiber+copper hybrid for things like access points.
On the other hand, for a lot of IoT applications PoE seems to be too complicated and too expensive. It makes sense for things like cameras, but individual lights, or things like smoke sensors are probably better served in office/industrial applications by either a regular AC supply or a local DC one, plus something like KNX, X10, CAN, or Modbus for comms: just being able to be wired as a bus rather than a star topology is already a massive advantage. And for domestic use the whole "has a wire" thing is of course a massive drawback - most consumers strongly prefer using Wifi over running a dedicated wire to every single little doodad.
Even if we were to standardize a low (<50V) voltage for DC distribution within homes, we'd still need ~120/240VAC to power big stuff, or we'd instead need even-larger conductors (more copper) than we use today to do the same work with low voltage.
But, sure -- we can play it out. So let's say we have an in-home 48VDC distribution standard and decide that this is the path forward and we enshrine it in law.
We need to convert whatever the solar system has available to 48VDC. Then, we need to distribute that 48VDC using a completely separate network of cabling. Finally, we still need to convert 48VDC to whatever it is that devices can actually use.
That's not representative of a reduction in steps, or an increase in efficiency.
That is instead just an increase in installed infrastructure expense, and a decrease in device compatibility. It takes what we have, which is simply universal (at least within any given geographical area) and adds complexity.
And for what? What's the perceived benefit?
Is the juice worth the squeeze, though? Two sets of home wiring voltages? Substantially bigger copper wire inside the walls instead of existing copper, in order to do the same work? Two sets of appliances (of all sizes) on shelves at the store? More adapters?
Billy now needs to bring 2 wall warts to make sure he can charge his portable gear at a friend's house instead of just 1, because he's never sure until he gets there if they've got a 120 or 240v house like they all used to be, a combination house, or if it's one of those solar-only places that only has the weird plugs.
What we have now is 1 cable plant connecting the rooms of a home, and an increasing number of hybrid solar inverters that -- on a sunny day -- cheerfully convert solar power directly from whatever the panels are outputting to the 120/240 VAC wiring that both existing and future appliances know how to use. At night, these hybrid systems do do the same thing from whatever voltage the battery uses and convert that to AC. There's only 1 voltage, and only 1 plug; Billy brings 1 wall wart and knows he can charge his stuff.
To be sure: What we have not strictly ideal, but then neither is changing things without a clear positive benefit.
Again: What's the qualitative advantage of changing this, other than change for the sake of change?
DC might feel nice and neat, but in reality it doesn't seem to be shaped that way at all to me.
Also, you'll need wires that 5 times thicker. Instead of needing a reasonably 1mm^2 for a normal 16A line, you'll need 5mm^2 for the same power.
I agree, it's unserious to suggest a cooker or something high power is going to run off of 48v. But for loads like lights, PC/Laptop/TV/Audio 16a at 48v is ~770W which is adequate for these devices.
DC infrastucture makes sense in highly specialised environments.... Like new gigawatt AI farms
I don't think that much stuff is left which actually needs AC power (usually to run an AC induction motor).
I think it's highly unlikely we'll see mass scale retrofits, but if enough momentum builds up, I can see it as a great bonus feature for new builds.
I got lucky with my house and every room has a dedicated phone line meeting at a distribution panel (a couple of 2x4s with screw terminals) built in the 50s. I'm in the process of converting it to light duty DC power. The wiring is only good for an amp or two, but at 48v that's still significant power transmission.
I imagine rooftop solar could also source DC for the house directly (or via a battery), before hitting the inverter... ?
The main problem I see is educating consumers. Maybe that starts with a standard for DC outlets and plugs that can't be confused with AC... ?
(Now I'm imagining desktop computers with much simpler power supplies; but you'd presumably have to wire for dozens of amps incoming...)
It's super nice because you only need to put the UPS/ATS at the PoE switch and then you get power redundancy everywhere you have ethernet running (i.e. the phones don't go down).
1. One of these is simplicity. With AC, one single home run of cabling (eg, Romex) can feed a whole room full of stuff, like a bedroom or a living room. At one end of the run is a circuit breaker (a fairly simple electromechanical device) and at the other end is a series of outlets (which are physically daisy-chained, but are functionally just wired in parallel with eachother).
Since one single run of cable can feed many devices, it is easy to accomplish.
2. Another advantage is that it is universal. Anything can plug into these outlets. Whatever a person brings into the home to use, they can plug it into an outlet and it works. It works this same way in every home.
3. And there's quite a lot of power available: A common 20A 120v branch circuit cabled up with 12AWG Romex is stated to supply up to 16A continuously, or 1920W. For intermittent loads, it can supply 20A -- or 2400W. That's tiny by European standards, but it's still quite a lot of power. It's plenty to run a space heater when Grandma visits and she complains about the guest room being cold (even as you start to sweat when you cross the threshold to investigate) and a big TV and a whole world of table lamps, all at once. And you can plug this stuff into any outlets in a room, and it Just Works.
4. But, sure: Lots of devices want DC, not AC. So there's a necessary conversion step that is either integral to the device being plugged in, or in the form of the external wall warts we all know very well.
So let's compare to power-over-ethernet.
1. It's also simple, but only tangentially-so. One home-run cable per outlet, whether that outlet is used or not, is something that can be rationalized as being a simple topology. A PoE switch at the head-end instead of a central box with circuit breakers is a simple-enough thing to transition to. And a lot more individual cables are required, but they're relatively small and are generally easier to install.
2. It's standardized, but it's not universal at all. I've got a few PoE widgets around the house, but I'm pretty friggin' weird when it comes to what I do with electricity. I can't go to Wal-Mart and buy more PoE widgets to use at home, and when people visit they aren't bringing PoE adapters to charge their phones and other electronics. My computer monitor doesn't have a PoE input. I can easily imagine a table lamp or a fan that connects to PoE, and also uses it as a network connection for automation, and that sounds pretty sweet in ways that tickle my automation bones in the most filthy of fashions... but that's getting even further into the weeds compared to how regular people expect to do regular things.
3. There isn't a lot of power available. 802.3bt Type 4 is the highest spec. And within that spec: While switch ports can output up to 100W, a device being powered is limited drawing no more than 71.3W. Now, sure, that's 71.3W per port, but in a room with 10 ports that's still only ~700W -- at most -- in that room. And Grandma's space heater won't run on 71.3W, nor her electric blanket. My laptop wants more than this. The list of useful, portable things that we casually plug into a wall that only draw less than 71.3W is pretty short and most don't benefit from the main advantage of PoE, which is a combination of [some] power alongside high-speed Ethernet data.
4. We still need wall warts since PoE is nominally ~48VDC. For example: Phones use less than 71.3W while charging, but they don't run on 48V. That means 120V AC comes in from the grid, gets shifted to 48VDC for distribution within the dwelling, and then gets shifted yet again to the produce the power (5, 9, 15, and 20V are common-enough in USB PD world) that devices actually want. That's more lossy conversion steps, not fewer -- and we still get to keep the extra conversion (wall warts) as punishment for our great ideas. This is not the path towards increased energy efficiency.
---
PoE is great for the things we use it for today. A camera, a wireless access point -- you know, fixed-location stuff that uses networked data as its primary function and also requires power.
Installed PoE light fixtures (like, say, task lights in a kitchen) also sounds neat -- unless they die prematurely and no PoE replacements are to be found. (Now, you have not just one or two problems, but many: The lights aren't working in that space and they can't be replaced with a trip to Lowes because the Romex that would normally have been installed was deliberately deleted from the plan. It could have been a 20-minute DIY fix that costs less than $100, but now it involves drywall and paint and retrofitting new cabling. Or maybe PoE replacements do exist, but it's now 2035 and the new ones don't talk the same network protocols as the old ones did.)
But there are other upsides: I've got an 8-port PoE-powered network switch that works a treat. It's a dandy little thing. And it sure would be neat to plug my streaming box in with PoE and kill two birds with one cable; I would like that very much.
But most people? Most people don't give a damn about ethernet (PoE, or not!) these days, or streaming boxes, and that trend is increasing. They just plug their lamp into the regular outlet on the wall like they always have, and deal with whatever terrible UI is built into their smart TV, and use wifi for anything that needs data.
And when they buy a home that is filled with someone else's smart infrastucture, their first task (more often than not) is to figure out who to call to erase those parts completely and put it back to being normal and boring.
But what about availability? If you ask most of our users whether they’d prefer 4 9s of availability or 10% more money to spend on CPUs, they choose the CPUs. We asked them.
There are a lot of availability-insensitive workloads in the commercial world, as well, like AI training. What matters in those cases is how much computing you get done by the end of the month, and for a fixed budget a UPS reduces this number.
And then every machine has a switching power supply to convert this to low-voltage DC, and then probably random point-of-load converters in various places (DC -> AC -> DC again) for stuff like the CPU / GPU core, RAM, etc. Each of these stages may be ~95% efficient with optimal load, but the losses add up, and get a lot worse outside a narrow envelope.
You could feed your servers off fat 12/24/48 volt supplies but with how much power a modern server can pull you're already converting in bulk even if you don't do that, limiting the potential advantages. For running CPU/GPU/RAM, there is no other option. When you need hundreds of amps at 1-2 volts, you convert that centimeters away if at all possible.
A datacenter using DC distribution is still using high voltages and stepping them down in layers. The hassle it avoids is in other aspects of power delivery.
Unfortunately, the EPO ended up having been installed differently than the plans indicated [0] — there was definitely more than an hour of unexpected downtime figuring out how to "unrig" its decades of faithful misconfiguration.
During this same months-long modernization, a separate facility decided to install a temporary tap off our already-temporary taps (no engineering consultation attempted) — with resulting Megawatts of melting disaster. More hours downtime.
tl;dr: shit happens
[0] EPO's main neutral was tied into a general lighting circuit's j-box, some bullshit jerryrigging from an 80s sparkie; our apprentice was tasked with taking out old fluorescents... ended up taking out entire facility (through no fault of his own).
Yes, of course both of those things are true, and yes, some data centers do engage in those processes for their unique advantages. The issue is that aside from specialty kit designed for that use (like the AWS Outposts with their DC conversion), the rank-and-file kit is still predominantly AC-driven, and that doesn't seem to be changing just yet.
While I'd love to see more DC-flavored kit accessible to the mainstream, it's a chicken-and-egg problem that neither the power vendors (APC, Eaton, etc) or the kit makers (Dell, Cisco, HP, Supermicro, etc) seem to want to take the plunge on first. Until then, this remains a niche-feature for niche-users deal, I wager.
https://www.nokia.com/bell-labs/publications-and-media/publi...
Every single DC I’ve worked in, from two racks to hundreds, has been AC-driven. It’s just cheaper to go after inefficiencies in consumption first with standard kit than to optimize for AC-DC conversion loss. I’m not saying DC isn’t the future so much as I’ve been hearing it’s the future for about as long as Elmo’s promised FSD is coming “next year”.
https://developer.nvidia.com/blog/nvidia-800-v-hvdc-architec...
https://blogs.nvidia.com/blog/gigawatt-ai-factories-ocp-vera...
almost everybody in the industry is embracing 800V DC mostly because of Vera Rubin and the increased electricity requirements.
Its much cheaper, quicker and easier to use cooling blocks with leak proof quick connectors to do liquid cooling. It means you can use normal equipment, and don't need to re-re-enforce the floor.
A lot of "edge" stuff has 12/48v screw terminals, which I suspect is because they are designed to be telco compatible.
For megawatt racks though, I'm still not really sure.
Edit: s/have/had/
DC doesn't have such a killer. There are a decent bunch of benefits, and the main drawback is gear availability. However, the chicken-and-egg problem is being solved by hyperscalers. Like it or not, the rank-and-file of small & medium businesses is dying, and massive deployments like AWS/GCP/Azure/Meta are becoming the norm. Those four already account for 44% of data center capacity! If they switch to DC can you still call it "specialty kit", or would it perhaps be more accurate to call it "industry norm"?
It is becoming increasingly obvious that the rest of the industry is essentially getting Big Tech's leftovers. I wouldn't be surprised if DC became the norm for colocation over the next few decades.
[0]: https://thecoolingreport.com/intel/pfas-two-phase-immersion-...
Fucks sake.
Looking at the manual for the first server line that came to mind, you can buy a Dell PowerEdge R730 today with a first party support DC power supply.
- Three conductors vs two, but they can be the next gauge up since the current flows on three conductors
- no significant skin effect at 400Hz -> use speaker wire, lol.
- large voltage/current DC brakers are.. gnarly, and expensive. DC does not like to stop flowing
- The 400Hz distribution industry is massive; the entire aerospace industry runs on it. No need for niche or custom parts.
- 3 phase @ 400Hz is x6 = 2.4kHz. Six diodes will rectify it with almost no relevant amount of ripple (Vmin is 87% of Vmax) and very small caps will smooth it.
As an aside, with three (or more) phase you can use multi-tap transformers and get an arbitrary number of poles. 7 phases at 400Hz -> 5.6kHz. Your PSU is now 14 diodes and a ceramic cap.
- you still get to use step up/down transformers, but at 400Hz they're very small.
- merging power sources is a lot easier (but for the phase angle)
- DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
now run that unshielded wire 50 meters past racks of GPUs and enjoy your EMI
> The 400Hz distribution industry is massive; the entire aerospace industry runs on it
nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms
> 3 phase @ 400Hz is x6 = 2.4kHz... Your PSU is now 14 diodes and a ceramic cap
you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates
> large voltage/current DC breakers are.. gnarly, and expensive. DC does not like to stop flowing
SiC solid-state DC breakers are shipping today from every major vendor
> DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
wide-bandgap converters are at 95%+ with no moving parts
Multipole expansion scales faster than r^2.
Also, im not in the field (clearly) but GPUs cant handle 2.4 kHz? The quarter wavelength is 30km.
"nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms"
Current wise, the catalog covers this track just fine. As to the voltages, well that's the whole point of AC! The voltage you need is but a few loops of wire away.
"you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates"
So keep it? To clarify, this is the "we're too good for plebeian power, so we'll transform it AC->DC->AC", right?
"SiC solid-state DC breakers are shipping today from every major vendor"
Of course they do. They're also pricey, have limited current capability (both capital costs and therefore irrelevant when the industry is awash with GCC money) and lower conduction, and therefore higher heat.
They're really nice though.
"wide-bandgap converters are at 95%+ with no moving parts"
transformers have no moving parts. Loaded they can do 97%+ efficiency, or 2MW of heat eliminated on a 100MW center.
The skin depth by the way is sqrt(2 1.7e-8 ohm m / (2 pi 400Hz mu0))=~3mm for copper---OK for single rack, but starts to be significant for the type of bus bars that an aisle of racks might want.
As for efficiency, both 400Hz transformers AND fancy DC-DC converters are around 95% efficient, except that AC requires electronics to rectify it to DC, losing another few percent, so the slight advantage goes to DC, actually.
As for merging power, remember that DC DC converter uses an internal AC stage, so it's the same---you can have multiple primary windings, just like for plain AC.
I am a recovering audiophool.
I do own a pair of 2m long Monster Cable speaker cables (with locking gold plated banana plugs). I am fairly certain I've used welders with smaller cables.
(In my defence, I bought those as a teenager in the late 80s. I am not so easily marketed to with snake oil these days. I hope.)
(On the other hand, I really like the idea of a reliably stable plus and minus 70V or maybe 100V DC power supply to my house. That'd make audio power amplifiers much easier and lighter...)
What are you talking about? There's a very significant skin effect at 400Hz. Skin effect goes up with frequency. These datacenters use copper busbars, not cable, so skin effect is an important consideration.
You obviously need at least a dozen stands in parallel!!
Clearly skin effect scales with frequency but, 400 Hz is still low, only 2.5x lines frequency (the scale is by the root); so the skin depth is 3mm. 3mm on each side makes for a pretty hefty rectangular cross-section.
I'm pretty sure you have my delivery address from when I bought sorted Lego from you about 10 years back.
Let me know when to expect the 100,000Amp test equipment!
I shall make sure I wear better PPE than just my reading glasses.
:-)
Ah, that lego project... that was one I always wondered if I should have industrialized it but sourcing enough lego was a real problem.
That's low voltage lightning :)
Many datacenters I'd been to at that point were already DC.
Didn't think this was that new of a trend in 2026, but also acknowledge I did not visit more than a handful of datacenters since 2007.
It just seemed like a undenyably logical thing to do.
800 volts DC, at the megawatt power supply levels, implies fault impulses of more than a megajoule. Google tells me that's about 2 hand grenades worth of boom. That's an optimistic lower bound.
The resulting copper plasma cloud is a burn and inhalation hazard, along with the overpressure.
Let's say you get a 10 kiloamp fault current, this will then induce voltages everywhere you don't want it to go. If all the interconnects are fiber, that's really not a problem, but you have to have everything EMP shielded if you don't want boards popping randomly after such an event.
The "efficiency" of removing the extra power conversions also removes filtering and surge suppression. It's entirely possible that one power supply over-voltage takes out half of your racks. The MOSFETs used tend to fail closed instead of open, making failures far worse than a simple outage.
Very smart people are making very smart mistakes.
However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
The part that would genuinely be cheaper is avoiding problematic flicker. It takes a reasonably high quality LED driver to avoid 120Hz flicker, but a DC-supplied driver could be simpler and cheaper.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
* https://en.wikipedia.org/wiki/Power_over_Ethernet#Standard_i...
In the commercial/industrial space this may be worth it: how long do these bulbs last? how much (per hour (equivalent)) do you pay your facilities folks? how much time does it take for employees or tenants to report an outage and for your folks to get a ladder (or scissor lift) to change the bulb?
The gain from DC-DC converters is small and DC devices are small part of usage compared appliances. There is no way will pay back costs of replacing all the appliances.
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)
It is silly to have AC to DC converters in all of my wall connected electronics ( LED bulbs, home controller, computer equipment etc )
You could wire your house for 12, 24 or 48V DC tomorrow and some off-grid dwellers have done just that. But since inverters have become cheap enough such installations are becoming more and more rare. The only place where you still see that is in cars, trucks and vessels.
And if you thought cooking water in a camper on an inverter is tricky wait until you start running things like washing machines and other large appliances off low voltage DC. You'll be using massive cables the cost of which will outweigh any savings.
...There was some kind of switch involved, I hope?
Gratuitous disclaimer, don't try this at home kids...
In all likely not worth the trouble. When I moved to Canada I gave away most of my power tools for that reason and when I moved back I had to do that all over again.
If you ever have to do it again, you can probably get a transformer rated high enough for power-tools for cheaper than replacing all of your power tools.
Killed a few tapes with a transformer on a US tape deck before buying a 220V 50Hz unit. No, I don’t remember if the pitch was grossly off, but I’m guessing it wasn’t.
I think the answer to your question is that it mostly doesn't matter for personal mug size quantities of hot water and if it does matter to you there are readily available competing options such as dedicated taps for your kitchen sink.
Perhaps the biggest reason is that a traditional kettle on any half decent electric range will match if not exceed the power output of any imported electric kettle. Many even go well beyond that with one burner marked "quick boil" or similar.
I’m surprised that American exceptionalism can tolerate half powered sockets.
How expensive would a proper AC->DC->AC brick for that power level be?
A pure sinewave inverter for that kind of power is maybe 600 to 1000 bucks or so, then you'd still need the other side and maybe a smallish battery in the middle t stabilize the whole thing. Or you could use one of those single phase inverters they use for motors.
No one in the USA drinks hat tea. The choices (and it tends to be regionally-based) is sweet or unsweet tea. No need to boil a kettle quickly for that.
There are dozens of us.
Perplexingly I was traveling in one of the iced tea regions of the country in need of a cup of hot tea, and they had no way to make it. Like, you have a commercial coffee maker and hot cups, the coffee maker has a hot(ish) water tap. All you need is a $4 box of teabags that’ll last until the heat death of the universe. Nope.
Still though, I don't seem to see most of those people seriously clamoring for the electric kettle to go a bit faster. The cost for the wiring difference and dealing with odd imported kettles just isn't worth it generally.
... Unless you're buying it pre-made, does this not still start with making hot tea the regular way? Or what exactly are you doing with the tea bags and loose tea from the supermarket?
I've heard of Americans getting EU plugs for things like this, but it's extremely uncommon. The simple answer is: Americans just don't drink tea very much, and the ones who do aren't going to go to this much trouble and expense just to be able to boil water for their tea twice as quickly.
It would be relatively easy for the US to go to 240V: swap out single-pole breakers for double-pole, and change your NEMA 5 plugs for NEMA 6.
For a transition period you could easily have 240V and 120V plugs right next to each other (because of split phase you can 'splice in' 120V easily: just run cable like you would for a NEMA 14 plug: L1/L2/N/G).
What would be the real challenge would be going from 50 to 60Hz.
Other way around, no? The US is already 60Hz.
Edit: I mostly remember this because the SNES games I used to buy in the US and brought back to Europe ran noticeably slower.
I can watch 1080p video on YouTube and it runs in an up-to-date web browser using less than 50% CPU on 12-year-old hardware with 8GB of RAM and a graphics card that was a budget option at the time (my searches indicate it draws at most 80W, though it expects a 500W PSU for some reason).
I end up converting stuff anyhow, because all my loads run at different voltages- even though I had my lights, vent fan, and heater fans running on 12V I still ended up having to change voltages for most of the loads I wanted to run, or generate a AC to to charge my computer and run a rice cooker.
Not to mention that running anything that draws any real power quickly needs a much thicker wire at 12V. So you're either needing to run higher voltage DC than all your loads for distribution and then lowering the voltage when it gets to the device, or you simply can't draw much power.
Not that you can't have higher voltage DC; with my newer system the line from my solar panels to my charger controller is around 350VDC and I can use 10awg for that... but none of the loads I own that draw much power (saws, instapot, rice cooker, hammond organ, tube guitar amp) take DC :D
4KW of panels, 400W 48V EG4 6000XP charge controller/ inverter 3x EG4 LifePower4 48V batteries a raspberry pi running solar assistant
I feels like a bit overkill, and there is still a whole mppt unused on the 6000xp so I could still double my panel input. Also solar assistant tells me that I rarly go below 75% battery storage. If I just wanted to run my fridge and assorted convenience loads (and ran things like table saws off a generator) then I could get away with a lot less of a system.
But I'm operating a recording studio, and there were a couple days this winter where I had a full-band session and a couple days of storms and got down to below 50%.
My inverter-charger is connected to my batteries with 4/0 cable. That wasn't fun to run.
Thus, even if you had DC in the walls, it would be 100+ volts, and you'd still have conversion down to the lower voltages that electronics use. If you look at the comments in this thread from people who work in telco, they talk about how voltage enters equipment at -48V and is then further lowered.
We have some old ceiling and exhaust fans, but I know those can be replaced. Our refrigerator is AC, but extended family with an off-grid home has a DC refrigerator that cycles way less, probably due to multiple design factors but I’m sure the lack of transformer heat is part of it. I’m not as sure about laundry machine or oven/cooktop options but I believe those are also running on DC in the off-grid home without inverters.
Most of these AC appliances also have transformers in them anyway for the control boards. It seems kind of insane to me that we are still doing things this way.
AC motors are using way more power than the puddly control boards in most home appliances. So you lose a little efficiency on conversion but being 80% efficient doesn’t matter much when it’s 1-5% of the devices energy budget. You generally gain way more than that from similarly priced AC motors being more efficient.
I know that a long time ago DC-to-DC voltage converters were very large in size, which meant AC would win on space efficiency. But unless I’m mistaken, that’s no longer the case. Wouldn’t a DC refrigerator with equivalent insulation and interior volume have nearly identical exterior dimensions as an AC refrigerator?
Sure, but it’s important to separate what could be built from what is being built based on consumer preferences and buying habits. The average refrigerator could be significantly quieter, but how often do people actually listen to what they are buying? People buying Tesla’s didn’t test drive the actual car they were buying so the company deprioritized panel gaps. And so forth, companies optimize in ways that maximize their profits not arbitrary metrics.
A DC household would have to choose a trade-off between multiple lines with different voltages or fewer voltages that need to be adapted to the appliances. And we're right back at the AC situation, but worse since DC voltages are more difficult to change.
But consumers like datacenters can very well plan ahead and standardize on a single DC voltage. They already need beefy equipment to deal with interruptions, power sourges, non-sinus components, and brownouts, which already involves transformers, condensators, and DC conversion for battery storage. Therefore almost no additional equipment is required.
The trade-off between, say, one (relatively) high voltage DC bus throughout the home vs many branches with lower discrete voltages is indeed a problem. With AC, we took the bus approach, running 120v everywhere (in the U.S., higher elsewhere). I’m inclined to say we should keep doing that for flexibility and predictability. But it’s a trade off, like you said. It would obviously help if regulatory and standards bodies came out with official recommendations.
Everything else I can think of in a typical household is basically a mere heater that in principle works equally well with AC and DC of the correct voltage. Even computers can be said to mostly care about the correct voltage since AC->DC conversion is vastly easier than voltage conversion.
No, they don't, not at all.
Most modern appliances have variable-speed motors these days. You can't do that by just connecting AC to a motor; you need a control board to generate the waveforms necessary to make the motor turn at the speed and direction you want. That control board has to be fed with DC power. (Source: I used to design BLDC motor control systems)
Only really simple appliances, like old-fashioned horribly-inefficient clothes dryers, still use AC induction motors, and those are mostly being phased out. (Bathroom fans also need AC; they're usually cheap synchronous reluctance motors.)
So it really doesn't matter much whether the incoming power is AC or DC these days, unless you have a bunch of ancient appliances that still use induction motors. If it's AC, it's going to be rectified and fed into a DC-to-DC converter to create the lower DC voltages needed. If it's a higher DC voltage, we can skip the rectification step and not worry much about ripple.
Indeed. And that’s quite normal. Our electrical system should serve our modern needs.
> but you can't run that through a home
5V might be too low for that length of wire. But you could most definitely have a low voltage line in your house that we could design around, maybe 12V. Electric vehicles are moving towards 48V for accessories. It seems like lack of a standard is holding us back more than anything else.
Or we could just keep doing 120V in the walls, with a DC supply. Modern DC-to-DC voltage converters are very efficient and small. But maybe I’m wrong. A lot of people seem to believe they are still not good enough yet for such a change to make sense.
> If you're going to have AC and DC then you might as well just have AC.
I arrive at the opposite conclusion. Most things are natively DC. So therefore, power in the walls should be DC and we should covert it to AC near the endpoint where necessary.
For 800V DC, a simple UPS could interface with the main supply using just a pair of (large) diodes, and a more complex and more efficient one could use some fancy solid state switches, but there’s no need for anything as complex as a line-interactive AC UPS.
Installing a ceiling fan used to be treacherous and so heavy. Also loud and buzzy after installed. Now the fans in these things are so lightweight and easy.
seeing the same in many more areas (lighting, etc)
The irony is all the recessed lights I picked out are DC, they all have little AC-DC boxes hanging off them using a proprietary connector. If I hadn't needed to pass a rough-in inspection going all DC would've been trivial.
Hard as a rock!
Well it's harder than a rock!If your house gets 800V DC you're still gonna need "bricks" to convert that to 5VDC of 12VDC (or maybe 19VDC) that most of the things that currently have "bricks" need.
And if your house gets lower voltage DC, you're gonna have the problem of worth-stealing sized wiring to run your stove, water heater, or car charger.
I reckon it'd be nice to have USB C PD ports everywhere I have a 220VAC power point, but 5 years ago that'd have been a USB type A port - and even now those'd be getting close to useless. We use a Type I (AS/NZS 2112) power point plug here - and that hasn't needed to change in probably a century. I doubt there's ever been a low voltage DC plug/socket standard that's lasted in use for anything like that long - probably the old "car cigarette lighter" 12DC thing? I'm glad I don't have a house full of those.
My understanding is that DC breakers are somewhat prone to fires for this reason, too.
The electricians I was working with also told me stories about how with the really big breakers, you don't stand in front of it when you throw it, because sometimes it can turn into a cloud of molten metal vapor. And that's just using them as intended.
Allegedly
While on "work experience" from high school I was put on washing power lines coming straight out of the local power station near the ocean - lots of salt buildups to clear.
Same deal, flashover suits and occasional arcs .. and much laughter from the ground operators who drifted the work bucket close.
Another story in the same line is that I heard that a horse got killed by contact with a lantern battery, but I don't have any reference for that, just a story by a family member that collected coaches.
Any kid could do it. Probably. But not a drunk one and instead of waiting long enough for both wheels to stay synchronized indicating there would be only a small jump in phase he just said f* it and connected the two anyway. Predictably, this led to a serious disturbance to the grid which in turn caused a whole lot of other stuff to disengage. Since his chances for re-employment on account of his new-found fame were somewhat minimal he decided to emigrate instead :)
It would have self-extinguished if you waited long enough for the probe to vaporize.
I think its that DC breakers are more expensive, so people use AC rated breakers instead. They are both rated for 400v @10 amps, its the same thing right?
It turns out they are not, and most people, even electronics types rarely play with 200v+ of DC.
AC arcs are easier to extinguish than DC arcs, but DC will creep much easier than AC and so on.
From a personal point of view: I've worked enough with both up to about 1KV at appreciable power levels and much higher than that at reduced power. Up to 50V or so I'd rather work with DC than AC but they're not much different. Up to 400V or so above that I'd much rather have AC and above 400V the answer is 'neither' because you're in some kind of gray zone where creep is still low so you won't know something is amiss until it is too late. And above 1KV in normal settings (say, picture tubes in old small b&w tvs and higher up when they're color and larger) and it will throw you right across the room but you'll likely live because the currents are low.
HF HV... now that's a different matter and I'm very respectful of anything in that domain, and still have a burn from a Tronser trimmer more than 45 years after it happened. Note to self: keep eye on SWR meter/Spectrum analyzer and finger position while trimming large end stages.
Can you say more about "creep"? Is the resistance changing? Or is material actually migrating?
Also curious why it's worse using DC.
Electromagnets dont work for DC, so your breaker will never trip. For thermal protection, you need current, so that checks out, and it would make sense for it to be rated under 50V as thats considered the highest voltage thats not life threatening on touch.
PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
If electromagnets don't work for DC then what am I supposed to do with this pile of DC solenoids and relays? ;)
> PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
That's what the SCCR rating is for. When there's a fault you're going to have a LOT of current flowing until your safety kicks in. Something like the grid or a battery bank will happily provide thousands of amps almost instantaneously. Breakers designed for protecting building wiring are rated for this. Now, most household breakers aren't dual DC/AC rated, but you can actually buy DC rated breakers that fit in a home panel (Square D QO series).
> Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
My point is that there isn't any material reason why DC can't be as safe as AC, all the proper safety equipment already exists. Extinguishing a DC arc during a fault is a solved problem for equipment at household scale.
> The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
I just mentioned 48V because it's a common equipment voltage for household DC systems. 400V would be good for big motors and resistive heating loads.
Regarding DC vs AC and wiring efficiency, talking about mean vs peak wattage just confuses the issue. 1 volt DC is 1 volt RMS. It is an apples-to-apples comparison. If you want to say "we can use 170VDC or 120VAC with the same insulation withstand rating, and at lower current for the same power", then that is absolutely true. But your common 600V THHN building wire won't care if you're using 400V AC or DC, so it's mostly immaterial.
(My stand mixer is the lone sad exception)
I spent a few years getting flown out around the world to service gear at different datacenters. I learned to pack an IEC 60320 C14 to NEMA 5-15R adapter cable and a dumb, un-protected* NEMA 5-15R power strip. While on-site at the datacenters, an empty PDU receptacle was often easy to find. At hotels, I'd bring home a native cable borrowed from or given to me by the native datacenter staff or I'd ask the hotel front desk to borrow a "computer power cable," (more often, I'd just show them a photo) and they generally were able to lend me one. It worked great. I never found a power supply that wasn't content with 208 or 240V.
Example adapters: https://www.amazon.com/dp/B0FD7PHB7Y or https://www.amazon.com/dp/B01IBIC1XG
*: Some fancier power strips with surge suppression have a MOV over-voltage varistor that may burn up if given 200V+, rendering the power strip useless. Hence, unprotected strips are necessary.
Thinking about the failure modes gave me the heebie jeebies, but the gas had been disconnected ages prior.
Once you get into higher power (laptops and up), switching and distribution get harder, so the advantages fade.
For bigger appliances (fridge, etc), AC is fine + practical.
However, there's also PoE (24 or 48V!), so maybe that's the right approach. It's not like each outlet is going to run a heater anyway.
Unless you mean running AC and installing inverters in the wall? What is this even for? All my electronics are DC but critically they all require different voltages. The only thing I might benefit from would be higher voltage service because there are times that 15 A at 120 V doesn't cut it.
For PoE, I thought it was standardized at 48 V, but I see lots of cameras run at 24 V, and I think I've even seen 12 V. Seems a bit of a mess.
The irony...
I always thought AC’s primary benefit was its transmission efficiency??
Would love to learn if anyone knows more about this
To expand on this, a given power line can only take a set maximum current and voltage before it becomes a problem. DC can stay at this maximum voltage constantly, while AC spends time going to zero voltage and back, so it's delivering less power on the same line.
The transmission efficiency of AC comes from the fact that you can pretty trivially make a 1 megavolt AC line. The higher the voltage, the lower the current has to be to provide the same amount of power. And lower current means less power in line loss due to how electricity be.
But that really is the only advantage of AC. DC at the same voltage as AC will ultimately be more efficient, especially if it's humid or the line is underwater. Due to how electricy be, a change in the current of a line will induce a current into conductive materials. A portion of AC power is being drained simply by the fact that the current on the line is constantly alternating. DC doesn't alternate, so it doesn't ever lose power from that alternation.
Another key benefit of DC is can work to bridge grids. The thing causing a problem with grids being interconnected is entirely due to the nature of AC power. AC has a frequency and a phase. If two grids don't share a frequency (happens in the EU) or a phase (happens everywhere, particularly the grids in the US) they cannot be connected. Otherwise the power generators end up fighting each other rather than providing power to a load.
In short, AC won because it it was cheap and easy to make high voltage AC. DC is comming back because it's only somewhat recently been affordable to make similar transformations on DC from High to low and low to high voltages. DC carries further benefits that AC does not.
BTW, megavolt DC DC converters are a sign to behold: https://en.wikipedia.org/wiki/File:Pole_2_Thyristor_Valve.jp...
There are many factors involved, and "efficiency" is only one. Cost is the real driver, as with everything.
AC is effective when you need to step down frequently. Think transformers on poles everywhere. Stepping down AC using transformers means you can use smaller, cheaper conductors to get from high voltage transmission, lower voltage distribution and, finally lower voltage consumers. Without this, you need massive conductors and/or high voltages and all the costs that go with them.
AC is less effective, for instance, when transmitting high power over long, uninterrupted distances or feeding high density DC loads. Here, the reactive[1] power penalty of AC begins to dominate. This is a far less common problem, and so "Tesla won" is the widely held mental shortcut. Physics doesn't care, however; the DC case remains and is applied when necessary to reduce cost.
Other people, of course, have other definitions of high voltage:
"This resonant tower is known as a Tesla coil. This particular one is just over 17 feet tall and it can generate about a million volts at 60,000 cycles per second."
and:
"This pulse forming network can deliver a shaped pulse of over 50,000 amps with a total energy of about 1,057 times the tower primary energy"
If there was anything like a high power transistor back then he would have used that. High power transistors that are robust enough to handle the grid were designed inly recently over 100 years after the tesla/edison ac/dc argument.
This!
The soon people realized these facts the better. The pervasive high rise buildings did not happen before the invention of modern cranes.
Exactly twenty years ago I was doing a novel research on GaN characterization, and my supervisors made a lot money with consulations around the world, and succesfully founded govt funded start-up company around the technology. Together with SiC, these are the two game changing power devices with wideband semiconductor technology that only maturing recently.
Heck, even the Nobel price winning blue LED discovery was only made feasible by GaN. Watch the excellent video made by Veritasium for this back story [1].
[1] Why It Was Almost Impossible to Make the Blue LED:
It accompanies in very low quantities aluminum and zinc, so it is extracted only in the mines of aluminum or of zinc, as a byproduct.
However, the abundance of gallium is similar to that of lithium, while gallium is used in smaller amounts, so there is no risk to not have enough gallium in the near future.
On the other hand, all semiconductor devices with gallium also use some indium. Indium is used in even greater quantities in all LCD or OLED displays, to make transparent electrodes.
Indium is an extremely rare element in the entire universe, comparable with gold, so for indium there is a much greater risk that its reserves will become insufficient.
This could be mitigated by extracting such critical elements from the dumped electronic devices, but this is very expensive, because only small amounts of indium are used per device, so very large amounts of garbage would have to be processed in order to extract a sizable amount of it.
(Gallium is a byproduct of aluminum production. We aren't going to run out.)
Unless by “make from something” else you mean extract the element from existing chemical compounds found in Earth, in which case we’re still just using existing deposits on Earth.
The earth's crust is 8% aluminum.
We will have bigger problems before hitting this one.
We will not remain without gallium, but it is impossible to scale up the gallium production to a higher level than provided by the current productions of aluminum and zinc.
So there is a maximum level of gallium that can be used per year and it would not be possible to increase the production of blue and white LEDs and of power transistors above that level.
Fortunately, the amount of gallium used per device is very small, so it is not likely that we will hit that level soon. A much more serious problem is the associated consumption of indium, for which the resources are much less.
I'm inclined to think we've lost that gold.
The metal isn't going to disappear, but it won't be concentrated enough to be as easily retrievable.
Such predictions have an abysmal historic track record, because we tend to find workarounds both on the supply side (=> previously undiscovered reserves) as well as flexibility on the demand side (using substitutes).
This applies historically for oil, lithium, rare earth metals and basically everything else.
edit: I'm not saying we're never gonna run out of anything-- I'm just saying to not expect sudden, cataclysmic shortages in general, but instead steadily rising prices and a somewhat smoothish transition to alternatives.
But non-sustainable pricing is very different from "cataclysmic collapse", and too many people expect the latter for too many things, which is just not realistic in my view (and historical precendent makes a strong case against that assumption, too).
A society where water prices gradually increases to "reverse-osmosis only" (instead of "pump-from-the-ground-everywhere") levels is very different from a society where water suddenly runs out.
The people that rush to tell you that reserves are running out tend to omit what price they are talking about. That way of expressing oneself is normally called "a lie".
That's a classic example of the "preparedness paradox" [1]. When no one raises the alarm in time or it is being ignored, resources can go (effectively) exhausted before alternatives can be found, or countries either need to pay extraordinary amounts of money or go to war outright - this has happened in the past with guano [2], which was used for fertilizer and gunpowder production for well over a century until the Haber-Bosch ammonia process was developed at the start of the 20th century.
And we're actually seeing a repeat of that as well happening right now. Economists and scientists have sounded the alarm for decades that oil and gas are finite resources and that geopolitical tensions may impact everyone... no one gave too much of a fuck because one could always "drill baby drill", and now look where we are - Iran has blasted about 20% of Qatar's LNG capacity alone to pieces and blocked off the Strait of Hormuz, sending oil prices skyrocketing.
If you had made predictions/scenarios in 1850 based on Guano deposits running out within a decade or two, you would have mispredicted completely, because a lot of the industry just transitioned to sodium nitrate (before synthetic fertilisers took over). Nowadays media landscape would've gladly made such doom-and-gloom predictions for global agriculture back then.
I completely agree that quickly depleting reserves often indicate non-sustainable pricing for ressources (which is obviously bad long term), but that is very different from sudden collapse.
Yes we can run out of oil, but nobody really knows if or even when that will happen. Right now I'm guessing we won't run out because wind and solar is so much cheaper for most purposes everyone is shifting anyway - this will take decades to play out.
We can run out of cheap and accessible oil very, very fast if the shitshow in MENA continues to escalate. Qatar already lost 20% of their LNG capacity in a single strike.
The US may have enough domestic oil production to sate its domestic demand, but the prices would still skyrocket even for them. Europe meanwhile, we're straight fucked here. Technically the oil hasn't run out, it's still in the oil fields of the journalist-butcher country and other sheikdoms, but that doesn't matter if it cannot be pumped out any more because the wells got blasted to pieces or if it cannot be transported thanks to Iranian mines, Europe is still running out of oil in practice.
It is easy to get infected by the media narratives that are notoriously biased towards maximum drama, but I firmly believe that we are not gonna escalate into such a scenario.
There's always options; sorting priorities because of price, radical electrification of transport, or, at the extreme end, picking up coal hydration again (worked well enough to keep the Nazi war machine running for quite a while, with much worse access to crude).
For comparison: Copper prices did increase by 500% since 2000, but people barely even care, and that's how I would expect "shortages" to typically go.
yyy! if we're going to wander off-topic :-) then I should mention elevators, water pumps, fire suppression including fire truck ladders and more! :-)
the podcaster Sebastian Major from "Our Fake History" did a looonnngg patreon episode on tesla and debunked most of the weird myths around tesla. Sebastian doesn't have a vendetta or anything, it's just amazing how much of the Tesla stuff is just nonsense or is viewed through a very weird bias nowadays. Major also briefly touches on the weird Edison stuff and how the internet has twisted Edison into a villain.
The funniest part is that The Oatmeal comic didn't invent this concept, but drew on pre-Internet narratives put forward by The Tesla Society, who were mailing busts of Tesla to universities around the country since the 70s at least. And that organization is explicitly nationalistic and religious, tied to other Serbian-American heritage organizations, and doing events with the Orthodox church.
So are many Serbs (more so if emigrants from atheist-socialist Yugoslavia, or descendants of folks who moved before WW2) as well as many other nations and organizations (America itself lol). So are many Something-Or-Other-American individuals and communities.
I presume that the organization(s) sending Tesla busts, being American-rooted, have had no illusions about which matters will forever remain impossible to communicate to Americans. (Such as anything not reducible to paperclip optimization.)
Instead, I consider it more likely that the point of promoting Tesla was not to impress anyone in America, but to uplift Serbia and generally the South Slavs of the Balkans who'd only gained national sovereignty in Tesla's day: "look, our heritage has already produced an honest-to-god American inventor half a jebani vek ago, so you guys have zero excuse to act as if you're stuck in the middle ages - do join the cargo cult of mordorn civilization instead, will ya - we got value to extract from ya!"
>They've basically projected the Jobs/Woz divide back onto two historical figures who, in reality, barely interacted.
I'd rather say this has been projected for them, but by whom is anyone's guess; not like there's a shadowy cabal operating. Besides said Serbian-American heritage promoters and whatever their game is, I guess - but here we're not talking mid-XX century Serbian diaspora any more, but a "culturally nonspecific" audience.
Much safer to call it "a hivemind situation" when nobody knows where some idea comes from, and nobody is accountable for rebroadcasting it either, since it comes pre-tagged as Good and True and Useful and it is wrongthink to doubt those. Especially when the idea is so obviously Useful for excusing nonaction. ("I can't be bothered to learn the first thing about electricity, even the history of why I have access to it in the first place - but now that Tesla guy I've vaguely heard of, he was the great genius of the people! What better reason to Experience a Positive Emotion!")
Aside from all the cult classics Keanu is part of like john wick and the matrix, even discounting that, he is a good person in it of itself who is genuinely humble and might be one of the best persons within hollywood.
What I feel pissed about is that people like Andrew Tate and others like them took the concept of Matrix and the contributions Keanu did within that movie and tried to capitalize on that cult classic decades after in the most toxic form that might be the issue if we are talking about an era
To be honest, Nikola tesla is also a great person within the context of his time. GGP's comment is still true but Tesla's contributions can hardly be reinstated and I'd much rather people believe these to be the heros (Keanu/Tesla) rather than Tate/Musk etc.
If I take anything from Keanu, I would like to take his humility/humbleness.
It always seems to me that the far right are bereft of original ideas and always co-opt other pre-existing concepts. There's exceptions, but I always find that right wing works are always lacking humour or irony (c.f. Ayn Rand's works).
That's not unique to them: Good artists copy; great artists steal.
- Oscar Wilde
If I was in his position I'm not sure I'd have taken it as well as he did.
enough Edison bashing!
Look, Tesla was a weirdo, but, he was a very good inventor who actually invented shit.
Edison was an industrialist, who knew the price of everything, and wasn't above spending a lot of money to destroy a rival.
Do I idolise Tesla? no, but I respect his understanding of high frequency electronics with really primitive tooling.
Do I despise Edison? also no, but he is a massive prick. Excellent buisness man, but an abrasive prick never the less.
IMHO, the vision he had about universal free electricity (transmitted wirelessly) was the dumbest. It was a novel idea, and he invested a lot (his time and other people's money) in it. The problem with his idea is that there was no way to monetize it (and profit from it). (There were also the technical issues of the power loss over distance (1/R^2), the harm to the environment, and the interference with radio communications.)
Edison was quite a villain. He stole many of his "inventions", and orchestrated a PR campaign against Tesla touting the "evils" of AC power. AFAIK, the electric chair was either invented or inspired by him.
I know these things because I've read many books on various topics related to Tesla, and all of this knowledge predates the Internet.
Every so often, I see or hear a new narrative of history that does not align with reality. I used to wonder how this could happen, but one of my sons explained to me that in his college history courses (in multiple accredited universities), the professors would teach their version of history, using their notes as the course material. They circularly cite other like-minded revisionist material, and most of their students just accept what the professor says as fact. He has seen this again and again in both lower and upper division courses.
This is a disturbing trend, and aside from "woke culture" indoctrination, I don't know what's behind it, or why these professors are not held to basic academic standards.
https://geekhistory.com/content/george-westinghouse-used-tes...
Thank you for quashing the gross misinformation. I was going to post this, but searched and found your comment. `\m/`
(I learned of the "Current War" in the 70's, since the Edison Museum was in my "backyard" -- and was a common destination of local school field trips.)
https://www.energy.gov/articles/war-currents-ac-vs-dc-power
https://www.discovermagazine.com/the-cruel-animal-testing-be...
I only found Edison in the headline, I didn't find it anywhere in the body, nor did I find Tesla. Glancing through the article it almost seems like someone tried to make a catchy headline to get clicks.
You can have the best idea in the world, but if you cant manufacture it you're SOL.
Mercury arc rectifiers were used long before his death.
can we stop vibe generating headlines?