As an example, way back then when this was a very lucrative business, we placed the servers for a premium number erotic call in an industrial park on the border of two districts in Budapest because that's where we could get two independent power feeds without running our own lines. Internet connection wise, one was a simple leased line the other was a microwave connection to very far away. Short of bombing the entire site it was fair impossible for the installation to go offline and -- for the six years I knew about it, it never did. Note the site served German callers, that's where the big bucks came from.
If you're going to come from a different circuit, see if you can at least pull it from a different phase, if we're talking a 120v circuit on 240v service (typical US home service). It's a small improvement, but I'd say 10%+ of the power outages I've seen have been just a single phase going out.
On a commercial 3 phase service, yes, connect redundant PSUs to separate phases, since each phase on the panel actually corresponds to each phase of the grid.
I believe that's a USA peculiarity. Where I live, the usual residential and commercial wiring is from 13.8 kV to 127V/220V through a three-phase delta-wye transformer, in which the primary connects between each pair of phases, and the secondary connects between one phase and the neutral (the high-voltage primary side does not have a neutral). When one phase of the high voltage side is lost (very common, since each high-voltage phase is a separate wire and has an independent fuse upstream of the transformer), what happens is that one phase of the low voltage side stays normal (the one between the two intact high voltage phases), and the other two have a lower voltage which varies depending on their relative load.
Yep. Sounds like you have true 3-phase service, whereas most places in the US just get split-phase.
All depends on how it failed.
I don't have a power outage more than once a year, but we manage to blow a breaker more than a few times a decade (vacuum + water boiler was one).
It wouldn't surprise me at all if some utilities have started installing smart meter upgrades or inline compressors or computer-controlled valves that don't have generators attached to them.
I hope it goes down a couple hours later?
Around here (Germany) your phone and internet is also dependent on a junction box with routers somewhere within half a mile or so of your home having power. But they have four hours of battery backup, and on normal-sized outages they send people out with diesel generators when the batteries start running low (prioritizing business customers). Having it go down from power loss is a decision made in triage, not something that just happens
Asking, coz Canada here. Power outages aren't uncommon (when I say that I mean: expect at least one that makes you get out the generator in 'shoulder season' per year), coz power lines (not talking transmission lines) are mostly above ground, except in large cities of course. But as soon as you get out of the "center" (which depending on city is larger or smaller too) it's good old wooden poles that carry power on the top and cable / phone on the lower level.
Bucket transformer[1] on a pole near you blows up is a favourite but the lines are actually fine, including your cable / DSL. Last time we also lost internet it took about 24 hours after power went out that internet went down as well. Cell phone service from the same company was still fine. The entire metro area was out of power for days and I guess they prioritized topping up the diesel/LP for those.
[1] These guys https://en.wikipedia.org/wiki/Distribution_transformer
What does happen are smaller scale outages. Power lines are mostly buried along streets and under the sidewalk, just like telephone lines. That doesn't stop the occasional excavator digging too deep and taking a street off the grid. At an individual level it's extremely uncommon, maybe once a decade. But deploy thousands of boxes with networking equipment all around the country and it happens to your equipment all the time.
The 2006 one I had read about before. I love reading timelines of such disasters. Shows how hard this actually is and how much work it is to keep it all running.
Here's another one: https://en.wikipedia.org/wiki/Northeast_blackout_of_2003#Tim... And speaking of Canada and power lines (this time it does include transmission lines) https://en.wikipedia.org/wiki/January_1998_North_American_ic... While not so severe this is basically the kind of thing I was referring to us happening in "shoulder season". There's usually at least one ice storm or very wet snow event at the start and/or end of winter now and it's very likely that in our wooded area we get trees into power lines and boom the buckets go. When we're lucky it's localized and crews are available to come out and fix it in a few hours. If it's all over the place then it's gonna take a while and they'll have crews from other provinces and the US come in to help as well.
I'd be interested in your outlook on the future of the grid in Germany and Europe though. Of course when France takes a nuke offline, that's usually planned, even when done for a "river water temperature emergency" it's gonna take a while and you can bring that coal plant online like you mention. But doesn't Germany want to reach the climate goals it set itself? How does coal make sense there? And how is shutting down their own nukes a thing when it's OK to use French nuke power?
Or some natural gas, which is quicker. If you have the gas. Re: Russia.
In all of the European (NATO) countries together, is there enough generation capacity if you assume zero Russian inputs (save for say untraceable third party transit or resources) and half of France's nukes going offline? Especially when the sun doesn't shine because bad weather and thus the winds are so high that you have to shut down your wind turbines?
That's where a large, sparsely populated country is a real disadvantage. "Trees into powerlines" isn't really an event in Germany, since high voltage lines are 150 feet up in the air and kept clear of trees (they'll just cut a line though a forest for them). And everything smaller is generally buried. But that would be very hard to do in Canada. And of course we don't have to fight with ice on our transmission lines.
> How does coal make sense there?
Lobbying. And saving the jobs of hard-working coal miners is more romantic and appealing in election campaigns than saving the jobs of wind turbine manufacturers.
It used to make economic sense in the sense that coal plants were cheaper to run, but that has changed in recent years so what you see now is mostly inertia
> And how is shutting down their own nukes a thing when it's OK to use French nuke power?
Oh, there are lots of protests against French nuclear plants too, especially those in the border regions. We just can't do much about them. But the people who don't like nuclear plants aren't the ones running the energy markets.
On the future: Before 2022 the idea was to transition all coal capacity to gas. This was mostly happening on its own anyways due to gas outcompeting coal on price, and new pipelines like Nordstream were going to accelerate that economic pressure to transition. The Ukraine war was a big setback for that.
In the end I believe we are still moving to a future where a lot of power is coming from solar and offshore wind, with natural gas peaker plants to offset times without wind until grid-scale battery technology moves a bit along (molten salt, hot sand, pumped hydro in abandoned mines, etc). In addition to that obviously hydro and pumped hydro from the Scandinavian countries
We are far enough into economies of scale that the generation side is mostly going to sort itself out on economics alone. Solar is becoming dirt cheap, offshore wind is becoming profitable, natural gas is cleaner and cheaper than coal. The bigger issue are transmission lines. Building transmission lines takes decades because every NIMBY fights against them. But the existing transmission lines are built around a somewhat even spread of supply and demand, versus the new situation where we want offshore farms in the North Sea to be able to supply lots of electricity to the South when there's good wind, and the solar panels in the South to help power the North. And politicians from certain parties love to side with "their" NIMBYs for easy political points
Something is only really a backup if the actual fuel is on site, at this point
But if you need enough backup capacity to survive something a multi-state, multi-day blackout [1] that probably gets expensive.
You wouldn't need that for a premium erotic call processor, but a 911 call exchange might, for the portion of their workload they can't pass off to another exchange.
[1] https://en.wikipedia.org/wiki/Northeast_blackout_of_2003
If you can't get gasoline within a day's drive then there's bigger problems in the world.
For example some can run gas/propane/natural gas
my dude i cannot tell you how much i love these stories of non-faang, real world engineering for extreme reliability.
thank you for posting that.
back then running leased lines would've been way too expensive. So what did the kids running the show did? They got wind of a central office in an older part of Budapest have excess capacity so they rented an apartment in the next building and drilled the wall :D no expensive trenching, no expensive equipment to demultiplex landlines, nothing, just a bunch of wires running straight from the CO equipment into retail modems... We had no idea what we were doing, mind you. I was already doing Linux at the time so among the few installers I was the lucky guy who got to install the Internet at a small business who wanted it to be done on a Solaris workstation. That was a fun challenge... Other installs were Trumpet Winsock. The ISP itself ran a custom linux app, you dialed in and landed on a text app or maybe it was Lynx? can't quite remember, it's been 30 years...
In that sense, it is not incorrect about the configuration of utility power because it doesn't say anything about that subject.
The two different power sources are different shapes.
To someone who’s setup things in datacentres, it seems pretty reasonable to see that could be 2 different circuits.
Of course it should be 2 separate plugs with the 2 different shapes in them.
As for batteries you can just get a UPS system that supports adding extra batteries to it.
I don't think this is an issue with double-conversion UPSs, since their input power is fixed by their rectifier size and they'll simply charge more slowly instead, but with standby type, it's very much a concern.
The monitors both had the same general issue - they would fail to find a signal every few weeks, and I'd find that waiting a few hours with them turned off would help.
I have pages of notes here. Another way to get them to work would be booting up the windows computer, which would seem to 'trick' the monitor into getting a signal on hdmi, and then I could switch to display port for the mac laptops to be used.
Anyways it's all crazy rambling notes, with copious timestamps, looking for patterns. I have an IR heat thermometer from the kitchen and my monitor vents would regularly have air over 160F coming out the tops when the monitors would not even post the vendor logo after a hard power reset.
I removed the battery backup and it's been months now with zero blips. So the only obvious takeaway I have is that overrunning a battery is a completely worthless endeavor.
Well over 100A for a few seconds
Lithium can’t even do that in a single cell (but in turn is infinitely better for continuous current)
It sort of baffles me intuitively that an extremely simple lead battery can for a short while compete with the grid
The Cold Cranking Amps (pretty much the peak startup current) that they can provide is rather insane. Even my small motorcycle battery provides over 250 CCA.
Supercaps is how "fast" charging in most smartphones works as well since they can soak current faster than the battery itself and also mitigate cycling batteries too hard.
I ran a lot of gear through a decent separate PDU and the rack of datacenter gear refused to die. No burst transistors or other things that can be the norm after.
My comment was definitely based in this - providing the cleanest electricity possible, however, doesn't hurt, and can only help.
Even for the cheaper UPSes I wonder if the issue isn't the cycle rate, but the cycle shape? My understanding is that they tend to be able to hit 60Hz pretty easily, but the cheaper ones are a ~square wave instead of a ~sine wave. Maybe the digital clock just glitches on that more.
Double conversion takes power from AC, converts it to DC to charge batteries, takes battery output, and inverts it back to AC. All power drawn from the UPS goes through the battery and inverter stack, and there is no transient/power loss when AC mains are lost. They tend to be more expensive, louder, run hotter, etc.
Line interactive UPSs, on the other hand, tend to be cheaper and are in most cheap consumer products. They take AC mains, convert it to DC, and charge batteries. But AC mains is also connected directly to the output device through switch circuitry that will quickly switch the power source from AC mains to batteries/inverter if power loss is detected.
Reputable UPSs will use pure sine wave inverters for converting DC battery back to AC. Modified sine waves are indeed a lot cheaper but are not suitable for some sensitive equipment.
Surge protectors are usually made of components that cause a short when a surge happens, protecting the equipment downstream. It usually pairs with some kind of overcurrent protection (breaker, fuse, sometimes GFCI) to protect against the short the surge protector itself caused.
Having chained surge protectors it actually quite common. You may have a surge protector in your breaker panel, then in your powerstrip, then in the power supply of the device you have plugged in. Most good quality ATX power supplies have built-in surge protection for instance. They also all tend to have overcurrent protection too. The breaker panel has breakers (duh), the power strip may have a simple breaker too, and the device may have a fuse. In the UK, the plug itself may have a fuse, plus the breaker from the utility company.
The risk from chaining surge protectors is that it increases the risk of false triggers if one of them is defective. But it may also provide better protection. All in all, I wouldn't worry too much about it. Just don't overload that power bar and whatever it is plugged in.
This has been my experience. There was a corridor between 2 FL counties known for heavy lightning strikes. I serviced small sites that had their IT equip fried (exploded, melted) once or twice a year.
Putting it behind 4-6 decent, consumer-grade surge protectors turned out to be really effective. I was a bit surprised given how lightning can jump over protection during a strike.
To illustrate the area: An XO's home was hit. Char marks lined the walls wherever wiring ran. Pipes burst all over. Nothing plugged in or wired survived. The front door was blow into the street.
His grade school kids were home at the time; they were physically fine.
I really wish someone would come up with some surge suppressors that have a string of field-replaceable suppressors. Periodic maintenance, replace the suppressors.
Only buy known surge suppressor, there have been tear downs where the surge components were missing / fake.
Since surge comments are passive, chaining the surge components is not a problem.
Although I do think I might have mixed some things up between regular power strips and those outdoors/industrial ones with a long (double/triple digit meter) rollable cable which my dad was a big user of back when he used to work in construction. Basically back when I was little he used to tell me never to plug power tools into a rolled-up “power wheel”, and I think that when I was later heard you shouldn’t daisy chain power strips I must have made that (wrong) connection.
Either way, if your house has had a surge and other equipment has died that wasn't surge protected, probably a good time to replace all surge protectors in the house, they're not really meant to survive multiple large surges. They shunt the power destructively, just somewhere you don't care.
Is this a concern when buying used rackmount power conditioners (like used for live music setups?), to protect home IT gear? Can they be worn out without a sign that they are?
You probably won't have issues charging iPhone from this thing or powering something for few seconds, so no need to go crazy about it, just something to keep in mind.
It’s not safe, and it’s expressly forbidden by the NEC, see 11.1.5 below:
> 11.1.5 Extension Cords
> 11.1.5.1
> Extension cords shall be plugged directly into an approved receptacle, power tap, or multiplug adapter and shall, except for approved multiplug extension cords, serve only one portable appliance.
Daisy chaining extension cords is unsafe and not recommended. Only use extension cords that you’ve inspected and are properly rated for the environment (don’t use indoor cords outside, don’t use an outdoor extension cord outdoors unless it’s GFCI protected) and power usage of the device you are powering.
Any time electricity has to flow through a splice or mechanical connection, the possibility of a loose connection causing an arc and subsequent fire exists.
It’s unlikely to happen to you specifically, but it does happen and avoiding electrical fires is a good thing if it can be avoided.
Daisy chaining power strips is also forbidden by the NEC:
> 11.1.4.2
> The relocatable power taps shall be directly connected to a permanently installed receptacle.
For anecdotal experience, I've had both extension cords and wall plugs fail (nothing serious thankfully, but they did get a bit melted), but in those cases it had nothing to do with my extension cord chains, but rather an internal connection failure.
As a general rule, I wouldn’t run tools past 50 feet on anything smaller than 12 AWG (and really, 14 AWG is the smallest I’d go for any length; anything smaller isn’t safe for most loads).
> 11.1.4.2
> The relocatable power taps shall be directly connected to a permanently installed receptacle.
A surge protector is a ‘relocatable power tap’ and must be plugged into a permanent receptacle.
So it's fine as long as you control the strip and keep track of loads (e.g. you know your spouse will never plug a vacuum into that handy receptacle you have there), but at work your EHS team will mark you down for it.
Daisy chaining a power bar with it's own circuit breaker can be ideal if it prevents someone from making the mistake of using a circuit in a way that trips a panel breaker, ie preventing your spouse from plugging a vacuum into a circuit shared by several rooms.
If I plug in a heater pulling 10A then sure, the 5A fuse will blow.
Daisy chaining multiways will increase the resistance in the earth wire which could mean you end up with a class 1 device with a fault connecting live to earth which would only punting say 8A to earth due to a high resistance (but then your circuit's RCD would trip with that), but is it a major problem?
With the US system, do you not have wires capable of 3A (say 24 AWG) which you can connect to a normal socket which also takes a 10A vacuum?
If that lamp has a fault where it pulls 6A, what protects the 3A wire -- i.e. there's a fault with your lamp which is plugged into a 15A circuit breaker, and the lamp draws 10A, it wouldn't trip the breaker, and that nice thin 3A lamp cord would melt.
UK only needed to introduce that because of their ring main architecture, which was fused at levels above a plug.
Also c.f. extension coils and pre-battery vacuums: both needed their cable full unspooled to reach their full load rating. Yet they typically lack technical enforcement mechanism to not rely on users being literate and willing enough to RTFM.
Something like 1.5 mm2 (only a 0.5mm diameter) is able to handle 12A if the insulation survives heating up to 60 degrees and 18A if 70 degrees is acceptable. The whole circuit would have a 16A fuse at the fusebox, so you're not going to get to 70 degrees.
Far from ideal, but also very very unlikely. Because a short would be over 16A and blow the fuse. So we're talking about some situation that's far from a normal load (any device that's close to such a load would need a different cable to be certified), while still remaining right under the maximum load of the fuse that's covering the circuit.
Homes aren't burning down all over the rest of Europe all the time, while fuses in plugs aren't a thing here.
If you have said combination of electrical devices, and if you're assuming we're using an undersized UPS A + the combo of devices, why does the UPS B matter?
If you're going to overload the UPS A you're going to overload the UPS A regardless of UPS B, no? Daisy chaining or not, that doesn't seem like the actual problem to a knee-jerk thinking.
The danger is overloading. Back in the days when the main things you plugged in were incandescent lights and space heaters, this was probably a big issue. With computer equipment and LED lights you have to have a lot more stuff - many outlets' worth - to reach the circuit's maximum capacity.
If the circuit and "surge protectors" are rated for 1800W (15 amps x 120V), officially you should limit yourself to 80% of that for continuous loads which is 1440W, so you can supply 14 laptops or small small desktops that use 100W each, or over 200 raspberry pis on USB chargers that use 5W each, and either way you're going to need a lot of outlets before you come anywhere close to that limit.
At least that's a rough estimate. Power factor could decrease that number by up to 50% and you can use the full rating for intermittent loads; I'm not certified to know the fine print. Point is that 10 computers can easily use less power than a single space heater.
Not necessarily. There are “power strips” which turn one receptacle into several. Then there are sure protectors which are typically built into power strips. So not all power strips are surge protectors but almost all surge protectors are also power strips.
There are panel-mounted surge suppressors (which can protect all circuits coming from the panel), and also inline surge suppressors with a single output like this one: https://www.lojaclamper.com.br/dps-iclamper-pocket-2pinos-10...
Technically different, but often combined functions. The splitting bit is a "power strip", or sometimes a "power bar". The surge protection is switching off when there's a short or overvoltage in the supply, or other larger than expected power draw.
surge protector = a device with electrical circuitry to help protect angaist surges and spikes.
Those little switcher bricks are horribly inefficient: the 15W one I just pulled out of a drawer draws 0.8A on the primary. Realistically you're going to max out a 15A circuit around 20-30 of those, not 96 (1440/15).
Why? Two reasons: You have to ensure the wire gauge on every link can handle the current, and at every junction (plug) the resistance is higher than in the wire itself. When electrical fires start they usually start at these plug junctions because they overheat.
The surge protectors themselves don't mind being daisy chained.
I believe some UPS brands might also void parts of your warranty if you use them with a surge protector plugged in.
It would be great if Eaton went to the trouble to explain what exact conditions the two UPSes need to fulfill to be successfully daisy chained, which would probably put people away from doing it anyway.
But it would also be a much more informative article, and also positively framed, which is always a much better read.
I mean, look for any mention of "slow-roast it for 12 hours" in your favourite web search engine (I notice a couple of those say "overnight", yikes).
The DC will have its own UPS system, but in the event it all goes wrong (which happens more often than I'd like) you probably want something in your cab that can give your equipment notification and time to shut down safely, and provide surge suppression and maybe some level of isolation from the inevitable back-emf caused by the rest of the hall going dark.
Explaining what are the conditions that need to be met for one UPS not to cause issues to another daisy chained off of it would help the reader understand why that's a bad idea.
Then there's the issue of output waveform. Cheaper UPS models put out a hideous square wave or modified square wave waveform. This is adequate for being fed into a PC's PSU to be turned right back into DC, but most UPS line status detectors will not be happy being fed in such a bad waveform and will consider the grid faulted. Thus, as soon as you have an outage, both UPSes will go into battery mode, and you will not be able to use the "upstream" UPS's capacity. I suspect that even nicer UPSes may have issues accepting the power generated by another UPS, as their voltage and frequency regulation may not be good enough to satisfy the grid stability tests.
But the larger part of my reasoning for why you shouldn't daisy chain UPSes is that it's attacking the problem of reliability from the wrong angle. By daisy chaining UPSes, you have added an additional single-point-of-failure to your system. If instead you used a redundant PSU, you would have decreased the number of single-points-of-failure.
In my personal experience, I have seen far more power supplies die than I've seen UPSes, so the first thing I would do if I wanted a more reliable power setup is go for the redundant power supply. Only then would I consider attaching multiple UPSes (but I probably wouldn't under normal circumstances).
It really depends on the model. The UPS I have takes uses barely any watts to recharge over a period of multiple hours.
I mean, it's possible that I don't know enough about UPSes to understand the finer details and all this would be obvious to someone who does. But presumably they already know why it's a bad idea, and don't need this particular article to explain them.
The amount of scare words and the lack of detail honestly make me believe it probably IS okay to do, and Eaton just wants to scam some extra money out of people. It's just the posts in this thread that make me take it slightly more seriously.
Seems quite improbable that would be the purpose of the article. The article is basically saying "buy a single UPS instead of buying two to daisy chain them expecting to get better results". If anything, such advice would lead to people buying less devices.
Of course if one had two UPSes laying around for nothing, then I guess daisy chaining might come to mind to get better capacity or something for no extra price. However, generally people don't randomly have UPSs laying around for no reason, so it would make sense to buy higher capacity UPS than two lower capacity UPSs to daisy chain them anyways (and for probably quite similar total cost of ownership).
Yea, as someone that ran their own micro ISP for a while, and worked for larger ISPs, no it's not ok to do at all. APC brand was the absolute worst about immediately tripping when plugged into another UPS. Eaton online UPS' actually handled it ok in comparison, but they are typically pretty expensive units.
This is definitely a case of the vendor attempting to save you money. Get an ATS on anything that doesn't accept dual power.
At least from my experience of owning multiple APC UPS devices, they have a customizable acceptable power quality setting. In such cases setting them to accept the absolute worst quality of power could probably stop them from tripping on bad power input.
No idea if this affects the end devices, however there's probably a reason other than simply extra profit for the default power quality tolerance setting on those devices. Generally they are set to rather low tolerance threshold as the expected usage scenario is servers and other relatively sensitive equipment.
The models I own aren't really the most expensive either, some of the lower end tower models and they still have configurable acceptable power input settings available. Regarding them tripping on bad input and being "absolute worst", I consider this tripping a feature more than an anti-feature, especially as it is user configurable.
EDIT: Also wanted to add that it is actually _preferable_ for UPS to trip as immediately as possible on bad power input. That is the only purpose of such product after all: to protect the devices attached to it from bad or otherwise inadequate power input.
I do understand the logic that chaining UPS will cause interesting things to happen when power is restored and the downstream UPS can then overload the upstream UPS because it will incur additional load when charging its battery, which could trigger overload protection in the upstream device.
Also, not particularly caring about efficiency currently as we always have more energy than we can use, and I’m currently looking at filling a shipping container with sand as a dump.
Oh, that makes a lot of sense. In that case, daisy chaining two sounds like no big deal.
>filling a shipping container with sand as a dump
That's pretty cool. I didn't know private folks were doing that.
Overall, the discussion tangentially reminds me of a common theme in aviation - twin vs single engine aircraft. With two engines, the chance of having an engine failure at a critical time is doubled.
There is a saying about light piston twins…. The other engine will take you all the way to the scene of the crash. If you are willing to fly your light piston twin half full, you have great safety margins, but then you are paying twice as much to move less payload than a comparable single engine aircraft.
I should say that some piston twins are much better than others in this regard, but for many it seems the extra engine mainly hauls the weight of the extra fuel and airframe you had to add to carry it. Asymmetrical thrust adds a lot of drag.
Moving to turbocharged engines or turbines tends to improve things a lot, but even some big twins can’t climb with one engine. (C130 in a many configurations, for example)
I’m pretty sure modern designs are much better as a rule. Composite construction and power to weight ratios of modern power plants have pushed the numbers safely to the left of the tipping point.
This is a lot harder to achieve using 1930s technology engines and construction as found in many light aircraft built up into the 1980s. There’s only so much you can get out of an air cooled, low RPM carburated gasoline engine,and complex shapes in metal construction cost big in labor and weight.
Many cheap inverters are not pure sine, and a UPS seeing this waveform may decide it needs to go to battery also.
Practically, a UPS also adds to the current draw, and many people may accidentally exceed the circuit limit because they only look at the useful load but not also the UPS charging load after a power failure ends.
When daisy-chaining an UPS to another, the downstream UPS can easily overload and trip the upstream UPS because of that.
It has not happened yet. If downstream UPS likes the source, it is not going to switch to battery, and battery charging will Not be happening because downstream UPS had it charged already long time ago.
------------
I bought three UPSs, all them used independently; but I had to stop using all of them. They all had a capacitor whine that was driving me crazy.
Your run-of-the-mill UPS is likely to either be offline, which forwards the input to the output, or line-interactive, which can compensate to an extent under or over voltage conditions with a regulator. If the input current characteristics are outside allowable tolerances, they can't compensate and must switch the load to battery to continue powering it.
This one surely should "like" simulated sine wave as a source and don't drop to battery?
But why would it require grid-like sine wave and not go along with whatever is the source, provided source can still be used to charge its batteries? I saw no answers yet, and this "why" is the very key to the discussion.
For example, a common topology in offline UPS is that the inverter and the charger are the same circuit driven differently[0] - so you can't charge the battery while the inverter is carrying load. This is popular at the low-end because you have literally half as much UPS, but makes what you're describing impossible.
Another common issue at the low end is that the inverter isn't thermally sized to run non-stop, they know they can cut a corner because your battery presents a finite and known duty cycle.
There are ways around this, but at some point you end up fixing the wrong problem - eg, it's cheaper, safer, and more resilient to buy a transfer switch instead of uprating two UPS to be capable of daisy-chaining.
This is the key to this whole discussion. I guess it boils down to existing line-interactive designs, why they can't work with simulated sine as a source.
A dual-conversion (online) UPS is almost certainly more robust as far as what kinds of inputs it can accept (though as GP noted, they're more expensive, and they're also less efficient due to the additional rectification->inverter).
Edit: derp. Wouldn’t matter. I was thinking putting them in parallel.