this post was submitted on 26 Aug 2024
167 points (93.7% liked)

Technology

59554 readers
3206 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

As we all know, AC won the "War of the Currents". The reasoning behind this is that AC voltage is easy to convert up/down with just a ring of iron and two coils. And high voltage allows us to transport current over longer distances, with less loss.

Now, the War of the Currents happened in 1900 (approximately), and our technology has improved a lot since then. We have useful diodes and transistors now, we have microcontrollers and Buck/Boost converters. We can transform DC voltage well today.

Additionally, photovoltaics produces DC naturally. Whereas the traditional generator has an easier time producing AC, photovoltaic plants would have to transform the power into AC, which, if I understand correctly, has a massive loss.

And then there's the issue of stabilizing the frequency. When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself. When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

I wonder, would it make sense to change our power grid from AC to DC today? I know it would obviously be a lot of work, since every consuming device would have to change what power it accepts from the grid. But in the long run, could it be worth it? Also, what about insular networks. Would it make sense there? Thanks for taking the time for reading this, and also, I'm willing to go into the maths, if that's relevant to the discussion.

you are viewing a single comment's thread
view the rest of the comments
[–] SomeoneSomewhere@lemmy.nz 22 points 2 months ago (2 children)

PV inverters often have around 1-2% losses. This is not very significant. You also need to convert the voltage anyway because PV output voltage varies with light level.

Buck/boost converters work by converting the DC current to (messy) AC, then back to DC. If you want an isolating converter (necessary for most applications for safety reasons) that converter needs to handle the full power. If it's non isolating, then it's proportional to the voltage step.

Frequency provides a somewhat convenient method for all parties to know whether the grid is over- or under- supplied on a sub-second basis. Operating solely on voltage is more prone to oscillation and requires compensation for voltage drop, plus the information is typically lost at buck/boost sites. A DC grid would likely require much more robust and faster real-time comms.

The AC grid relies on significant (>10x overcurrent) short-term (<5s) overload capability. Inrush and motor starting requires small/short overloads (though still significant). Faults are detected and cleared primarily through the excess current drawn. Fuses/breakers in series will all see the same current from the same fault, but we want only the device closest to the fault to operate to minimise disruption. That's achieved (called discrimination, coordination, or selectivity) by having each device take progressively more time to trip on a fault of a given size, and progressively higher fault current so that the devices upstream still rapidly detect a fault.

RCDs/GFCIs don't coordinate well because there isn't enough room between the smallest fault required to be detected and the maximum disconnection time to fit increasingly less sensitive devices.

Generators are perfectly able to provide this extra fault current through short term temperature rise and inertia. Inverters cannot provide 5-fold overcurrent without being significantly oversized. We even install synchronous condensers (a generator without any actual energy source) in areas far from actual generators to provide local inertia.

AC arcs inherently self-extinguish in most cases. DC arcs do not.

This means that breakers and expulsion type fuses have to be significantly, significantly larger and more expensive. It also means more protection is needed against arcs caused by poor connection, cable clashes, and insulation damage.

Solid state breakers alleviate this somewhat, but it's going to take 20+ years to improve cost, size, and power loss to acceptable levels.

I expect that any 'next generation' system is likely to demand a step increase in safety, not merely matching the existing performance. I suspect that's going to require a 100% coverage fibre comms network parallel to the power conductors, and in accessible areas possibly fully screened cable and isolated supply.

EVs and PV arrays get away with DC networks because they're willing to shut down the whole system in the event of a fault. You don't want a whole neighborhood to go dark because your neighbour's cat gnawed on a laptop charger.

[–] BearOfaTime@lemm.ee 4 points 2 months ago (1 children)

Oh wow, thanks for the detailed writeup. It's a little above my pay grade (condensers used as localized generators? Wow, what an idea. They must be huge).

Guess it's time to find an Intro to Powergrids from The Teaching Company

[–] gandalf_der_12te@lemmy.blahaj.zone 1 points 2 months ago (1 children)

I'll give you a short introduction to the power grid (btw. it's called "stromnetz" (electricity network) in german). The power grid has many "levels", where each level represents a network of cables that transport current at a given, specific voltage. For example, you might have one 220kV level, and then a 5kV level, and a 230V end-consumer level.

Between these levels, there have to be translations. These are "transformers" today, transforming high-level AC into lower-level AC or the other way around. For AC networks, they are basically a ring of iron and a few coils. However, for DC networks, other transformers exists, such as Buck/Boost converter.

My question basically is: is there anyone who can give me experimental data on how well DC networks would work in practice? Personal experience is enough, it doesn't have to be super-detailed reports.

[–] SomeoneSomewhere@lemmy.nz 2 points 2 months ago* (last edited 2 months ago)

I'm not sure there are any power grids past the tens-of-megawatt range that aren't just a 2/3/4 terminal HVDC link.

Railway DC supplies usually just have fat rectifiers and transformers from the AC mains to supply fault current/clearing and stability.

Ships are where I would expect to start seeing them arrive, or aircraft.

Almost all land-based standalone DC networks (again, not few-terminal HVDC links) are heavily battery backed and run at battery voltage - that's not practical once you leave one property.

I'm sure there are some pretty detailed reports and simulations, though. A reduction in cost of multi-kV converters and DC circuit breakers is essential.

[–] gandalf_der_12te@lemmy.blahaj.zone 1 points 2 months ago (1 children)

Thank you for this well-thought and balanced viewpoint. It took me 19 days to process all the information.

So basically, I was wrong when I assumed that inverters had an efficiency of around 50%. That misunderstanding comes from the phrase that "filters in the inverter eliminate high-frequency components in the PWM's output". I thought they discard that power. But that's apparently not the case. So the efficiency is more like >95%. So that's good.

[–] SomeoneSomewhere@lemmy.nz 2 points 2 months ago

Even 95% is on the low side. Most residential-grade PV grid-tie inverters are listed as something like 97.5%. Higher voltage versions tend to do better.

Yeah, filters essentially store power during one part of the cycle and release it during another. Net power lost is fairly minimal, though not zero. DC needs filtering too: all those switchmode power supplies are very choppy.