this post was submitted on 01 Aug 2023
323 points (97.1% liked)
Technology
59092 readers
4721 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Superconductors in general have no electrical resistance. That's basically electricity's friction.
Superconducting materials make the strongest electromagnets, they have big applications in quantum stuff (which I don't properly understand to try explaining), and they're used in something called a Tokamak, a specific kind of fusion reactor. They're useful in anything where electrical resistance is bad in general. When electricity is resisted, we lose some and get heat, so a superconducting wire would lose none and never heat up as a result of resistance.
Superconductors traditionally have to be super freaking cold. A lot of these applications can only be done with liquid nitrogen or even colder things, keeping them superconducting. You can do some things with pressure to help out with that, but the point is it's not easy to keep a material superconducting. This effort translates to costs, often prohibitive ones, as you need to actively keep these materials from collecting any heat.
If this research pans out, though, this kind of superconductor will just work at standard temperature and pressure. These could go into standard circuits, they can sit around without bleeding money on upkeep, they're very cool.
People are comparing them to transistors in part because before transistors we had vacuum tubes. Vacuum tubes do the same thing as a transistor, but they're effectively a lightbulb. They burn out, they produce heat, and they didn't miniaturize. Transistors were magic at the time because we could do so much more with them than vacuum tubes, and for superconducting metals, this is the same.
Thanks for the explanation. So, this means we are another step closer to quantum computers for example?
I'm trying to grasp on this concept and how we could see this in our daily lives. Better batteries? I thought about that because they get hot when charging but not sure if it's because of the resistance. Going into standard circuits means we'll have better SoCs? better integrated circuits? Faster computers or phones?
Im trying to think about a daily life application but maybe it won't have a direct impact on that area, maybe it's more about facilitating research that will eventually turn into daily life stuff?
Better batteries, yeah. That's down the line. We will also generate heat during the actual use of any devices. But, less.
It also could become the most efficient commercial batteries, but I expect the cost will be prohibitive. Sending electricity always has a loss, but it doesn't through a superconductor, so these will have a lot of uses at power generation sites, both reducing heat and losslessly storing it (until it enters the traditional grid).
It won't directly transfer to faster tech or anything like that, but it helping quantum computing could do so indirectly.
Definitely it's more of a facilitating research kinda thing. You can't play with superconductors in a lab in a cost efficient way, but this could let you.
Also maglevs and MRI's directly use superconductors currently, so that's a direct use, lower cost MRI's and incredibly fast trains.
Heat is a huge barrier to increasing clock speeds, so a room temperature and pressure superconductor would actually fairly directly translate to major performance gains in computing.
While true, that'd only be for a superconducting CPU. I doubt this material can both superconduct and act as a transistor, and even if it can, I highly doubt you could pack in anywhere near the amount we have in standard CPUs. So while we might replace a standard power supply with a superconducting one, and reduce heat that way, I don't see any direct computing boosts from this. We could superconduct everything around a CPU, have superconducting wires, but the heat from a CPU is generated in the silicon.
It'll be pretty nice to have 100% efficient PSUs, though. Definitely some gains there, just not the same revolutionary ones seen elsewhere.
This is where my mind went. Wondered if the reduction in heat would allow further overclocking/defaults on both CPU and GPUs. I don't know that much about the actual hardware and how it works though.
Not really. First, standard equipment is limited by cost, not technology. Nothing stopping some power user from using liquid nitrogen to cool a desktop, it's just costly. Superconductor tech, though, would be bleeding edge, it wouldn't cost any less for a long time. Supercomputing, on the other hand, has had access to more esoteric cooling systems, and can already use them. They also have had access to the extreme cold superconductors that have already existed.
The real issue there is the CPU makes the heat, but this tech isn't a transistor. We can't replace the silicon chips with superconducting ones, at least not in a form dense enough to be a CPU. There's lots of small improvements around the CPU we can make, but those aren't at the "wow, this will revolutionize technology" level. They're cool but it's the other stuff that's gonna get the focus.
Managing heat is a large part of circuit design. Superconductors can fundamentally change everything about it meaning far smaller much faster and more capable in every way. As an example 95%+ of modern CPU's and GPUs are cooling related. The actual chips are tiny in comparison to the whole component.