this post was submitted on 31 Oct 2024
173 points (96.3% liked)

Technology

59605 readers
3435 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

As a reminder, current estimates are that quantum cracking of a single 2048-bit RSA key would require a computer with 20 million qubits running in superposition for about eight hours. For context, quantum computers maxed out at 433 qubits in 2022 and 1,000 qubits last year. (A qubit is a basic unit of quantum computing, analogous to the binary bit in classical computing. Comparisons between qubits in true quantum systems and quantum annealers aren't uniform.) So even when quantum computing matures sufficiently to break vulnerable algorithms, it could take decades or longer before the majority of keys are cracked.

The upshot of this latest episode is that while quantum computing will almost undoubtedly topple many of the most widely used forms of encryption used today, that calamitous event won’t happen anytime soon. It’s important that industries and researchers move swiftly to devise quantum-resistant algorithms and implement them widely. At the same time, people should take steps not to get steamrolled by the PQC hype train.

you are viewing a single comment's thread
view the rest of the comments
[–] WolfLink@sh.itjust.works 4 points 3 weeks ago (1 children)

Error correction does fix that problem but at the cost of increasing the number of qubits needed by a factor of 10x to 100x or so.

[–] humblebun@sh.itjust.works -2 points 3 weeks ago (1 children)

But who guarantees that ec will overcome decoherence, introduced by this number of qbits? Not a trivial question that nobody can answer for certain

[–] WolfLink@sh.itjust.works 3 points 3 weeks ago (1 children)

I mean the known theory of quantum error correction already guarantees that as long as your physical qubits are of sufficient quality, you can overcome decoherence by trading quantity for quality.

It’s true that we’re not yet at the point where we can mass produce qubits of sufficient quality, but claiming that EC is not known to work is a weird way to phrase it at best.

[–] humblebun@sh.itjust.works -2 points 3 weeks ago (1 children)

It was shown this year for how many, 47 qbits to scale? How could you be certain this will stand for millions and billions?

[–] WolfLink@sh.itjust.works 2 points 3 weeks ago (1 children)

Because the math checks out.

For a high level description, QEC works a bit like this:

10 qubits with a 1% error rate become 1 EC qubit with a 0.01% error rate.

You can scale this in two ways. First, you can simply have more and more EC qubits working together. Second, you can near the error correcting codes.

10 EC qubits with a 0.01% error rate become one double-EC qubit with a 0.0001% error rate.

You can repeat this indefinitely. The math works out.

The remaining difficulty is mass producing qubits with a sufficiently low error rate to get the EC party started.

Meanwhile research on error correcting codes continues to try to find more efficient codes.

[–] humblebun@sh.itjust.works 2 points 3 weeks ago (1 children)

While you describe the way how error correction works, there are other factors you fail to notice.

It is widely known, that for each physical qubit T2 time decreases when you place it among other. The ultimate question here is: when you add qubits, could you overcome this decoherence with EC or not.

Say you want to build a QC with 1000 logical qubits and you want to be sure that the error rate doesn't exceed 0.01% after 1 second. You assemble it, and it turns out that you have 0.1%. You choose to use some simple code, say 7,1 and now you have to assemble a 7000 chip to execute 1000 qubits logic. You again assemble it and the error rate is higher now (due to decoherence and crosstalk). But the question is how much higher? If it's lower than your EC efficiency then you just drop a few more qubits, use 15,2 code and you are good to go. But what if no?

[–] WolfLink@sh.itjust.works 2 points 3 weeks ago

That’s a good point which is part of why there is a lot of active research into quantum networking. Once you can connect two otherwise independent quantum computers, you no longer have the issue of increasing crosstalk and other difficulties in producing larger individual quantum chips. Instead you can produce multiple copies of the same chip and connect them together.