[HN Gopher] More on whether useful quantum computing is "imminent"
___________________________________________________________________
More on whether useful quantum computing is "imminent"
Author : A_D_E_P_T
Score : 82 points
Date : 2025-12-21 20:53 UTC (12 hours ago)
HTML web link (scottaaronson.blog)
TEXT w3m dump (scottaaronson.blog)
| prof-dr-ir wrote:
| I am confused, since even factoring 21 is apparently so difficult
| that it "isn't yet a good benchmark for tracking the progress of
| quantum computers." [0]
|
| So the "useful quantum computing" that is "imminent" is not the
| kind of quantum computing that involves the factorization of
| nearly prime numbers?
|
| [0] https://algassert.com/post/2500
| bawolff wrote:
| I always find this argument a little silly.
|
| Like if you were building one of the first normal computers,
| how big numbers you can multiply would be a terrible benchmark
| since once you have figured out how to multiply small numbers
| its fairly trivial to multiply big numbers. The challenge is
| making the computer multiply numbers at all.
|
| This isn't a perfect metaphor as scaling is harder in a quantum
| setting, but we are mostly at the stage where we are trying to
| get the things to work at all. Once we reach the stage where we
| can factor small numbers reliably, the amount of time to go
| from smaller numbers to bigger numbers will be probably be
| relatively short.
| jvanderbot wrote:
| From my limited understanding, that's actually the opposite
| of the truth.
|
| In QC systems, the engineering "difficulty" scales very badly
| with the number of gates or steps of the algorithm.
|
| Its not like addition where you can repeat a process in
| parallel and bam-ALU. From what I understand as a layperson,
| the size of the inputs is absolutely part of the scaling.
| dmurray wrote:
| But the reason factoring numbers is used as the quantum
| benchmark is exactly that we have a quantum algorithm for
| that problem which is meant to scale better than any known
| algorithm on a classical computer.
|
| So it seems like it takes an exponentially bigger device to
| factor 21 than 15, then 35 than 21, and so on, but if I
| understand right, at some point this levels out and it's
| only relatively speaking a little harder to factor say
| 10^30 than 10^29.
|
| Why are we so confident this is true given all of the
| experience so far trying to scale up from factoring 15 to
| factoring 21?
| sfpotter wrote:
| The fact that it does appear to be so difficult to scale
| things up would suggest that the argument isn't silly.
| thrance wrote:
| Actually yes, how much numbers you can crunch per second and
| how big they are were among the first benchmarks for actual
| computers. Also, these prototypes were almost always
| immediately useful. (Think of the computer that cracked
| Enigma).
|
| In comparison, there is no realistic path forward for scaling
| quantum computers. Anyone serious that is not trying to sell
| you QC will tell you that quantum systems become
| exponentially less stable the bigger they are and the longer
| they live. That is a fundamental physical truth. And since
| they're still struggling to do _anything at all_ with a
| quantum computer, don 't get your hopes up too much.
| dboreham wrote:
| This is quite falatious and wrong. The first computers were
| built in order to solve problems immediately that were
| already being solved slowly by manual methods. There never
| was a period where people built computers so slow that they
| were slower than adding machines and slide rules, just
| because they seemed cool and might one day be much faster.
| upofadown wrote:
| Perhaps? The sort of quantum computers that people are talking
| about now are not general purpose. So you might be able to make
| a useful quantum computer that is not Shor's algorithm.
| gsf_emergency_6 wrote:
| Simulating the Hubbard model for superconductors at large
| scales is significantly more likely to happen sooner than
| factoring RSA-2048 with Shor's algorithm.
|
| Google have been working on this for years
|
| Don't ask me if they've the top supercomputers beat, ask
| Gemini :)
| bahmboo wrote:
| I particularly like the end of the post where he compares the
| history of nuclear fission to the progress on quantum computing.
| Traditional encryption might already be broken but we have not
| been told.
| bawolff wrote:
| In a world where spying on civilian communication of
| adversaries (and preventing spying on your own civilians) is
| becoming more critical for national security interests, i
| suspect that national governments would be lighting more of a
| fire if they believe their opponents had one.
| the8472 wrote:
| NSA _is_ pushing for PQ algos.
| mvkel wrote:
| They absolutely are. NSA is obsessed with post-quantum
| projects atm
| tyre wrote:
| But is this because they are already needed or because they
| want to preserve encryption for past and present data post-
| quantum?
| mvkel wrote:
| The latter
| GrilledChips wrote:
| tbh they could just be pushing for people to adopt newer,
| less-tested, weaker algorithms. switch from something
| battle-hardened to the QuantResist2000 algorithm which
| they've figured out how to break with lattice reduction and
| a couple of GPUs like those minecraft guys did.
| littlestymaar wrote:
| I really doubt we are anywhere close to this when there has
| been no published legit prime factorization beyond _21_ :
| https://eprint.iacr.org/2025/1237.pdf
|
| Surely if someone managed to factorize a 3 or 4 digits number,
| they would have published it as it's far enough of
| weaponization to be worth publishing. To be used to break
| cryptosystems, you need to be able to factor at least
| 2048-digits numbers. Even assuming the progress is linear with
| respect to the number of bits in the public key (this is the
| theoretical lower bound but assume hardware scaling is itself
| linear, which doesn't seem to be the case), there's a pretty
| big gap between 5 and 2048 and the fact that no-one has ever
| published any significant result (that is, not a magic trick by
| choosing the number in a way that makes the calculation
| trivial, see my link above) showing any process in that
| direction suggest we're not in any kind of immediate threat.
|
| The reality is that quantum computing is still very very hard,
| and very very far from being able what is theoretically
| possible with them.
| ktallett wrote:
| As someone that works in quantum computing research both academic
| and private, no it isn't imminent in my understanding of the
| word, but it will happen. We are still at that point whereby we
| are comparable to 60's general computing development. Many
| different platforms and we have sort of decided on the best next
| step but we have many issues still to solve. A lot of the key
| issues have solutions, the problem is more getting everyone to
| focus in the right direction, which also will mean when funding
| starts to focus in the right direction. There are snake oil
| sellers right now and life will be imminently easier when they
| are removed.
| andsoitis wrote:
| > it will happen.
|
| If you were to guess what reasons there might be that it WON'T
| happen, what would some of those reasons be?
| ktallett wrote:
| So in my view, the issues I think about now are:
|
| - Too few researchers, as in my area of quantum computing. I
| would state there is one other group that has any academic
| rigour, and is actually making significant and important
| progress. The two other groups are using non reproducible
| results for credit and funding for private companies. You
| have FAANG style companies also doing research, and the
| research that comes out still is clearly for funding. It
| doesn't stand up under scrutiny of method (there usually
| isn't one although that will soon change as I am in the
| process of producing a recipe to get to the point we are
| currently at which is as far as anyone is at) and
| repeatability.
|
| - Too little progress. Now this is due to the research focus
| being spread too thin. We have currently the classic digital
| (qubit) vs analogue (photonic) quantum computing fight, and
| even within each we have such broad variations of where to
| focus. Therefore each category is still really just at the
| start as we are going in so many different directions. We
| aren't pooling our resources and trying to make progress
| together. This is also where a lack of openness regarding
| results and methods harms us. Likewise a lack of automation.
| Most significant research is done by human hand, which means
| building on it at a different research facility often
| requires learning off the person who developed the method in
| person if possible or at worse, just developing a method
| again which is a waste of time. If we don't see the results,
| the funding won't be there. Obviously classical computing
| eventually found a use case and then it became useful for the
| public but I fear we may not get to that stage as we may take
| too long.
|
| As an aside, we may also get to the stage whereby, it is
| useful but only in a military/security setting. I have worked
| on a security project (I was not bound by any NDA
| surprisingly but I'm still wary) featuring a quantum setup,
| that could of sorts be comparable to a single board computer
| (say of an ESP32), although much larger. There is some value
| to it, and that particular project could be implemented into
| security right now (I do not believe it has or will, I
| believe it was viability) and isn't that far off. But that
| particular project has no other uses, outside of the
| military/security.
| ecshafer wrote:
| Wouldn't the comparison be more like the 1920s for computing.
| We had useful working computers in the 1940s working on real
| problems doing what was not possible before hand. By the 1950s
| we had computers doing Nuclear bomb simulations and the 1960s
| we had computers in banks doing accounting and inventory. So we
| had computers by then, not in homes, but we had them. In the
| 1920s we had mechanical calculators and theories on computation
| emerging but not a general purpose computer. Until we have a
| quantum computer doing work at least at the level of a digital
| computer I can't really believe it being the 1960s.
| ktallett wrote:
| I'm not going to pretend that I am that knowledgeable on
| classic computing history from that time period. I was
| primarily going off the fact the semi conductor was built in
| the late 40's, and I would say we have the quantum version of
| that in both qubit and photonic based computing and they work
| and we have been developing on them for some time now. The
| key difference is that there are many more steps to get to
| the stage of making them useful. A transistor becamse useful
| extremely quickly and well in Quantum computing, these just
| haven't quite yet.
| tokai wrote:
| Not to be snarky, but how is it comparable to 60's computing?
| There was a commercial market for computers and private and
| public sector adoption and use in the 60s.
| ktallett wrote:
| There is private sector adoption and planning now of specific
| single purpose focused quantum devices in military and
| security settings. They work and exist although I do not
| believe they are installed. I may be wrong on the exact date,
| as my classical computer knowledge isn't spot on. The point I
| was trying to make was that we have all the bits we need. We
| have the ability to make the photonic quantum version (which
| spoiler alert is where the focus needs to move to over the
| qubit method of quantum computing) of a transistor, so we
| have hit the 50's at least. The fundamentals at this point
| won't change. What will change is how they are put together
| and how they are made durable.
| layer8 wrote:
| What makes you confident that it will happen?
| mvkel wrote:
| The people who are inside the machine are usually the least
| qualified to predict the machine's future
| uejfiweun wrote:
| What an idiotic take. The LEAST qualified? Should I go ask
| some random junkie off the street where quantum computing
| will be in 5 years?
| charcircuit wrote:
| Yes. Apply wisdom of the crowd.
| GrilledChips wrote:
| In the 60's we actually had extremely capable, fully-developed
| computers. Advanced systems like the IBM System360 and CDC
| 6600.
|
| Quantum computing is currently stuck somewhere in the 1800's,
| when a lot of the theory was still being worked out and few
| functional devices had even been constructed.
| eightysixfour wrote:
| Did anyone else read the last two paragraphs as "I AM NOT ALLOWED
| TO TELL YOU THINGS YOU SHOULD BE VERY CONCERNED ABOUT" in bright
| flashing warning lights or is it just me?
| William_BB wrote:
| Just you
| ktallett wrote:
| It is more, many companies can't do what they claim to do, or
| they have done it once at best and had no more consistency. I
| sense most companies in the quantum computing space right now
| are of this ilk. As someone that works in academic and private
| quantum computing research, repeatability and methodology are
| severely lacking, which always rings alarm bells. Some
| companies are funded off the back of one very poor quality
| research paper, reviewed by people who are not experts, that
| then leads to a company that looks professional but behind the
| scenes I would imagine are saying Oh shit, now we actually have
| to do this thing we said we could do.
| bahmboo wrote:
| I don't think he is saying that. As I said in my other comment
| here I think he is just drawing a potential parallel to other
| historic work that was done in a private(secret) domain. The
| larger point is we simply don't know so it's best to act in a
| way that even if it hasn't been done already it certainly seems
| like it will be broken. Hence the move to Post-Quantum
| Cryptography is probably a good idea!
| griffzhowl wrote:
| Aaronson says:
|
| > This is the clearest warning that I can offer in public
| right now about the urgency of migrating to post-quantum
| cryptosystems...
|
| That has a clear implication that he knows something that he
| doesn't want to say publically
| machinationu wrote:
| a crypto system is expected to resist for 30 years.
|
| it doesnt need to be imminent for people to start moving
| now to post-quantum.
|
| if he thinks we are 10 years away from QC, we need to start
| moving now
| andrewflnr wrote:
| Very much so. But the specificity and severity of what he
| knows is not clear just from this. Not necessarily to the
| point of "bright flashing warning lights" as the top-level
| comment put it. Anyway, I certainly am glad that people are
| (as far as I can tell?) more or less on top of the post-
| quantum transition.
| belter wrote:
| I ran it through ROT13, base64, reversed the bits, and then
| observed it....The act of decoding collapsed it into ...not
| imminent...
| svara wrote:
| He's making it sound that way, although he might plausibly deny
| that by claiming he just doesn't want to speculate publicly.
|
| Either way he must have known people would read it like you did
| when he wrote that; so we can safely assume it's boasting at
| the very least.
| tromp wrote:
| I realize this is a minority opinion, and goes against all
| theories of how quantum computing works, but I just cannot
| believe that nature will allow us to reliably compute with
| amplitudes as small as 2^-256. I still suspect something will
| break down as we approach and move below the planck scale.
| semi-extrinsic wrote:
| Fun fact: the Planck mass is about 22 micrograms, about the
| amount of Vitamin D in a typical multivitamin supplement, and
| the corresponding derived Planck momentum is 6.5 kg m/s, which
| is around how hard a child kicks a soccer ball. Nothing
| inherently special or limiting about these.
| ttoinou wrote:
| If you look at Planck units or any dimensionless set of
| physical units, you will see that mass stands apart from
| others units. There's like a factor 10^15 or something like
| this, i.e. we can't scale all physical units to be around the
| same values, something is going with mass and gravity that
| makes it different than others
| adrian_b wrote:
| After computing in 1899 for the first time the value of
| what is now named "Planck's constant" (despite the fact
| that Planck has computed both constants that are now named
| "Boltzmann's constant" and "Planck's constant"), Planck has
| immediately realized that this provides an extra value,
| besides those previously known, which can be used for the
| definition of a natural unit of measurement.
|
| Nevertheless, Planck did not understand well enough the
| requirements for a good system of fundamental units of
| measurement (because he was a theoretician, not an
| experimentalist; he had computed his constants by a better
| mathematical treatment of the experimental data provided by
| Paschen), so he did not find any good way to integrate
| Planck's constant in a system of fundamental units and he
| has made the same mistake made by Stoney 25 years before
| him (after computing the value of the elementary electric
| charge) and he has chosen the wrong method for defining the
| unit of mass among two variants previously proposed by
| Maxwell (the 2 variants were deriving the unit of mass from
| the mass of some atom or molecule and deriving the unit of
| mass from the Newtonian constant of gravitation).
|
| All dimensionless systems of fundamental units are
| worthless in practice (because they cause huge
| uncertainties in all values of absolute measurements) and
| they do not have any special theoretical significance.
|
| For the number of independently chosen fundamental units of
| measurement there exists an optimum value and the systems
| with either more or fewer fundamental units lead to greater
| uncertainties in the values of the physical quantities and
| to superfluous computations in the mathematical models.
|
| The dimensionless systems of units are not simpler, but
| more complicated, so attempting to eliminate the
| independently chosen fundamental units is the wrong goal
| when searching for the best system of units of measurement.
|
| My point is that the values of the so-called "Planck units"
| have absolutely no physical significance, therefore it is
| extremely wrong to use them in any reasoning about what is
| possible or impossible or about anything else.
|
| In a useful system of fundamental units, for all units
| there are "natural" choices, except for one, which is the
| scale factor of the spatio-temporal units. For this scale
| factor of space-time, in the current state of knowledge
| there is no special value that can be distinguished from
| other arbitrary choices, so it is chosen solely based on
| the practical ease of building standards of frequency and
| wave-number that have adequate reproducibility and
| stability.
| Aardwolf wrote:
| Once quantum computers are possible, is there actually anything
| else, any other real world applications, besides breaking crypto
| and number theory problems that they can do, and do much better
| than regular computers?
| comicjk wrote:
| Yes, in fact they might be useful for chemistry simulation long
| before they are useful for cryptography. Simulations of quantum
| systems inherently scale better on quantum hardware.
|
| https://en.wikipedia.org/wiki/Quantum_computational_chemistr...
| GrilledChips wrote:
| More recently it's turned out that quantum computers are less
| useful for molecular simulation than previously thought. See:
| https://www.youtube.com/watch?v=pDj1QhPOVBo
|
| The video is essentially an argument from the software side
| (ironically she thinks the hardware side is going pretty
| well). Even if the hardware wasn't so hard to build or scale,
| there are surprisingly few problems where quantum algorithms
| have turned out to be useful.
| smurda wrote:
| One theoretical use case is "Harvest Now, Decrypt Later" (HNDL)
| attacks, or "Store Now, Decrypt Later" (SNDL). If an oppressive
| regime saves encrypted messages now, they can decrypt later
| when QCs can break RSA and ECC.
|
| It's a good reason to implement post-quantum cryptography.
|
| Wasn't sure if you meant crypto (btc) or cryptography :)
| GrilledChips wrote:
| I will never get used to ECC meaning "Error Correcting Code"
| or "Elliptic Curve Cryptography." That said, this isn't
| unique to quantum expectations. Faster classical computers or
| better classical techniques could make various problems
| easier in the future.
| OJFord wrote:
| What do you want it to mean?
| layer8 wrote:
| From TFA: 'One more time for those in the back: the main
| _known_ applications of quantum computers remain (1) the
| simulation of quantum physics and chemistry themselves, (2)
| breaking a lot of currently deployed cryptography, and (3)
| eventually, achieving some _modest_ benefits for optimization,
| machine learning, and other areas (but it will probably be a
| while before those modest benefits win out in practice). To be
| sure, the detailed list of quantum speedups expands over time
| (as new quantum algorithms get discovered) and also contracts
| over time (as some of the quantum algorithms get dequantized).
| But the list of known applications "from 30,000 feet" remains
| fairly close to what it was a quarter century ago, after you
| hack away the dense thickets of obfuscation and hype.'
| GrilledChips wrote:
| It turns out they're not so useful for chemistry.
| https://www.youtube.com/watch?v=pDj1QhPOVBo
| hattmall wrote:
| I believe the primary most practical use would be compression.
| Devices could have quantum decoder chips that give us massive
| compression gains which could also massively expand storage
| capacity. Even modest chips far before the realization of the
| scale necessary for cryptography breaking could give
| compression gains on the order of 100 to 1000x. IMO that's the
| real game changer. The theoretical modeling and cryptography
| breaking that you see papers being published on is much further
| out. The real work that isn't being publicized because of the
| importance of trade secrets is on storage / compression.
| tgi42 wrote:
| I worked in this field for years and helped build one of the
| recognizable companies. It has been disappointing to see, once
| again, promising science done in earnest be taken over by
| grifters. We knew many years ago that it was going to take FAR
| fewer qubits to crack encryption than pundits (and even experts)
| believed.
| jasonmarks_ wrote:
| Zero money take: quantum computing looks like a bunch of
| refrigerator companies.
|
| The fact that error correction seems to be struggling implies
| unaccounted for noise that is not heat. Who knows maybe
| gravitational waves heck your setup no matter what you do!
| willmadden wrote:
| We'll know when all of the old Bitcoin P2PK addresses and
| transacted from addresses are swept.
| GrilledChips wrote:
| the funny thing is that nobody will ever do that. The moment
| someone uses quantum computing or any other technology to crack
| bitcoin in a visible way, the coins they just gave to
| themselves become worthless because confidence collapses.
| Traubenfuchs wrote:
| Cloud providers will love it when we will need to buy more
| compute and memory for post quantum TSL.
| nacozarina wrote:
| another late signal will be a funding spike
|
| once someone makes a widget that extracts an RSA payload, their
| govt will seize, spend & scale
|
| they will try to keep it quiet but they will start a spending
| spree that will be visible from space
___________________________________________________________________
(page generated 2025-12-22 09:00 UTC)