Given that a reader here is a professor of this quantum stuff

Can we get an opinion as to whether this is true or not?

Australian researchers have demonstrated that a quantum version of computer code can be written on a silicon microchip with the highest level of accuracy ever recorded.

A quantum computer uses atoms rather than transistors as its processing unit, allowing it to conduct multiple complex calculations at once and at high speed. In the race to build the first functional quantum computer scientists around the world have been trying to write quantum code in a range of materials such as caesium, aluminium, niobium titanium nitride and diamond.

But researchers at the University of NSW have long been basing their research around silicon, because silicon is the building block of all modern electronic devices, which would make quantum code in a silicon microchip easier, more cost-effective and highly scalable.

For the first time they managed to entangle a pair of quantum bits – units of quantum information also known as qubits – in silicon. Qubits allow computers to access code vastly richer than the digital codes used in normal computers which gives quantum computers their superior power.

??

27 thoughts on “Given that a reader here is a professor of this quantum stuff”

  1. It’s not code writing, it is establishing the basic building blocks in hardware. Silicon is good because we are very good at making very small very accurate stuff in it.

    I don’t know nearly enough about qubit hardware design (just the subsequent usage) to authoritatively comment but there is no theoretical reason why you couldn’t use silicon.

    Quite a lot of the journalistic interpretation is nonsense of course, so no wonder you are having issues.

  2. The actual factual statements look pretty plausible. The hype is, well, hype.

    Incremental advance, one of many, of interest to us specialists but hardly a fundamental breakthrough.

  3. Ditto. Ever since I was an undergraduate in computer science, functional quantum computers have been five years away. That was about twenty years ago.

    The same for AI. Definitely more science fiction than reality.

  4. Oh quantum computers are real, and have been for almost 20 years now. What will take the time is making them useful rather than toys.

    Personally I would say we’re now 10+ years away. A breakthrough in 5 years or less strikes me as unlikely.

    What’s not often talked about is the problem of a shortage of quantum algorithms, and there has been little or no progress in that area. Interesting that the article is putting its hope in quantum simulation (which is what the last few paragraphs are obliquely referring to). That’s probably right, but represents quite a ratcheting down in the rhetoric.

  5. So Much For Subtlety

    But researchers at the University of NSW have long been basing their research around silicon

    I know someone at the University of NSW. It is not like the Nigerian university mentioned elsewhere. It is part of the Australian equivalent of the Russell Group.[1] So it might be wrong but ought to be taken seriously.

    [1] That is, they train vets as well as teach sheep dipping.

  6. I don’t know whether I’m the intended “professor of quantum stuff” (I’m only a humble “doctor of quantum stuff”) and it’s not exactly my field, but I guess I have slightly more knowledge than the average member of the public. Unfortunately, we have a lot of above-average members of the public commenting here 🙂 and everyone’s said what I’d say already:
    1. Yep, what the researchers have done is entirely plausible; they’ve just shown that you can set up the required mechanisms. The novelty is doing so in a substance that is already very well-known in regular computing.
    2. Sir Thomas Beecham could have been talking about the Guardian’s relationship with science when he quipped, “They don’t understand it, but they love the noise it makes.”
    3. UNSW is definitely a reputable research institution.
    4. Quantum computing is not likely to be something that radically alters our everyday experience of computation. I can expand, but that would be a truly expansive comment.

  7. Thank you for that, although no, you weren’t quite the professor I had in mind. Jonathan up above is the Oxford professor of something or other quantum…..

  8. “Given that a reader here is a professor of this quantum stuff”

    That me. I make own quantum puter with hasmmer. But hurts m,y hed.

  9. The number you need to entangle to be useful, 1000. The maximum number anybody has been able to entangle, after decades of trying, 14.

    I got flamed at samizdata for pointing this out when they were getting excited about the d-wave “quantum computer”. Baring a miraculous breakthrough, quantum computers are science fiction.

  10. “the same for AI.” I can assure you that AI has been just around the corner since I was an undergraduate (figuratively; and literally too, as it happens). That gets you back into the sixties. Back then the field was dominated by prima donnas: is it still?

  11. The same for translation and writing in general. Predictions of our demise (usually within 3 years) have been circulating for the >15 years I’ve spent in these games. We’ve now got some just-about useable automated text generation stuff, but it tends to make us more accurate, rather than reducing manpower.

    I expect to retire with similar rumours still frightening the ‘cruits.

  12. When I started in computing in 1972 I was told computers writing their own programs was “just around the corner”. Don’t hear that one much these days.

  13. Roue le jour,

    I remember “The Last One” from the early 80s that was hyped as putting programmers out of business. Then 4GLs in the early 90s. I’m sure there’s someone out there with a “build web solutions without programmers” advert.

  14. Bloke in Costa Rica

    When, 25 years ago, I was doing 3rd year solid state physics at Uni, there were three areas that everyone was buzzing about. They were: a) blue LEDs and laser diodes b) high electron-mobility transistors (HEMTs) and c) quantum dots (QDs). Of these, blue optical semiconductors have become ubiquitous and are a game-changer in a way that a lot of people do not appreciate, HEMTs have found wide application in a lot of RF applications from mobile phones to radar and quantum dots are going to be useful “real soon now”. QDs have a whole range of gee-whiz vapourware applications, qubits being one of them. You might be able, plausibly, to do some useful stuff in e.g. chemistry with a few dozen qubits but the one that has everyone hyped up is factoring RSA moduli using Shor’s algorithm and to break RSA-2048 you probably need in the region of 3000 qubits. The current record for quantum factoring is 56153 = 233×241 which you can find yourself by trial division on an envelope in a couple of minutes (or less, if you look close to √56153 first).

  15. You don’t need atomic switch to do parallel processing. For instance, an processor AND instruction feeds all the bits of each operand into each side of loads of AND gates, it doesn’t AND bit 0 and bit 0, then do bit 1 and bit 1, then etc. With the right wiring almost any CPU operation can be done by feeding all of the source into the wiring at the same time and getting all of the result out of the other end all at the same time.

  16. “You don’t need atomic switch to do parallel processing.”

    True. But if you have 1000 qubits you can do 2^1000 operations simultaneously. You’d need a fair bit of wiring to do that with parallel processing…

  17. I thought the point of this was using existing tool sets which greatly decreased the costs and hopefully the development time. Given one of the methods for producing chips already is Atomic Layer Deposition it seems they are just fine tuning.
    My son who was thinking of engineering at the time spent the first week of his study your career thing with a plasma/ion beam for nanoscale manufacture development team in the clean room and design office; the second week he spent soldering wiring looms.

  18. The Stigler,
    Yeah, I remember The Last One. And I went to the VB launch at Wembley in ’91. Join the dots. What could be easier? Programming for the masses!

    BiCR
    When I got interested in QC in 2001 I wrote a QC simulator. Not difficult if you can do 3D, it’s just a bunch of rotations. There are several commercial solutions. A good graphics card can simulate a QC bigger than any that has actually been built, and, given real QCs have a non trivial set up time, faster, too.

  19. Roue le Jour,

    It is indeed straightforward to simulate small QCs; the problems come when you start to scale things up.

    Representing the general state of an n-qubit QC (assuming for simplicity that you can confine yourself to pure state systems) needs 2^n complex numbers, and if you are using gates outside the Clifford group you basically have to represent the full state and do explicit operations on it. Just representing the general state of a 40 qubit device would need about 10TB of memory.

    If you’re interested in the Clifford group then look up the Gottesman–Knill theorem. But note that although all sorts of interesting things can be done within the Clifford group none of the really interesting things can be done that way.

Leave a Reply

Your email address will not be published. Required fields are marked *