I've been writing a 'book' (more of an extended blog post that I'd like to put out for free) attempting to explain quantum computing to a layman-ish audience.
I sort of got inspired to do this after seeing so many QC PR posts on HN, and finding the educational material in this space to be either too academic, too narrow in scope, or totally facile. I think given the incredible hype (and potential promise) of this industry, there should be on-ramps for technically minded people to get an understanding of what's going on. I don't think you should need to be a quantum physicist to be able to follow the field (I am only an electrical engineer).
My book tries to cover the computational theory, the actual hardware implementations, and the potential applications of quantum computers. More than that, I want to be unbiased and stray away from what I feel is misleading hype. It's been a work in progress for about 6 months now, with a lot of time spent gaining fluency in the field. But the end is in sight! :)
One thing I would like to see addressed is the misconception that QC can help turn NP problems into P. I see this floating from time to time.
Yes, totally. I feel like the computational complexity part of quantum computing is actually pretty well explained to the 'layman' by some of Scott Aaronson's work, but unfortunately it's not well placed in context (i.e. it very much focuses on the theoretical CS, and not the whole QC picture). You have to sort of start digging for material about computational complexity theory/quantum and stumble into his output.
Awesome! Anywhere we can look for updates, like a website?
FWIW, my shallow understanding of quantum computing as a programmer, in case you wanted perspectives from your potential audience:
- I thought quantum physics was a sham? Like on par with string theory. But apparently that's not true
- I hear QC only breaks certain kinds of cryptography algorithms (involving factoring big primes?), and that we can upgrade to more foolproof algorithms.
- I hear that one of the main challenges is improving error bounds? I'm not sure how error is involved and how it can be wrangled to get a deterministic or useful result
- Idk what a qubit is or how you make one or how you put several together
Planning on starting a substack/blog soon!
Your questions are helpful bar-setter for me, and more or less align with the questions that I had when I was starting out this project (sans the skepticism of quantum mechanics period, I take that as a given). Going down your list:
- Yeah there's a distinction between asymmetric and symmetric encryption schemes. Asymmetric schemes are typically used to make a shared private key which is then used in ensuing symmetrically encoded communications. Those asymmetric schemes are broadly vulnerable to quantum based attacks, hence the need to upgrade to 'post quantum encryption schemes' (PQS). PQS approaches have been developed and are slowly being rolled out, even though it's unclear when the threat of quantum-enabled cracking will be real.
- Yes, I cover this extensively. This actually relates to your last question as well, since error depends in part on what kind of qubit platform you're working with. A superconducting qubit naturally 'decoheres' (loses its unique state) over time, with some sort of semipredictable rate of decoherence, whereas photonic qubits sometimes just get lost! All platforms have some sort of built in error due to the fact that you are applying essentially analog gates to them, and these gates have some imprecision that may build up over millions of operations. I'd characterize the challenges as A) reducing error, and B) correcting the errors that inevitably occur.
- This was one of my sticking points too. The short answer is that there are a few different modalities all competing to be 'the one', and no one really knows what's going to win out. They all have their own (dis)advantages.
No, quantum physics is not a sham. Lasers are an application of quantum physics, for example. Usage of quantum physics principles in non scientific (thoughts are entangled!) or arbitrary macroscopic contexts (since electrons can cross a barrier, a human can pass through a wall) is an entirely different thing.
I would be very interested where you got some of these misconceptions and half-truths.
I think I watched some educational TV program like 15 years ago that did a poor job explaining quantum physics, or overhyped it and set off my BS detectors. Idk. A weird mix of poor memory and miscommunication and outdated information I think. EDIT oh and Schrodinger's cat! Doesn't make sense to me.
The latter points were things I gathered from skimming recent headlines and articles. I should read more thoroughly.
As a comparison, take a look at this book -> https://nostarch.com/quantum-computing. I found it very accessible!
Thanks for sharing this. I think there are a few books in this vein: intro to quantum algorithms, basically. I particularly liked Thomas Wong's “Introduction to Classical and Quantum Computing”. I'm not so particularly interested in the math details, and I want my book to go further. I haven't found any (layman-friendly) text that breaks down the actual mechanisms of operation behind the qubit modalities, considers the implications of successful QC development, etc.
I'd like to see that post when it's ready!
Would like to check this out when you're ready to share.
Following!