• Escapado 2 days ago

    My master thesis was on using machine learning techniques to synthesise quantum circuits. Since any operation on a QC can be represented as a unitary matrix my research topic was that, using ML, given a set of gates, how many and in what arrangement of them you could generate or at least approximate this matrix. Another aspect was, given a unitary matrix, could a neural network predict a number of gates needed to simulate that matrix as a QC and thereby give us a measure of complexity. It was a lot of fun to test different algorithms from genetic algorithms to neural network architectures. Back then NNs were a lot smaller and I trained them mostly on one GPU and since the matrices get exponentially bigger with the amount of qbits in the circuit it was only possible for me to investigate small circuits with less than a dozen qubits but it was still nice to see that in principle this worked quite well.

    • Xcelerate 2 days ago

      > given a unitary matrix, could a neural network predict a number of gates needed to simulate that matrix as a QC and thereby give us a measure of complexity

      That's really interesting. I'm curious—did you explore whether the predictivity of the neural network was influenced by any hidden subgroup structure in the unitary matrix? Seems like the matrix symmetries could play a significant role in determining the gate complexity.

      • Escapado 2 days ago

        I didn’t include this in my thesis but from what I remember looking at hundreds if not thousands of matrices and their QC solutions some symmetries would immediately make it so that way less gates would be needed but it could also be deceptive and completely depended on which gate set you have available. If you take the matrix for the quantum furier transform for example with a gate set of phase gates and a hadamard gate then for the 100% solution you n hadamard gates and n! phase gates for an n qubit circuit even though the matrix is highly symmetrical. If your gate set was Clifford + toffli you would be able to do it with n*log(n). And then depending on how close you wanna approximate it you could get away with even less. But I have not gone into further analysis on which symmetries would have which effect on which gate set and whether it would influence predictivity. But it would be fun to investigate for sure!

      • IIAOPSW 2 days ago

        How did it do compared to the baseline of the solvoy kitaev algorithm (and the more advanced algorithms that experts have come up with since)?

        How (if at all) can the ML approach come up with a circuit when the unitary is too big to explicitly represent it as all the terms in the matrix but the general form of it is known? Eg it is known what the quantum fourier transform on N qubits is defined to be, and consequently any particular element within its matrix is easy to calculate, but you don't need (and shouldn't try) to write it out as a 2^n x 2^n matrix to figure out its implementation.

        • Escapado 2 days ago

          I have not investigated this as much as I should have but if I remember correctly there were cases where the NN approach was yielding smaller solutions. Would be a great follow up to the thesis.

          The way this was turned into an optimization problem was to assume a NN to input an identity matrix and then have a custom layer in there to generate S(2) unitaries (exponential form) for which the phase parameters are then the learned parameters in the NN. Similarly for global XX gates. Then from this the final unitary can be computed and compared against the desired unitary and a loss function can be derived. I remember it was a little fiddly to implement these custom layers in tensorflow since many of the functions didn’t work for imaginary numbers. But yeah in short a circuit structure was assumed (alternating between a global XX and single qubit operations) for which the phases of their generating matrices were the learned parameters of the network. Then multiple topologies (how many steps in the QC) could be validated. I think the coolest result was that NNs consistently outperformed other optimisation strategies to learn those parameters.

          • westurner 2 days ago

            Solvay-Kiteav theorem: https://en.wikipedia.org/wiki/Solovay%E2%80%93Kitaev_theorem

            /? Solvay-Kiteav theorem Cirq QISkit: https://www.google.com/search?q=Solvay-Kiteav+theorem+cirq+q...

            qiskit/transpiler/passes/synthesis/solovay_kitaev_synthesis.py: https://github.com/Qiskit/qiskit/blob/main/qiskit/transpiler...

            qiskit/synthesis/discrete_basis/solovay_kitaev.py: https://github.com/Qiskit/qiskit/blob/stable/1.2/qiskit/synt...

            SolovayKitaevDecomposition: https://docs.quantum.ibm.com/api/qiskit/qiskit.synthesis.Sol...

            What are more current alternatives to the Solvay-Kiteav theorem for gate-based quantum computing?

          • sigmoid10 2 days ago

            At which university? I literally know a guy who did exactly the same thing for his master thesis. I'm wondering if the world is so small or this specific topic is so common.

            • Escapado 2 days ago

              University of Hamburg in Germany.

            • OscarCunningham 2 days ago

              I would be interesting if finding quantum circuits was one of the things that quantum computers were good at.

            • aithrowawaycomm 2 days ago

              Outside of a few odd citations in the intro (which is innocuous) I don't see anything shady or dishonest in this paper after a cursory skimming. But I do worry about 60% of the authors being NVIDIA employees, where the corresponding author is not a staff scientist, but rather a "technical marketing engineer" whose job "focuses on inspiring developers by sharing examples of how NVIDIA quantum technologies can accelerate research."[1] So I am wondering about what articles didn't make it to this review, where ANN/GPU-accelerated QC research didn't work very well or was too expensive to justify. It makes me suspect I am being advertised to, and that this paper must be read with an unusually high degree of scrutiny.

              Like I said: there's nothing obviously shady or dishonest here. But this is also an arXiv paper, so no need for authors to disclose conflicts of interest. (OpenAI has pioneered the technique of using the arXiv to distribute academic infomercials with zero scientific value.) And I worry about the distortion this stuff has on honest scientific work. Bell Labs was always an independent subsidiary: Bell's marketing staff did not take ownership of Bell Labs' research. The fact that NVIDIA's marketer has a PhD and an "engineer" title doesn't actually mitigate any of this.

              [1] https://developer.nvidia.com/blog/author/mawolf/

              • alw4s 2 days ago

                meanwhile the 5M prize for the first real application of quantum remains open..

                quantum computing like string theory remains a scratch waiting for an itch - ie, a boondoggle.

                to be fair though, it is worth looking at something that has the potential to break all of the encryption on the internet.

              • dr_dshiv 2 days ago

                The most remarkable thing is using frontier models to write and then run code on quantum computers (eg IBM’s). It’s amazing — you can even teach with a “creative coding” approach (without the multiple courses in quantum physics).

                It’s not even the code that is the hardest part. In general, it is very difficult to frame a given problem domain (ie, any other field of science or optimization problem) as a problem addressable with a quantum computer. Quantum computer scientists have not made this easy—with LLMs, it still isn’t trivial, but it is a huge leap in accessibility.

                Warning: LLMs hallucinate a ton within this area. Many things can go wrong. But the fact that sometimes they are correct is amazing to me.

                We’ve run some studies showing the importance of expertise in this area: participants with quantum background and coding skills were much more effective at solving quantum problems with LLMs than novices, for instance.

                • medo-bear 2 days ago

                  My next paper will be titled

                  Artificial Intelligence for Quantum Computing with Applications to Blockchain ... in Rust

                  • dghughes 2 days ago

                    Used to find a cure cancer but only on a Friday.

                  • bastloing 2 days ago

                    Throw in some crypto and you've got a recipe for some great word salad!

                    • kaycebasques 2 days ago

                      Blockchain and AI algorithms yield novel insights in superconductor quantum computing

                      • dartos 2 days ago

                        I really believe that the confluence of these paradigm shifting technologies will enable even greater gains in productivity.

                        With our new AI enabled smart algorithms, we can meta-analyze transactions in systems such as blockchain networks to identify areas where resources can be more effectively aggregated.

                        We also gave tight integration with salesforce and Jira, enabling next-gen teams to conduct research and ship products with much flatter management structure.

                        I wish word salad artist was a profession

                    • JanisErdmanis 2 days ago

                      One thing that comes to my mind and reminds me by looking into the examples is the fragility of existing implementation processes for quantum computers. Pulse optimisation, frequency allocation to minimise cross-talk between qubits, and the development of custom error correction codes tailored for specific quantum computers, circuits, or computations are unpredictable in their successes and, hence, inherently unscalable. It looks like the primary use for AI here is to optimise benchmark results for publishing rather than enable practical quantum computations.

                      • vivzkestrel 2 days ago

                        me and my friend used to joke about how to get VC funding a few years ago and it's about to be true soon at this rate VC: what are you building? me: Decentralized deep learning powered vertical farming using quantum computing on the blockchainfor onions

                        • dartos 2 days ago

                          That’s always how it works with tech investors

                        • itchyjunk 2 days ago

                          Petition to add AI to HN so it lists out the gist of papers and articles posted xD . I also always thought AI would overlap with Q Annealing in some way vaguely because both are optimization process. But maybe this is where QC helps AI and not the other way around.

                          • l33tbro 2 days ago

                            People can deploy that personally if they wish. The abstract captures the gist in most cases.

                          • mg 2 days ago

                            Is there a metric which lets us track how far/close we are to do something useful with quantum computers?

                            And when we look at that metric and extrapolate its progression over the last years, when do we expect QCs to hit datacenters?

                            • wzeng 2 days ago

                              You can track resource estimates for different problems against QC performance here: https://metriq.info/progress

                              It crowdsources new submissions to the chart if there are ones you see missing (just open a PR)

                              • foresttony 2 days ago

                                QCs are in all major data centers already. Although there is not much you can do with them, but you can use them. IONQ is considered the leader right now in QC. Look them up, very interesting stuff they are doing. They just partnered with astrazeneca to start developing useful applications. They will also demonstrate hybrid algorithms next week where they offload some stuff to QPUs to speed up models. I would say within two years QCs will start being useful.

                                • almostgotcaught 2 days ago

                                  > QCs are in all major data centers already

                                  Citation needed

                                • bawolff 2 days ago

                                  I think https://xkcd.com/678/ is relavent.

                                  Basically we are far enough out from "useful" QC that its anyone's guess (except for the people trying to convince you to invest in their startup). That said relavent metrics include number of qubits and their error rate (once you get past a certain number with a low enough error rate you can start to use error correction which is when things really start to become interesting)

                                  It does depend a bit on what you mean by "useful" though see also https://en.wikipedia.org/wiki/Noisy_intermediate-scale_quant...

                                  • vtomole 2 days ago

                                    Yes, QC is far enough that it's "anyone's guess", but the field is actively working on sliding the answer to this problem from "anyone's guess" to "a bit more certain". It will never be 100% certain until the useful QC appears but we can decrease the probability of our predictions being pure guesswork. As an example, DARPA is funding a project to find the first high impact QC applications https://www.darpa.mil/work-with-us/publications-highlighting... along with finding when the first hardware to run those applications can be built https://www.darpa.mil/work-with-us/quantum-benchmarking-init....

                                    QC startups should be funded because industry is a crucial component of QC progress and large-scale QC labs (Google, IBM e.t.c) can't work on all the ideas. The ideas that come from startups do accelerate QC development.

                                    • foresttony 2 days ago

                                      Good job commenting on something you know nothing about...

                                      • bawolff a day ago

                                        If you think i said something incorrect or misleading, by all means feel free to correct the record.

                                    • hulitu 8 hours ago

                                      i think there was some time ago a comparison with Commodore 64.

                                    • Jabbs 2 days ago

                                      Aliens for Time Travel (is how I read this)

                                      • bee_rider 2 days ago

                                        Well, of course Aliens are for time travel. The universe is far too long, we don’t get to see anything interesting if we show up after everybody is dead.

                                      • hulitu a day ago

                                        > Artificial Intelligence for Quantum Computing

                                        They forgot the blockchain. /s