Number of qubits needed to simulate the universe

  • I
  • Thread starter HighPhy
  • Start date
  • #1
HighPhy
89
8
The other day my computer science professor gave an advanced course on the importance of quantum computers.

At one point, he justified the fact that quantum computers have unlimited functions by saying that with ##800## qubits we could (he does not say "we can") represent the entire universe.

What are the arguments in favor of that thesis? Where did the number ##800## come from?

What I don't understand is the deeper meaning of the number of qubits needed to simulate/describe the universe. AFAIK, the idea is that the observable universe can only have ##2^n## possible states. And a quantum computer with ##n## qubits can have ##2^n## possible states. So a quantum computer with ##n## qubits must have a state that described the entire observable universe.
Googling suggests that now QCs with 1000 qubits have been made, so what's the point? I'm really struggling.

Moreover, the number ##800## does not prove me convincing. If ##2^n## are the possible representable states, ##3.57 \times 10^{80} \ \mathrm{m^3}## is the volume of the observable universe, and ##(1.616 \times 10^{-35})^3 \ \mathrm{m^3}## is the Planck length cubed, the number of qubits would be given by $$\log_2 \left(\dfrac{3.57 \times 10^{80}}{(1.616 \times 10^{-35})^3}\right) = 615 \ \mathrm{qubits}$$ Not ##800##.
Another point of confusion.
 
Last edited:
Physics news on Phys.org
  • #2
I'm no expert, but it sounds like an exaggeration of the importance of representing states. It takes very little information, for example, to represent an arbitrary chess position. That hardly helps you simulate a game, let alone solve the game.
 
  • #3
The other issue is that our theories (e.g. QFT and GR) may involve an infinite amount of information that can only be approximated by a finite simulation. E.g. particle scattering using perturbation theory generates an infinite series of integrals (Dyson series), which can be represented by an infinite number of Feynman diagrams. Even if there are only two particles, the theory is not contained by finite mathematics.
 
  • #4
PeroK said:
The other issue is that our theories (e.g. QFT and GR) may involve an infinite amount of information that can only be approximated by a finite simulation. E.g. particle scattering using perturbation theory generates an infinite series of integrals (Dyson series), which can be represented by an infinite number of Feynman diagrams. Even if there are only two particles, the theory is not contained by finite mathematics.
I now have greater clarity of the topic. But I am still left with a stumpy understanding with respect to a topic. That is: what should be the meaning of "simulating/describing/representing the universe with a certain number of qubits sampled at the Planck length"? What is the characteristic view of this point?

Articles like this share a similar view, but I'm missing the point. If we have quantum computers with more than 1000 qubits, what should be the point of talking about "observable universe representation" with 800 qubits? Maybe there is something I don't know?
 
  • #5
HighPhy said:
Articles like this share a similar view
You're unlikely to increase your understanding by reading science journalism. That link is technically not valid reference for this forum. It sounds like the usual hyperbole and is not trustworthy.
 
  • #6
HighPhy said:
I now have greater clarity of the topic. But I am still left with a stumpy understanding with respect to a topic. That is: what should be the meaning of "simulating/describing/representing the universe with a certain number of qubits sampled at the Planck length"? What is the characteristic view of this point?

Articles like this share a similar view, but I'm missing the point. If we have quantum computers with more than 1000 qubits, what should be the point of talking about "observable universe representation" with 800 qubits? Maybe there is something I don't know?
The qubits in current quantum computers are not very good. If you had a quantum computer with 1000 perfect qubit you could indeed do a LOT of interesting calculations. However, today's qubits are way too noisy and if you try to run a calculation using more than say ~50 qubit (at most) you are unlikely to get a useful answer.
One way around this is to use error correction codes that use many "bad" physical qubits to implement one "good" logical qubit (which is what is then used to run calculations). The number of physical qubits you need to make one "good" qubit depends on the scheme used; but it very possible that we will need at least ~million physical qubits to create 1000 logical qubits. We certainly are not there yet!
 
  • Like
Likes mattt and PeroK
  • #7
HighPhy said:
Articles like this share a similar view
PeroK said:
You're unlikely to increase your understanding by reading science journalism. That link is technically not valid reference for this forum.
The link is neither technically a valid reference, nor factually correct:
Using superdense coding a qubit can hold up to two bits.
A classical bit can be either 0 or 1. A quantum bit, or qubit, is a superposition of 0 and 1.
A single qubit therefore takes 2 classical values at once.
The statement about superdense coding is correct, but the suggested connection to the superposition principle is simply nonsense.
We can see the pattern. One qubit can take the value of two bits. Two qubits can take the values of four bits. In general, n qubits can take the values of 2 tp the power of n.
Superdense coding cannot be used to justify this claim. With superdense coding, you can encode 2n classical bits in n qubits. Which is perfectly consistent with the alleged pattern for n=1 and n=2, but completely inconsistent with the conclusion that n qubits could encode 2^n classical bits.
 
  • Informative
  • Like
Likes PeterDonis and PeroK
  • #8
HighPhy said:
The other day my computer science professor gave an advanced course on the importance of quantum computers.

At one point, he justified the fact that quantum computers have unlimited functions by saying that with 800 qubits we could (he does not say "we can") represent the entire universe.
It is probably not a good idea to try to argue with your professor by proxy. Both of you would profit more, if you tried to establish a working communication, where you can ask if something he said confused you, independent of whether his statement was mistaken, you misunderstood what he tried to say, or he communicated in a very confusing way.

Maybe his thinking was the other way round: If you want to simulate 800 qubits on a classical computer, than you would need more classical bits than the entire universe could provide you. (Based on our current understanding of the situation.)
 
  • Like
Likes Vanadium 50 and Dale
  • #9
gentzen said:
It is probably not a good idea to try to argue with your professor by proxy. Both of you would profit more, if you tried to establish a working communication, where you can ask if something he said confused you, independent of whether his statement was mistaken, you misunderstood what he tried to say, or he communicated in a very confusing way.

Maybe his thinking was the other way round: If you want to simulate 800 qubits on a classical computer, than you would need more classical bits than the entire universe could provide you. (Based on our current understanding of the situation.)
You are absolutely right about the first part. 20/20.

However, my professor explicitly said that "you can represent the universe with 800 qubits, and this is a testament to the fact that quantum computers have unlimited and not restricted functions, at least potentially."

Is that true?

In addition, he is a firm believer in the impact of quantum computers and claims that "they are the future and will be fully integrated on a large scale in 20-30 years."

Despite the fact that there is no evidence against the advent of quantum computers, I don't know what to think. Doesn't this seem to be too much of a strong claim? Especially when motivated in this way? I am asking because I obviously have no experience in this area.
 
  • #10
It is impossible to know. If you want to make a comparison to classical computers; it is a bit like trying to extrapolate the power of a modern supercomputers from the performance of a vacuum tube based computer from the 1940s. It is possible to demonstrate the basic functionally, but making prediction about where the technology will go in 30 years is obviously very hard.
 
  • #11
HighPhy said:
You are absolutely right about the first part. 20/20.

However, my professor explicitly said that "you can represent the universe with 800 qubits, and this is a testament to the fact that quantum computers have unlimited and not restricted functions, at least potentially."

Is that true?

In addition, he is a firm believer in the impact of quantum computers and claims that "they are the future and will be fully integrated on a large scale in 20-30 years."
I agree we shouldn't be arguing by proxy. What you are saying, and claiming your professor said, is heresay. It is of no value to debate it here. We have no way or corroborating that he actually said that or anything like it.
 
  • Like
Likes PeterDonis
  • #12
PeroK said:
I agree we shouldn't be arguing by proxy. What you are saying, and claiming your professor said, is heresay. It is of no value to debate it here. We have no way or corroborating that he actually said that or anything like it.
Yes, you are absolutely right. I just wanted to make sure that my doubts about that were at least reasonable.
 
  • #13
qbits are all part of the Universe. I wonder how many qubits would be needed to characterise a real qubit circuit.
 
  • Like
Likes gentzen
  • #14
sophiecentaur said:
qbits are all part of the Universe. I wonder how many qubits would be needed to characterise a real qubit circuit.
Could you expand on this?
 
  • #15
HighPhy said:
However, my professor explicitly said that "you can represent the universe with 800 qubits, and this is a testament to the fact that quantum computers have unlimited and not restricted functions, at least potentially."

Is that true?
Is quantum computing part of what your computer science professor tries to teach you? If not, then this could be a nice topic for a discussion over a glass of wine, but nothing to worry about. Maybe he just used quantum computing as an example of computation which goes beyond classical computation, as far as computational complexity is concerned.

Of course it is not true, if taken literally. You don't even need experience in this area to see this:
sophiecentaur said:
qbits are all part of the Universe. I wonder how many qubits would be needed to characterise a real qubit circuit.
But it could still make a nice discussion in a private setting.

HighPhy said:
In addition, he is a firm believer in the impact of quantum computers and claims that "they are the future and will be fully integrated on a large scale in 20-30 years."
My personal belief is that Craig Gidney is spot-on that the "The mundane uses of quantum computers" will turn out to be the important part:
https://algassert.com/post/2300

My personal belief is also that quantum computers won't crack RSA, not even in 20-30 years. Or if they indeed crack RSA, then there is also a polynomial or quasi-polynomial classical algorithm doing the same.

HighPhy said:
Despite the fact that there is no evidence against the advent of quantum computers, I don't know what to think. Doesn't this seem to be too much of a strong claim? Especially when motivated in this way? I am asking because I obviously have no experience in this area.
Perhaps the claim is just there for entertainment. No need to get upset over it, like saying "too much of a strong claim". Maybe you are both just amateurs in this area?
 
  • #16
gentzen said:
Is quantum computing part of what your computer science professor tries to teach you? If not, then this could be a nice topic for a discussion over a glass of wine, but nothing to worry about. Maybe he just used quantum computing as an example of computation which goes beyond classical computation, as far as computational complexity is concerned.
No, quantum computing is not part of what my professor tries to teach me. This was a one-time in-depth lecture for those interested (out of program). I chose to participate because these topics are fascinating to me. No normal teaching of any kind.

gentzen said:
But it could still make a nice discussion in a private setting.
Sorry, I didn't understand what you mean here.

gentzen said:
My personal belief is that Craig Gidney is spot-on that the "The mundane uses of quantum computers" will turn out to be the important part:
https://algassert.com/post/2300

My personal belief is also that quantum computers won't crack RSA, not even in 20-30 years. Or if they indeed crack RSA, then there is also a polynomial or quasi-polynomial classical algorithm doing the same.
Really interesting. I had heard of RSA before, but I never delved into it on my own.

gentzen said:
Perhaps the claim is just there for entertainment. No need to get upset over it, like saying "too much of a strong claim". Maybe you are both just amateurs in this area?
It is possible. But I would like to say that I am not upset or angry about this. I am just slightly confused because my professor is a very good professor and quantum computing is within his background expertise. Although it is not the subject he teaches me, he should not be an amateur (as I am). I asked on the Forum because many of my colleagues have taken these claims literally, and in an internal debate only I and one of my other colleagues were not so sure about these arguments. So I wanted to understand more.
 
  • #17
HighPhy said:
It is possible. But I would like to say that I am not upset or angry about this. I am just slightly confused because my professor is a very good professor and quantum computing is within his background expertise. Although it is not the subject he teaches me, he should not be an amateur (as I am). I asked on the Forum because many of my colleagues have taken these claims literally, and in an internal debate only I and one of my other colleagues were not so sure about these arguments. So I wanted to understand more.
Your professor is not "wrong" as such; there are lots of respected scientists working the field who would agree with him. While much of it is just hype, there IS a reason for why billions of dollars are being invested in quantum computing R&D.
To some extent it just depends on how optimistic one is about solving the various problems that people are currently working on. Currently, there is no known big "fundamental" problem as such in QC, but there are a lot of different challenges (mainly engineering challenges) that needs to be solved before one can build a fully error corrected large-scale quantum computer.

Also 20-30 years is quite a long time; most companies and nations have roadmaps that are much more aggressive than that. Having a "useful" quantum computer by 2035 is a common goal (this is e.g.. the official goal of the UK, I think the US has something similar).
 
  • #18
gentzen said:
It is probably not a good idea to try to argue with your professor by proxy.
And it is also a good idea to ask him what he means, rather than some random folks on the internet. After all, you're paying good money for that.

I suspect his claim has more than a few caveats - for example, does the universe the quantum computer is trying to simulate contain a quantum computer than is trying to simulate it? :smile:
 
  • Like
Likes PeterDonis
  • #19
Vanadium 50 said:
I suspect his claim has more than a few caveats - for example, does the universe the quantum computer is trying to simulate contain a quantum computer than is trying to simulate it? :smile:
I had been thinking about this very thing, and I was thinking of motivating it by resorting to the principles of self-similarity and fractal structures.

My confusion comes from the fact that my professor used the words "simulate", "describe" and "represent" interchangeably. Could you kindly explain to me what is the deeper meaning of each of these in relation to the original question? Are they synonyms in this case?
 
  • #20
HighPhy said:
I had been thinking about this very thing, and I was thinking of motivating it by resorting to the principles of self-similarity and fractal structures.

My confusion comes from the fact that my professor used the words "simulate", "describe" and "represent" interchangeably. Could you kindly explain to me what is the deeper meaning of each of these in relation to the original question? Are they synonyms in this case?
I threw you the example of chess. What does it take to represent a chess position or game? What does it take to encapsulate the rules? And what does it take to simulate a realistic or high-quality game?

It's my understanding that QC will provide some specific computational capabilities, but will not replace classical computers.
 
  • #21
You must have some knowledge of what it takes to program a computer in order to understand QC. And some knowledge of what is data, what are functional/processing requirements and what is an algorithm or process that attempts to meet these.
 
  • #22
PeroK said:
You must have some knowledge of what it takes to program a computer in order to understand QC. And some knowledge of what is data, what are functional/processing requirements and what is an algorithm or process that attempts to meet these.
I understand your point and agree with it.

However, I have another question - different from this one - about Quantum Mechanics. Could I open a new thread to discuss it?
 
  • #23
HighPhy said:
I have another question - different from this one - about Quantum Mechanics. Could I open a new thread to discuss it?
A different question should be posted in a new thread, yes.
 
  • Like
Likes Vanadium 50

Similar threads

  • Quantum Physics
Replies
2
Views
1K
Replies
2
Views
606
  • Quantum Physics
Replies
8
Views
1K
  • Quantum Physics
Replies
6
Views
1K
Replies
2
Views
2K
  • Quantum Physics
Replies
2
Views
1K
Replies
8
Views
916
Replies
11
Views
2K
  • Quantum Physics
Replies
18
Views
2K
  • Advanced Physics Homework Help
Replies
6
Views
1K
Back
Top