Measurement, information and entropy

In summary, the conversation discusses the concept of a mixed state in quantum systems and its relation to entropy. It is clarified that a mixed state is different from a density matrix and can only be expressed using a density matrix. The question of whether measuring a qubit in a mixed state results in a pure state is addressed, with the answer depending on the type of measurement. It is also discussed that the von Neumann entropy does not change if the measurement results in a pure state. The conversation also touches on the topic of information entropy and how it differs from the von Neumann entropy.
  • #1
jk22
729
24
If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ? If this is the case, then the entropy of the system decreases.

Now the question another way round is :

Suppose we measure a quantum system without gaining information from it, does then the entropy of that quantum system increase ?
 
Physics news on Phys.org
  • #2
I am surprised by your description of a "mixed state".
This:

½|+><+| + ½|-><-|

looks to me more like a density matrix, a statistical mix of state, which is different from

½|+> + ½|->

which I am used to call a mixed state.

It could be that my vocabulary is not up-to-date.
Could you clarify?
 
  • #3
maajdl said:
½|+> + ½|->

which I am used to call a mixed state.

That one is actually a pure state. Anything which can be written as a normalized linear combination of eigenstates is a pure state. Equivalently, you can say that everything you can express as a single ket is a pure state.

Now mixed states are statistical ensembles of pure states. You cannot express them just using a ket. You need to use a density matrix (or equivalent description) to describe them. The difference becomes clear when you have a look at interferences or similar stuff. Loosely speaking, for pure states, you need to calculate the square of your whole linear combination in order to get expectation values. For mixed states, you need to calculate the square of each pure state in your density matrix and you will get a weighted sum of those.

Of course you can also express pure states in terms of a density matrix. However the distinction is pretty clear. If the trace of the squared density matrix is 1, you have a pure state. Otherwise it is a mixed state.

jk22 said:
If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ?

What exactly do you intend to do? Do you want to perform a single measurement or dou you want to repeat it many times? In the latter case, you will get a mixture of 50% of results being + and 50% being -. In each run, you will only get one of the results, of course.
 
  • #4
jk22 said:
If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ? If this is the case, then the entropy of the system decreases.

Now the question another way round is :

Suppose we measure a quantum system without gaining information from it, does then the entropy of that quantum system increase ?

How can you measure something without gaining information?
 
  • #5
jk22 said:
If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ? If this is the case, then the entropy of the system decreases.
Which entropy do you have in mind? The von-Neumann entropy
$$
S_{vN} = -\sum_{k,l} \rho_{kl} (\ln \rho)_{kl}
$$
does not change if the measurement on a system in pure state gave definite result, because then the new state is again a pure state.

jk22 said:
Now the question another way round is :

Suppose we measure a quantum system without gaining information from it, does then the entropy of that quantum system increase ?

If the system began in pure state, and you measure it and learn the result, it ends up in a pure state. The von Neumann entropy stays 0. If you do not learn the result, the system may get into mixed state with non-zero von Neumann entropy.
 
  • #6
Thanks to Cthugha and Jano, this gave me much indices.

If the system began in pure state, and you measure it and learn the result, it ends up in a pure state. The von Neumann entropy stays 0. If you do not learn the result, the system may get into mixed state with non-zero von Neumann entropy.

This is what I supposed. Hence if I consider for example the unit operator 1, which learns nothing about the system, does the entropy increases ? This would indicate that a non maximal information giving operator (hence with multiple eigenvalues) should in fact increase the entropy ?


In fact I could give a mixed state either by a density matrix or by a density probability. Here is something :

suppose I consider a mixed state with the given probability density : rho(theta) uniform. Then its von neumann entropy is the same as the mixture of .5 + and .5 -. This seems a bit non intuitive since a mixture of all state seems to have more entropy ?
 
Last edited:
  • #7
jk22 said:
suppose I consider a mixed state with the given probability density : rho(theta) uniform. Then its von neumann entropy is the same as the mixture of .5 + and .5 -. This seems a bit non intuitive since a mixture of all state seems to have more entropy ?
What is theta? Angle of the ket on the Bloch sphere?

If so, you just have to realize that the von Neumann entropy is just a functional of the density operator ##\rho##, characterizing the lack of certainty about the results of measurements on the system or ensemble of systems. It is not the information entropy nor the thermodynamic entropy, so do not rely on your intuition too much. If two density operators are the same, their von Neumann entropy is the same, irrespective of what else you know about the system/ensemble.

The amount of lack of information about the state of the system or an ensemble of systems can be characterized using the information entropy - you can divide the interval ##\langle 0,\pi\rangle## into ##N## chunks and introduce probability ##1/N## for each of them. The information entropy then will be

$$
S_{inf.} = \sum_k - p_k \ln p_k = \ln N.
$$

As you can see, von Neumann and the information entropy are different things and can have different values.
 
  • #8
I think again to the terminology.
Is it logical to call this: ½|+><+| + ½|-><-| a "mixed state"?
After all, this is not a state of a system.
This is rather a mix of states that represents a pure statistical uncertainty added to the quantum statistics.
Wouldn't it be better to call that a "mix of states" then instead of a "mixed state"?
By the way, I checked here and there, and indeed "mixed state" is the used terminology.
 
  • #9
Yes, the terminology is quite bad. When you hear "state" what people often really mean is just the density matrix.
 
  • #10
maajdl said:
After all, this is not a state of a system.

Actually it is.

See chapter 2 - Ballentine - Quantum Mechanics - A Modern Development.

A state is a positive operator of unit trace. Pure states are of the form |u><u|. Mixed states are convex sums of pure states. It can be shown all states are mixed or pure.

In this connection is the important Gleason's Theorem:
http://kof.physto.se/theses/helena-master.pdf

Thanks
Bill
 
Last edited by a moderator:

Related to Measurement, information and entropy

1. What is measurement?

Measurement is the process of determining the numerical value of a physical quantity. It involves comparing an unknown quantity to a known standard unit, such as a meter or a gram.

2. What is information?

Information is a measure of the amount of knowledge or data that is conveyed by a message. It can be represented by symbols or signals, and is often quantified using measures such as bits or bytes.

3. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In physics, it is often used to describe the amount of energy that is unavailable to do work in a thermodynamic system. In information theory, it is a measure of the uncertainty or unpredictability in a message.

4. How are measurement, information, and entropy related?

Measurement, information, and entropy are all closely related concepts. Measurement involves quantifying physical quantities, which can provide information about a system. This information can then be used to calculate the entropy of the system, which describes the degree of disorder or randomness.

5. Why is entropy important?

Entropy is important because it is a fundamental concept in both physics and information theory. In thermodynamics, it helps us understand how energy flows and how systems reach equilibrium. In information theory, it allows us to quantify the amount of information in a message and to design efficient communication systems.

Similar threads

  • Quantum Physics
2
Replies
39
Views
3K
  • Quantum Physics
Replies
4
Views
853
Replies
11
Views
1K
Replies
2
Views
653
  • Quantum Physics
Replies
30
Views
3K
  • Quantum Physics
Replies
6
Views
1K
Replies
3
Views
898
  • Quantum Physics
2
Replies
35
Views
748
  • Quantum Physics
Replies
9
Views
1K
  • Special and General Relativity
Replies
7
Views
432
Back
Top