- #1
- 10,778
- 3,646
I have been going through the following interesting paper on the foundations of Quantum Mechanics:
http://arxiv.org/pdf/0911.0695v1.pdf
'We define the state of a system as that mathematical object from which one can determine the probability for any conceivable measurement. Physical theories can have enough structure that it is not necessary to give an exhaustive list of all probabilities for all possible measurements, but only a list of probabilities for some minimal subset of them. We refer to this subset as fiducial set. Therefore, the state is specified by a list of d (where d depends on dimension N) probabilities for a set of fiducial measurements: p = (p1, . . . , pd). The state is pure if it is not a (convex) mixture of other states. The state is mixed if it is not pure. For example, the mixed state p generated by preparing state p1 with probability λ and p2 with probability 1 − λ, is p = λp1 + (1 − λ)p2. When we refer to an N-dimensional system, we assume that there are N states each of which identifies a different outcome of some measurement setting, in the sense that they return probability one for the outcome. We call this set a set of basis or orthogonal states. Basis states can be chosen to be pure. To see this assume that some mixed state identifies one outcome. We can decompose the state into a mixture of pure states, each of which has to return probability one, and thus we can use one of them to be a basis state. We will show later that each pure state corresponds to a unique measurement outcome.'
After thinking about it there seems to an assumption being made here - namely in associating a mixed state with an ensemble of other states they are assuming that if the list of probabilities they call a state is exactly the same as such an ensemble then it is to be interpreted that way. I can't see how it follows from the definitions they make, but rather is an assumption following from their first axiom 'All systems of the same information carrying capacity are equivalent'. However if that assumption is at the very foundations of QM then decoherence solves the measurement problem. For it means the improper mixed state decoherence transforms a superposition into must be interpreted as an actual ensemble - its assumed in it foundations.
What do others think?
Thanks
Bill
http://arxiv.org/pdf/0911.0695v1.pdf
'We define the state of a system as that mathematical object from which one can determine the probability for any conceivable measurement. Physical theories can have enough structure that it is not necessary to give an exhaustive list of all probabilities for all possible measurements, but only a list of probabilities for some minimal subset of them. We refer to this subset as fiducial set. Therefore, the state is specified by a list of d (where d depends on dimension N) probabilities for a set of fiducial measurements: p = (p1, . . . , pd). The state is pure if it is not a (convex) mixture of other states. The state is mixed if it is not pure. For example, the mixed state p generated by preparing state p1 with probability λ and p2 with probability 1 − λ, is p = λp1 + (1 − λ)p2. When we refer to an N-dimensional system, we assume that there are N states each of which identifies a different outcome of some measurement setting, in the sense that they return probability one for the outcome. We call this set a set of basis or orthogonal states. Basis states can be chosen to be pure. To see this assume that some mixed state identifies one outcome. We can decompose the state into a mixture of pure states, each of which has to return probability one, and thus we can use one of them to be a basis state. We will show later that each pure state corresponds to a unique measurement outcome.'
After thinking about it there seems to an assumption being made here - namely in associating a mixed state with an ensemble of other states they are assuming that if the list of probabilities they call a state is exactly the same as such an ensemble then it is to be interpreted that way. I can't see how it follows from the definitions they make, but rather is an assumption following from their first axiom 'All systems of the same information carrying capacity are equivalent'. However if that assumption is at the very foundations of QM then decoherence solves the measurement problem. For it means the improper mixed state decoherence transforms a superposition into must be interpreted as an actual ensemble - its assumed in it foundations.
What do others think?
Thanks
Bill
Last edited: