- #1
nomadreid
Gold Member
- 1,677
- 210
I am attempting to get a good definition of the word "information", as in "information can be transferred at most at the speed of light".)
Here is my attempt so far. Please indicate if this seems adequate.
Starting from the definition in information theory:
Given a probability distribution p over a random variable x, the Shannon entropy =
-∑xpx[itex]\cdot[/itex]log2px.
(I presume that for continuous variables, the sum turns into an integral.)
Information in Event E = the change in entropy upon measurement of E.
Therefore, for information to be transferred, there must be measurement of the beginning probability distribution and the final probability distribution.
A carrier of information is an entity that changes measurements of the probability distribution.
Hence, all fields (or force carriers) are carriers of information (even though photons seem to be the force carrier of choice).
Here is my attempt so far. Please indicate if this seems adequate.
Starting from the definition in information theory:
Given a probability distribution p over a random variable x, the Shannon entropy =
-∑xpx[itex]\cdot[/itex]log2px.
(I presume that for continuous variables, the sum turns into an integral.)
Information in Event E = the change in entropy upon measurement of E.
Therefore, for information to be transferred, there must be measurement of the beginning probability distribution and the final probability distribution.
A carrier of information is an entity that changes measurements of the probability distribution.
Hence, all fields (or force carriers) are carriers of information (even though photons seem to be the force carrier of choice).