- #1
FallenApple
- 566
- 61
Here is an animation I created in R.
I built this Markov chain of order 50 by correlating the information in one of the coordinates while randomly varying the rest. Is there an explanation for the clustering and flattening out over increasing dimensions of the vector space? Is it due to the fact that data becomes spread out over larger dimensions?
But that doesn't explain why the clusters themselves do not spread out or why other clusters condense. I've done this for much larger dimensions and it seems to reach a steady state.
The plot is of the incremental changes in a Euclidean metric vs the input, so I don't know if viewing the data as extremely spread out in higher dimensional space would translate to this plot.
Do this mean that the correlation that I induced in that coordinate is strong enough such that it keeps the cluster together regardless of how high the dimension is?
Last edited: