- #1
- 2,323
- 3
I want to consider the space of NxN real matrices as a vector space in which any given NxN matrix can be given as a real weighted sum of at most N^2 basis matrices. I already know how this works if I assume the form for the inner product (eg. (1/N)Tr(matrix product)).
However, here's the catch: I want to be able to first specify the N^2 basis matrices arbitrarily, and then define the inner product that will make this basis orthogonal. (I don't care how the inner product normalizes them.) I'm thinking that this would be analogous to, for example, starting with the 2-D basis [itex]\left\{\hat{X},\hat{X}+\hat{Y}\right\}[/itex], and then defining a dot-product such that [itex]\hat{X}\cdot\left(\hat{X}+\hat{Y}\right)=0[/itex].
I'm not sure how many restrictions I can allow on the basis, but I know that there are at least two restrictions that I don't care about: 1) no two distinct basis matrices are scalar multiples of each other, and 2) every basis matrix has a nonvanishing square. If either or both of these restrictions make the problem easier, then great. But I don't care if the basis violates them either.
I do need to make one condition on the inner product, however. I don't know if this is always true of inner products, but I need mine to be bilinear. The reason for this is, of course, that I want to use it to pull out "components" in the basis directions.
Now I will try to say this as mathematically as I know how:
Given a set of m matrices,
[tex]
S_A=\left\{A_1,A_2,\ldots,A_m\right\}
[/tex]
s.t.
[tex]
A_j\neq{}rA_i
[/tex]
for any
[tex]
r\in\mathbb{R}
[/tex]
unless i=j, and
[tex]
A_i^2\neq0\forall{}i
[/tex]
I want to find a function
[tex]
g:S_A\times{}S_A\rightarrow\mathbb{R}
[/tex]
s.t.
[tex]
g\left(aA_i+A_j,bA_k+A_l\right)=ar_i\left(b\delta_{i,k}+\delta_{i,l}\right)+r_j\left(b\delta_{j,k}+\delta_{j,l}\right)
[/tex]
for any real a and b and for some real r_i and r_j
The specific situation that sparked my interest in this was the recursion relations for the powers of the rotation generators. And, the specific example that I have in mind would have a basis set of four 4x4 matrices, where
[tex]
A_{m\leq4}=A_1^m
[/tex]
and
(EDIT: I knew I had it right the first time. Removed square root from the denominator.)
[tex]
A_1^{m>4}=\sum_{n=1}^4\frac{g\left(A_1^m,A_1^n\right)}{g\left(A_1^n,A_1^n\right)}A_1^n
[/tex]
(So this is actually for a 4-D subspace of the 16-D space of all 4x4 real matrices.)
I reiterate that I am not asking about the Gramm-Schmidt procedure, which assumes a known g and finds a new basis of A_i's. I want to keep my original A_i's, and make them orthogonal by choosing the definition of the inner product.
However, here's the catch: I want to be able to first specify the N^2 basis matrices arbitrarily, and then define the inner product that will make this basis orthogonal. (I don't care how the inner product normalizes them.) I'm thinking that this would be analogous to, for example, starting with the 2-D basis [itex]\left\{\hat{X},\hat{X}+\hat{Y}\right\}[/itex], and then defining a dot-product such that [itex]\hat{X}\cdot\left(\hat{X}+\hat{Y}\right)=0[/itex].
I'm not sure how many restrictions I can allow on the basis, but I know that there are at least two restrictions that I don't care about: 1) no two distinct basis matrices are scalar multiples of each other, and 2) every basis matrix has a nonvanishing square. If either or both of these restrictions make the problem easier, then great. But I don't care if the basis violates them either.
I do need to make one condition on the inner product, however. I don't know if this is always true of inner products, but I need mine to be bilinear. The reason for this is, of course, that I want to use it to pull out "components" in the basis directions.
Now I will try to say this as mathematically as I know how:
Given a set of m matrices,
[tex]
S_A=\left\{A_1,A_2,\ldots,A_m\right\}
[/tex]
s.t.
[tex]
A_j\neq{}rA_i
[/tex]
for any
[tex]
r\in\mathbb{R}
[/tex]
unless i=j, and
[tex]
A_i^2\neq0\forall{}i
[/tex]
I want to find a function
[tex]
g:S_A\times{}S_A\rightarrow\mathbb{R}
[/tex]
s.t.
[tex]
g\left(aA_i+A_j,bA_k+A_l\right)=ar_i\left(b\delta_{i,k}+\delta_{i,l}\right)+r_j\left(b\delta_{j,k}+\delta_{j,l}\right)
[/tex]
for any real a and b and for some real r_i and r_j
The specific situation that sparked my interest in this was the recursion relations for the powers of the rotation generators. And, the specific example that I have in mind would have a basis set of four 4x4 matrices, where
[tex]
A_{m\leq4}=A_1^m
[/tex]
and
(EDIT: I knew I had it right the first time. Removed square root from the denominator.)
[tex]
A_1^{m>4}=\sum_{n=1}^4\frac{g\left(A_1^m,A_1^n\right)}{g\left(A_1^n,A_1^n\right)}A_1^n
[/tex]
(So this is actually for a 4-D subspace of the 16-D space of all 4x4 real matrices.)
I reiterate that I am not asking about the Gramm-Schmidt procedure, which assumes a known g and finds a new basis of A_i's. I want to keep my original A_i's, and make them orthogonal by choosing the definition of the inner product.
Last edited: