Common supplementary subspaces

  • Thread starter geoffrey159
  • Start date
  • Tags
    Subspaces
In summary: Then ##D = \text{span}(f_p)##, but any vectorial line is a supplementary space of any hyperplane, so you have a supplementary space of ##H## which is ##D##.1 - If ##A = B ##, since every subspace of a finite dimensional vector space has at least one supplementary space, ##A## and ##B## share this supplementary space, which can be any vectorial line of ##E##.2 - In general, a finite dimensional vector space ##W## can be expressed as the direct sum of an hyperplane ##H## and a vectorial line ##D##. If you consider a basis ##(f_1,...,f_p)## of that
  • #1
geoffrey159
535
72

Homework Statement


Let ##E## be a finite dimensional vector space, ##A## and ##B## two subspaces with the same dimension.
Show there is a subspace ##S## of ##E## such that ##E = A \bigoplus S = B \bigoplus S ##

Homework Equations


[/B]
##\text{dim}(E) = n##
##\text{dim}(A) = \text{dim}(B) = m \le n ##

## {\cal B} = (e_1,...,e_n) ## is a basis of ##E##
## A = \text{span}(e_{i_1},...,e_{i_m}) ##
## B = \text{span}(e_{j_1},...,e_{j_m}) ##

## S_A = \text{span}((e_i)_{i \neq i_1,...,i_m } )##
## S_B = \text{span}((e_i)_{i \neq j_1,...,j_m } )##

## E = A \bigoplus S_A = B \bigoplus S_B ##

The Attempt at a Solution



I find this exercise hard and I can't finish it. Could you help please ?The case ##A = B## is easy, ## S = S_A = S_B ##

I assume now that ##A\neq B ##. I want to show the result by induction based on the decreasing dimensions of ##A## and ##B##.
1 - If ## m = n ## then ## A = B = E ## and ## S = \{0\} ## works
2 - If A and B are hyperplanes ## m = n-1 ##, then ## S_A = \text{span}(e_k) ##, and ## S_B = \text{span}(e_\ell) ##, with ##k\neq \ell##.
Put ## S = \text{span}(e_k + e_\ell) ##.
- Then for any ##x \in A\cap S##, there are scalars ##(\lambda_i)_{i\neq k}## and ##\mu ## such that ## x = \sum_{i\neq k } \lambda_i e_{i} = \mu (e_k + e_\ell ) \Rightarrow 0 = \mu (e_k + e_\ell ) - \sum_{i\neq k } \lambda_i e_{i} = \mu e_k + ( \mu - \lambda_l) e_l - \sum_{i\neq k,l } \lambda_i e_{i} ##. Since ##{\cal B}## is a basis of ##E##, then ##\mu = 0## and ##x = 0##. So ##A \cap S =\{0\}##. We can do the same for ##B## be so that ##B\cap S=\{0\}##
- For any ##x \in E##, there are scalars ##(\lambda_i)_{i = 1...n}## such that ## x = \sum_{i= 1}^n \lambda_i e_i ##. Reordering the terms,
## x = \lambda_k (e_k + e_\ell) + ((\lambda_\ell - \lambda_k) e_\ell + \sum_{i\neq k,\ell}^n \lambda_i e_i) \Rightarrow x \in S + A \Rightarrow S + A = E ##
## x = \lambda_l (e_k + e_\ell) +( (\lambda_k - \lambda_\ell) e_k + \sum_{i\neq k,\ell}^n \lambda_i e_i) \Rightarrow x \in S + B \Rightarrow S + B = E ##​
The two points above show that ##E = A \bigoplus S = B \bigoplus S ##
3 - Assume that it works for ## m = n,n-1,...,r+1 ##. I want to show it works for ##m = r ##
- If ##S_A \cap S_B \neq \emptyset ##, then there is a vector ##e_k \in S_A \cap S_B ## such that ## S_A = \text{span}(e_k)\bigoplus S_A'##, and ##S_B = \text{span}(e_k)\bigoplus S_B'##, where ## S'_A = \text{span}((e_i)_{i \neq k,i_1,...,i_m } )##
and ## S'_B = \text{span}((e_i)_{i \neq k,j_1,...,j_m } )##
So ## E = (A \bigoplus \text{span}(e_k)) \bigoplus S_A' = (B \bigoplus \text{span}(e_k)) \bigoplus S_B' ##. By induction hypothesis, there is a subspace ##S'## such that :
## E = (A \bigoplus \text{span}(e_k)) \bigoplus S' = (B \bigoplus \text{span}(e_k)) \bigoplus S' ##. So ## S = \text{span}(e_k) \bigoplus S' ## works.
- If ##S_A \cap S_B = \emptyset ##, I don't know how to finish that part !
 
Last edited:
Physics news on Phys.org
  • #2
Point 3.1 does not convince me anymore, I need to rework that ! Sorry.
And ##S_A \cap S_B \neq \emptyset ##, always, because the intersection of subspaces is a subspace, so it contains 0.
 
Last edited:
  • #3
I think I might have more. Point 2 was more important than I thought.

Any finite dimensional vector space can be broken into the direct sum of an hyperplane and a vectorial line : ## E = H \bigoplus D ##

Point 2 on OP shows that if ##E = H_1 \bigoplus D_1 = H_2 \bigoplus D_2##, then there exist a vectorial line ##D## such that ## E = H_1 \bigoplus D = H_2 \bigoplus D ##.

So now assume that the dimension of ##S_A## and ##S_B## is 2. They can both be broken into a direct sum of an hyperplane and a vectorial line, so :
## E = A \bigoplus S_A = A \bigoplus ( H_1 \bigoplus D_1) = (A \bigoplus H_1) \bigoplus D_1 ##,
## E = B \bigoplus S_B = B \bigoplus ( H_2 \bigoplus D_2) =(B \bigoplus H_2) \bigoplus D_2 ##.
We have that ##A \bigoplus H_1## and ##B \bigoplus H_2## are hyperplanes of ##E##, so there is a line ##D## such that
##E = (A \bigoplus H_1) \bigoplus D = (B \bigoplus H_2)\bigoplus D ##. Commute ##H_1## and ##D##, as well as ##H_2## and ##D##, and do the same. At the end, ## E = A \bigoplus (D' \bigoplus D) = B \bigoplus (D' \bigoplus D) ##

Repeat this operation by induction. Is this idea good ?
 
  • #4
If A is a subspace of E then, given any basis for A, b= {v1, v2, ..., vm}, there exist a basis for E that contains b.
 
  • #5
Thank you for your reply. I really need some feedback on this exercise, which is horrible !
I understand your comment, in the general case, I should have completed a basis of A and B with vectors of the basis ##{\cal B}## of E, but to tell you the truth, I was completely helpless at first, so I took a less general case in order to see clearer. Please allow me to restate it taking into account your comment.

Let ##{\cal B} = \{ e_1, ..., e_n\}## be a basis of ##E##

0 - If ##A = B ##, since every subspace of a finite dimensional vector space has at least one supplementary space, ##A## and ##B## share this supplementary space.

1 - If ##A## and ##B## have dimension ##n##, then ##A = B = E##, and with remark 0, they have a common supplementary space which is ##S=\{0\}##

2 - In general, a finite dimensional vector space ##W## can be expressed as the direct sum of an hyperplane ##H## and a vectorial line ##D##. If you consider a basis ##(f_1,...,f_p)## of that vector space, put ##H = \text{span}(f_1,...,f_{p-1})##, and ## D = \text{span}(f_p)##. It is easy to see that ## W = H + D## and ## H \cap D = \{0\} ##, so that ##W = H \bigoplus D##

3 - If ##A## and ##B## have dimension ##n-1##, they are hyperplanes of E. One can complete a basis of ##A## (respectively ##B##) with one vector ##e_A \in {\cal B}## (respectively ##e_B##) in order to form a new basis of ##E##. Setting ##D_A = \text{span}(e_A)##, ##D_B = \text{span}(e_B)##, and following remark 2, then ## E = A \bigoplus D_A = B \bigoplus D_B##.

4 - If ## D_A = D_B## then ##D_A## is a common supplementary space of ##A## and ##B##

5 - Assume ##D_A \neq D_B ##, which amounts to saying that ##e_A \in B## and ##e_B \in A ##. Set ##D = \text{span}(e_A + e_B) ##. I want to show that ##E = A \bigoplus D## (respectively ##E = B \bigoplus D##).
  • We already know that ##A + D \subset E##. For any ##x\in E = A \bigoplus D_A##, it can be expressed uniquely as ##x = a + \lambda e_A##, ##a\in A## and ##\lambda## scalar. So ## x = (a - \lambda e_B) + \lambda (e_A + e_B) \Rightarrow x \in A + D \Rightarrow E \subset A + D ##. That shows ## E = A + D ##.
  • Now we need to show that ##A\cap D = \{0\}##. If ##x \in A \cap D##, then it has the form ## x = \sum_{i = 1}^{n-1} \lambda_i a_i = \mu (e_A + e_B) ## when ##(a_1,...,a_{n-1})## is a basis of A. Furthermore ##e_B \in A ## so ## e_B = \sum_{i = 1}^{n-1} \alpha_i a_i##. It follows that ## \mu e_A + \sum_{i=1}^{n-1} (\mu \alpha_i - \lambda_i) a_i = 0##. Since ##(a_1,...,a_{n-1}, e_A)## is a basis of ##E##, then ##\mu = 0## and ## x = 0##. That shows ##A\cap D = \{0\}##
  • Doing the same thing for ##B##, I get what I want.

6 - Assume that if the dimension of ##A## and ## B## is greater or equal to ##r + 1##, then they have a common supplementary.
Assume that now ## A ## and ## B ## have dimension r. Then,
## E = A \bigoplus S_A = A \bigoplus (D_A \bigoplus H_A) = (A \bigoplus D_A) \bigoplus H_A ##
## E = B \bigoplus S_B = B \bigoplus (D_B \bigoplus H_B) = (B \bigoplus D_B) \bigoplus H_B ##
Both ##A \bigoplus D_A## and ##B \bigoplus D_B## have dimension ##r+1##, so they have a common supplementary ##H## in ##E##.
Commuting the terms, ## E = ( A \bigoplus H ) \bigoplus D_A = (B\bigoplus H) \bigoplus D_B ##. Now ##A \bigoplus H## and ##B \bigoplus H## are hyperplanes of ##E## so they have a common supplementary ##D##.
It follows that ## E = A \bigoplus (H \bigoplus D) = B \bigoplus (H \bigoplus D)##. So by induction, it can be shown.
 
  • #6
I apologize for putting my thread on top of the list like that, I know it's not very polite :sorry:, but I would like to know if post #5 answers the question.
 

Related to Common supplementary subspaces

1. What are common supplementary subspaces?

Common supplementary subspaces are two subspaces that have a non-zero intersection, but their sum is equal to the entire vector space. In other words, they share some common elements, but together they span the entire vector space.

2. How are common supplementary subspaces related to linear independence?

Common supplementary subspaces are related to linear independence because if two subspaces are common supplementary, then the basis vectors of each subspace must be linearly independent. This is because the sum of two linearly independent subspaces must span the entire vector space.

3. Can there be more than two common supplementary subspaces?

Yes, there can be more than two common supplementary subspaces. For example, in a three-dimensional vector space, there can be three subspaces that have a non-zero intersection and their sum is equal to the entire vector space.

4. How can common supplementary subspaces be used in applications?

Common supplementary subspaces can be used in applications such as data compression and image processing. By finding common supplementary subspaces in a set of data or images, redundant information can be removed, resulting in a more efficient representation of the data.

5. What is the difference between common supplementary subspaces and orthogonal subspaces?

The main difference between common supplementary subspaces and orthogonal subspaces is that common supplementary subspaces share some common elements, while orthogonal subspaces have no common elements. Additionally, orthogonal subspaces are always mutually perpendicular, while common supplementary subspaces may not be.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
27
Views
3K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
6K
  • Calculus and Beyond Homework Help
Replies
6
Views
3K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Math Proof Training and Practice
Replies
1
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
2K
  • Atomic and Condensed Matter
Replies
4
Views
2K
Back
Top