A question on PSD (positive semidefinite) completion

  • Thread starter NaturePaper
  • Start date
  • Tags
    Psd
In summary, the conversation discusses the question of finding a choice for c that would make a given matrix A positive semidefinite (PSD). It is noted that if c=a_1\bar{a}_2, then A is PSD, but it is also stated that there may not exist any other choice for c to make A PSD. The conversation also mentions the possibility of proving this statement using the additional condition Tr(A)=1 and gives a counterexample. Ultimately, the conversation concludes with a proof that suggests that the PSD completion for this type of matrix is unique.
  • #1
NaturePaper
70
0
Please help me to solve this question:

Let [tex] A=\left[\begin{array}{ccccccc}
|a_0|^2 &a_0\bar{a}_1 &a_0\bar{a}_2 &0 &a_0\bar{a}_3 &0&0\\
\bar{a}_0a_1 &|a_1|^2 & c &0 &a_1\bar{a}_3 &0&0\\
\bar{a}_0a_2 &\bar{c} &|a_2|^2 &0 &a_2\bar{a}_3 &0&0\\
0 &0 &0 &0 &0 &0&0\\
\bar{a}_0a_3&\bar{a}_1a_3 &\bar{a}_2a_3 &0 &|a_3|^2 &0&0\\
0 &0 &0 &0 &0 &0 &0 \\
0 &0 &0 &0 &0 &0 &0
\end{array}\right][/tex].

We note that if we choose [tex]c=a_1\bar{a}_2[/tex], then [tex] A=b^\dag b,\mbox{ with } b=(a_0~ a_1~ a_2~ 0~ a_3~ 0 ~0)^T[/tex] and hence A is PSD. Now my guess is there does not exists any other choice for c to make A PSD i.e., A is PSD iff [tex]c=a_1\bar{a}_2[/tex]. Can anyone help me to prove (or give a counter example to ) this?

One can assume the additional condition Tr(A)=1. To give a counterexample, one can assume as many of [tex]a_i,\quad i=0,1,2,3[/tex] to be zero. Please help me.
 
Physics news on Phys.org
  • #2
Ya, I got a proof. Its a straightforward calculation and my answer is correct--no other choice of c can make A PSD. Can anybody give me an illusive (I mean a theorem...etc) proof to this?
 
  • #3
Oh...amazingly I found a nice proof which is applicable to not only this problem but for all such matrices (atleast three a_i nonzero) . The PSD completion is unique. The main key to the proof is the property that every principle minor should be non negative.
 

Related to A question on PSD (positive semidefinite) completion

1. What is positive semidefinite (PSD) completion?

Positive semidefinite (PSD) completion is a mathematical technique used to find a positive semidefinite matrix that satisfies a given set of constraints. It involves adding new variables and constraints to a given matrix in order to make it positive semidefinite, which means that all of its eigenvalues are non-negative.

2. Why is PSD completion important?

PSD completion is important because positive semidefinite matrices have many applications in various fields such as statistics, optimization, and control theory. They can also be used to model real-world problems and provide solutions to them. Additionally, PSD completion can help simplify complex problems by reducing them to a form that is easier to work with.

3. How is PSD completion used in practice?

In practice, PSD completion is used in a variety of ways depending on the specific problem at hand. It can be used to find feasible solutions to optimization problems, to generate covariance matrices for multivariate data analysis, or to construct positive semidefinite kernels for machine learning algorithms. It can also be used as a tool for proving the existence of solutions to certain mathematical problems.

4. What are the limitations of PSD completion?

One limitation of PSD completion is that it can be computationally expensive, especially for large matrices. Additionally, it may not always be possible to find a positive semidefinite completion for a given matrix, in which case other techniques may need to be used. PSD completion also relies on the initial matrix being close to positive semidefinite, so it may not work well for highly non-PSD matrices.

5. What are some examples of problems where PSD completion is used?

PSD completion has a wide range of applications, some examples include: finding the maximum likelihood estimator for covariance matrices in multivariate data analysis, constructing positive semidefinite kernels for support vector machines in machine learning, and solving semidefinite programming problems in optimization. It can also be used in graph theory, control theory, and signal processing.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Advanced Physics Homework Help
2
Replies
36
Views
3K
  • Linear and Abstract Algebra
Replies
8
Views
2K
Replies
4
Views
950
Replies
3
Views
2K
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
27
Views
1K
Back
Top