# [SOLVED]q(x) is a strictly convex function, show that G is positive definite

#### numbersense

##### New member
Consider the quadratic function $\displaystyle q(\textbf{x}) = \frac{1}{2} \textbf{x}^T G \textbf{x} + \textbf{d}^T \textbf{x} + c$ on $\mathbb{R}^n$, where $\textbf{d}$ is a constant $n \times 1$ vector, $G$ is a constant $n \times n$ symmetric matrix and $c$ is a scalar.

The gradient is $\nabla q(\textbf{x}) = G \textbf{x} + \textbf{d}$ and the Hessian is $\nabla^2 q(\textbf{x}) = G$.

If $q(\textbf{x})$ is a strictly convex function then show that $G$ is positive definite.

I am not sure whether I should start with the convex function definition or start by considering the gradient or the Hessian.

I tried expanding the inequality in the convex function definition but didn't get anywhere.

There is a proposition that says $f$ is strictly convext on $\mathbb{R}^n$ $\implies$ any stationary point is the unique global minimizer. (I can't even prove that a stationary point exists) Another theorem says that positive definiteness is a sufficient condition for being a unique global minimizer and positive semi definiteness is a necessary condition for being a local minimizer. I can't see how to use these statements to prove what the question is asking.

#### Opalg

##### MHB Oldtimer
Staff member
Consider the quadratic function $\displaystyle q(\textbf{x}) = \frac{1}{2} \textbf{x}^T G \textbf{x} + \textbf{d}^T \textbf{x} + c$ on $\mathbb{R}^n$, where $\textbf{d}$ is a constant $n \times 1$ vector, $G$ is a constant $n \times n$ symmetric matrix and $c$ is a scalar.

The gradient is $\nabla q(\textbf{x}) = G \textbf{x} + \textbf{d}$ and the Hessian is $\nabla^2 q(\textbf{x}) = G$.

If $q(\textbf{x})$ is a strictly convex function then show that $G$ is positive definite.

I am not sure whether I should start with the convex function definition or start by considering the gradient or the Hessian.

I tried expanding the inequality in the convex function definition but didn't get anywhere.

There is a proposition that says $f$ is strictly convext on $\mathbb{R}^n$ $\implies$ any stationary point is the unique global minimizer. (I can't even prove that a stationary point exists) Another theorem says that positive definiteness is a sufficient condition for being a unique global minimizer and positive semi definiteness is a necessary condition for being a local minimizer. I can't see how to use these statements to prove what the question is asking.
I don't think you need to use the gradient or the Hessian to show that $G$ is positive definite. The fact that $q$ is strictly convex tells you that $q(\textbf{0}) <\frac12\bigl(q(\textbf{x}) + q(-\textbf{x})\bigr)$, for any nonzero vector $\textbf{x}.$ It follows very easily that $0 < \textbf{x}^T G \textbf{x}$.

#### numbersense

##### New member
I don't think you need to use the gradient or the Hessian to show that $G$ is positive definite. The fact that $q$ is strictly convex tells you that $q(\textbf{0}) <\frac12\bigl(q(\textbf{x}) + q(-\textbf{x})\bigr)$, for any nonzero vector $\textbf{x}.$ It follows very easily that $0 < \textbf{x}^T G \textbf{x}$.
Thanks. It follows like this.
\begin{align*}
q\left(\frac{1}{2} \textbf{x} + \left(1 - \frac{1}{2}\right) (-\textbf{x})\right) = q(0) = c <& \frac{1}{2} q(\textbf{x}) + \left(1 - \frac{1}{2}\right) q(-\textbf{x}) = \textbf{x}^T G \textbf{x} + c\\
\textbf{x}^T G\textbf{x} >& 0
\end{align*}

This is quite clever. I didn't think of obtaining an inequality by using specific values in the convex function definition.