How can I find the optimal vector x for a constrained optimization problem?

In summary, the conversation discusses the problem of finding the vector x_{n\times 1} that minimizes the function f(x) = \sum_{i}^{n}x_{i}^{2} subject to a linear equality constraint. The minimum value for f(x) is 0 for x = 0, but the additional issue is that the rank of matrix A could be less than m, making the standard Lagrange multipliers approach not applicable. Suggestions on how to approach this problem include finding a set of independent constraints through numerical methods or explicitly eliminating variables from the problem, and potentially using a singular value decomposition to solve the Lagrange multiplier equation system.
  • #1
dilbert2011
3
0
Hi all,
I am working on a project and stuck at the following problem.

Find vector [tex]x_{n\times 1}[/tex] which minimizes the function
Code:
[tex]f(x) = \sum_{i}^{n}x_{i}^{2}[/tex]

subject to the linear equality constraint
Code:
[tex][A]_{m\times n} x_{n \times 1}=b_{m\times 1}[/tex] with [tex]m\leq n[/tex]

The function f(x) trivially has a minimum value of 0 for x = 0. The equality constraint is an under determined problem with multiple solutions. One of the possible solutions is certain to minimize f(x). So I am relatively certain that solution to this problem exists. The additional issue is that the rank of matrix A could be less than m. So the standard Lagrange multipliers approach does not work in this case.

Any suggestions on how to approach this problem ?
 
Mathematics news on Phys.org
  • #2
If the rank of A is lesss than m, then your m constraint equations are not all independent. If you convert A to a smaller set of independent constraints, you can then use Lagrange multipliers.

The "best" way to do that would probably be to figure out why the redundant constraints have got into your math model, and remove them from it. Alternatively, you can find the rank of A and a set of independent basis vectors (i.e. independent constraints) numerically.

Note, the set of independent constraints is not unique, but it doesn't matter which indepedent set you find. Any set will give you the same solution for x.
 
  • #3
AlephZero,

Unfortunately, there is no escaping redundant constraints in the problem i am looking at. The approach you suggest is something I have thought of already and is certainly doable. Indeed, if the equality constraints had a full row rank, the problem would have been much easier to deal with.

But I would still like to know if this is the *only* way to do this before i go ahead and code this thing up.
 
  • #4
Another way is to explicitly eliminate variables from the problem. Find the largest coefficient [itex]|a_{jk}|[/itex] in [itex]A[/itex], and rewrite the j'th constraint equation in the form

[itex]x_k[/itex]= a linear equation in the other variables.

Eliminate [itex]x_k[/itex] from the other constraint equations and from the quadratic form you are minimizing.

Repeat till the remaining constraints have all the coefficients zero (to with rounding error) - i.e. they were redundant.

Mathematically, this is very much the same as the previous method, but it may be simpler to implement depending on the size of the problem relative to the number of constraints, and/or the sparseness of the matrix [itex]A[/itex].

A third idea which I haven't investigated, but might work: just set up the Lagrange multiplier equation system for all the constraints including the redundant ones, then solve it by doing a singular value decomposition (SVD) and ignore the singular values that are zero. That certainly works for solving unconstrained least-squares problems which are under-determined because data points are not sufficient to fix the values for all the variables in the model. See any good numerical methods text under "SVD" or "least squares" for more details.
 
  • #5
Let me read up the literature on SVD and see if it can work.
 

Related to How can I find the optimal vector x for a constrained optimization problem?

What is constrained optimization?

Constrained optimization is a mathematical optimization technique that involves finding the maximum or minimum of a function while satisfying a set of constraints. The constraints can be in the form of mathematical equations or inequalities.

What is the difference between constrained and unconstrained optimization?

The main difference between constrained and unconstrained optimization is that constrained optimization takes into account additional constraints that must be satisfied, while unconstrained optimization does not have any constraints.

What are some common techniques used in constrained optimization?

Some common techniques used in constrained optimization include the method of Lagrange multipliers, linear programming, nonlinear programming, and quadratic programming.

What are the applications of constrained optimization?

Constrained optimization has a wide range of applications in various fields such as engineering, economics, operations research, and machine learning. It can be used to optimize processes, design efficient systems, and make optimal decisions.

What are the challenges of solving constrained optimization problems?

One of the main challenges of solving constrained optimization problems is finding an optimal solution that satisfies all the constraints. Another challenge is that the solution space can be complex and high-dimensional, making it difficult to find the global optimum.

Similar threads

Replies
7
Views
1K
Replies
1
Views
561
Replies
6
Views
1K
  • General Math
Replies
6
Views
1K
Replies
5
Views
1K
  • General Math
Replies
2
Views
965
  • Calculus and Beyond Homework Help
Replies
1
Views
593
Replies
1
Views
1K
  • General Math
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Back
Top