Constrained Optimization in Engineering Design

Main.ConstrainedOptimization History

Hide minor edits - Show changes to markup

June 21, 2020, at 04:35 AM by 136.36.211.159 -
Deleted lines 41-59:

(:html:)

 <div id="disqus_thread"></div>
    <script type="text/javascript">
        /* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE * * */
        var disqus_shortname = 'apmonitor'; // required: replace example with your forum shortname

        /* * * DON'T EDIT BELOW THIS LINE * * */
        (function() {
            var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true;
            dsq.src = 'https://' + disqus_shortname + '.disqus.com/embed.js';
            (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq);
        })();
    </script>
    <noscript>Please enable JavaScript to view the <a href="https://disqus.com/?ref_noscript">comments powered by Disqus.</a></noscript>
    <a href="https://disqus.com" class="dsq-brlink">comments powered by <span class="logo-disqus">Disqus</span></a>

(:htmlend:)

Added lines 29-41:

Interior Point Example

A refinery must produce 100 gallons of gasoline and 160 gallons of diesel to meet customer demands. The refinery would like to minimize the cost of crude and two crude options exist. The less expensive crude costs $80 USD per barrel while a more expensive crude costs $95 USD per barrel. Each barrel of the less expensive crude produces 10 gallons of gasoline and 20 gallons of diesel. Each barrel of the more expensive crude produces 15 gallons of both gasoline and diesel. Find the number of barrels of each crude that will minimize the refinery cost while satisfying the customer demands.

  • Solve Refinery Optimization Problem with Continuous Variables
  • Solve Refinery Optimization Problem with Integer Variables

(:html:) <iframe width="560" height="315" src="//www.youtube.com/embed/M_mpRrGKKMo?rel=0" frameborder="0" allowfullscreen></iframe> (:htmlend:)

March 14, 2013, at 09:29 PM by 66.178.116.12 -
Added lines 28-29:
  1. A. WΓ€chter and L. T. Biegler, On the Implementation of an Interior-Point Filter Line-Search Algorithm for Large-Scale Nonlinear Programming, Mathematical Programming 106(1), pp. 25-57, 2006. Download PDF
Changed lines 47-48 from:

(:htmlend:)

to:

(:htmlend:)

January 11, 2013, at 02:26 PM by 69.169.188.188 -
Added lines 27-46:

(:html:)

 <div id="disqus_thread"></div>
    <script type="text/javascript">
        /* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE * * */
        var disqus_shortname = 'apmonitor'; // required: replace example with your forum shortname

        /* * * DON'T EDIT BELOW THIS LINE * * */
        (function() {
            var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true;
            dsq.src = 'https://' + disqus_shortname + '.disqus.com/embed.js';
            (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq);
        })();
    </script>
    <noscript>Please enable JavaScript to view the <a href="https://disqus.com/?ref_noscript">comments powered by Disqus.</a></noscript>
    <a href="https://disqus.com" class="dsq-brlink">comments powered by <span class="logo-disqus">Disqus</span></a>

(:htmlend:)

December 22, 2012, at 04:28 PM by 69.169.188.188 -
Changed lines 1-3 from:

(:title Genetic Algorithms in Engineering Design:) (:keywords genetic algorithms, continuous optimization, mathematical modeling, discrete optimization, nonlinear, optimization, engineering optimization, interior point, active set, differential, algebraic, modeling language, university course:) (:description One often encounters problems in which design variables must be selected from among a set of discrete values:)

to:

(:title Constrained Optimization in Engineering Design:) (:keywords constrained optimization, GRG, SQP, genetic algorithms, continuous optimization, mathematical modeling, discrete optimization, nonlinear, optimization, engineering optimization, interior point, active set, differential, algebraic, modeling language, university course:) (:description Theoretical and numerical fundamentals of constrained optimization for engineering design:)

December 22, 2012, at 04:25 PM by 69.169.188.188 -
Added lines 1-26:

(:title Genetic Algorithms in Engineering Design:) (:keywords genetic algorithms, continuous optimization, mathematical modeling, discrete optimization, nonlinear, optimization, engineering optimization, interior point, active set, differential, algebraic, modeling language, university course:) (:description One often encounters problems in which design variables must be selected from among a set of discrete values:)

Chapter 6: Constrained Optimization, Part I

We now begin our discussion of gradient-based constrained optimization. Recall that we looked at gradient-based unconstrained optimization and learned about the necessary and sufficient conditions for an unconstrained optimum, various search directions, conducting a line search, and quasi-Newton methods. We will build on that foundation as we extend the theory to problems with constraints.

At an unconstrained local optimum, there is no direction in which we can move to improve the objective function. We can state the necessary conditions mathematically as Grad(f)=0. At a constrained local optimum, there is no feasible direction in which we can move to improve the objective. That is, there may be directions from the current point that will improve the objective, but these directions point into infeasible space.

The necessary conditions for a constrained local optimum are called the Kuhn-Tucker Conditions, and these conditions play a very important role in constrained optimization theory and algorithm development.


Chapter 7: Constrained Optimization, Part II

In the previous section we examined the necessary and sufficient conditions for a constrained optimum. We did not, however, discuss any algorithms for constrained optimization. The purpose of this section is to review three popular techniques for constrained optimization:

  1. Sequential Quadratic Programming (SQP)
  2. Generalized Reduced Gradient (GRG)
  3. Interior Point Methods (IP)

SQP works by solving for where the KT equations are satisfied. SQP is a very efficient algorithm in terms of the number of function calls needed to get to the optimum. It converges to the optimum by simultaneously improving the objective and tightening feasibility of the constraints. Only the optimal design is guaranteed to be feasible; intermediate designs may be infeasible.

The GRG algorithm works by computing search directions which improve the objective and satisfy the constraints, and then conducting line searches in a very similar fashion to the algorithms we studied previously. GRG requires more function evaluations than SQP, but it has the desirable property that it stays feasible once a feasible point is found. If the optimization process is halted before the optimum is reached, the designer is guaranteed to have in hand a better design than the starting design. GRG also appears to be more robust (able to solve a wider variety of problems) than SQP, so for engineering problems it is often the algorithm tried first.

Interior point methods are best suited for very large-scale problems with many degrees of freedom (design variables). Interior point methods are also the simplest to code into a mathematical program. We will work with interior point methods to investigate the algorithmic details of constrained optimization.