Main Content
Student Learning Objectives
-
SLO#1: Discuss/Analyze and Implement in
software — backtracking linesearch using the (1) steepest
descent and (2) Newton search directions; carefully quantify
convergence rates, and total mount of work required to optimize
the two-dimensional Rosenbrock function from carefully selected
initial approximations.
- Assessment#1: Homework, with Practical Software Implementation, and Theoretical analysis/discussion
- Activity: Lecture
-
SLO#2: Discuss/Analyze and Implement in
software — a polynomial interpolation based improvement of
the backtracking linesearch strategy; carefully quantify and
discuss the improvement (or lack thereof) of convergence properties
when applied to the (1) steepest descent and (2) Newton search
directions for the two-dimensional Rosenbrock function.
- Assessment#2: Homework, with Practical Software Implementation, and Theoretical analysis/discussion
- Activity: Lecture
-
SLO#3: Discuss/Analyze and Implement in
software — the solution of the trust-region sub-problem,
using a strategy more sophisticated than the Dogleg method;
visualize the approximate the optimal path, by applying the
algorithm to a range of trust region radii.
- Assessment#3: Homework, Software & Theoretical
- Activity: Lecture
-
SLO#4: Discuss/Analyze and Implement in
software — the standard Conjugate Gradient method, and use
it to to solve (1) discretizations (using as many points as the
current computational environment allows for) of the Poisson
Equation in one, two, and three dimensions using the Helical
Coordinate Preconditioner for the Laplacian; and (2) linear
systems involving the Hilbert matrix of sizes 5-by-5 to 20-by-20.
Discuss strengths and limitations of the algorithm.
- Assessment#4: Homework, with Practical Software Implementation, and Theoretical analysis/discussion.
- Activity: Lecture
-
SLO#5: Discuss/Analyze and Implement in
software, — the Broyden-Fletcher-Goldfarb-Shanno (BFGS)
algorithm; use it to find the optimum of a version of the 18 (or
higher) dimensional version of the Rosenbrock function; compare
convergence against the full Newton optimization; and compute the
sequence corresponding to the Quasi-Newton convergence criteria.
- Assessment#5: Homework, with Practical Software Implementation, and Theoretical analysis/discussion.
- Activity: Lecture
-
SLO#6: Demonstrate, through an
interpretive dance, how the modern computational optimization
methods discussed in the class can be applied to a reseach problem
of your interest.
- Assessment#6: Project presentation covering theoretical and practical aspects of the research problem
- Activity: Lecture, Independent Research