Optimization Problems (Maxima/Minima) in Multivariable Functions
Nerd Cafe
Goal
To find maximum or minimum values of a multivariable function, often subject to certain conditions (e.g., within a region or constraint).
Step 1: Understand the Problem
For a function f(x,y), you want to find points where the function reaches:
Maximum value (peak)
Minimum value (valley)
Saddle point (neither max nor min, like a horse saddle)
These occur where the gradient vector is zero:
Step 2: Mathematical Steps for Finding Critical Points
Let’s say we have:
Step A: Compute First-Order Partial Derivatives
and
Step B: Set the Partial Derivatives to Zero
So the critical point is (2,3).
Step C: Use Second Derivative Test
Let’s define:
Now compute the discriminant:
Interpretation:
If D>0 and fxx>0 → local minimum
If D>0 and fxx<0 → local maximum
If D<0 → saddle point
In our case:
Since D=4>0 and fxx=2>0, it's a local minimum at (2,3).
Python Code Using SymPy
Outline
Visualize the Function in 3D
Outline
Summary Table
1
Compute𝑓𝑥,𝑓𝑦
2
Solve𝑓𝑥=0,𝑓𝑦=0
3
Compute second-order partials: 𝑓 𝑥𝑥 , 𝑓 𝑦𝑦 , 𝑓 𝑥𝑦
4
Find discriminant 𝐷 = 𝑓 𝑥𝑥.𝑓 𝑦𝑦 − ( 𝑓 𝑥𝑦 )2
5
Determine max/min/saddle based on D and 𝑓 𝑥𝑥
Keywords
optimization
, multivariable calculus
, critical points
, maxima
, minima
, saddle point
, gradient
, partial derivatives
, second derivative test
, discriminant
, local minimum
, local maximum
, unconstrained optimization
, sympy
, Python math
, calculus with Python
, function analysis
, 3D plotting
, mathematical optimization
, surface plot
, nerd cafe
Last updated