Python Encyclopedia for Academics
  • Course Outline
  • Artificial Intelligence
    • Data Science Foundation
      • Python Programming
        • Introduction and Basics
          • Variables
          • Print Function
          • Input From User
          • Data Types
          • Type Conversion
        • Operators
          • Arithmetic Operators
          • Relational Operators
          • Bitwise Operators
          • Logical Operators
          • Assignment Operators
          • Compound Operators
          • Membership Operators
          • Identity Operators
      • Numpy
        • Vectors, Matrix
        • Operations on Matrix
        • Mean, Variance, and Standard Deviation
        • Reshaping Arrays
        • Transpose and Determinant of Matrix
      • Pandas
        • Series and DataFrames
        • Slicing, Rows, and Columns
        • Operations on DataFrames
        • Different wayes to creat DataFrame
        • Read, Write Operations with CSV files
      • Matplotlib
        • Graph Basics
        • Format Strings in Plots
        • Label Parameters, Legend
        • Bar Chart, Pie Chart, Histogram, and Scatter Plot
  • Machine Learning Algorithms
    • Regression Analysis In ML
      • Regression Analysis in Machine Learning
      • Proof of Linear Regression Formulas
      • Simple Linear Regression Implementation
      • Multiple Linear Regression
      • Advertising Dataset Example
      • Bike Sharing Dataset
      • Wine Quality Dataset
      • Auto MPG Dataset
    • Classification Algorithms in ML
      • Proof of Logistic Regression
      • Simplified Mathematical Proof of SVM
      • Iris Dataset
  • Machine Learning Laboratory
    • Lab 1: Titanic Dataset
      • Predicting Survival on the Titanic with Machine Learning
    • Lab 2: Dow Jones Index Dataset
      • Dow Jones Index Predictions Using Machine Learning
    • Lab 3: Diabetes Dataset
      • Numpy
      • Pandas
      • Matplotlib
      • Simple Linear Regression
      • Simple Non-linear Regression
      • Performance Matrix
      • Preprocessing
      • Naive Bayes Classification
      • K-Nearest Neighbors (KNN) Classification
      • Decision Tree & Random Forest
      • SVM Classifier
      • Logistic Regression
      • Artificial Neural Network
      • K means Clustering
    • Lab 4: MAGIC Gamma Telescope Dataset
      • Classification in ML-MAGIC Gamma Telescope Dataset
    • Lab 5: Seoul Bike Sharing Demand Dataset
      • Regression in ML-Seoul Bike Sharing Demand Dataset
    • Lab 6: Medical Cost Personal Datasets
      • Predict Insurance Costs with Linear Regression in Python
    • Lab 6: Predict The S&P 500 Index With Machine Learning And Python
      • Predict The S&P 500 Index With Machine Learning And Python
  • Artificial Neural Networks
    • Biological Inspiration vs. Artificial Neurons
    • Review linear algebra and calculus essentials for ANNs
    • Activation Function
  • Mathematics
    • Pre-Calculus
      • Factorials
      • Roots of Polynomials
      • Complex Numbers
      • Polar Coordinates
      • Graph of a Function
    • Calculus 1
      • Limit of a Function
      • Derivative of Function
      • Critical Points
      • Indefinite Integrals
  • Calculus 2
    • 3D Coordinates and Vectors
    • Vectors and Vector Operations
    • Lines and Planes in Space (3D)
    • Partial Derivatives
    • Optimization Problems (Maxima/Minima) in Multivariable Functions
    • Gradient Vectors
  • Engineering Mathematics
    • Laplace Transform
  • Electrical & electronics Eng
    • Resistor
      • Series Resistors
      • Parallel Resistors
    • Nodal Analysis
      • Example 1
      • Example 2
    • Transient State
      • RC Circuit Equations in the s-Domain
      • RL Circuit Equations in the s-Domain
      • LC Circuit Equations in the s-Domain
      • Series RLC Circuit with DC Source
  • Computer Networking
    • Fundamental
      • IPv4 Addressing
      • Network Diagnostics
  • Cybersecurity
    • Classical Ciphers
      • Caesar Cipher
      • Affine Cipher
      • Atbash Cipher
      • Vigenère Cipher
      • Gronsfeld Cipher
      • Alberti Cipher
      • Hill Cipher
Powered by GitBook
On this page
  • What is a Gradient?
  • Why is the Gradient Important?
  • Step-by-Step Mathematical Explanation
  • Example 1
  • Example 2
  • Summary
  • Keywords
  1. Calculus 2

Gradient Vectors

Nerd Cafe

What is a Gradient?

The gradient of a scalar-valued function is a vector that points in the direction of the steepest increase of the function. It’s one of the most fundamental ideas in multivariable calculus.

For a function f(x,y), the gradient is defined as:

▽f(x,y)=(∂f∂x,∂f∂y)\bigtriangledown f(x,y)=\left( \frac{\partial f}{\partial x},\frac{\partial f}{\partial y} \right)▽f(x,y)=(∂x∂f​,∂y∂f​)

For a function 𝑓 ( 𝑥 , 𝑦 , 𝑧 ):

▽f(x,y,z)=(∂f∂x,∂f∂y,∂f∂z)\bigtriangledown f(x,y,z)=\left( \frac{\partial f}{\partial x},\frac{\partial f}{\partial y},\frac{\partial f}{\partial z} \right)▽f(x,y,z)=(∂x∂f​,∂y∂f​,∂z∂f​)

Why is the Gradient Important?

  • It tells us the direction of the fastest increase of the function.

  • It is perpendicular (normal) to level curves (in 2D) or level surfaces (in 3D).

  • It is used in optimization problems like gradient descent.

Step-by-Step Mathematical Explanation

Let’s say we have a scalar function:

f(x,y)=x2+y2f(x,y)=x^{2}+y^{2}f(x,y)=x2+y2

Step 1: Compute Partial Derivatives

∂f∂x=2x    and    ∂f∂y=2y\frac{\partial f}{\partial x}=2x\;\;and\;\;\frac{\partial f}{\partial y}=2y∂x∂f​=2xand∂y∂f​=2y

Step 2: Write the Gradient Vector

▽f(x,y)=(2x,2y)\bigtriangledown f(x,y)=(2x,2y)▽f(x,y)=(2x,2y)

At a point like ( 1 , 1 ) , the gradient becomes:

▽f(1,1)=(2,2)\bigtriangledown f(1,1)=(2,2)▽f(1,1)=(2,2)

This vector points in the direction of steepest ascent of the function.

Example 1

f(x,y)=x2+y2f(x,y)=x^{2}+y^{2}f(x,y)=x2+y2

Let’s compute and visualize gradient vectors on a 2D surface.

import numpy as np
import matplotlib.pyplot as plt

# Define the function
def f(x, y):
    return x**2 + y**2

# Define partial derivatives (gradient components)
def grad_f(x, y):
    df_dx = 2 * x
    df_dy = 2 * y
    return df_dx, df_dy

# Generate a grid
x = np.linspace(-2, 2, 20)
y = np.linspace(-2, 2, 20)
X, Y = np.meshgrid(x, y)

# Compute gradient on the grid
U, V = grad_f(X, Y)

# Plot the vector field (gradient)
plt.figure(figsize=(8, 6))
plt.quiver(X, Y, U, V, color='blue')
plt.title("Gradient Vector Field of f(x, y) = x² + y²")
plt.xlabel('x')
plt.ylabel('y')
plt.grid()
plt.axis('equal')
plt.show()

Output

Example 2

Let:

f(x,y)=x.e−x2−y2f(x,y)=x.e^{-x^{2}-y^{2}}f(x,y)=x.e−x2−y2

Using product and chain rules:

∂f∂x=e−x2−y2−2x2.e−x2−y2=(1−2x2)e−x2−y2\frac{\partial f}{\partial x}=e^{-x^{2}-y^{2}}-2x^{2}.e^{-x^{2}-y^{2}}=(1-2x^{2})e^{-x^{2}-y^{2}}∂x∂f​=e−x2−y2−2x2.e−x2−y2=(1−2x2)e−x2−y2

and

∂f∂y=−2xy.e−x2−y2\frac{\partial f}{\partial y}=-2xy.e^{-x^{2}-y^{2}}∂y∂f​=−2xy.e−x2−y2

Python Visualization:

def f2(x, y):
    return x * np.exp(-x**2 - y**2)

def grad_f2(x, y):
    df_dx = (1 - 2*x**2) * np.exp(-x**2 - y**2)
    df_dy = -2 * x * y * np.exp(-x**2 - y**2)
    return df_dx, df_dy

# Generate grid
x = np.linspace(-2, 2, 20)
y = np.linspace(-2, 2, 20)
X, Y = np.meshgrid(x, y)

# Gradient values
U2, V2 = grad_f2(X, Y)

# Plot
plt.figure(figsize=(8, 6))
plt.quiver(X, Y, U2, V2, color='green')
plt.title("Gradient Vector Field of f(x, y) = x * exp(-x² - y²)")
plt.xlabel('x')
plt.ylabel('y')
plt.grid()
plt.axis('equal')
plt.show()

Output

Summary

Concept
Meaning

Gradient

Vector of partial derivatives

Direction

Steepest increase

Magnitude

Rate of increase

Applications

Optimization, physics, machine learning

Keywords

gradient vector, gradient descent, partial derivatives, optimization, scalar field, vector calculus, electric field, gravitational field, backpropagation, machine learning, neural networks, image processing, edge detection, sobel filter, heat equation, diffusion, potential function, motion planning, terrain slope, multivariable calculus, nerd cafe

PreviousOptimization Problems (Maxima/Minima) in Multivariable FunctionsNextEngineering Mathematics

Last updated 1 month ago