Eigenvalue Calculator

Compute eigenvalues, eigenvectors, and the characteristic polynomial for any square matrix up to 10×10. Paste your matrix, get step-by-step derivations, and verify with trace and determinant checks.

Online Eigenvalue Calculator

Compute eigenvalues and eigenvectors with steps, checks, and exports.

Input Matrix
Paste text or switch to grid. Size auto-detected.
Precision
Mode
Results
Summary, steps, eigenvectors, and visualize.
Enter a matrix and press Calculate to see eigenvalues.

Understanding Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors are central concepts in linear algebra that describe how a square matrix transforms vectors. When a matrix A multiplies an eigenvector v, the result is simply the eigenvector scaled by a constant λ (the eigenvalue): Av = λv. This equation reveals the fundamental directions of a linear transformation and the magnitude of stretching or compression along each direction.

Definition

For an n×n matrix A, a scalar λ is an eigenvalue if there exists a nonzero vector v such that Av = λv. The vector v is called the eigenvector corresponding to λ. Every square matrix has exactly n eigenvalues (counted with multiplicity), which may be real or complex numbers.

Geometric Interpretation

Geometrically, eigenvectors point in directions that remain unchanged by the linear transformation — they may stretch or flip, but they do not rotate. The eigenvalue tells you the scaling factor: a positive eigenvalue means the direction is preserved, a negative eigenvalue means it reverses, and a complex eigenvalue indicates rotation in the plane.

Characteristic Equation

Eigenvalues are found by solving the characteristic equation det(A − λI) = 0. This produces a polynomial of degree n whose roots are the eigenvalues. For a 2×2 matrix, you solve a quadratic; for a 3×3 matrix, a cubic. Our calculator handles this computation automatically and shows every step.

Relation to Determinants & Trace

The sum of all eigenvalues equals the trace (sum of diagonal entries) of the matrix, and the product of all eigenvalues equals the determinant. These relationships provide quick sanity checks: if the determinant is zero, at least one eigenvalue must be zero, meaning the matrix is singular.

How to Find Eigenvalues

Finding eigenvalues involves computing the characteristic polynomial and solving for its roots. The method varies by matrix size — small matrices can be solved analytically, while larger matrices require numerical algorithms. Our eigenvalue calculator handles all cases and shows the derivation at each step.

2×2 Matrices

λ² − tr(A)λ + det(A) = 0

For a 2×2 matrix [[a, b], [c, d]], the characteristic polynomial simplifies to a quadratic equation. The eigenvalues are found directly using the quadratic formula: λ = (tr ± √(tr² − 4·det)) / 2, where tr = a + d and det = ad − bc.

3×3 Matrices

det(A − λI) = 0 → cubic

The characteristic polynomial of a 3×3 matrix is a cubic equation. It can be solved using Cardano's formula or by finding one rational root through inspection, then factoring to a quadratic. Symmetric 3×3 matrices always have three real eigenvalues.

Characteristic Polynomial

p(λ) = det(A − λI)

The characteristic polynomial encodes all eigenvalue information. Its degree equals the matrix dimension, and its roots are the eigenvalues. The coefficients relate to matrix invariants: the trace, sums of 2×2 minors, and the determinant appear as coefficients.

Numerical Methods

For matrices larger than 4×4, closed-form solutions generally do not exist. Numerical algorithms like the QR algorithm, power iteration, and inverse iteration are used. The QR algorithm iteratively factors A = QR and recomputes A = RQ until convergence to an upper triangular matrix whose diagonal entries are the eigenvalues.

Special Matrices

Symmetric matrices always have real eigenvalues and orthogonal eigenvectors. Triangular matrices have eigenvalues equal to their diagonal entries. Orthogonal matrices have eigenvalues with absolute value 1. Knowing these properties can simplify computation and verify results.

Verification

Always verify eigenvalue results: the sum of eigenvalues must equal the trace of the matrix, and the product must equal the determinant. Additionally, substituting each eigenvalue back into (A − λI)v = 0 should yield a nontrivial solution for the eigenvector.

Applications of Eigenvalues

Eigenvalue analysis is one of the most widely used techniques in applied mathematics, appearing in fields from machine learning and data science to physics and engineering. Understanding eigenvalues unlocks powerful tools for dimensionality reduction, stability analysis, and signal processing.

PCA in Data Science

  • • Principal Component Analysis uses eigenvalues of the covariance matrix
  • • Eigenvalues indicate the variance explained by each principal component
  • • Enables dimensionality reduction while preserving maximum information
  • • Widely used in image recognition, genomics, and recommendation systems

Vibration Analysis

  • • Eigenvalues of stiffness and mass matrices give natural frequencies
  • • Critical for designing bridges, buildings, and mechanical structures
  • • Prevents resonance failures by identifying dangerous frequency modes
  • • Used in automotive, aerospace, and civil engineering simulations

Quantum Mechanics

  • • Observable quantities are eigenvalues of Hermitian operators
  • • Energy levels of atoms are eigenvalues of the Hamiltonian matrix
  • • Eigenstates represent stable quantum states of a system
  • • Foundation of spectroscopy and quantum computing algorithms

Google PageRank

  • • PageRank is the dominant eigenvector of the web link matrix
  • • The corresponding eigenvalue (always 1) ensures a stable ranking
  • • Power iteration was used to compute rankings for billions of pages
  • • Demonstrates how eigenvalue theory powers modern search engines

Frequently Asked Questions

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are fundamental concepts in linear algebra. For a square matrix A, an eigenvector v is a nonzero vector that, when multiplied by A, only gets scaled by a constant factor. That scaling factor is the eigenvalue λ. Mathematically, Av = λv. Eigenvectors reveal the directions along which a linear transformation acts by simple stretching or compression, and eigenvalues tell you the magnitude of that scaling.

How do you find eigenvalues of a matrix?

To find eigenvalues of a matrix A, you solve the characteristic equation det(A - λI) = 0, where I is the identity matrix and λ is the unknown eigenvalue. For a 2×2 matrix, this produces a quadratic equation. For a 3×3 matrix, you get a cubic polynomial. The roots of this polynomial are the eigenvalues. Once you have the eigenvalues, you substitute each one back into (A - λI)v = 0 and solve for the eigenvector v.

What is the characteristic polynomial?

The characteristic polynomial of a square matrix A is defined as p(λ) = det(A - λI). It is a polynomial of degree n for an n×n matrix, and its roots are the eigenvalues of A. The coefficients of the characteristic polynomial encode important matrix properties: the constant term equals the determinant of A, and the coefficient of λ^(n-1) equals the negative trace (sum of diagonal entries). The characteristic polynomial is the foundation for computing eigenvalues analytically.

What does it mean if an eigenvalue is zero?

If a matrix has a zero eigenvalue, it means the matrix is singular (non-invertible) and its determinant is zero. Geometrically, the transformation collapses at least one dimension — some nonzero vectors are mapped to the zero vector. The eigenvectors corresponding to eigenvalue zero form the null space (kernel) of the matrix. The number of zero eigenvalues equals the dimension of the null space, which determines the rank deficiency of the matrix.

How are eigenvalues used in real life?

Eigenvalues have widespread applications across science and engineering. In data science, Principal Component Analysis (PCA) uses eigenvalues to identify the most important directions of variance in datasets. In structural engineering, eigenvalues determine natural vibration frequencies of bridges and buildings. Google’s original PageRank algorithm used the dominant eigenvector of a web link matrix to rank pages. In quantum mechanics, eigenvalues of operators represent measurable physical quantities like energy levels.

Can a matrix have complex eigenvalues?

Yes, real matrices can have complex eigenvalues. Complex eigenvalues always come in conjugate pairs (a + bi and a - bi) for real-valued matrices. They arise when the characteristic polynomial has no real roots, which commonly happens in rotation matrices and oscillatory systems. For example, the 2×2 rotation matrix [[cosθ, -sinθ], [sinθ, cosθ]] has eigenvalues e^(iθ) and e^(-iθ). Complex eigenvalues indicate the transformation involves rotation rather than pure stretching.

Related Calculators

Share This Calculator

Found this tool helpful?

Help others discover it with one click.

Copy Link

Suggested hashtags: #eigenvalue #calculator #Calculator #FreeTools #AICalculator