Eigenvalues & Eigenvectors Calculator
Compute eigenvalues, eigenvectors, and the characteristic polynomial of any square matrix. Enter your matrix and get step-by-step solutions with diagonalization and spectral analysis.
Eigenvalues and Eigenvectors Calculator
Calculate eigenvalues and eigenvectors for square matrices with step-by-step explanations
Enter matrix values
Enter matrix values and click "Calculate Eigenvalues" to see results
Key Concepts:
- • Eigenvalue λ: A scalar that satisfies Av = λv
- • Eigenvector v: A non-zero vector that doesn't change direction
- • Characteristic equation: det(A - λI) = 0
- • Only square matrices have eigenvalues
Applications:
- • Principal Component Analysis (PCA)
- • Quantum mechanics
- • Stability analysis
- • Google PageRank algorithm
Understanding Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors are fundamental concepts in linear algebra that reveal the intrinsic properties of a linear transformation. Given a square matrix A, an eigenvector v is a nonzero vector whose direction remains unchanged when A is applied: Av = λv. The scalar λ is the corresponding eigenvalue. Together, they decompose complex transformations into simple scalings along specific directions.
Geometric Meaning
Geometrically, eigenvectors point along the directions that are merely stretched or compressed by the transformation, not rotated. The eigenvalue tells you the stretch factor: λ > 1 means stretching, 0 < λ < 1 means compression, λ < 0 means the direction is reversed, and λ = 0 means the direction collapses to zero.
Characteristic Equation
Eigenvalues are found by solving the characteristic equation det(A - λI) = 0, where I is the identity matrix. This produces a polynomial of degree n for an n×n matrix. The roots of this characteristic polynomial are the eigenvalues. For 2×2 matrices, this reduces to a simple quadratic equation.
Algebraic Multiplicity
The algebraic multiplicity of an eigenvalue is the number of times it appears as a root of the characteristic polynomial. For instance, if the characteristic polynomial factors as (λ - 2)³(λ - 5), then λ = 2 has algebraic multiplicity 3 and λ = 5 has algebraic multiplicity 1. The sum of all algebraic multiplicities equals n.
Geometric Multiplicity
The geometric multiplicity of an eigenvalue is the dimension of its eigenspace, which equals n minus the rank of (A - λI). Geometric multiplicity is always between 1 and the algebraic multiplicity. When geometric multiplicity equals algebraic multiplicity for every eigenvalue, the matrix is diagonalizable.
Methods for Finding Eigenvalues
Several algorithms exist for computing eigenvalues, each suited to different matrix sizes and structures. For small matrices, analytical methods work well; for large matrices, iterative numerical methods are essential.
Characteristic Polynomial
The direct method: form det(A - λI) = 0 and solve the resulting polynomial. Practical for 2×2 and 3×3 matrices where the polynomial can be solved analytically using the quadratic or cubic formula. For larger matrices, this method is numerically unstable and rarely used in practice.
QR Algorithm
The workhorse of numerical eigenvalue computation. The QR algorithm repeatedly decomposes the matrix into an orthogonal matrix Q and an upper triangular matrix R, then forms RQ. With shifts and deflation, it converges rapidly and is the basis of eigenvalue routines in MATLAB, NumPy, and LAPACK.
Power Method
An iterative method that finds the dominant eigenvalue (largest in absolute value) and its eigenvector. Start with a random vector and repeatedly multiply by A and normalize. Convergence rate depends on the ratio of the two largest eigenvalues. Used in Google's PageRank algorithm.
Inverse Iteration
A variant of the power method that finds the eigenvalue closest to a given shift σ. Instead of multiplying by A, you solve (A - σI)x = b at each step. This converges to the eigenvalue nearest to σ and is useful when you have an approximate eigenvalue from another method.
Jacobi Method
Specifically designed for real symmetric matrices. The Jacobi method applies a sequence of orthogonal rotations to zero out off-diagonal elements, converging to a diagonal matrix of eigenvalues. It is highly parallelizable and guarantees real eigenvalues for symmetric inputs.
Which Method to Use?
For small matrices (up to 4×4), the characteristic polynomial approach gives exact symbolic results. For larger or numerical matrices, the QR algorithm is the standard choice. Use the power method when you only need the largest eigenvalue, and inverse iteration when refining an approximate eigenvalue.
Real-World Applications of Eigenvalues
Eigenvalue analysis is one of the most widely applied tools in mathematics, with critical uses across engineering, data science, physics, and beyond. Understanding eigenvalues unlocks powerful techniques for analyzing systems and reducing complexity.
Structural Engineering
- • Natural frequency analysis of buildings and bridges
- • Vibration mode shapes for mechanical structures
- • Buckling load prediction in columns and beams
- • Stability analysis of dynamical systems
Machine Learning (PCA)
- • Principal Component Analysis for dimensionality reduction
- • Covariance matrix eigenvalues for feature extraction
- • Spectral clustering for unsupervised learning
- • Google PageRank as a dominant eigenvector problem
Quantum Mechanics
- • Energy levels as eigenvalues of the Hamiltonian operator
- • Quantum states as eigenvectors of observable operators
- • Spin measurements and Pauli matrix eigenvalues
- • Molecular orbital theory and electronic structure
Image Compression (SVD)
- • Singular Value Decomposition based on eigenvalue theory
- • Low-rank matrix approximation for image storage
- • Facial recognition using eigenfaces (PCA on images)
- • Signal denoising by discarding small eigenvalues
Frequently Asked Questions
What is the difference between eigenvalues and eigenvectors?
Eigenvalues and eigenvectors come in pairs. An eigenvector is a nonzero vector v that, when multiplied by a matrix A, only gets scaled (not rotated). The eigenvalue λ is the scalar factor by which the eigenvector is stretched or compressed: Av = λv. For example, if Av = 3v, then v is an eigenvector with eigenvalue 3. A matrix can have multiple eigenvalue–eigenvector pairs, and together they reveal the fundamental directions and magnitudes of the linear transformation.
How do you calculate eigenvectors from eigenvalues?
Once you have an eigenvalue λ, you find its eigenvectors by solving the homogeneous system (A - λI)v = 0, where I is the identity matrix. Subtract λ from each diagonal entry of A, then row-reduce the resulting matrix to find the null space. Every nonzero vector in that null space is an eigenvector associated with λ. For repeated eigenvalues, the eigenspace may be one-dimensional or multi-dimensional depending on the rank of (A - λI).
What is matrix diagonalization?
A square matrix A is diagonalizable if it can be written as A = PDP⁻¹, where D is a diagonal matrix of eigenvalues and P is a matrix whose columns are the corresponding eigenvectors. Diagonalization simplifies matrix computations: raising A to a power becomes Aⁿ = PDⁿP⁻¹, which is trivial since Dⁿ just raises each diagonal entry to the nth power. A matrix is diagonalizable if and only if it has n linearly independent eigenvectors.
How many eigenvalues can an n×n matrix have?
An n×n matrix has exactly n eigenvalues when counted with algebraic multiplicity, since the characteristic polynomial det(A - λI) = 0 is a degree-n polynomial. However, the number of distinct eigenvalues can range from 1 to n. For example, the 3×3 identity matrix has only one distinct eigenvalue (λ = 1) with algebraic multiplicity 3, while a general 3×3 matrix typically has three distinct eigenvalues.
What are repeated eigenvalues?
Repeated eigenvalues occur when the characteristic polynomial has roots with multiplicity greater than one. The algebraic multiplicity is how many times the eigenvalue appears as a root, while the geometric multiplicity is the dimension of its eigenspace (the number of linearly independent eigenvectors). When geometric multiplicity is less than algebraic multiplicity, the matrix is not diagonalizable and requires Jordan normal form instead.
How are eigenvalues related to the trace and determinant?
The trace of a matrix (sum of diagonal entries) equals the sum of all eigenvalues, and the determinant equals the product of all eigenvalues. For a 2×2 matrix with eigenvalues λ₁ and λ₂: trace = λ₁ + λ₂ and det = λ₁λ₂. These relationships provide quick checks for your calculations and are fundamental identities in linear algebra. A matrix is singular (non-invertible) if and only if at least one eigenvalue is zero.
Related Calculators
Share This Calculator
Found this tool helpful?
Help others discover it with one click.
Copy Link
Suggested hashtags: #eigenvalues #Calculator #FreeTools #AICalculator