Interactive Linear Algebra

Welcome to this interactive exploration of Linear Algebra, the branch of mathematics concerning vector spaces and linear mappings between them. This module provides hands-on tools to help you build intuition for abstract concepts like Gaussian elimination, vector transformations, and eigenvectors. By directly manipulating these mathematical objects, you can see for yourself how the core principles of linear algebra work in practice.

AI Tutor Setup

To use the AI tutor, you need a Google AI API key.

  1. Go to Google AI Studio.
  2. Click "Create API key" to get your key.
  3. Copy the key and paste it into the field above.

Learning Objectives

  • Represent structured data and linear transformations using matrices.
  • Solve systems of linear equations using matrix methods, including Gaussian elimination and the method of least squares.
  • Understand fundamental concepts of vector spaces, such as span, linear independence, and coordinate systems.
  • Visualize the geometric effects of matrix operations, including transformations, decomposition, and multiplication.
  • Identify eigenvectors and eigenvalues to understand their geometric and algebraic significance.

Matrix

A QR code is a real-world example of a matrix. It's a grid of black and white squares, where each square's color can be represented by a number (1 for black, 0 for white). Hover over the QR code below to see the underlying matrix values. Enter a URL to generate a new QR code matrix, then scan it with your smartphone!

Need help with Matrices?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Solving Linear Systems

A system of equations can be represented as an augmented matrix. We then use Gauss-Jordan elimination to transform the matrix into Reduced Row Echelon Form (RREF). This tool demonstrates that process. Click "Next Step" to perform the next operation.

Need help with Linear Systems?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Linear Space

The "span" of a set of vectors is the set of all possible linear combinations of those vectors. It represents all the points you can reach in 3D space. Use the dropdown to select different sets of vectors and see if they span a line, a plane, or all of R³.

Need help with Linear Space?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Coordinate System

A vector's coordinates are just a description of how to reach it using a set of basis vectors. Change the basis vectors (b₁, b₂, b₃) or the target vector (v), and see how the coordinates of **v** relative to that basis change. The visualization shows how many 'steps' along each basis vector are needed to reach the target.

xyz

Need help with Coordinate Systems?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Linear Transformations

A matrix can transform the geometry (pixel positions) or the color space of an image. Choose a transformation type and a specific matrix to see its effect. Note: For external images, a CORS-friendly source is needed.

Original

Transformed

Need help with Transformations?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Matrix Decomposition

Any linear transformation can be decomposed into a sequence of simpler ones. Here, we use Singular Value Decomposition (SVD) to break a matrix A into A = U * S * Vᵀ, representing a rotation (Vᵀ), a scaling (S), and another rotation (U). Edit the matrix A to see how its components and the intermediate transformations change.

A =

Rotation (U)

Scaling (S)

Rotation (Vᵀ)

Original

1. Apply Vᵀ

2. Apply S

3. Apply U (Final)

Need help with Matrix Decomposition?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Eigenvectors & Eigenvalues

An eigenvector of a matrix is a special vector that does not change its direction when the corresponding linear transformation is applied; it is only scaled by a factor called the eigenvalue. Below, the matrix transformation scales the red eigenvector but changes the direction of the blue vector.

Matrix A:

[[3, 1], [0, 2]]

Eigenvector v₁ (λ=3):

[1, 0]

Need help with Eigenvectors?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.

Method of Least Squares

When a system of linear equations has no solution, we can find a "best-fit" approximate solution using the method of least squares. This is most commonly used in data analysis to find the line of best fit for a set of data points. Click on the chart to add points, then click "Calculate" to find the line that minimizes the total squared error.

Need help with Least Squares?

Stuck on this topic? Start a session with the AI tutor for a hint or guidance.