Here is a very long and detailed note on linear algebra, formatted precisely according to the complex multi-column structure you provided.
Multicolumn
Blank
1. The Foundations: What is a Vector?
In linear algebra, we start with vectors. A vector is an object that has both magnitude (length) and direction. We can visualize them as arrows starting from an origin.
In 2D: A vector can be represented as coordinates: (2 units right, 3 units up).
In 3D: (1 unit on x-axis, 4 on y-axis, -2 on z-axis).
The collection of all possible 2D vectors is called . The collection of all 3D vectors is .
Transclude of 2D-Vector-Space.excalidrawMulticolumn
Example
Vector Addition:
(Geometrically: tip-to-tail)
Example
Scalar Multiplication:
(Stretching or shrinking)
Blank
2. Organizing Data: What is a Matrix?
A matrix is a rectangular grid of numbers, arranged in rows and columns. If a vector represents a point, a matrix can represent a transformation of space itself.
Dimensions: A matrix with rows and columns is called an "" matrix.
Example (2x3 Matrix):
Matrices are the primary tool for representing systems of equations and linear transformations.
Multicolumn
Note
Notation: The entry in the -th row and -th column is written as . For matrix above, is 5.
Warning
Order Matters! Matrix multiplication (which we’ll see later) is generally not commutative. .
Multicolumn
Blank
3. Linear Combinations & Span
This is a core concept. A linear combination is the result of adding vectors that have been scaled by scalars.
Given vectors and scalars , a linear combination is:
The span of a set of vectors is the set of all possible linear combinations you can make from them.
Transclude of Span-of-two-vectors.excalidrawMulticolumn
Info
Span in :
The span of a single vector (e.g., ) is a line. The span of two non-parallel vectors (e.g., and ) is the entire 2D plane ().
Info
Span in :
The span of two non-parallel vectors in is a plane. To span all of , you need three vectors that don’t all lie on the same plane.
Blank
4. Systems of Linear Equations
Linear algebra is perfectly designed to solve systems of equations.
Example System:
We can write this in the compact matrix form :
Multicolumn
Question
The Big Question:
Does a solution exist? Is it unique? This is equivalent to asking: “Is the vector in the span of the columns of matrix ?”
Tip
Geometric View:
Solving this system is finding the intersection point of two lines. The system could have one solution (lines intersect), no solutions (parallel lines), or infinite solutions (same line).
Multicolumn
Blank
5. Gaussian Elimination
This is the algorithm we use to solve systems of linear equations. The goal is to simplify the system into an “upper triangular” form that is easy to solve.
We use an augmented matrix and perform “elementary row operations”:
Swap two rows.
Multiply a row by a non-zero scalar.
Add a multiple of one row to another row.
Augmented Matrix for :
We manipulate this to get Row-Echelon Form.
Transclude of Gaussian-Elimination-Steps.excalidrawMulticolumn
Note
Reduced Row-Echelon Form (RREF):
A special form where all leading entries (pivots) are 1, and they are the only non-zero entries in their columns. This form directly gives you the solution.
Example
RREF for our system:
This reads: and .
The solution is .
Blank
6. Linear Independence
This concept is related to “span” and “redundancy.” A set of vectors is linearly independent if no vector in the set can be written as a linear combination of the others.
Informally: None of the vectors are “redundant.” Each vector adds a new direction to the span.
Formally: The set is linearly independent if the only solution to the equation
is the trivial solution: .
Multicolumn
Example
Independent:
. You can’t write one as a multiple of the other.
Example
Dependent:
. The second vector is just the first. It’s redundant. .
Multicolumn
Blank
7. Basis & Dimension
A basis for a vector space (or subspace) is a “goldilocks” set of vectors. It has two properties:
It is linearly independent (no redundant vectors).
It spans the entire space (it has enough vectors to reach everywhere).
The dimension of a space is simply the number of vectors in its basis.
Multicolumn
Info
Standard Basis:
The most common basis for is:
Since there are 3 vectors, is 3-dimensional.
Note
Uniqueness:
A space can have infinitely many different bases, but every basis for that space will always have the same number of vectors.
Blank
8. Linear Transformations
This is where matrices get their meaning. A linear transformation is a function that takes a vector as input and produces a vector as output, following two rules:
(Preserves addition)
(Preserves scalar multiplication)
Key Idea: A linear transformation is a matrix multiplication. Every matrix corresponds to a linear transformation defined by .
Transclude of Linear-Transformation-Grid.excalidrawMulticolumn
Example
Rotation:
A 90° counter-clockwise rotation in is a linear transformation.
Example
Projection:
Projecting all vectors in onto the x-y plane is a linear transformation.
Warning
Not Linear:
is not linear because it doesn’t map the zero vector to the zero vector ().
Multicolumn
Blank
9. The Matrix of a Transformation
How do you find the matrix that corresponds to a specific linear transformation ?
The “Big Trick”:
You only need to see what does to the standard basis vectors.
Take the standard basis vectors: .
Apply the transformation to each one: and .
The resulting vectors are the columns of your matrix .
Transclude of Transformation-Matrix-Columns.excalidrawMulticolumn
Example
Find the 90° Rotation Matrix:
(1st col)
(2nd col)
The matrix is .
Blank
10. Determinants & Inverses
The determinant is a single number, , that tells you about the “scaling factor” of a transformation.
If , scales areas by a factor of 2.
If , shrinks areas by half.
If , “squishes” space into a lower dimension (like a line or a point).
If , “flips” the orientation of space (like a mirror reflection).
The inverse of a matrix, , “undoes” the transformation .
(the identity matrix).
Multicolumn
Warning
Invertibility:
A matrix has an inverse if and only if .
If the determinant is zero, the transformation squished space, and you can’t “un-squish” it (you lost information).
Example
Determinant (2x2):
Multicolumn
Blank
11. Eigenvectors & Eigenvalues
This is the most important topic in linear algebra.
When a matrix transforms space, most vectors get “knocked” off their span (they change direction).
An eigenvector is a special, non-zero vector that does not change its direction when transformed by . It only gets scaled.
The scaling factor is its eigenvalue, .
The defining equation is:
Transclude of Eigenvector-Transformation.excalidrawMulticolumn
Info
What they mean:
Eigenvectors are the “axes of transformation.” They are the most “stable” directions in a system.
Example
Rotation: A 3D rotation’s eigenvector is its axis of rotation. Vectors on this axis don’t change direction (their eigenvalue is 1).
Blank
12. How to Find Eigenvalues
We need to solve .
Rewrite as:
Insert Identity:
Factor out :
NOTE
bla bla car
We are looking for a non-zero vector that solves this. This means the matrix must be non-invertible (it must “squish” to zero).
And a matrix is non-invertible if its determinant is zero!
Characteristic Equation:
Multicolumn
Tip
The Process:
Solve the characteristic equation to find the eigenvalues .
For each , plug it back into and solve for the eigenvectors .
Note
The set of all eigenvectors for a given (plus the zero vector) forms a subspace called the eigenspace.





