Mastering Linear Algebra: An Introduction with Applications
Overview
About
Trailer
01: Linear Algebra: Powerful Transformations
Discover that linear algebra is a powerful tool that combines the insights of geometry and algebra. Focus on its central idea of linear transformations, which are functions that are algebraically very simple and that change a space geometrically in modest ways, such as taking parallel lines to parallel lines. Survey the diverse linear phenomena that can be analyzed this way.
02: Vectors: Describing Space and Motion
Professor Su poses a handwriting recognition problem as an introduction to vectors, the basic objects of study in linear algebra. Learn how to define a vector, as well as how to add and multiply them, both algebraically and geometrically. Also see vectors as more general objects that apply to a wide range of situations that may not, at first, look like arrows or ordered collections of real numbers.
03: Linear Geometry: Dots and Crosses
Even at this stage of the course, the concepts you’ve encountered give insight into the strange behavior of matter in the quantum realm. Get a glimpse of this connection by learning two standard operations on vectors: dot products and cross products. The dot product of two vectors is a scalar, with magnitude only. The cross product of two vectors is a vector, with both magnitude and direction.
04: Matrix Operations
Use the problem of creating an error-correcting computer code to explore the versatile language of matrix operations. A matrix is a rectangular array of numbers whose rows and columns can be thought of as vectors. Learn matrix notation and the rules for matrix arithmetic. Then see how these concepts help you determine if a digital signal has been corrupted and, if so, how to fix it.
05: Linear Transformations
Dig deeper into linear transformations to find out how they are closely tied to matrix multiplication. Computer graphics is a perfect example of the use of linear transformations. Define a linear transformation and study properties that follow from this definition, especially as they relate to matrices. Close by exploring advanced computer graphic techniques for dealing with perspective in images.
06: Systems of Linear Equations
One powerful application of linear algebra is for solving systems of linear equations, which arise in many different disciplines. One example: balancing chemical equations. Study the general features of any system of linear equations, then focus on the Gaussian elimination method of solution, named after the German mathematician Carl Friedrich Gauss, but also discovered in ancient China.
07: Reduced Row Echelon Form
Consider how signals from four GPS satellites can be used to calculate a phone’s location, given the positions of the satellites and the times for the four signals to reach the phone. In the process, discover a systematic way to use row operations to put a matrix into reduced row echelon form, a special form that lets you solve any system of linear equations, and tells you a lot about the solutions.
08: Span and Linear Dependence
Determine whether eggs and oatmeal alone can satisfy goals for obtaining three types of nutrients. Learn about the span of a set of vectors, which is the set of all linear combination of those vectors; and linear dependence, where one vector can be written as a linear combination of two others. Along the way, develop your intuition for seeing possible solutions to problems in linear algebra.
09: Subspaces: Special Subsets to Look For
Delve into special subspaces of a matrix: the null space, row space, and column space. Use these to understand the economics of making croissants and donuts for a specified price, drawing on three ingredients with changing costs. As in the previous lecture, move back and forth between a matrix equation, a system of equations, and a vector equation, which all represent the same thing.
10: Bases: Basic Building Blocks
Using the example of digital compression of images, explore the basis of a vector space. This is a subset of vectors that, in the case of compression formats like JPEG, preserve crucial information while dispensing with extraneous data. Discover how to find a basis for a column space, row space, and null space. Also make geometric observations about these important structures.
11: Invertible Matrices: Undoing What You Did
Now turn to engineering, a fertile field for linear algebra. Put yourself in the shoes of a bridge designer, faced with determining the maximum force that a bridge can take for a given deflection vector. This involves the inverse of a matrix. Explore techniques for determining if an inverse matrix exists and then calculating it. Also learn proofs about properties of matrices and their inverses.
12: The Invertible Matrix Theorem
Use linear algebra to analyze one of the games on the popular electronic toy Merlin from the 1970s. This leads you deeper into the nature of the inverse of a matrix, showing why invertibility is such an important idea. Learn about the fundamental theorem of invertible matrices, which provides a key to understanding properties you can infer from matrices that either have or don’t have an inverse.
13: Determinants: Numbers That Say a Lot
Study the determinant—the factor by which a region of space increases or decreases after a matrix transformation. If the determinant is negative, then the space has been mirror-reversed. Probe other properties of the determinant, including its use in multivariable calculus for computing the volume of a parallelepiped, which is a three-dimensional figure whose faces are parallelograms.
14: Eigenstuff: Revealing Hidden Structure
Dive into eigenvectors, which are a special class of vectors that don’t change direction under a given linear transformation. The scaling factor of an eigenvector is the eigenvalue. These seemingly incidental properties turn out to be of enormous importance in linear algebra. Get started with “eigenstuff” by pondering a problem in population modeling, featuring foxes and their prey, rabbits.
15: Eigenvectors and Eigenvalues: Geometry
Continue your study from the previous lecture by exploring the geometric properties of eigenvectors and eigenvalues, gaining an intuitive sense of the hidden structure they reveal. Learn how to calculate eigenvalues and eigenvectors; and for vectors that are not eigenvectors, discover that if you have a basis of eigenvectors, then it’s easy to see how a transformation moves every other point.
16: Diagonalizability
In this third lecture on eigenvectors, examine conditions under which a change in basis results in a basis of eigenvectors, which makes computation with matrices very easy. Discover the property called diagonalizability, and prove that being diagonalizable is the equivalent to having a basis of eigenvectors. Also explore the connection between the eigenvalues of a matrix and its determinant.
17: Population Dynamics: Foxes and Rabbits
Return to the problem of modeling the population dynamics of foxes and rabbits from Lecture 14, drawing on your knowledge of eigenvectors to analyze different scenarios. First, express the predation relationship in matrix notation. Then, experiment with different values for the predation factor, looking for the optimum ratio of foxes to rabbits to ensure that both populations remain stable.
18: Differential Equations: New Applications
Professor Su walks you through the application of matrices in differential equations, assuming for just this lecture that you know a little calculus. The first problem involves the population ratios of rats and mice. Next, investigate the motion of a spring, using linear algebra to simplify second order differential equations into first order differential equations—a handy simplification.
19: Orthogonality: Squaring Things Up
In mathematics, “orthogonal” means at right angles. Difficult operations become simpler when orthogonal vectors are involved. Learn how to determine if a matrix is orthogonal and survey the properties that result. Among these, an orthogonal transformation preserves dot products and also angles and lengths. Also, study the Gram–Schmidt process for producing orthogonal vectors.
20: Markov Chains: Hopping Around
The algorithm for the Google search engine is based on viewing websurfing as a Markov chain. So are speech-recognition programs, models for predicting genetic drift, and many other data structures. Investigate this practical tool, which employs probabilistic rules to advance from one state to the next. Find that Markov chains converge on at least one steady-state vector, an eigenvector.
21: Multivariable Calculus: Derivative Matrix
Discover that linear algebra plays a key role in multivariable calculus, also called vector calculus. For those new to calculus, Professor Su covers essential concepts. Then, he shows how multivariable functions can be translated into linear transformations, which you have been studying since the beginning. See how other ideas in multivariable calculus also fall into place, thanks to linear algebra.
22: Multilinear Regression: Least Squares
Witness the wizardry of linear algebra for finding a best-fitting line or best-fitting linear model for data—a problem that arises whenever information is being analyzed. The methods include multiple linear regression and least squares approximation, and can also be used to reverse-engineer an unknown formula that has been applied to data, such as U.S. News and World Report’s college rankings.
23: Singular Value Decomposition: So Cool
Next time you respond to a movie, music, or other online recommendation, think of the singular value decomposition (SVD), which is a matrix factorization method used to match your known preferences to similar products. Learn how SVD works, how to compute it, and how its ability to identify relevant attributes makes it an effective data compression tool for subtracting unimportant information.
24: General Vector Spaces: More to Explore
Finish the course by seeing how linear algebra applies more generally than just to vectors in the real coordinate space of n dimensions, which is what you have studied so far. Discover that Fibonacci sequences, with their many applications, can be treated as vector spaces, as can Fourier series, used in waveform analysis. Truly, linear algebra pops up in the most unexpected places!