Predicting the future based on the past few steps. If we have the weights and if we have the previous positions, can predict next steps. Assuming the error term is zero mean. e(k) = y(k) - wTphi(k)
problem how to learn the weights w
. How should i go about learning them?
Idea: sample. training data. Formulate an optimization problem. given training data, how to estimate the weights.
Problem:
mine - what are vectors?
Suppose we observed the past history y(1), y(2), ... y(N)
. This is the same as observing
IDEA:
Formulate as LS problem: minimize . phi is a matrix with phi(1)^T β¦ phi(N)^T as rows. check picture.
y = y vector column
Note that all we need is linear in w
. y(k) can be βnotβ linear.
Question: what if y(k) is a quadratic function of previous y(k-1) , y(k-1)?
The same scheme works for
Linear equations in Engineering. (Friendly Review of Some Examples)
Utility and modeling everything. the focus of 16A. Applied linear algebra. Modelling in real life (engineering sense)
Will become useful later for modeling things. reduce optimization models and so forth.
- Good for modeling constraints in Engineering / Science. A lot of things in natural are linear.
Example 1: Tomography (16A) key- take log to make the equations linear
Example 2: Network Flows (Direct Graphs)
Linear Algebraβs Role
Linear algebra provides the tools to model and solve these problems. Hereβs how:
-
Variables: We use variables to represent the flow through each part of the network (e.g., the amount of water in a pipe, the number of cars on a road).
-
Equations: The key principle is conservation of flow:
- Conservation of flow : The total flow into the network must equal the total flow out.
- Feasible Flow : At any intersection (node), the flow in must equal the flow out.
1 & 3 & 4 & \end{matrix}$$
These principles translate into a system of linear equations.
3. Matrices: We represent the network and the flow equations using matrices. This allows us to use efficient techniques like Gaussian elimination to solve for the unknown flows.
Linear equations in optimization
minimize f(x) over x such that (Transporting Corn Example, Network Flow)
Review of Linear Algebra Module. Big Points. What are important according to Tom.
- Linear Algebra = Language of Optimization
- Basic Object: vectors, matrices.
- Important Concepts: Subspaces, Bases, Norms, Inner Products, etc.
- Fundamentally Geometric in Nature.
- Some of the most important Tools:
-
Projection (everything of this course {in a sense})
- Given Subspace S, solve
- Solution characterized by Orthogonality
- Meaning and are orthogonal.
- IMPORTANT: Projection is a linear transformation
- Applications:
- Gram-Schmidt (Sequential Projection) Input = and Output Application of Gram-Schmidt β QR Decomposition
-
Matrices represent Linear Transformation
- Four Fundamental Subspaces (The picture)
- Pseudo-Inverse
- The picture can sometimes can allow us to better understand / reduce problems.
- One good example: transform to
- Basically transform an optimization problem with linear constraint to a least square problem.
-
Many of the other concepts we saw from optimization problems:
- Example if A is a symmetric n x n matrix,
- Repeat to get Spectral Decomposition =
- Applications: PCA : just take spectral decomposition of
-
Singular Value Decomposition Comes from an optimization problem
- For Generic
- Application:
- Low Rank Approximation
- PseudoInverse Solving Least Squares, Mapping between Subspaces.
- For Generic
-