Ch 10: The Dot Product and Convolution
This chapter goes over the math / meaning of the Dot Product and Convolution, critical concepts that make up the basis of a lot of analysis, such as FFT
10.1 Dot Product
Convolution is an extension of Dot Product, so it is important to learn the Dot Product first. You likely know of some interpretations of the Dot Product. Purely mathematically it is a function that takes in 2 vectors of equal length, multiplies like-element of each together, and returns the sum of all those results, outputting a single value.
In practice, this can be thought of in a few ways:
- Sum of elements in one vector weighted by the elements of another vector - signal-processing interpretation
- A measure of covariance / similarity between 2 vectors - statistical interpretation
- A mapping between 2 vectors scaled by the cosine angle between them - geometric interpretation
In the geometric sense, you can think of the dot product of the length of the "shadow" of a vector projected / "shadowed" onto another.
This example is given in 2D space, but mathematically it can be done in any number of dimensions. Ergo, you can take an EEG signal time-domain set of 640 points and write it as a single vector of 640 dimensions, and use this for taking dot products.
Further details about the dot product can be found here.
Be SURE you understand how dot products are computed before continuing, otherwise you WILL get lost!
10.2 Convolution
10.3 How does Convolution work?
10.4 Convolution versus Cross-Covariance