Linear Algebra for AI/ML Part 2 – Dot Product
trybackprop.comConfusing notation in the second image example.
If both vectors are viewed as column vectors, dot product is defined as aᵀb. a is transposed and aᵀb is matrix product of vectors. Since you can think vectors as column or row vectors, but usually as column vectors it's better to stick to what is common to avoid confusion.
Thanks for the feedback!
It's nitpicking until you someone learns about Hadamard product and confusion sets in.
a∘b = a_ij × b_ij
a·b = aᵀb = a_ji × b_ij