In this video, you will learn about:
Even though, therse linear argebra operations are good to be known, they are not frequentelly applied operations in DL. Anyway, It's very easy to implement them in PyTorch. However,or building simple networks like a fully connected networks with embedings, for categorical features, and numerical features, you won't need these operations.
Skimm through the comprehensive deffitions of the above terms here or one-sentace explanation bellow. Learn more by watching the video.
Matrix Determinant: "The determinant of a square matrix is a single number that, among other things, can be related to the area or volume of a region. In particular, the determinant of a matrix reflects how the linear transformation associated with the matrix can scale or reflect objects." mathinsight.org GIF Source: https://www.chilimath.com/
Inversion This work is interesting, as it talks about importance of matrix inversion in veruy specific DL application
Trace Is the sum of all the diagonal elements present in a given matrix.
Eigendecomposition Is important in ML task for dimensionality reductions (PCA, or SVD, etc.). one of the application may be a recoonstruction, i.e. estimation of a full, sparse matrix. You can read more about Eigendecomposition in a post by machinelearningmastery.com
Positive Definite matrices Lead to convex functions. This property is useful in optiomization problems. Look at this paper, to see the appication in computer vision!
Orthogonal & Symmetric matrices Useful properties in matrox decomposition, e.g. Eigendecomposition.