Vector Notation aTx=b
This is called a linear transformation of x
Linear Algebra is which defines objects and is a fundamental part of geometry
Why do we need Linear Algebra?
Linear algebra is based on continuous mathematics, which is used through engineering. The Input Vectors are converted into a series of linear transformations.
Scalars are a single number of objects. They are mainly real-valued or integers which are represented in lower-case italic x
Vectors are an array of numbers which are arranged in order. They are identified in by an index are represented in lower-case bold x Vectors are points in space where each element are plotted along an axis When one talks about machine learning Data Representation becomes an essential aspect of data science, and data is represented usually in a matrix form. Having a good understanding of some of these concepts before you step in and understand more complicated or more sophisticated machine learning algorithms.
What is a Matrix?
A matrix is a form of organizing data into rows and columns. There are many ways in which you can organize data matrix provides you a convenient way of organizing this data So, if you are an engineer and you are looking at data for multiple variables, at multiple times. How do you put this data together in a format that can be used later, is what a matrix is helpful for? Matrices can be used to represent samples with multiple attributes in a compact form. Matric can also be used to represent linear equations compactly and straightforwardly. Linear algebra provides tools to understand and manipulate matrices to derive useful knowledge from data.
Usually, matrices are used to store and represent the data on machines. Matric is a very natural approach to organizing data. In general, rows represent the samples. Columns represent the values of the variables.
If Two indices identify each element of the matrices, then it is a 2-D Array of Numbers. It is Denoted by A
The RGB Color Image, which has three axes is an example of a Tensor. Tensor is an array of numbers which are arranged on the grid. This grid is regular and has a variable number of axes. Tensor is Denoted by A
If the Matrices have the same shape, then we can add the corresponding elements.
C = A+B
Ci,j = Ai,j +Bi,j
The Matrix can be added to the scalar or even multiplied
Di,j = aBi,j +c
Machine Learning- Flow of Tensors
Matrix Formulation: Ax=b
Angle between Vectors
L2 norms are represented by the dot product, which is written in terms of angle θ between the two vectors.
The Diagonal Matrix has a Lot of Zeros, and the non-zero entries are only in the diagonals.
A Symmetric Matrix is of the Form A=AT
Covariance is Symmetric
A Unit Vector is of the Form
We can represent the data into a matrix format, what are the questions we need to ask?
- Are all the attributes in the data matrix relevant?
- Is there any method which can identify if some attributes are related to the other attributes?
- If yes, how do we identify the linear relationship?
- Can we use this to reduce the size of the data matrix?
How does one identify the independent attributes?
- Domain Knowledge D is the function of P and T
- Implying that at least one attribute is dependent on the others
- Variable can be calculated as a linear combination of other variables
The rank of a Matrix
Let us assume that we have many more samples than attributes for now. Is there any approach which can be used to identify the number of linear relationships between the attributes purely using data? The rank of a matrix refers to the number of linearly independent rows or columns of the matrix. The rank of the matrix is found by using the rank command: rank(A)
Null Space: The Idea
Notice that if A β=0, every row of A when multiplied by β goes to zero. This implies that variable values in each sample behave the same. Every null space corresponds to one linear relationship.
To Summarize, available data can be expressed in the form of a data matrix, and a data matrix can be further used to do different types of operations. Nullspace is de ned as a collection of vectors that satisfy this relationship A times beta equal to 0 we have taken a look at the importance of Linear Algebra and its role in Machine Learning. Matrices can also be used to store coef cients in several equations, and which can be processed later for further use.
The notion of rank gives you the notion of the number of independent variables or samples. We have touched upon some of the critical features of Machine learning development solutions and a real-world example which would have sparked the importance and power of learning machine learning. Start improving your machine learning skills by understanding linear algebra.