Since both $A$ and $B$ are of the same order, it is possible to add them and the result will be a matrix of the same order that is $2 \times 3$., that is,
$A + B = C = [c_{i,j}]_{2\times 3}$
Each element of the resultant matrix $C$ can be calculated by just adding the corresponding elements of $A$ and $B$, that is,
$= \begin{bmatrix} k \cdot a_{1, 1} & k \cdot a_{1, 2} & ... & k \cdot a_{1, n} \\ k \cdot a_{2, 1} & k \cdot a_{2, 2} & ... & ka_{2, n} \\ \vdots & \vdots & \ddots & \vdots \\ k \cdot a_{m, 1} & k \cdot a_{m, 2} & ... & k \cdot a_{m, n} \end{bmatrix}$
III. Properties Of Scalar Multiplication
Distributive Over Addition/Subtraction
That is, $k(A_{m,n} \pm B_{m,n}) = kA_{m,n} \pm kB_{m,n}$
Commutative
That is, $kA_{m,n} = A_{m,n} k$
Associative
That is, $p \cdot (q \cdot A_{m,n}) = (p \cdot q) \cdot A_{m,n}$
-----------book page break-----------
IV. Matrix Multiplication
Now we will see what how to multiply two matrices. Two matrices can be multiplied if, and only if, the number of columns of the first matrix is equal to the number of rows in the second matrix. If we have two matrices
$A = \begin{bmatrix}a_{i,j}\end{bmatrix}_{m \times n}$ and
it is only possible to evaluate $A \times B$ only if $n = p$, and the result will be a matrix $P$ with an order of $m \times q$. It is possible to evaluate $B \times A$ only if $q = m$ and the resultant matrix will be of order $p \times n$
Since the number of columns in the first matrix is the same as the number of rows in the second matrix, each row of the first matrix will contain the same number of elements as each column of the second matrix. Each element of the resulting matrix is obtained by multiplying the elements of the corresponding row of the first matrix and the corresponding column of the second matrix, and taking the sum of these products.
Let's take an example and understand this better.
Let us consider matrix $A$ as a $3 \times 2$ matrix and matrix $B$ as $2 \times 3$ matrix. Observe, that the number of columns of $A$ is $2$ and the number of rows in matrix $B$ is also $2$, hence we should be able to multiply these two matrices.
$A = \begin{bmatrix}3 & -1 \\ 2 & 4 \\ -3 & 5\end{bmatrix}$ this is a matrix with $3$ rows and $2$ columns
$B = \begin{bmatrix}-2 & 5 & -1 & 3 \\ 3 & 6 & -4 & 1\end{bmatrix}$ this is a matrix with $2$ rows and $4$ columns
-----------book page break-----------
As stated before, the resulting matrix will have $3$ rows and $4$ columns
So, let us write the complete expression with the resultant matrix filled with variables:
Now we will see how to find the values of each element of the result matrix $P$.
To find the value of the top-right corner element $p_{1,1}$, we will need to take the elements of $1\xasuper{st}$ row of $A$ and the $1\xasuper{st}$ column of $B$ as shown in the step below (the selected row and column are highlighted in blue):
Thus we get $p_{1,1} = (3)\times(-2) + (1) \times (-3) = -6 + (-3) = -9$
Now we can replace $p_{1,1}$ with $-9$ and move on to finding $p_{1,2}$
To find $p_{1,2}$ we need to select the $1\xasuper{st}$ row of $A$ and the $2\xasuper{nd}$ column of $B$, as highlighted below in blue, and perform the same multiplication of corresponding elements and take the sum, as follows:
Now that we have understood the process to compute the product of two matrices, let us answer the question:
Why is it so, that the first matrix should have the same number of columns as the number of rows in second matrix?
The answer should be clear by now, that is, if the number of columns in the first matrix was different from the number of rows in the second matrix, then rows of the first matrix would contain a different number of elements as compared to the number of elements in the columns of the second matrix. It would not be possible to find the product of corresponding elements and their sum.
Similarly, if we want to test the commutative property of matrix multiplication, we can try to multiply $B A$ from our example.
Now we can see that each row of the first matrix contains $4$ elements while each column of the second matrix contains $3$ element. Hence, it is not possible to perform a row and column wise multiplication, therefore, $BA$ does not exist.
In general, matrix multiplication is not commutative.
Now that you have understood how to multiply two matrices, you can use the widget below to do some rapid practice of matrix multiplication.
--------- Reference to widget: 20bd18ed-be06-447e-aca2-0dca9fd56b6d ---------
-----------book page break-----------
V. Division
The concept of division, in the traditional number division sense, does not exist for matrices.
However, there is the concept of multiplication by inverse which yields similar result as division. For example, if you have two numbers $x$ and $y$, such that $x \cdot y = 1$, then $y = \dfrac{1}{x} = x^{-1}$
Similarly, inverse of a square matrix is such, that when multiplied with the matrix the result is the identity matrix.
That is, if,
$A \times B = I$, where $I$ is the identity matrix, then $B$ is called the inverse of $A$ and is denoted by $A^{-1}$
Note, the above multiplication is valid only if $n = p = r$ and $q = s$
Commutative Property
Matrix multiplication is not commutative in general. That is $AB \ne BA$.
-----------book page break-----------
However, there are a few exceptions, where commutative property holds, for special cases. We will list out some of those special cases here.
If both matrices are square matrices, and one of the matrices is a scalar matrix (that is, the identity matrix multiplied by a scalar), then the multiplication is commutative, that is:
$A \times S = S \times A$ where $S$ is a scalar matrix.
Our multiplication has reduced to a simple scalar multiplication, and is therefore, commutative.
The other exception is multiplication by inverse. That is, if for a square matrix $A$, its inverse $A^{-1}$ exists, then,
$A \times A^{-1} = A^{-1} \times A = I$
-----------book page break-----------
VII. Matrix Exponentiation
We saw in the previous section how to multiply two matrices. In this section we will see how to compute the power of a matrix.
It is possible to calculate the $n\xasuper{th}$ power of a matrix $A$ only if it is a square matrix. We will soon find out why? But first let us understand how.
Like exponentiation of scalar values is defined as repeated multiplication of the value by itself, matrix exponentiation has the exact same definition. That is, repeated multiplication of the matrix by itself. Now let us take an example of this.
Notice that we have computed the product of the three matrices starting with the leftmost matrix. Matrix multiplication is associative, therefore, if we had multiplied the second and the third matrix first and we would have gotten the same result.
-----------book page break-----------
Now let us understand why we cannot compute the power of a matrix $A_{m \times n}$ which is not a square matrix, that is, $m \ne n$
Here we can see that the number of columns in the first matrix is $3$ and the number of rows in the second matrix is $2$, hence they do not fulfil the requirement for matrix multiplication, that requires that the number of columns in the first matrix be equal to the number of rows in the second matrix. Therefore, $A^2$ does not exist.
Therefore, we can say that a matrix can only be multiplied by itself, if it is a square matrix.
VIII. Matrix Transpose
Given a matrix $A_{m \times n}$ of order $m \times n$, if we change all its rows into columns, the resultant matrix will be a matrix of order $n \times m$. This matrix is called the transpose of $A$ and is denoted by $A^T$.
If we take $A = \begin{bmatrix}1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}$, then,