(Work in progress)
The Taylor Series are all formed by the integer powers of the variable within a polynomial (finite or infinite). Thus when we have a square matrix, which
we can "multiply" with itself and get a matrix with the same dimensions. Thus we should be able to form Taylor Series with matrices instead of normal real or complex
values which should result in a matrix of the same dimensions.
Thus we want to examine functions that will provide us with polynomials of the following form:
$$
\begin{aligned}
f(X) = \sum_{j=0}^\infty C_j \cdot {{X^j} \over j!}
\end{aligned}
$$
Where $X$ and $C_j$ are matrices.
Exponential Function
Let's see if we can use the normal exponential function Taylor Series with matrices:
$$
\begin{aligned}
e^X & = \sum_{j=0}^\infty {{X^j} \over j!}\\
& = I + X + {{X^2} \over 2!} + {{X^3} \over 3!} + {{X^4} \over 4!} + ....+ {{X^n} \over n!} + ....\\
\end{aligned}
$$
$I$ is the identity matrix with diagonals (top left to bottom right) 1 and the rest 0.
If we set $X = [x]$ then it reduces to the normal exponential function. If we use $X = \left[ \begin{array}{c} 0 & -x\\x& 0 \end{array} \right]$ this represents the imaginary number $xi$
and if this works out then we should get the values for $\left[ \begin{array}{c} cos(x) & -sin(x)\\ sin(x) & cos(x) \end{array} \right]$ as result:
$$
\begin{aligned}
I & = \left[ \begin{array}{c} 1 & 0\\ 0 & 1 \end{array} \right]\\
X & = \left[ \begin{array}{c} 0 & -x\\ x & 0 \end{array} \right]\\
X^2 & = \left[ \begin{array}{c} -x^2 & 0\\ 0 & -x^2\end{array} \right]\\
X^3 & = \left[ \begin{array}{c} 0 & x^3\\ -x^3 & 0 \end{array} \right]\\
X^4 & = \left[ \begin{array}{c} x^4 & 0\\ 0 & x^4\end{array} \right]\\
X^j & = \left[ \begin{array}{c} 0 & (-1)^{{j+1} \over 2} x^j\\ (-1)^{{j-1} \over 2} x^j & 0 \end{array} \right] \, \mbox{when j is uneven}\\
X^j & =\left[ \begin{array}{c} (-1)^{j \over 2} x^j & 0\\ 0 & (-1)^{j \over 2} x^j \end{array} \right]\\
e^{\left[ \begin{array}{c} 0 & -x\\ x & 0 \end{array} \right]} & = \left[ \begin{array}{c} 1 & 0\\ 0 & 1 \end{array} \right]
+ \left[ \begin{array}{c} 0 & -x\\ x & 0 \end{array} \right] +
{{\left[ \begin{array}{c} -x^2 & 0\\ 0 & -x^2\end{array} \right]} \over 2!} +
{{\left[ \begin{array}{c} 0 & x^3\\ -x^3 & 0 \end{array} \right]} \over 3!} +
{{\left[ \begin{array}{c} x^4 & 0\\ 0 & x^4\end{array} \right]} \over 4!} + ....+ {{X^n} \over n!} + ....\\
& = \left[ \begin{array}{c} 1-{{x^2} \over 2} +{{x^4}\over 4}-.. & -x+{{x^3} \over 3!} -{{x^5} \over 5!}+ ..\\ x-{{x^3} \over 3!} +{{x^5} \over 5!}- .. & 1-{{x^2} \over 2} +{{x^4}\over 4}-.. \end{array} \right]\\
& = \left[ \begin{array}{c} cos(x) & -sin(x) \\ sin(x) & cos(x) \end{array} \right]
\end{aligned}
$$
Thus it works for 2 dimensional matrices as well.
$$
\begin{aligned}
\end{aligned}
$$
Similarly we can show that we can get the matrix functions for $log(X+I)$, $cos(X)$, $sin(X)$, $arctan(X)$, $sinh(x)$, $cosh(X)$, $arctanh(X)$ etc. More of this will be added over time.