Matrices are made up of and include vectors. Vectors, values that are described by two distinct numbers, have been around since the time of Aristotle, describing forces for centuries. But the modern development of vectors began as a geometric representation of complex numbers by Argand and Wessel (Knott, 1978). For example, \begin{bmatrix} a \\ b \\ \end{bmatrix} shows a vector, but it could also represent the complex number a+bi. These vectors and complex numbers are referred to as "doubles" because each is expressed by two distinct numbers, not necessarily in the same set of numbers. As time progressed, mathematicians looked for a three-dimensional analogue of complex numbers to express in triples (values expressed using three distinct numbers) rather than doubles.
W. R. Hamilton (born in 1805) discovered the use of quadruples, as opposed to the expected triples, to extend the idea of complex numbers to a third dimension. He also realized that their multiplication was associative. In a struggle to determine a notation for the relationships between vectors and scalars, Sylvester coined the word "matrix" and defined it in 1850. Previous to Sylvester, mathematicians had worked with determinants, so "matrix" was easily defined as the array of numbers as a whole (Knott, 1978). While determinants and their theory inspired work with matrices, Arthur Cayley (born in 1821) was the first to discuss matrices and their properties without focusing on the determinant. Cayley also defined matrix equality, sum and scalar products of matrices, and the use of matrices to describe a system of linear equations. In addition, he defined matrix multiplication and inverses (Feldmann, 1962).