A vector is an ordered list of numbers. They can represent quantities with both magnitude and direction.
Example of a 2D vector: \(\vec{v} = (3, 4)\)
A matrix is a rectangular array of numbers arranged in rows and columns.
Example: $$ A = \begin{bmatrix}1 & 2 \\ 3 & 4\end{bmatrix} $$
They satisfy: $$A \vec{v} = \lambda \vec{v}$$
Eigenvalues are central in many algorithms. For example, they underpin techniques like the Variational Quantum Eigensolver (VQE).
Tensors generalize vectors (1D) and matrices (2D) to higher dimensions.
Tensors are fundamental to understanding data structures in computing and physics. In quantum computing, they relate to how we represent multiple qubits.
The tensor (Kronecker) product of two matrices \(A\) and \(B\) is denoted \(A \otimes B\).
Example: $$ \begin{bmatrix}a & b \\ c & d\end{bmatrix} \otimes \begin{bmatrix}e & f \\ g & h\end{bmatrix} = \begin{bmatrix} ae & af & be & bf \\ ag & ah & bg & bh \\ ce & cf & de & df \\ cg & ch & dg & dh \end{bmatrix} $$
Tensor products are used to build larger spaces from smaller components. In quantum computing, they describe how multi-part systems combine.
\((6,4)\)
\(1*4 + 2*5 + 3*6 = 32\)
\(2\) and \(3\)
\((0,1,0,0)\)