Properties of the Projection Operator

Since chapter 7.2 in the textbook about the projection operator is very concise, I’d like to show some of the properties of this operator. Hopefully this will help in understanding them a bit more.

Projection Operators are Hermitian

The projection operator \(\mathrm{P}\) is defined as \[ \mathrm{P} = \ket{\psi} \bra{\psi} \]

Using the rule \((AB)^\dagger = B^\dagger A^\dagger\) we get:

\[ \mathrm{P}^\dagger = (\ket{\psi} \bra{\psi})^\dagger = \bra{\psi}^\dagger \ket{\psi}^\dagger = \ket{\psi} \bra{\psi} = \mathrm{P} \]

Hence \(\mathrm{P}\) is Hermitian.

Eigenvectors and Eigenvalues

The vector \(\ket{\psi}\) is an eigenvector of its projection operator with eigenvalue \(1\): \[ \ket{\psi} \bra{\psi} \: \ket{\psi} = \ket{\psi} \underbrace{\braket{\psi | \psi }}_{=1} = 1 \ket{\psi} \]

since \(\ket{\psi}\) is normalized.

Acting on a vector \(\ket{\phi}\) which is orthogonal to \(\ket{\psi}\) yields: \[ \ket{\psi} \bra{\psi} \: \ket{\phi} = \ket{\psi} \underbrace{\braket{\psi | \phi }}_{=0} = 0 \ket{\psi} \] Hence \(\ket{\phi}\) is an eigenvector with eigenvalue \(0\).

Square of a projection operator

If you perform a projection twice, nothing changes. \[ (\ket{\psi} \bra{\psi})^2 = \ket{\psi} \underbrace{\bra{\psi} \ket{\psi}}_{=1} \bra{\psi} = \ket{\psi} \bra{\psi} \]

Trace of a projection operator

The trace (sum of the diagonal elements) of a projection operator is \(1\).

The reason for this is given in the textbook: the trace of a Hermitian operator \(\mathrm{M}\) is the sum of its eigenvalues, which are the diagonal elements of the matrix after performing a basis transformation using the eigenvectors of \(\mathrm{M}\) as basis vectors.

While this is true, you need some background in linear algebra to follow this proof.

The following is a more basic approach, but it helps in understanding this property.

Consider a vector \[ \ket{A} = \begin{pmatrix} a_1 \\ a_2 \\ a_3 \end{pmatrix} \]

The corresponding bra is \[ \bra{A} = \begin{pmatrix} a^*_1 & a^*_2 & a^*_3 \end{pmatrix} \]

Assume \(\ket{A}\) to be normalized: \[ \braket{A|A} = a^*_1 a_1 + a^*_2 a_2 + a^*_3 a_3 = 1 \]

Hence, according to the rules of matrix multiplication, \[ \ket{A} \bra{A} = \begin{pmatrix} a_1 \\ a_2 \\ a_3 \end{pmatrix} \begin{pmatrix} a^*_1 & a^*_2 & a^*_3 \end{pmatrix} = \begin{pmatrix} a_1 a^*_1 & a_1 a^*_2 & a_1 a^*_3\\ a_2 a^*_1 & a_2 a^*_2 & a_2 a^*_3\\ a_3 a^*_1 & a_3 a^*_2 & a_3 a^*_3 \end{pmatrix} \]

\[ \mathrm{Tr} (\ket{A} \bra{A}) = a_1 a^*_1 + a_2 a^*_2 + a_3 a^*_3 = 1 \]

This holds true for any basis and any dimension.

Sum of Projection Operators

Using a basis system \(\ket{i}\), a ket vector \(\ket{A}\) can be written as

\[ \ket{A} = \sum_i a_i \ket{i} = \sum_i \braket{i|A} \ket{i} = \sum_i \ket{i} \braket{i|A} \]

Formally we can factor out \(\ket{A}\): \[ \sum_i \ket{i} \braket{i|A} = \sum_i ( \ket{i} \bra{i} ) \ket{A} = \mathrm{I} \ket{A} \]

hence \[ \sum_i ( \ket{i} \bra{i} ) = \mathrm{I} \]

with \(\mathrm{I}\) being the identity operator.