top of page
  • Roland van der Veen

Basic Notion: Dual of a vector space

Abstract: Some topics we may to touch on: Is the dual of the dual the same as the original? How do you raise an index? What is the transpose? and why not just pick a metric/inner product? What is the Dirac delta function? What happens when you tensor with the dual?


Linear algebra is fundamental to the modern description of much of mathematics and physics, especially geometry, relativity and quantum theory. As such it should be a fitting start for a basic notions seminar in mathematical physics. Especially because on the surface the notation for vectors and other linear algebraic concepts tends to vary wildly, from linear functionals, to bra, ket notation to row and column vectors. Combining the vectors to make tensors only makes the confusion worse of course.


Close to the root of the confusing notations seems to be the concept of the dual V* of a vector space V. Dual vectors are real-valued linear functions on our vector space but how to describe those? In practice one can always hide the dual vectors by picking a basis and using the corresponding delta-functions as a dual basis. This way the dual is mostly hidden but will appear in the form of the transpose of a matrix and the introduction of row vectors and column vectors. If one has an inner product on V then this approach is backed up by viewing the dual vectors as the operation of taking the inner product with a fixed vector. This viewpoint explains Dirac's bra-Ket notation where the inner product is written as a bracket.


So far the dual vector space V* sounds much like the original V, the distinction being largely academic: Passing bras to kets or rows to column vectors there should be a simple identification between V and V*. However this is far from the truth. By a variant of the diagonal argument of Cantor one shows that the cardinality of V* is larger than that of V whenever the dimension of V is infinity (as it often is). An example of this phenomenon that appears in my own research is studying the case where the vector space V is the space Q[x] of polynomials with rational coefficients. In this case V* is the space of power series Q[[x]] in x as one can see by noting that a linear function on the space of polynomials should know where to map all the monomials 1,x,x^2,x^3,x^4,... and all those monomials are independent. Making a generating function for the values of the function on the monomials naturally yields a power series. Going a step further one will see Feynman diagram type calculations appearing in computing with linear maps betwee polynomial rings. This way my research uses ideas from quantum field theory to deal with algebra.


This blog post was written subsequent to a presentation in the Basic Notions seminar. For more information about this seminar, go to the seminar's webpage.

193 views0 comments
bottom of page