We saw in the last update that the generating function Z[J] can be expressed as
Z[J]=e12J⋅A−1J
(at least as long as we've normalize things so that Z[0]=1. Now the wonderful thing is that this is something we can compute explicitly:
Z[J]=∞∑n=0(12A−1ijJiJj)nn!=∞∑n=0(A−1ijJiJj)n2nn!
For example, in the one-dimensional case (taking A=1) we get
Z[J]=∞∑n=0J2n2nn!
On the other hand, by the definition of the generating function we have
Z[J]=∞∑n=0⟨xn⟩n!Jn
Comparing coefficients, we find
⟨x2n⟩(2n)!=12nn!
so that
⟨x2n⟩=(2n)!2nn!.
Let's give a combinatorial description. Given 2n objects, in how many ways can we divide them into pairs? If we care about the order in which we pick the pairs, then we have
{2n \choose 2}{2n - 2 \choose 2} \cdots {2n-(2n-2) \choose 2} = \frac{(2n)!}{2^n}
Of course, there are n! ways of ordering the n pairs, so after dividing by this (to account for the overcounting) we get exactly the expression for \langle x^{2n} \rangle. This is the first case of Wick's theorem.
Now consider the general multidimensional case. Given I = (i_1, \cdots, i_{2n}), we define a contraction to be
\langle x^{j_1} x^{k_1} \rangle \cdots \langle x^{j_n} x^{k_n} \rangle
where j_1, k_1, \cdots, j_n, k_n is a choice of parition of I into pairs.
Theorem (Wick's theorem, Isserlis' theorem) The expectation value
\langle x^{i_1} \cdots x^{i_{2n}} \rangle
is the sum over all full contractions. There are (2n)!/ 2^n n! terms in the sum.
Proof This follows from our formula for the power series of the generating function. The reason is that the coefficient of J^I in (\frac{1}{2} A^{-1}_{ij} J^i J^k)^n is exactly given by summing products of A^{-1}_{ij} over partitions of I into pairs, and the n! in the denominator takes care of the overcounting.
Next up: perturbation theory and Feynman diagrams.
Saturday, March 3, 2012
Introduction to Gaussian Integrals
As a warm-up for more serious stuff, I'd like to discuss Gaussian integrals over \mathbb{R}^d. Gaussian integrals are the main tool for perturbative quantum field theory, and I find that understanding Gaussian integrals in finite dimensions is an immense aid to understanding how perturbative QFT works. So let's get started.
The Basics
Let A be some d \times d symmetric positive definite matrix. We are interested in the integral
\int_{-\infty}^\infty \exp(-\frac{x \cdot Ax}{2}) dx.
Out of laziness, I will suppress the limits of integration and just write this as
\int e^{-S(x)} dx.
where S(x) = x \cdot Ax / 2. Now for a function f(x), we define the expectation value \langle f(x) \rangle to be
\langle f(x) \rangle_0 = \int f(x) e^{-S(x)} dx
Occasionally, we might care about the normalized expectation value
langle f(x) \rangle = \frac{\langle f(x) \rangle_0}{\langle 1 \rangle_0} = \frac{1}{\langle 1 \rangle_0} \int f(x) e^{-S(x)} dx.
We mostly care about asymptotics, so we will typically think of a function f(x) as being a polynomial (or Taylor series). So what we're really interested in is
\langle x^I \rangle = c\int x^I e^{-S(x)} dx,
where I is a multi-index.
The Partition Function
Let us define Z[J] by
Z[J] = \int e^{-S(x) + J \cdot x} dx.
Now the great thing is that
\langle x^I \rangle = \left. \frac{d^I}{dJ^I} \right|_{J = 0} Z[J],
so that once we know Z[J], we can calculate anything. So let's try to compute it. We have
\begin{align} (Ax - J) \cdot A^{-1} (Ax - J) &= (Ax - J) \cdot (x - A^{-1} J) \\\ &= x \cdot Ax - x \cdot J - J \cdot x + J \cdot A^{-1} J \\\ &= x \cdot Ax - 2 x \cdot J + J \cdot A^{-1} J. \end{align}
So we see that
-\frac{1}{2} x \cdot A x + J \cdot x = \frac{1}{2} J \cdot A^{-1} J -\frac{1}{2} (x-A^{-1}J) \cdot A(x - A^{-1} J).
So, after a change of variales x \mapsto x - A^{-1} J we find
Z[J] = e^{\frac{1}{2} J \cdot A^{-1} J} Z[0].
Now the argument in the exponential is
\frac{1}{2} A^{-1}_{ij} J^i J^j
So we find that
\langle x^i x^j \rangle = \frac{d^2}{dx^i dx^j} Z[J]|_{J = 0} = A^{-1}_{ij}.
Now we are ready to prove Wick's theorem and discuss Feynman diagrams, which we'll do in the next post.
The Basics
Let A be some d \times d symmetric positive definite matrix. We are interested in the integral
\int_{-\infty}^\infty \exp(-\frac{x \cdot Ax}{2}) dx.
Out of laziness, I will suppress the limits of integration and just write this as
\int e^{-S(x)} dx.
where S(x) = x \cdot Ax / 2. Now for a function f(x), we define the expectation value \langle f(x) \rangle to be
\langle f(x) \rangle_0 = \int f(x) e^{-S(x)} dx
Occasionally, we might care about the normalized expectation value
langle f(x) \rangle = \frac{\langle f(x) \rangle_0}{\langle 1 \rangle_0} = \frac{1}{\langle 1 \rangle_0} \int f(x) e^{-S(x)} dx.
We mostly care about asymptotics, so we will typically think of a function f(x) as being a polynomial (or Taylor series). So what we're really interested in is
\langle x^I \rangle = c\int x^I e^{-S(x)} dx,
where I is a multi-index.
The Partition Function
Let us define Z[J] by
Z[J] = \int e^{-S(x) + J \cdot x} dx.
Now the great thing is that
\langle x^I \rangle = \left. \frac{d^I}{dJ^I} \right|_{J = 0} Z[J],
so that once we know Z[J], we can calculate anything. So let's try to compute it. We have
\begin{align} (Ax - J) \cdot A^{-1} (Ax - J) &= (Ax - J) \cdot (x - A^{-1} J) \\\ &= x \cdot Ax - x \cdot J - J \cdot x + J \cdot A^{-1} J \\\ &= x \cdot Ax - 2 x \cdot J + J \cdot A^{-1} J. \end{align}
So we see that
-\frac{1}{2} x \cdot A x + J \cdot x = \frac{1}{2} J \cdot A^{-1} J -\frac{1}{2} (x-A^{-1}J) \cdot A(x - A^{-1} J).
So, after a change of variales x \mapsto x - A^{-1} J we find
Z[J] = e^{\frac{1}{2} J \cdot A^{-1} J} Z[0].
Now the argument in the exponential is
\frac{1}{2} A^{-1}_{ij} J^i J^j
So we find that
\langle x^i x^j \rangle = \frac{d^2}{dx^i dx^j} Z[J]|_{J = 0} = A^{-1}_{ij}.
Now we are ready to prove Wick's theorem and discuss Feynman diagrams, which we'll do in the next post.
Subscribe to:
Posts (Atom)