I was wondering about the relationship between the ENSO (El Nino/Souther Oscillation) and temperature. I've seen some comparisons, but they always use the raw ENSO data. But I was thinking, what if it wasn't the ENSO data, *per se*, but the *total accumulated* "energy" (for lack of a better word) that drove temperature?

That is, temperature T is the integral of the enso:

T(t) ≈ ∫ enso(t)

Unfortunately, we don't have the "actual" ENSO data. What we have is the ENSO data, multiplied by some arbitrary constant, with some arbitrary offset. That is:

enso(t) = k_{1}·enso^{*}(t) + k_{2}

where enso* is the "real" ENSO index, which has been corrupted with arbitrary scaling and offsets. Combining the formulas and rearranging terms:

T(t) ≈ ∫ k_{1}·enso^{*}(t) + k_{2}

T(t) ≈ k_{1}· ∫ enso^{*}(t) + k_{2}·x + c

The "+c" is some arbitrary constant, which is valid because, like the ENSO, the temperature data is artificially zeroed. Using the ENSO values from NOAA and the temperature data from HadCRUT, the values of k1, k2 and c are pretty easily fitted using a multiple regression, from which I get the following:

k1 = 0.0038899

k2 = 9.3752e-04

c = 0.020067

So basically, download the raw ENSO data and multiply it by k1 and then add k2. [To make it clear - I don't necessarily regard k1 and k2 as "parameters" - they exist (at least in theory) to remove arbitrary parameters that already exist in the data.] Then compute the integral (that is, integral(n) = sum(enso(0..n))). The result should "fit" against the temperature up to a constant difference, c. The results are shown below (click for larger):

The temperature has been filtered with a 3-month moving average (which matches the unfortunately-smoothed ENSO seriees). The fit is pretty darn close. I don't know what it means - it may mean nothing, since the ENSO may simply be computed as the derivative of temperature - but it's interesting none the less.