Monday, June 21, 2010

The Sir Clive Granger Memorial Conference

On Monday 24th and Tuesday 25th May this year the Granger Centre for Time Series Econometrics at the University of Nottingham and the Department of Economics at the University of California San Diego jointly hosted the Sir Clive Granger Memorial Conference at the East Midlands Conference Centre at the University of Nottingham. The event came almost exactly one year after the demise of Sir Clive Granger, joint Nobel prize winner in Economics (with Robert Engle) in 2003. I was fortunate enough to be amongst the several dozen people invited to the conference, where I was in the exalted company of Sir David Hendry, Halbert White, Peter Philllips, James Stock, Mark Watson and Hashem Pesaran (to mention just a few of the big names who were there).

The conference began with laudations in honour of Sir Clive Granger provided by David Hendry, Ken Wallis, Peter Phillips, Jim Stock, Cheng Hsiao, Hal White and John Bates. Each speaker revealed when they had first met Sir Clive and how he had influenced both their own work and the wider subject of econometrics. It is, of course, virtually impossible to undertake any time series econometrics without making use of Granger's work. David Hendry remarked upon Granger's ability not only to innovate but also to communicate clearly his ideas (possibly one of the reasons that he had such an impact). Peter Phillips described Granger as a thinker and a leader, emphasizing in particular his leadership qualities. We can learn from him that ideas are paramount and that one should display intellectual courage in promoting new ideas. Phillips also admired the simplicity and elegance in much of Granger's work. He spoke for everyone at the conference in affirming that all who knew Sir Clive loved and revered him. Jim Stock spoke of Sir Clive's willingness to challenge conventional wisdom and of the help and support that he had been given by Sir Clive when Stock was an Associate Professor early in his career. He praised his scientific integrity and the importance that he attached to forecasting. Cheng Hsiao said that Sir Clive was a source of inspiration to students and colleagues alike, and Hal White also noted the encouragement that he provided to colleagues and visitors in his time at UCSD. He reminded us that Granger's work had both reach and depth. He told us that he had recently attended a conference on Neuroscience and reported that the concept of Granger causaility is providing new insights for those attempting to understand the functioning of the brain. John Bates told the conference how he had first met Granger back in 1956 during Granger's early years as a lecturer at Nottingham when Bates was student. He noted that Granger was always generous with his time for students. He also liked the way that Granger mixed theory and applications and recalled how inspiring he was with the ideas that he tossed out. At the end of the session Dame Patricia (Pat) Granger, Sir Clive's widow, told of how she had first met Sir Clive when she was a research assistant working for an economic historian on the subject of smallpox deaths in 18th century England. She had been recommended to go and see him as he would be someone who would be willing to spend time giving advice about statistics in language that could be understood by a non-specialist. The rest, we were told, was history!

We were then treated to the first of ten keynote lectures that had been prepared for the conference, Sir David Hendry speaking on the subject of Empirical Model Discovery. Hendry began by noting that many features of models are not derivable from theory. Instead, a more data-based approach could help us to discover them. Hendry recalled the dialogue between Granger and himself reported in the special 2005 issue of Econometric Theory on Automated Inference and the Future of Econometrics, in which Hendry responded to a series of questions prepared by Clive Granger on the use of PcGets in data modelling and as a new research tool (see also Hendry and Krolzig, 2001). He went on to explain how the Autometrics software can be used for model formulation, estimation, selection and evaluation. Using the PcGive data set on consumption, income and inflation he showed how, after allowing for up to 20 lags and with impulse indicator saturation, the software could arrive at a congruent, parsimonious and encompassing model, finding a local DGP from the initial General Unrestricted Model (GUM). He emphasized that the approach was not based on repeated testing but on selecting the best model on the basis of squared t values and the notion of "sequential factorization". This approach does not impose theory on the data (other than the initial choice of variables to include) and can deal with structural breaks in a "constantly evolving world" . He mentioned that he had used this approach in practical applications with a big US insurance company, and in work with the UK Office for National Statistics to improve the quality of the latest available data. No printed papers were distributed at the conference but interested readers might like to view the video lecture at the University of Aarhus which covers similar ground.

The second keynote session had papers by Peter Phillips on "Implicit maps and new unit root limit theory" and Marcus Chambers on "Testing for seasonal unit roots by frequency domain regression". Phillips talked about embedded simulation techniques in estimation, bias correction and the failure of the delta method. In response to a question from James Davidson, Phillips said that these ideas were not yet ready to be unleashed upon the general public via EViews, it was more a matter of raising awareness at this stage. I must admit that at times during this talk I began to appreciate the way that some of my students must feel when they tell me that they can understand all the individual words that I am saying but still can't get what it is all about. As someone who is at the more applied end of the econometrics spectrum I shall have to wait for a future edition of Davidson's textbook before I can comment further on this topic.

In the first afternoon session Hal White gave a paper with the title "Granger causality, exogeneity, cointegration and policy analysis" - a bit of a "Granger kitchen sink" title as he commented himself. Relating back to the 1998 paper by Ericsson, Hendry and Mizon in the Journal of Business and Economic Statistics, White proposed a slightly different framework in which he emphasized the concept of "structural causality", distinguishing it from Granger causality and EHM's conditional super exogeneity. This would mean that Granger Causality is not an essential property but a consequence of conditional exogeneity. The second paper was by Norman Swanson with the title "Diffusion index based data reduction with shrinkage: new empirical evidence". My notes on this paper seem to have shrunk to almost nothing (!) but I do recall Swanson mentioning that Granger was an expert on ex ante prediction of car parking spaces when he was at UCSD. Swanson also noted that he was relieved on one occasion when coming out of grocery store into the car park and forgetting exactly where he had left his car to discover that Clive Granger admitted to also suffering this same fate on a different occasion.

During the various coffee, lunch and tea breaks at the conference the younger and less established attendees were able to display their work in poster sessions. It seemed to me that most of these were devoted to unit root tests that were able to deal with various structural breaks, outliers or other abnormalities, but all were very professionally put together and I'm sure that the experience of talking through their work with colleagues as they walked round will have been very valuable.

On the Monday night we were treated to a very nice dinner at the Hart's restaurant in Nottingham, during and after which there were some more informal reminiscences of Sir Clive Granger. The following day there were further keynote lectures from James Stock, Mark Watson, Cheng Hsaio, Graham Elliott and Jesus Gonzalo, as well as the 4th Annual Granger lecture which was given by Hashem Pesaran. Pesaran talked about "Aggregation in Large Dynamic Panels", referring back to a conjecture that Granger had made on page 237 of his seminal 1980 Journal of Econometrics paper concerning fractional integrated processes. Pesaran showed that it was true.

Stock talked about "Forecast for time series with smooth spectral densities". The motivation for his work was the observation that the Akaike Information Criterion often seems to point to longer lags being needed in models than the Bayesian Information Criterion. Another puzzle was that the average of AR(1), AR(2) and AR(3) models (with equal weights) appeared to be working better in forecasting than any of the individual models. From simulations it seemed to be that the BIC was capturing big effects with the AIC picking up more local effects. Watson's paper "Estimating turning points using large data sets" started with a review of the various attempts to date the business cycle, going back to the early work of Thorp, and then Burns and Mitchell, through Moore and Zarnovitch and on to the Business Cycle dating committee approach. Early approaches were based on eye-balling the graphs looking for clusters of peaks (or troughs) while more recent attempts have been based on more analytical and computer-based methods. We were shown a nice "temperature" colour plot which illustrated very clearly the clustering of series with peaks and troughs.

Hsiao looked again at the question of whether there is an optimal forecast combination while Elliott looked at a similar issue relating to the averaging and optimal combination of forecasts. He showed that forecast averages beat OLS combinations.

In the last paper of the conference Gonzalo introduced the concepts of "summability" and "co-summability" which, he said, are the extensions of the concepts of integration and co-integration to non-linear systems. After two days of concentraion on heavy-duty econometrics (not to mention the several glasses of red wine consumed at the dinner on the Monday night) I was struggling to keep up with the detail of this talk, but I was able to appreciate the general idea. If we are dealing with non-linear processes of any kind - maybe x^2(t) where x(t) is I(d) - then we want to ensure that any estimated model is balanced on both sides of the = sign.

The conference was organised by Rob Taylor and Dave Harvey of the Granger Centre for time series econometrics. I am sure that all attendees will join with me in thanking them for all their hard work in putting together the programme and ensuring that everything ran smoothly over the two days. I appreciate too the efforts of all the presenters. Although I didn't instantly understand everything that I heard at the conference, I am hopeful that with more thought and more reading (when some of these papers come out in printed form) in my head there will be convergence to a point not too far away from the authors' equilibrium position!

[1] Ericsson, N R, Hendry, D F and Mizon, G E (1998) Exogeneity, Cointegration, and Economic Policy Analysis. Journal of Business and Economic Statistics, 16, 370-387.
[2] Granger, C W J (1980) Long memory relationships and the aggregation of dynamic models. Journal of Econometrics, Volume 14, Issue 2, pages 227-238
[3] Granger, C W J and Hendry, D F (2005) A dialogue concerning a new instrument for econometric modeling, Econometric Theory Volume 21 pp 278-297
[4] Hendry, D.F. & H.-M. Krolzig (2001) Automatic Econometric Model Selection. Timberlake Consultants Press.