Saturday, October 30, 2010

Seminar 9th November

Just a reminder that next Tuesday 9th November at 2:00pm I will be giving a seminar at the Arndt Corden Division of Economics on the role of energy in long-run economic growth. I'll be happy to send you a copy of my paper if you e-mail me. We hope to get it up on the web in a formal working paper series soon at which point I will blog about it. Abstract and details are here.

Thursday, October 28, 2010

CCEP Working Papers Launches

Yesterday, CCEP: Centre for Climate Economics and Policy at ANU was launched. The Centre has around 30 associates. The majority are from outside ANU and several are from outside Australia too. We already have the first working papers up on the website.

Tuesday, October 26, 2010

EEN Symposium: 22-24 November


The EEN Symposium will take place between 22nd and 24th November at the Crawford School at ANU. The symposium will showcase the results of the research carried out by the Environmental Economics Research Hub as well as presentations of 14 invited papers from outside researchers. My presentation is at 1:30pm on Monday, 22nd November.

Registration is free! You can find the
full program here.

Monday, October 25, 2010

ARC Funding Outcomes Announced

The Australian Research Council announced it's decisions on applications for Discovery and Linkage grants today. Congratulations to Frank Jotzo and Peter Wood on the success of their application. Also, congratulations to my colleague in the Arndt-Corden Department of Economics, Prof. Athukorala whose proposal with Peter Robertson: "Sustaining India's economic transformation: challenges, prospects and implications for Australia and the Pacific region" was also funded. Also of interest given the topics of this blog:

  • Jakob Madsen won an Australian Professorial Fellowship for "The great divergence, long-run growth and unified theories of economic growth."

  • David Pannell, John Rolfe, Michael Burton, and Jessica Meeuwig received funding for their Linkage project: "Do scientist and public preferences diverge? Analysing expert and public preferences for environmental and social outcomes for the Swan River."


  • Congratulations!

    Sunday, October 24, 2010

    Porsche Develops Hybrid Technology



    I blogged about the development of hybrid cars by BMW and Mercedes when I was visiting Munich. My point was that fuel economy standards were forcing luxury car makes to adopt hybrid technology. I saw this as a route to wider adoption of hybrid technology in mass-market cars. Non-luxury hybrids seem so far to only appeal to "green consumers" willing to pay a premium for lower fuel consumption. Anyway, the New York Times has an article on new hybrid systems developed by Porsche.

    One is a hybrid version of the Cayenne SUV with a 35kW electric motor and a 250 kW petrol engine. Urban fuel economy is improved from 16 mpg to 21 mpg. Highway fuel economy only improves by 10%. Then there is a racing car that uses a flywheel to store energy from braking which can then be used to power a generator. It has two 60 kW electric motors driving the front wheels in addition to the 360 kW petrol engine. The third system is a concept car that has a 375 kW petrol engine and front and rear electric motors that can produce a total of 164 kW. For comparison, the current standard V6 Ford Falcon has a 195 kW engine.

    There is an increasing diversity of body plans out there, which is a positive sign in the development of a new technology.

    Friday, October 22, 2010

    Tests for Non-Linear Cointegration

    It took more than 25 years since the discovery of cointegration for someone to come up with general tests of cointegration in nonlinear regression models. Choi and Saikkonen published a paper on the topic in the June issue of Econometric Theory. One place where this might be relevant is, of course the environmental Kuznets curve, where Martin Wagner argued that standard cointegration methods could not be applied to a model that included powers of the explanatory variables. Wagner has a paper with Seung Hyun Hong on just this topic. But a lot of standard models such as the translog consumer demand model involve non-linear functions. So this is a very useful advance.

    Thursday, October 21, 2010

    Survey Paper on Estimating Consumer Demand Systems

    If you are looking for a nice survey paper on estimating static consumer demand systems (I was) Apostolos Serletis and William Barnett put one out a couple of years ago in the Journal of Econometrics. It's a nicely organized paper that should be understandable to anyone who's done the basic graduate level micro-economics and econometrics courses. In other words, it is really approachable compared to most papers published in the Journal of Econometrics :)

    The beginning of the article reviews the basic neoclassical consumer demand theory. Following that, parametric approaches to estimating demand systems such as the Rotterdam model, flexible functional forms such as translog and AIDS, and "semi-non-parametric" forms such as the Fourier approach are covered. Next on the agenda are sections on revealed preference and on Engle curves. The final sections cover estimation issues, theoretical regularity - to what degree estimated demand functions meet the restrictions of neoclassical theory - and econometric regularity - mainly a discussion of non-stationarity - i.e. data with stochastic trends.

    The only additional issue I would want the paper to cover, of course, is the question of how to interpret the estimated elasticities - are they estimates of short-run or of long-run elasticities - and how reliable the estimates from any single study are. The former is affected largely by the type (time series, cross section etc.) and properties (stationary, non-stationary etc.) of the data and the latter by sample size as I discuss for industrial interfuel substitution elasticities in my forthcoming paper and in my recent paper on estimating the emissions-income elasticity.

    Why You Should Have a Blog

    (if you are an academic)

    The vast majority of hits on my website that originate with Google are from people entering keywords closely related to my name. By contrast, Google Analytics shows that almost no-one looking for my name arrives at my blog. So the blog attracts an audience that would be unlikely to arrive at my website and check out my publications on all the topics that I write about on the blog. In the past I did have a bit more content on my website. But the blog now has more than 300 articles for search engines to hit on a wide range of topics. So I think it has been pretty successful in getting my opinions and expertise on that range of topics out to people who are interested and much more successful than I think the website would ever have been. There isn't as much crossover of visitors from one site to the other as I would have liked. But then most of the links about my research on both sites go to RePEc etc. rather than to the sister site.

    Wednesday, October 20, 2010

    Index Numbers and Consistency in Aggregation: Part II

    This post gets even more technical than the last. I'm just blogging about what I'm reading in the course of my research. I read a whole bunch more papers on index numbers, which got more and more technical. The bottom line is that for most applications the chain Fisher index is an appropriate index to use.

    An index is superlative if it is exact for an aggregator function (e.g. a production function) that can provide a second order differential approximation to an arbitrary twice differentiable linear homogenous function. A second order differential approximation is one where the level, first derivative, and second derivatives of the two functions are equal at the point of approximation.

    Diewert (1978) shows that the Vartia I index differentially approximates a superlative index as long as both prices and quantities are strictly non-zero and there is actually no price and quantity change between the periods. What this means is that for relatively small changes in prices and quantities the Vartia I index will give very similar results to superlative indices like the Törnquis index and the Fisher index.

    The nature of superlative indices themselves means that “chained” indices are always preferable to non-chained indices. A chain index is one where the index is computed for each year (or whatever is the smallest available gap between datapoints) and the product of those annual indices is used as the time series of the index over time even if we only want the change over a much longer period.

    Diewert (1978) goes on to show that chained superlative indices will yield close to consistent aggregation for relatively small changes in prices and quantities. Diewert (1978) also shows in an empirical appendix that chained Vartia, Törnquist, and Fisher indices produce almost identical results and that two stage aggregation produces almost the same results as one step aggregation for Törnquist and Fisher.

    An additional advantage of the Fisher index over logarithmic indices such the discrete Divisia index also known as the Törnquist index is that it can easily handle the introduction of new goods, as zero values pose no problem for it. One way to deal with this for the Törnquist or Vartia I indices is to compute the price index, assuming that the price of a new input was the same before its introduction as in the year of its introduction and then find the quantity index as the ratio of total value to the price index.

    References
    Diewert, W. E. (1978) Superlative index numbers and consistency in aggregation, Econometrica 46(4): 883-900.

    Tuesday, October 19, 2010

    Index Numbers and Consistency in Aggregation: Part I

    There are many formulae for the index numbers used to compute price and quantity indices, such as a consumer price index or a volume index of imports, in economics. The Laspeyres, Paasche, Divisia, and Fisher indices are the best known of these formulae. A body of theory examines the criteria that can be used to decide which formula to use in a particular application. One important property is consistent aggregation. Say that consumers purchase the following categories of goods and services: Education, health care, food, clothing. First compute price indices for goods using the data on quantities and prices of food and clothing and services using the data on quantities and prices of education and health care and then compute a consumer price index using the resulting goods and services price indices. If this index is the same as a consumer price index computed using the data on the four original commodities then the index formula is said to exhibit consistent aggregation.

    Another important property is “Fisher’s factor reversal test”. Compute a price index for the ratio of the prices of a group of commodities relative to a base year as well as the corresponding quantity ratio index. Then if the product of these two indices equals the ratio of the total value or cost in the second period relative to the first the index formula is said to pass Fisher’s factor reversal test.

    Vartia (1976) proposed a formula that passes both these tests dubbed the Vartia I index. The Vartia I index for a change in price between period 0 and period 1 is:



    where superscripts refer to the two time periods, the pi are the prices and the xi are the quantities of each of the commodities indexed by i. p and x are the vectors of the prices and quantities. L() is a function defined by:



    But Vartia’s index isn’t perfect. Another desirable property of index functions is that a quantity index for the ratio of aggregate quantities in the second period relative to the first should be equal to the ratio of production functions that use those quantities of inputs in the second period and the first.* Such an index is called an exact index.

    Diewert (1978) shows that Vartia’s index is only exact for the Cobb-Douglas production function. This seems rather disappointing as the Cobb-Douglas function imposes an elasticity of substitution of one between each pair of inputs rather than letting the data speak. This seems rather restrictive. So maybe Vartia’s index isn’t as ideal as he thought?

    References
    Diewert, W. E. (1978) Superlative index numbers and consistency in aggregation, Econometrica 46(4): 883-900.
    Vartia, Y. O. (1976) Ideal log-change index numbers, Scandinavian Journal of Statistics 3: 121-126.

    * Similar relationships exist for the price index and the unit cost function and for utility functions etc.

    Monday, October 18, 2010

    Launch of Centre for Climate Economics and Policy

    The launch of the new Centre for Climate Economics and Policy at ANU will be on 27th October following the Asian Climate Change Policy Forum. The new centre will be directed by Frank Jotzo. There will be a working paper series, which will take over from the EERH Working Papers in the area of climate change.

    Writing and Publishing Tips from Nature

    Very good advice (almost all of which I follow myself) from Nature on writing and publishing.

    My only caveat is that there is a real trade-off in economics between getting published in reasonable time and getting published in the top journals. The top journals have very slow review processes and very high rejection rates. Not all of them use the "desk reject" system used by top natural science journals like Nature and Science, though some do.* If top journals take a year to review a paper and accept fewer than 10% of submissions vs. 3 months at typical lower ranked journals and acceptance rates of 30-50% it is a real question as to which it makes sense to submit to. This is especially the case for people on the job market who want to quickly get some publications onto their CV but also for authors of policy-oriented articles who need to publish in a time-frame before the issues change significantly. If you just want to get your paper to the audience then it could make sense to send it to the second-tier journals (those ranked by the ARC as A journals). It will be cataloged in the Web of Science and Scopus and thought of as a reliable paper by most potential readers. But it won't help as much in getting a job or promotion as a paper in a top-ranked journal ** and some readers might think it a less reliable source and, therefore, be less likely to read it and cite it. If you have a paper that you think might be publishable in a top journal check whether that journal has a desk-reject policy.

    * At Ecological Economics we do use the desk-rejection system (and we're a "second-tier" journal). Papers that are either very weak or not related to the topics we are interested in publishing on will likely be desk rejected. At the moment it is a minority of papers that are not sent for review by referees. Often an associate editor like me will decide whether to reject the paper.

    ** At the most elite departments in the US (maybe UK?) articles in a second-tier journal could be a negative, especially for new PhDs. If the only thing on your CV is a second-tier article, search committees are likely to downgrade their evaluation of you. It could be better to have no publications and a PhD from a top program. If you are coming from a low-ranked program you should get publications on your CV somewhat irrespective of quality.

    Sunday, October 17, 2010

    Causes of the Demographic Transition

    Oded Galor has made many contributions to growth theory and population economics and the connections between them. A new working paper examines various economic theories of the causes behind the demographic transition. In Galor's terminology the demographic transition refers specifically to the decline in fertility rates and population growth. This is the third phase in the conventional demographic transition model:



    Galor gives the background of each theory, facts that they predict, and whether those facts match reality. It is a very useful survey. Hypotheses/theories that he rejects based on the evidence are that the fall in fertility was caused by:

    1. Rising income in the early industrial revolution - a theory he associates with Gary Becker.

    2. The decline in infant and child mortality.

    3. Development of capital markets reduced the need to have children to support oneself in old age - The old age security hypothesis.

    The ones for which he finds support are (not surprisingly given his previous work on these topics):

    1. The rise in demand for human capital resulting in a trade-off between child quantity and child "quality".

    2. The decline in the gender gap in human capital and wages.

    Energy Efficiency Report Part II

    Reading through the report they seem to come to similar conclusions to me on Australia's track record on energy efficiency. The main goal is a 30% reduction in Australia's energy intensity by 2020. This implies an annual reduction of 2.6% per annum. Since 1980 energy intensity has declined by 1.3% per annum so the target is fairly ambitious in seeking to double this historical rate.

    The centrepiece policy recommendation is to broaden existing energy efficiency schemes that currently exist in NSW, Victoria, and South Australia to a national energy certificate scheme. Credits would be generated by energy efficiency increasing investments that could then be sold to energy suppliers who would be obligated to improve the energy efficiency of their customer base. An interesting feature of this proposal is that it reduces the "split incentives" faced by renters and landlords. Often, landlords are reluctant to improve energy efficiency because they won't gain the benefits of energy cost savings while renters are ill-informed about the energy cost parameters of alternative rental properties and so don't make choices of where to live and how much rent to pay on that basis. In very tight rental markets there is often little choice anyway on where you can live*. Under the certificate scheme the landlord could sell the credit and the renters gain from the cost savings. Under an energy tax the incentives remain as asymmetric as they are now.

    A problem with such schemes is that they seem to ignore the rebound effect. However, rebound effects are usually much less than 100%.

    * See search markets

    Saturday, October 16, 2010

    Report of the Prime Minister's Task Force on Energy Efficiency

    The report of this group commissioned during Kevin Rudd's period as prime minister was released about a week ago. I gave a presentation to some members of the team earlier this year on my work comparing Australian energy efficiency to that of other countries. So I was particularly interested to see what they came up with. One interesting point for academic economists is that the extensive reference list in the report includes hardly any references to the academic energy economics literature. There are no references to journals like Energy Policy or Energy Economics. There are several to a special issue of the journal Energy Efficiency which dealt with the energy efficiency certificates that the Task Force came out in favor of. I wonder whether this is because what is published in these journals is too esoteric, too useless or irrelevant, or the group just didn't have the time to do look through that stuff. It's got to be a bit depressing for people who publish stuff on energy policy to see a review which doesn't really look at all at most of what has been published academically on the topic. Also, the advisory group to the task force included representatives of industry and NGOs but no academic researchers.

    I'll have more on the report soon. Henry Ergas doesn't like it.

    Wednesday, October 13, 2010

    John Ionnanides



    For those of you interested in meta-analysis, the Atlantic has an interesting article on John Ioannidis (needs subscription I think?). I've written about him previously and his paper "Why Most Published Research Results are False". This article gives more color about him and his research group.

    On a related note, you can now get my article on meta-analysis of interfuel substitution for free at Journal of Economic Surveys. This seems an odd publication model to me. Why give away the paper for free before assigning it a journal issue and page numbers and then put it behind the paywall? The New York Times used to follow a similar online model where archived material cost money but the current issue was free online. But in that case people looking for old articles wanted very specific information and were probably more willing to pay for it than someone looking for current news who could just go to another website.

    Monday, October 11, 2010

    Tips for Choosing a Title for a Paper

    I'm having a harder time than usual in deciding on a title for our latest paper. For some reason none of the alternatives I have seem good. So I looked on the web for some ideas and the following seem to be the key useful ones:

    1. Make sure there are the main keywords in your title. You may think that your abstract will handle that. Google Scholar thinks otherwise.

    2. Shorter is better than longer, subject to condition 1.

    3. Active language is better than passive. One site I encountered favored the opposite resulting in really boring titles.

    4. Get the most important idea first in the title - this was not something I had really thought of. That means I have to choose the most important idea :)

    Now I have even more potential titles than before!

    Sunday, October 10, 2010

    The Story of Climate Change Legislation in the Obama Administration

    Peter Wood pointed out this story in the New Yorker, which chronicles the history of the so far failed attempts by the Obama Administration and various senators to legislate on climate change policy.

    Saturday, October 9, 2010

    Is the Drought Over?



    As you can see from the above graph, after a long period of standing at about 50% capacity, Canberra's dams are now at around 80% capacity. All the dams in the Western catchment in the Brindabella-Namadgi mountains are near capacity. Googong to the east which is half the total capacity is only at 60%. But that too is an improvement. As we flew into Canberra on Thursday we could see that the Eastern half of Lake George in now full of water. That's the most I've ever seen. I suppose that that is about a 1m depth. The historic shoreline is at 2m depth. The prehistoric shoreline, within the last 1000 years is at 17m! So far things seem to be following the prediction in one of my first blogposts.


    Image of Lake George in August from Wikipedia

    Thursday, October 7, 2010

    Back Home

    I'm finally back from my trip to Europe (mainly work) and Asia (mainly family visit/vacation). As an Australian the only country we visited that seemed expensive overall was Denmark. Sweden no longer seems to be a terribly expensive country as it once seemed to be. The Big Mac index doesn't agree though. Thailand, our last stop, is of course way cheap but I noticed that drinks in Starbucks don't cost much less than in the US. In general restaurant meals ranged from 1/6 (foodcourt in cheap mall) to 1/3 (waiter service restaurant but some are also more like 1/5-1/4 the price) of Australian prices for the same quality of service. Blogging might continue to be sparse until I am fully up to speed here again. The downside of going on vacation in careers like academia is that the work doesn't go away, it just piles up for you for when you get back.