Limits to "Limits to growth"
Published by Bryan · 16 April 2021
Those of us of a certain age with an interest in environmental issues will remember “The limits to growth”, the 1972 Pan paperback authored by Donella Meadows and others at MIT. It will also be familiar to students of systems thinking, as it was perhaps the first major use of computer technology to support a large system dynamics model.
The LTG model that the MIT team develop looked at the relationship between population, industrial output, food production, pollution, availability of non-renewable resources, birth rates, death rates and the level of services that could be provided (health and education), and made projections about how these might interact through to the year 2100. The predictions were not good. Although they ran the model with different conditions, the one that was seen to be most likely given the way the world economy ran in the 1970s was that most things would progress steadily until about 2030, at which point there would be a collapse as a shortage in natural resources made it harder to deliver services. In 1972, 2030 was literally a lifetime away (mine certainly), but now it looms large.
As George Box said, all models are wrong but some are useful, so how useful has the LTG model been? Although it was roundly criticised throughout the 1970s by neoclassical economists, several studies conducted during the last 20 years have shown an alarming accuracy in its predictions, so if nothing changes, we may be not far away from catastrophe.
But one of the more interesting aspects of LTG is little known. The report was commissioned by the Club of Rome, a group of intellectuals and professionals, on the basis of a paper written by one of its members, Hasan Ozbekhan, entitled “The predicament of mankind”. In it Ozbekhan listed 49 ‘continuous critical problems’ that he thought the world faced at that time. The paper called for the then newly emerging computer technology to explore the relationship between these problems to try and develop an understanding about how they were connected and what the implications might be. LTG was the result.
However, Ozbekhan was not happy with the result because he felt that the MIT team had focused very much on hard information that was easily quantifiable and amenable to system dynamics technology. One of the continuous critical problems that they did not take into consideration was CCP-18, “Growing irrelevance of traditional values and continuing failure to involve new value systems”. Subsequent research by Alexander Christakis into the 49 CCPs using different computer analysis techniques has shown that CCP-18 is a root contributor to all of the other problems, suggesting that a failure to deal with value systems dooms everything else.
A cursory reflection on late 20th century and early 21st-century value systems seems to confirm this. By the 1980s the world was dominated by neoliberal thinking and the importance of the individual and their choice — consumption, lifestyles, the need for new things, to better oneself.
I am not sure that we have learnt much about the possibly fatal nature of our Western value systems. During 2020 at the height of the COVID-19 pandemic there was talk about ‘building back better’, and movement restrictions might have encouraged people to reflect on what was actually important in their lives and what they wanted for the future.
Will this have triggered a new enlightenment?