Culture of the Commons

Interview with Mayo Fuster Morell

Mayo Fuster Morell, responsible for BarCola, a group working on collaborative economy policies within the Barcelona city council, shares her thoughts on how commons-based forms of collaboration can build a more just society.

What is the commons?

Commons is an ethos and an umbrella term which encompasses many practices and transformative changes. The commons emphasises common interests and needs. It includes collaborative production, open and shared resources, collective ownership, as well as empowering and participative forms of political and economic organizing.

It is, though, a very plural concept with very diverse ‘traditions’ and perspectives. Some commons for example, are connected to material resources (pastoral, fields, fishing etc) and others to immaterial ones (knowledge etc).

In the area of knowledge commons, the emphasis is on the conditions of access – open access and the possibility to access resources and intervene in their production without requiring the permission of others. It emphasises knowledge as a public good, a patrimony, and a human right.

Why do you believe we need to return to the idea and practices of the commons?

Commons were present prior to capitalism. But somehow this question implies that the commons is something of the past, not of the present. Is the family, the market, or public space something of the past? Would we pose such a question about these ways of organising ourselves? Commons is an integral part of our present society; not something to recover from the past.

Good point. Perhaps then the question is why is the issue rising up the political agenda?

I think this is partly connected to the democratization of knowledge and engagement that came with the arrival of the Internet. Cyber-scholars such as Yochai Benkler (author of Wealth of Networks), argue that the commons moved from the margins to the center of many economic systems, because of the reduction of costs for collective action due to the widespread adoption of ICT. Others such as Carol Rose (author of The Comedy of the Commons) have shown how internet commons resources become more valuable the more people access them- which undermined former critiques of the commons who argued that they were unsustainable and encouraged free-riders. With the internet, the opposite is true!

Another trajectory that has come to the fore is that of the collective governance of natural resources, in which community governance of a common-owned resources is central. This is the frame of Nobel-prize winner Elinor Ostrom’s school.

Another trajectory we can see is that of defending public services and the resistance to neoliberal enclosure of public institutions, which TNI does a lot of work on.

Yet despite these diverse trajectories, they clearly all point to integral parts of our current system and provide solutions.

Why did digital culture draw so heavily – and continue to have strong – commons-based approaches?

There are many reasons. I would highlight two. The first is the fact that digital culture emerged, in many ways, from counterculture. This story is very well told in a book by Turner, From Counterculture to Cyberculture. Hippies and psychedelic subcultures were amongst the first to see the utility of the Internet, developing virtual communities. The values of community were very present in these early forms of the internet. In fact, the Internet itself was and could still be conceived as a commons – even it has been increasingly enclosed by corporations allied with authoritarian regimes.

The second reason is the decentralised character of the Internet. This enabled the democratisation of access to the means of cultural production. Today, large parts of the population have access to immense resources of knowledge, along with programs to create and remix that knowledge. The internet also facilitated new forms of distribution of that knowledge, which favoured collaborative and open production. It made an economy based on collaborative and open production and distribution and sharing more efficient than closed, proprietary and product-based modes of the previous industrial era.

What stands out for you as the most interesting concrete projects in the digital commons?

Wikipedia certainly is one. Since its creation in 2001, it has become one of the largest reference websites in the world, with an incredible 70,000 active contributors working on more than 41 million articles in 294 languages. The Free and Open source project, that has enabled people to freely use, copy, study, and change software in any way, and underlies many well-known software systems is another one.

The rise of platform cooperatives such Fairmondo and SMart are also promising developments. Fairmondo is an online marketplace for ethical goods and services, that originated in Germany and has expanded to the UK. It is a cooperative alternative to Amazon and Ebay. SMart is a cooperative that pools services and skills to make them affordable for creative freelancers.

Crowdfunding platforms like Goteo, which are building alternatives to the current financial system are also potentially very significant. Goteo has created a community of over 65,000 people, providing civic crowdfunding and collaboration on citizen initiatives and social, cultural, technological and educational projects.

It seems digital commons movements are integrally linked with culture – both in the way they work and the cultural outputs they are producing. What are the lessons for social movements in general?

I think it’s useful to highlight two great conceptions of culture: culture (with a small ‘c’) and Culture (with a big ‘C’). Culture refers to artistic expression, culture refers to our anthropological nature. Every human activity involves culture, and so with this meaning certainly, commons are connected to culture.

It is perhaps not a surprise that the commons emerged as a predominant organizational form to organize the governance and sustainability of artistic production. These forms of Culture are often based on self-governed modalities, favoring open access, innovation and remix, and putting community needs and creativity first and profitability second.

As I mentioned, one of the first digital areas to favor a more commons organizational form of production was in the area of software production, with the emergence of free and open source projects like Linus or Apache which became the dominant mode of production (larger than proprietary systems) in certain areas of software industry. From there, it was an easy step to move commons-based organization of music, and film, and also encyclopedias and other content subjects, that could benefit from collaborative production. The term free culture refers to this

Now, we see commons production expanded to almost any area of production, including currencies, city landscapes (like urban gardens and orchards), architecture (FabLab), and the open design of cars (Like Wikispeed car) to toys. The early forms of the digital commons have helped inform these newer forms.

Regarding lessons for social movements, I think they can help us expand the conception and practice of participation moving from forms of organizing that require high levels of involvement by a few, super activists towards models based on economies of participation. The key is to integrate participation based on diversity – not only strong contributors, but also allowing for weak and sporadic involvement, and people who can only follow the process and allowing those different types of involvement. Somehow we need to democratize participation in social movements in order to reach and adapt to a larger social base.

What are the key elements that make up a culture of the commons?

Commons are very very diverse, that is something that defines them, as they adapt to local and specific circumstances, and are embedded within the specific community and commoning to which they belong.

I would say these are the great key principles, but not all of them are necessarily present in all the families of commons or specific commons:

Community organizing (openness to engagement)
Self-governance of the community by the creators of the commonly-held value
Open access to the resources created
Ethics of looking beyond profitability to serve social and environmental needs and inclusion

Inclusion is perhaps one of the weakest elements, particularly in the Free/Libre/Open Source Software (FLOSS) movement. Studies suggest that only 1.5% of contributors in FLOSS communities are women, while in proprietary closed software production, the proportion is closer 30%. How can FLOSS be a model with such poor gender participation rates? Similarly, communities that manage natural resources, like fishing commons institutions in Albufera, Valencia, restricted women’s participation until very recently. Additionally, commons theorising tends to be very dominated by male authors, who engage very little with feminist theory.

Commons approaches need to do more to embrace more these issues, develop methodologies and highlight and learn from the cases which perform well in terms of gender inclusion. Barcelona en comu, in its attempts to reclaim political institutions for the common good, is one example where feminist wisdom is properly engaged in the commons and seeking to bring about gender equality.

Tell us about what you have been involved in through Barcelona en Comu.

I was on the initial list of people that promoted the launch of Barcelona en comu. This was a citizen platform launched in 2014 in the wake of the popular uprisings that took the squares of many Spanish cities after the financial crisis. The platform for Barcelona en comu was drawn up in a highly participative way, and has sought to put participative democracy and commons-methodologies at the heart of governance.

I am member of Barcelona en Comú and am responsible for BarCola, a group working on collaborative economy policies within the Barcelona city council. Our group has helped organise the project and conference of procomuns.net which is raising popular awareness of commons-based collaborative economic initiatives, providing technical guidelines to communities for building FLOSS technologies and making specific policy recommendations for the Barcelona City Council and for the European Union and other administrations.

Our first international event in March 2016, brought together more than 400 participants to develop 120 policy recommendations for governments.

How is the so-called ‘Shared Economy’ different to the digital commons?

The sharing economy is not different to the digital commons; it just puts more the emphasis on the economical dimension of the commons. However, there has been a wikiwashing of the term, with the media inaccurately using the term sharing economy to refer to the on-demand economy, dominated by firms like Uber and Airbnb.

These are economies based on collaborative production, but they do not include commons governance, access or an agenda of serving the public interest. A true sharing economy is one that is connected to the community and society and looks to serve the common interest, building more egalitarian relations.

How do we prevent corporations – or other structures of power such military – taking over the digital commons?

At these moment there are in my view three key strategies and goals:

1) Create public commons partnerships. Push for political institutions to be led by commons principles and to support commons-based economic production (such as reinventing public services led by citizens’ participation, what I call commonification). Barcelona en comu is providing a great model for this.

2) Reclaim the economy, and particularly develop an alternative financial system.

3) Confront patriarchy with the commons, in other words embrace freedom and justice for all, not just for a particular privileged subject (men, white, etc) and help foster greater diversity in society.

I think increasing the commons as a matrix within our systems has the greatest potential to develop an alternative to the current capitalist system.

— source tni.org

Most Adults Spend More Time on Their Digital Devices Than They Think

A recent national survey conducted by Common Sense Media, which included nearly 1,800 parents of children aged eight to 18, found that parents spend an average of nine hours and 22 minutes every day in front of various screens—including smartphones, tablets, computers and televisions. Of those, nearly eight hours are for personal use, not work.

Perhaps even more surprising is that 78 percent of parents surveyed believe they are good role models for how to use digital technology. Multimedia are designed to be engaging and habit-forming, so we do not even realize how much time we spend when we heed the siren call of our devices, says Catherine Steiner-Adair, a clinical psychologist and author of The Big Disconnect

— source scientificamerican.com

On culture and power

Interview with Ahdaf Soueif

We seem to be entering through a period of contradictions, of dislocating change and crisis, of raised hopes and terrible realities. We have seen the economic crisis questioning the very pillars of neoliberalism yet market fundamentalism has not faltered in its march. We have had inspiring flourishing of social movements yet authoritarian leaders everywhere are rising to the fore. How do you understand this moment?

There’s clearly a struggle taking place on a universal scale between a system that has the world in its clutches, and something new that’s trying to be born. In a sense, this is the story of mankind.

But some elements in our situation are particular to this moment. The first is an awareness of the interconnectedness of the world – in both problems and solutions.

Of course people with interests in the matter have always been aware of the opportunities that different parts of the world present; trade, conquest and migration are all based on that. But now there’s a growing general awareness that the world’s problems need to be solved globally.

Environmental issues are the most obvious examples, but there are many more – the exponential increase of both wealth and poverty and the obscene gap between rich and poor, wars, migration and the movement of capital, and more – and they’re all linked.

We can’t pretend that this awareness is shared by everybody, but it’s shared by enough politicized groups of – on the whole young – people across the world (can we call this ‘the Young Global Collective’?) – to make it now unremarkable for us to see Palestinians, say, sending messages of support to Black activists in the US, or to have seen Occupy using Tahrir iconography.

This awareness needs to grow, to coalesce, and to use its potential to generate ideas and action. For this we need forms of global conferring, decision-making, solidarity and action – like the TNI enterprise.

The second element that’s particular to this moment is that the Internet and related technology seem to hold a promise that global conversations and actions are achievable.

The third element is that the fate of the planet itself is in the balance; this lends acute urgency to the struggle.

On the other hand, the existing power system also sees the issues in global terms – and it is in a better position to create and enact its own global solidarity in pursuit of consolidating and furthering its power. Bilderberg, Davos, G8 are all examples of this.

I think we need to recognise that neoliberalism has not failed. It has not failed its proponents. It understands that it has not delivered its promises to ‘the people’ and that, therefore, it is under attack. But its answer is to find reasons outside itself for this non-delivery (immigrants, shirkers, Terror) and to repeat its promises more emphatically every time it changes its front players.

It plays to the fears of the audience, and it breathes life into the demons in their psyches: jingoism, selfishness, racism, a readiness to embrace violence, etc. The Trump campaign was an example of this.

The system – being old and in power – has its ideas, arguments, discourse and justifications in place. And embedded within it are the power structures with which it protects and continuously justifies and consolidates itself: the governments, the intelligence, police, security and military establishments, the legal and financial systems that underpin them – and the media.

One of the traits I find really attractive and encouraging in the Young Global Collective is how unconstrained it is by old ideologies. It has powerful ideas – and has ethics and natural justice on its side – but these ideas are not yet translated – how could they be? – into one overarching idea that can develop into a coherent system for running the world.

It has not yet found a way to coalesce into a global movement – although we often see bits trying to come together as happened in Cancun and Durban. (My sense is that the Green parties are the most suited to embrace and process the impulses and ideas of the Young Global Collective and forge them into a much-needed vision centred somehow around life, sustainability and human rights).

So what we have now is a situation where the Young Global Collective understands that Neoliberalism is lethally bad for most of the world’s population and for the planet itself. It continually challenges various aspects of Neoliberalism in a variety of ways in different parts of the world: activists take on Big Oil, armaments, dismantling the NHS in the UK, police brutality in the US, austerity in Greece, the BDS campaign takes on the Israeli occupation and ethnic cleansing of Palestine, and so on. Every one of these challenges raises our hopes.

It used to be a received idea that if millions came out onto the streets and stayed there the existing power structures would collapse and space would be created for something new. Exactly what the new thing would be like no-one knew, but everyone had a good idea what it would not be like. And everyone hoped there would be space and time for forms to evolve. Egypt 2011 proved that this was not true. Syria is proving it in even uglier fashion.

The young of the Young Global Collective are to a large extent averse to the structures and practices of power. They – commendably – want to change the world but not to rule it. In other words, most social movements would find it an impossible contradiction to employ, for example, an armed force to defend themselves and spin doctors and PR firms to propagate their ideas.

In the midst of this conflict there are now the emerging armed actors, like Islamic State. They serve the existing system – by purchasing arms, militarizing struggles, normalizing violence and by providing a Terror Monster for the use of fear-mongering politicians and so a justification for increased surveillance of citizens, increased spending on arms, intelligence and security. In a way, we are witnessing an alliance of Neoliberalism and Terror whipping democracies into fascism.

All these are things we need to be – I’m certain many of us are, constantly – thinking about, trying to imagine and image, to represent and to develop and to counter.

The culture of Tahrir square

You had the experience of being part of the movements in Cairo and Egypt that inspired the world. Can you tell us something about the culture of those resistance movements? Is there something of that time that remains today?

What happened in Egypt in January 2011, and so embedded itself in the world’s imagination, was a moment – a climactic moment – in a process that had started years before, and continues today.

I would particularly like to remind us of two non-governmental organisations (NGOs) that were set up in the 1990s: al-Nadim Centre for the Rehabilitation of Victims of Torture, and the Hisham Mubarak Law Centre (HMLC). Both were set up by committed and charismatic professionals: psychiatrist Aida Seif el-Dawla and lawyer Ahmed Seif, who were able to gather around themselves similarly committed and smart teams. Their focus was human rights, and their work revealed the extent to which the abuse of human rights had become a normalised part of how power structures in Egypt are serviced, consolidated and extended.

Both organizations sought to redress and challenge this abuse; al-Nadim treated people who had been tortured, published reports and statistics, and took huge risks campaigning against the Ministry of the Interior and individual officers; HMLC provided people with legal support against a wide spectrum of human rights violations, published information and research papers and sought to take cases to the Constitutional Court and so to change and develop the law itself.

By taking on the Mubarak regime in this way al-Nadim and HMLC started this latest round of resistance and enabled it to take root. The positions they took formed the “personality” of the 2011 revolution.

Both organizations provided their services free, paid their staff in line with Egyptian rates, and were very careful in choosing non-governmental funding sources whose agendas matched their own; this way they avoided the alienation and de-politicization that blights so much NGO work.

Both were clear in that their services were available to everybody regardless of nationality, citizenship, faith, gender, sexual orientation, etc. HMLC, for example, took on the defence of the unpopular and dangerous “gay case”; the Queen Boat. Both were welcoming of refugees.

Their audience – their constituency – was the public both at home and abroad. They implemented a vision of human rights and speaking truth to power as international as well as local concerns. In Australia, I once met a Sudanese writer who told me al-Nadim had saved his life and his marriage.

The first decade of the new century saw a number of initiatives appearing in Egypt, all seeking change and challenging power.

HMLC then opened its doors to new initiatives like the Movement for the Support of the Palestinian Intifada (2000) providing these initiatives with a free safe space, resources, information and advice. Lawyers trained in HMLC established their own NGOs supporting freedom of information and expression, workers’ rights, land rights, personal rights, economic rights, housing rights and others.

There were movements for the independence of the universities and for the independence of the judiciary and movements simply for ‘change’. ‘Kefaya’ was an umbrella movement for social and political change which put sudden and imaginative protests on the streets from 2005 to 2011. The 6 April Youth Movement, which worked to establish links with workers’ protests, managed to create a reasonable cross-country presence with activists in every major town.

When –after 28 January 2011 – Tahrir – and other locations in Alexandria and other cities – for a period became a liberated space, the culture they created was informed – in its basic principles – by the spirit of the work of the previous decade.

One clear principle was the empowerment of people. Activists taught reading and writing to street children who, for the first time, found a safe space on the street. The Mosireen Film Collective trained anybody who came along to shoot and edit film. Some of the trainees were street children who went on to shoot their own footage with Mosireen equipment. Mosireen documented housing struggles, fishing, industry and legal struggles and amplified people’s voices through them.

‘Let’s Write Our Constitution’ was an initiative set up by Alaa Abd el-Fattah (Ahmed Seif’s son and one of the most prominent figures of the revolution. Now serving five years in prison for protesting) to elicit a new set of constitutional governing principles from ordinary people across the country.

Freedom of information was another principle, with Mosireen, again, acting not only as producer of footage but as collector, archivist, point of exchange and distributor of footage onto mobile phones. By the end of 2011, when the Supreme Council of the Armed Forces (SCAF) was killing revolutionaries on the streets, this footage was used by activists in the campaign: Kazeboon (‘They Lie’), to expose the lies of the military.

Kazeboon achieved what every grassroots movement aims for: non-ownership. People across the country downloaded footage from mobile phones, acquired or borrowed projectors and set up surprise screenings against walls. This was how everyone came to see the huge gap between the rhetoric of the military and what they were doing on the ground.

Another initiative set up by Alaa Abd el-Fattah – who happens to be a software designer – was ‘Tweet-Nadwa’; a discussion forum run by Twitter rules. This was one of the activities in which technology and game-playing were used to celebrate diversity, tackle difficult topics and bring people together.

The first Tweet-Nadwa brought young people from the Muslim Brotherhood and others who’d left the Brotherhood to talk about their experience with each other and with a wide audience – a meeting that was a ground-breaker for everyone involved. Another was held sitting on the ground in Tahrir, about reforming the police. Applause was not by clapping but by ‘twittering’ your hands in the air. Attracted by the twittering hands more and more people joined the nadwa and the discussion. Tweets were streamed on a large screen.

In Tahrir there was, overall, a rejection of Neoliberal capitalism. There was true altruism, a rooted belief in human rights and a tremendous emphasis on social justice. People understood that their views about how to achieve social justice ranged from centre right to far left, but it was enough that everyone wanted it. For the moment the work of keeping the sit-ins alive, pressing for the removal of Mubarak and then trying to press for transparent, democratic government provided enough common ground for people to work from.

As Tahrir was periodically under attack by security forces, the army and various related thugs, field hospitals very quickly appeared. The organization and efficiency of these cannot be overstated. Entire systems came into being organically to save and treat people. After the attacks they often stayed in place to provide everyone with free medical care. A culture also grew where many physicians who did not come to Tahrir made themselves available to perform emergency surgery – particularly eye surgery – for free at their hospitals and clinics.

And, of course, art was everywhere. From simple pavement drawings by people who suddenly realised they were free to make them, to huge sophisticated murals by graffiti artists that expressed – and created – the collective spirit of Tahrir. Graffiti artists recorded events, made statements, created iconic emblems – The Blue Bra, Nefertiti in a Gas Mask, Angel Ultras, Universal Man – and eventually created massive murals which mined the art of every era of Egypt’s long past to bring its aesthetic and moral force to bear on the present moment.

There’s much more. But I want to close this section with a quote from a young revolutionary, written in December 2011:

Tahrir Square worked because it was inclusive – with every type of Egyptian represented equally. It worked because it was inventive – from the creation of electric and sanitation infrastructure to the daily arrival of new chants and banners. It worked because it was open-source and participatory – so it was unkillable and incorruptible. It worked because it was modern – online communication baffled the government while allowing the revolutionaries to organise efficiently and quickly. It worked because it was peaceful – the first chant that went up when under attack was always ‘Selmeyya! Selmeyya!’. It worked because it was just – not a single attacking (thug) was killed, they were all arrested. It worked because it was communal – everyone in there, to a greater or lesser extent, was putting the good of the people before the individual. It worked because it was unified and focused – Mubarak’s departure was an unbreakable bond. It worked because everyone believed in it. Inclusive, inventive, open-source, modern, peaceful, just, communal, unified and focused. A set of ideals on which to build a national politics.

You ask Is there something of that time that remains today? The answer has to be ‘yes’. Even though thousands of our young people have been killed and thousands have been injured – some without repair; even though tens of thousands are in prison and hundreds of thousands live in trauma.

Even though the country has been through betrayals and massacres, the democratic process has been discredited and the military has established a counter-revolutionary regime more repressive and vicious than any that Egypt has ever known.

Even though there are people who found themselves in the revolution and when it was lost they were lost. Even though there are people who are disillusioned and bitter and people who pretend 2011 was a mass hallucination and people who have gone back to their lives and are trying to forget that the last six years ever happened.

Yet, I would say that everybody who was truly involved in 2011 and who is now working on something – anything, whether they are in Egypt or outside it – is doing work that will one day fuel the next revolutionary wave.

Enough to note the internet news sites like MadaMasr, or al-Badeel, or Yanair and all the people working in them, still providing news, analysis and commentary. The network of legal and practical support for the prisoners – still functioning despite exhaustion. The human rights organizations born of HMLC, still working despite arrests, freezes on assets and smear campaigns. Aida Seif el-Dawla and her colleagues sitting in their office, refusing to close, and facing down 20 security agents just a few weeks ago.

And all of this while arrests disappearances and deaths in prisons and in police stations continue.

Culture and meaning

From your experience in participating in the Arab Spring and then seeing its hopes and aspirations diverted or crushed, how do you see the role of culture in sustaining and one day delivering on those dreams?

It is through culture that we describe the world and what has happened to it and to us in it. We comment on the present, excavate the past and try to imagine a future – or several.

Culture holds up a mirror, criticizes, tries to synthesize; it puts worlds together, opens up feelings, validates them, provides illumination, ideas, respite. Culture is dreaming the dream, it is also enacting it. Culture provides us with the language, the symbols, the imagery to explore, communicate and propose. Without culture no dream is dreamable.

I’d like to give you one tiny, and to me, powerful example of the role of culture creating meaning.

We’ve all seen the ancient Egyptian symbol of the scarab with a disc between its front legs. Well, some artist, thousands of years ago, watched a black beetle lay its eggs in a bit of animal dung. The beetle rolled the dung and rolled it and rolled it till the eggs were encased in a ball of dung at least twice its own size. Then the beetle dug a hole. Then, moving backwards and using its hind legs, it rolled the ball of dung deep into the hole. It then came out of the hole, filled it up and went away.

The artist watched the space where the hole had been until one day, struggling out of the earth, there emerged 15, 20 baby beetles. As they found their feet and shook the dung and the earth off their wings and started to take their first tentative flying leaps the artist saw the new little beetles’ luminous bright blue wings catching the light. Iridescent, sparkling little joyous flickers of sapphire blue against the earth. The Scarab is the dung beetle, the Disc is both the ball of dung and the sun that gives light and life to everything.

For the reader, thousands of years ago, the image of the ‘Scarab holding the Disc’ spoke of ‘becoming’, of transformation and emerging. Just think of all the elements that image brought together, and of the power of what it proposed.

“Culture” is often regarded – and may be presented by political and religious authorities – as being both homogeneous and static, yet we know that there are always fault lines and voices that don’t get a hearing or are suppressed. We also know that well-intentioned external efforts to address harmful or gender-restrictive traditional practices can have the effect of further entrenching them. How, then, can changes happen and be sustained?

I think change happens organically within each community/culture group. The change is for the better – in other words towards more freedom, openness, transparency etc – when people are confident and not defensive. ‘Outside’ intervention should only ever be at the request and according to the demands and guidelines of trusted and authentic ‘inside’ groups. So a feminist group from Somalia, say, wanting to work on feminist issues in Norway, would only do so in partnership with credible, rooted Norwegian groups and within their programme.

A culture of transformation

What does a culture of transformation, democracy and justice look like?

The society we dream of would be one in which no child is born disadvantaged, where basic education and healthcare are free, where people don’t have to worry about survival, where everyone has enough time and resources to fulfill their potential as they see it, where people are truly involved in the decisions that will affect them, where we respect the earth and all natural creatures on it.

I believe that the world needs to be engaged with and run as one unit.

I believe that the processes of democracy unless accompanied by certain safeguards are merely a tool for the system. When people have to make a decision on an issue they need to have all the information that’s relevant to that issue, be able to understand it, and be free of any need or coercion that affects their decision. Without transparency, freedom of information, education and guaranteed human rights there is no democracy.

Finally, this is a statement that I’ve been making for a while, and I would like to make it again here, as particularly relevant to TNI:

“If I could decree a universal education programme, I would make every child in the world learn a brief history of the entire world that focussed on the common ground. It would examine how people perceive their relationship to each other, to the planet, and to the universe, and it would see human history as an ongoing, joint project, where one lot of people picked up where another had left off.”

— source longreads.tni.org

How to Convince Someone When Facts Fail

Have you ever noticed that when you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.

Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. Anti-vaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president’s long-form birth certificate in search of fraud because they believe that the nation’s first African-American president is a socialist bent on destroying the country.

In these examples, proponents’ deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.

Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.

In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect “in which corrections actually increase misperceptions among the group in question.” Why? “Because it threatens their worldview or self-concept.” For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite … and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”

If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1. keep emotions out of the exchange, 2. discuss, don’t attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people’s minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.

— source scientificamerican.com By Michael Shermer

1 in 5 young people lose sleep over social media

1 in 5 young people regularly wake up in the night to send or check messages on social media, according to new research published today in the Journal of Youth Studies. This night-time activity is making teenagers three times more likely to feel constantly tired at school than their peers who do not log on at night, and could be affecting their happiness and wellbeing. girls much more likely to access their social media accounts during the night than boys.

— source alphagalileo.org

Why Jared Diamond’s ‘The World Until Yesterday’ Is Completely Wrong

I ought to like this book: after all, I have spent decades saying we can learn from tribal peoples, and that is, or so we are told, Jared Diamond’s principal message in his new “popular science” work, The World Until Yesterday. But is it really?

Diamond has been commuting for 50 years between the U.S. and New Guinea to study birds, and he must know the island and some of its peoples well. He has spent time in both halves, Papua New Guinea and Indonesian‐occupied West Papua. He is in no doubt that New Guineans are just as intelligent as anyone, and he has clearly thought a lot about the differences between them and societies like his, which he terms Western, educated, industrialized, rich, and democratic (“WEIRD”). He calls the latter “modern.”

Had he left it at that, he would have at least upset only some experts in New Guinea, who think his characterizations miss the point. But he goes further, overreaching considerably by adding a number of other, what he terms “traditional” societies, and then generalizing wildly. His information here is largely gleaned from social scientists, particularly (for those in South America) from the studies of American anthropologists, Napoleon Chagnon, and Kim Hill, who crop up several times.

It is true that Diamond does briefly mention, in passing, that all such societies have “been partly modified by contact,” but he has still decided they are best thought about as if they lived more or less as all humankind did until the “earliest origins of agriculture around 11,000 years ago in the Fertile Crescent,” as he puts it. That is his unequivocal message, and the meaning of “yesterday” in his title. This is a common mistake, and Diamond wastes little of his very long book trying to support it. The dust jacket, which he must agree with even if he did not actually write it, makes the astonishingly overweening claim that “tribal societies offer an extraordinary window into how our ancestors lived for millions of years” (my emphasis).

This is nonsense. Many scientists debunk the idea that contemporary tribes reveal anything significantly more about our ancestors, of even a few thousand years ago, than we all do. Obviously, self-­sufficiency is and was an important component of the ways of life of both; equally obviously, neither approach or approached the heaving and burgeoning populations visible in today’s cities. In these senses, any numerically small and largely self-­sufficient society might provide something of a model of ancient life, at least in some respects. Nevertheless, tribal peoples are simply not replicas of our ancestors.

Britain’s foremost expert on prehistoric man, Chris Stringer of London’s Natural History Museum, for example, routinely cautions against seeing modern hunter-­gatherers as “living fossils,” and repeatedly emphasizes that, like everyone else, their “genes, cultures and behaviors” have continued to evolve to the present. They must have changed, of course, or they simply would not have survived.

It is important to note that, although Diamond’s thesis is that we were all once “hunter-­gatherers” and that this is the main key to them being seen as our window into the past, in fact most New Guineans do little hunting. They live principally from cultivations, as they probably have for millennia. Diamond barely slips in the fact that their main foodstuff, sweet potato, was probably imported from the Americas, perhaps a few hundred or a thousand years ago. No one agrees on how this came about, but it is just one demonstration that “globalization” and change have impacted on Diamond’s “traditional” peoples for just as long as on everyone else. Disturbingly, Diamond knows these things, but he does not allow them to spoil his conclusions.

But he has come up with a list of practices he thinks we should learn from “traditional” societies, and all this is well and good, though little of it appears particularly radical or novel. He believes we (Americans, at least) should make more effort to put criminals on a better track, and try to rehabilitate rather than merely punish. He feels we should carry our babies more, and ensure they’re facing forward when we cart them around (which is slightly odd because most strollers and many baby carriers face forward anyway). He pleads with us to value old people more … and proffers much similar advice. These “self-­help manual” sections of the book are pretty unobjectionable, even occasionally thought-­provoking, though it is difficult to see what impact they might really have on rich Westerners or governments.

Diamond is certainly in fine fettle when he finally turns to the physiology of our recent excessive salt and sugar intake, and the catastrophic impact it brings to health. His description of how large a proportion of the world is racking up obesity, blindness, limb amputations, kidney failure, and much more, is a vitally important message that cannot be overstressed. Pointing out that the average Yanomami Indian, at home in Amazonia, takes over a year to consume the same amount of salt as can be found in a single dish of a Los Angeles restaurant is a real shocker and should be a wake-up call.

The real problem with Diamond’s book, and it is a very big one, is that he thinks “traditional” societies do nasty things which cry out for the intervention of state governments to stop. His key point is that they kill a lot, be it in “war,” infanticide, or the abandonment, or murder, of the very old. This he repeats endlessly. He is convinced he can explain why they do this, and demonstrates the cold, but necessary, logic behind it. Although he admits to never actually having seen any of this in all his travels, he supports his point both with personal anecdotes from New Guinea and a great deal of “data” about a very few tribes—a good proportion of it originating with the anthropologists mentioned above. Many of his boldly stated “facts” are, at best, questionable.

How much of this actually is fact, and how much just personal opinion? It is of course true that many of the tribes he cites do express violence in various ways; people kill people everywhere, as nobody would deny. But how murderous are they exactly, and how to quantify it? Diamond claims that tribes are considerably more prone to killing than are societies ruled by state governments. He goes much further. Despite acknowledging, rather sotto voce, that there are no reports of any war at all in some societies, he does not let this cloud his principal emphasis: most tribal peoples live in a state of constant war.

He supports this entirely unverifiable and dangerous nonsense (as have others, such as Steven Pinker) by taking the numbers killed in wars and homicides in industrialized states and calculating the proportions of the total populations involved. He then compares the results with figures produced by anthropologists like Chagnon for tribes like the Yanomami. He thinks that the results prove that a much higher proportion of individuals are killed in tribal conflict than in state wars; ergo tribal peoples are more violent than “we” are.

There are of course lies, damned lies, and statistics. Let us first give Diamond the benefit of several highly debatable, not to say controversial, doubts. I will, for example, pass over the likelihood that at least some of these intertribal “wars” are likely to have been exacerbated, if not caused, by land encroachment or other hostilities from colonist societies. I will also leave aside the fact that Chagnon’s data, from his work with the Yanomami in the 1960s, has been discredited for decades: most anthropologists working with Yanomami simply do not recognize Chagnon’s violent caricature of those he calls the “fierce people.” I will also skate over Kim Hill’s role in denying the genocide of the Aché Indians at the hands of Paraguayan settlers and the Army in the 1960s and early 1970s. (Though there is an interesting pointer to this cited in Diamond’s book: as he says, over half Aché “violent deaths” were at the hands of nontribals.)

I will also throw only a passing glance at the fact that Diamond refers only to those societies where social scientists have collected data on homicides, and ignores the hundreds where this has not been examined, perhaps because—at least in some cases—there was no such data. After all, scientists seeking to study violence and war are unlikely to spend their precious fieldwork dropping in on tribes with little noticeable tradition of killing. In saying this, I stress once again, I am not denying that people kill people—everywhere. The question is, how much?

Awarding Diamond all the above ‘benefits of doubt’, and restricting my remarks to looking just at “our” side of the story: how many are killed in our wars, and how reasonable is it to cite those numbers as a proportion of the total population of the countries involved?

Is it meaningful, for example, to follow Diamond in calculating deaths in the fighting for Okinawa in 1945 as a percentage of the total populations of all combatant nations—he gives the result as 0.10 percent—and then comparing this with eleven tribal Dani deaths during a conflict in 1961. Diamond reckons the latter as 0.14 percent of the Dani population—more than at Okinawa.

Viewed like this, the Dani violence is worse that the bloodiest Pacific battle of WWII. But of course the largest nation involved in Okinawa was the U.S., which saw no fighting on its mainland at all. Would it not be more sensible to look at, say, the percentage of people killed who were actually in the areas where the war was taking place? No one knows, but estimates of the proportion of Okinawa citizens killed in the battle, for example, range from about 10 percent to 33 percent. Taking the upper figure gives a result of nearly 250 times more deaths than the proportion for the Dani violence, and does not even count any of the military killed in the battle.

Similarly, Diamond tells us that the proportion of people killed in Hiroshima in August 1945 was a tiny 0.1 percent of the Japanese people. However, what about the much smaller “tribe” of what we might call “Hiroshimans,” whose death toll was nearly 50 percent from a single bomb? Which numbers are more meaningful; which could be seen as a contrivance to support the conceit that tribespeople are the bigger killers? By supposedly “proving” his thesis in this way, to what degree does Diamond’s characterization differ significantly from labeling tribal peoples as “primitive savages,” or at any rate as more savage than “we” are?

If you think I am exaggerating the problem—after all, Diamond does not say “primitive savage” himself—then consider how professional readers of his book see it: his reviewers from the prestigious Sunday Times (U.K.) and The Wall Street Journal (U.S.) both call tribes “primitive,” and Germany’s popular Stern magazine splashed “Wilde” (“savages”) in large letters across its pages when describing the book.

Seek and you shall find statistics to underscore any conceivable position on this. Diamond is no fool and doubtless knows all this—the problem is in what he chooses to present and emphasize, and what he leaves out or skates over.

I do not have the author’s 500 pages to expand, so I will leave aside the problem of infanticide (I have looked at it in other contexts), but I cannot omit a response to the fact that, as he repeatedly tells us, some tribes abandon, or abandoned, their old at the end of their lives, leaving them only with what food or water might be spared, and moving on in the sure knowledge that death would quickly follow, or even hastening it deliberately.

Again, Diamond explains the logic of it, and again he tells us that, because of munificent state governments’ ability to organize “efficient food distribution,” and because it is now illegal to kill people like this, “modern” societies have left such behavior behind.

Really? So let us forget the 40 million or so dead in the Great Chinese Famine of the early 1960s. But what about the widespread, though usually very quiet, medical practice of giving patients strong doses of opiates— really strong doses—when illness and age have reached a threshold? The drugs relieve pain, but they also suppress the respiratory reflex, leading directly to death. Or, what about deliberately withholding food and fluids from patients judged near the end? Specialist nonprofits reckon there are about a million elderly people in the U.K. alone who are malnourished or even starving, many inside hospitals. So how different is what we industrialized folk get up to from some tribal practices? Are we all “savages” too?

Contrasting tribal with industrialized societies has always been more about politics than science, and we should be extremely wary of those who use statistics to “prove” their views. It all depends on what your question is, whom you believe, and most of all, exactly where you are standing when you ask it.

If, for example, you are an Aguaruna Indian in Peru, with a history of occasional revenge raiding stretching back the small handful of generations which comprise living memory (no Aguaruna can really know the extent to which such raiding was going on even a few generations ago, leave alone millennia), and if you have recently been pushed out of the forest interior into riverine villages by encroachment from oil exploration or missionaries, then your chances of being killed by your compatriots might even exceed those caught in Mexican drugs wars, Brazilian favelas, or Chicago’s South Side.

In such circumstances there would undoubtedly be much more homicide in Aguaruna-land than that faced by well-heeled American college professors, but also much less than that confronted by inmates in Soviet gulags, Nazi concentration camps, or those who took up arms against colonial rule in British Kenya, or apartheid South Africa.

If you find yourself born a boy in the Pine Ridge Indian Reservation, in the center of the world’s richest nation, your average lifespan will be shorter than in any country in the world except for some African states and Afghanistan. If you escape being murdered, you may end up dead anyway, from diabetes, alcoholism, drug addiction, or similar. Such misery, not inevitable but likely, would not result from your own choices, but from those made by the state over the last couple of hundred years.

What does any of this really tell us about violence throughout human history? The fanciful assertion that nation states lessen it is unlikely to convince a Russian or Chinese dissident, or Tibetan. It will not be very persuasive either to West Papuan tribes, where the Indonesian invasion and occupation has been responsible for a guessed 100,000 killings at least (no one will ever know the actual number), and where state-­sponsored torture can now be viewed on YouTube. The state is responsible for killing more tribespeople in West Papua than anywhere else in the world.

Although his book is rooted in New Guinea, not only does Diamond fail to mention Indonesian atrocities, he actually writes of, “the continued low level of violence in Indonesian New Guinea under maintained rigorous government control there.” This is a breathtaking denial of brutal state-sponsored repression waged on little armed tribespeople for decades.

The political dimensions concerning how tribal peoples are portrayed by outsiders, and how they are actually treated by them, are intertwined and inescapable: industrialized societies treat tribes well or badly depending on what they think of them, as well as what they want from them. Are they “backward,” from “yesterday”; are they more “savage,” more violent, than we are?

Jared Diamond has powerful and wealthy backers. He is a prestigious academic and author, a Pulitzer Prize winner no less, who sits in a commanding position in two American, and immensely rich, corporate-governmental organizations (they are not really NGOs at all), the World Wildlife Fund (WWF) and Conservation International (CI), whose record on tribal peoples is, to say the least, questionable. He is very much in favor of strong states and leaders, and he believes efforts to minimize inequality are “idealistic,” and have failed anyway. He thinks that governments which assert their “monopoly of force” are rendering a “huge service” because “most small-­scale societies [are] trapped in … warfare’ (my emphasis). “The biggest advantage of state government,” he waxes, is “the bringing of peace.”

Diamond comes out unequivocally in favor of the same “pacification of the natives,” which was the cornerstone of European colonialism and world domination. Furthermore, he echoes imperial propaganda by claiming tribes welcome it, according to him, “willingly abandon[ing] their jungle lifestyle.”

With this, he in effect attacks decades of work by tribal peoples and their supporters, who have opposed the theft of their land and resources, and asserted their right to live as they choose—often successfully. Diamond backs up his sweeping assault with just two “instances”: Kim Hill’s work with the Aché; and a friend who recounted that he, “traveled half way around the world to meet a recently discovered band of New Guinea forest hunter-­gatherers, only to discover that half of them had already chosen to move to an Indonesian village and put on T-­ shirts, because life there was safer and more comfortable.”

This would be comic were it not tragic. The Aché, for example, had suffered generations of genocidal attacks and slavery. Was Diamond’s disappointed friend in New Guinea unaware of the high probability of carrying infectious diseases? If this really were a recently “discovered” band, which is highly unlikely, such a visit was, to say the least, irresponsible. Or, was it rather a contrived tourist visit, like almost all supposed “first contacts” in New Guinea where a playacting industry has grown up around such deception? In either event, West Papuans are “safer” in Indonesian villages only if they are prepared to accept subjugation to a mainstream society which does not want them around.

As I said, I ought to like this book. It asserts, as I do, that we have much to learn from tribal peoples, but it actually turns out to propose nothing that challenges the status quo.

Diamond adds his voice to a very influential sector of American academia which is, naively or not, striving to bring back out-­of-­date caricatures of tribal peoples. These erudite and polymath academics claim scientific proof for their damaging theories and political views (as did respected eugenicists once). In my own, humbler, opinion, and experience, this is both completely wrong—both factually and morally—and extremely dangerous. The principal cause of the destruction of tribal peoples is the imposition of nation states. This does not save them; it kills them.

Were those of Diamond’s (and Pinker’s) persuasion to be widely believed, they risk pushing the advancement of human rights for tribal peoples back decades. Yesterday’s world repeated tomorrow? I hope not.

— source thedailybeast.com By Stephen Corry

How Humans Became ‘Consumers’

“Consumption is the sole end and purpose of all production,” Adam Smith confidently announced in The Wealth of Nations in 1776. Smith’s quote is famous, but in reality this was one of the few times he explicitly addressed the topic. Consumption is conspicuous by its absence in The Wealth of Nations, and neither Smith nor his immediate pupils treated it as a separate branch of political economy.

It was in an earlier work, 1759’s The Theory of Moral Sentiments, that Smith put his finger on the social and psychological impulses that push people to accumulate objects and gadgets. People, he observed, were stuffing their pockets with “little conveniences,” and then buying coats with more pockets to carry even more. By themselves, tweezer cases, elaborate snuff boxes, and other “baubles” might not have much use. But, Smith pointed out, what mattered was that people looked at them as “means of happiness.” It was in people’s imagination that these objects became part of a harmonious system and made the pleasures of wealth “grand and beautiful and noble.”

This moral assessment was a giant step towards a more sophisticated understanding of consumption, for it challenged the dominant negative mindset that went back to the ancients. From Plato in ancient Greece to St. Augustine and the Christian fathers to writers in the Italian Renaissance, thinkers routinely condemned the pursuit of things as wicked and dangerous because it corrupted the human soul, destroyed republics, and overthrew the social order. The splendour of luxus, the Latin word for “luxury,” smacked of luxuria—excess and lechery.

The term “consumption” itself entered circulation with a heavy burden. It originally derived from the Latin word consumere and found its way first into French in the 12th century, and from there into English and later into other European languages. It meant the using up of food, candles, and other resources. (The body, too, could be consumed, in this sense—this is why in English, the “wasting disease,” tuberculosis, was called “consumption.”) To complicate matters, there was the similar-sounding Latin word consummare, as in Christ’s last words on the cross: “Consummatum est,” meaning “It is finished.” The word came to mean using up, wasting away, and finishing.

Perhaps those meanings informed the way that many pre-modern governments regulated citizens’ consumption. Between the 14th and 18th centuries, most European states (and their American colonies) rolled out an ever longer list of “sumptuary laws” to try and stem the tide of fashion and fineries. The Venetian senate stipulated in 1512 that no more than six forks and six spoons could be given as wedding gifts; gilded chests and mirrors were completely forbidden. Two centuries later, in German states, women were fined or thrown in jail for sporting a cotton neckerchief.

To rulers and moralists, such a punitive, restrictive view of the world of goods made eminent sense. Their societies lived with limited money and resources in an era before sustained growth. Money spent on a novelty item from afar, such as Indian cotton, was money lost to the local treasury and to local producers; those producers, and the land they owned, were heralded as sources of strength and virtue. Consumers, by contrast, were seen as fickle and a drain on wealth.

Adam Smith’s reappraisal of this group in 1776 came in the midst of a transformation that was as much material as it was cultural. Between the 15th and 18th centuries, the world of goods was expanding in dramatic and unprecedented ways, and it was not a phenomenon confined to Europe. Late Ming China enjoyed a golden age of commerce that brought a profusion of porcelain cups, lacquerware, and books. In Renaissance Italy, it was not only the palazzi of the elite but the homes of artisans that were filling up with more and more clothing, furniture, and tableware, even paintings and musical instruments.

It was in Holland and Britain, though, where the momentum became self-sustaining. In China, goods had been prized for their antiquity; in Italy, a lot of them had circulated as gifts or stored wealth. The Dutch and English, by contrast, put a new premium on novelties such as Indian cottons, exotic goods like tea and coffee, and new products like the gadgets that caught Smith’s attention.

In the 1630s, the Dutch polymath Caspar Barlaeus praised trade for teaching people to appreciate new things, and such secular arguments for the introduction of new consumer products—whether through innovation or importation—were reinforced by religious ones. Would God have created a world rich in minerals and exotic plants, if He had not wanted people to discover and exploit them? The divine had furnished man with a “multiplicity of desires” for a reason, wrote Robert Boyle, the scientist famous for his experiments with gases. Instead of leading people astray from the true Christian path, the pursuit of new objects and desires was now justified as acting out God’s will. In the mid-18th century, Smith’s close friend David Hume completed the defense of moderate luxury. Far from being wasteful or ruining a community, it came to be seen as making nations richer, more civilized, and stronger.

By the late 18th century, then, there were in circulation many of the moral and analytical ingredients for a more positive theory of consumption. But the French Revolution and the subsequent reaction stopped them from coming together. For many radicals and conservatives alike, the revolution was a dangerous warning that excess and high living had eaten away at social virtues and stability. Austerity and a new simple life were held up as answers.

Moreover, economic writers at the time did not dream there could be something like sustained growth. Hence consumption could easily be treated as a destructive act that used up resources or at best redistributed them. Even when writers were feeling their way towards the idea of a higher standard of living for all, they did not yet talk of different groups of people as “consumers.” One reason was that, unlike today, they did not yet single out the goods and services that households purchased, but often also included industrial uses of resources under the rubric of consumption.The French economist Jean-Baptiste Say—today remembered for Say’s law, which states that supply creates its own demand—was one of the few writers in the early 19th century who considered consumption on its own, according the topic a special section in his Treatise on Political Economy. Interestingly, he included the “reproductive consumption” of coal, wood, metal, and other goods used in factories alongside the private end-use by customers.

Elsewhere, other economists showed little interest in devising a unified theory of consumption. As the leading public moralist in Victorian England and a champion of the weak and vulnerable, John Stuart Mill naturally stood up for the protection of unorganized consumers against the interests of organized monopolies. In his professional writings, however, consumption got short shrift. Mill even denied that it might be a worthy branch of economic analysis: “We know not of any laws of the consumption of wealth as the subject of a distinct science,” he declared in 1844. “They can be no other than the laws of human enjoyment.” Anyone pitching a distinct analysis of consumption was guilty by association of believing in the possibility of “under-consumption,” an idea that to Mill was suspect, wrong, and dangerous.

It fell to a popular French liberal and writer, Frédéric Bastiat, to champion the consumer—supposedly his dying words in 1850 were “We must learn to look at everything from the point of view of the consumer.” That may have sounded prescient but it hardly qualified as a theory, since Bastiat believed that free markets ultimately took care of everything. For someone like Mill with a concern for social justice and situations when markets did not function, such laissez-faire dogma was bad politics just as much as bad economics.

By the middle of the 19th century, then, there was a curious mismatch between material and intellectual trends. Consumer markets had expanded enormously in the previous two centuries. In economics, by contrast, the consumer was still a marginal figure who mainly caught attention in situations of market failure, such as when urban utilities failed or cheated their customers, but rarely attracted it when it came to the increasingly important role they’d play in the expansion of modern economies.

Theory finally caught up in 1871, when William Stanley Jevons published his Theory of Political Economy. “The theory of economics,” he wrote, “must begin with a correct theory of consumption.” Mill and his ilk had it completely wrong, he argued. For them the value of goods was a function of their cost, such as the cloth and sweat that went into making a coat. Jevons looked at the matter from the other end. Value was created by the consumer, not the producer: The value of the coat depended on how much a person desired it.

Further, that desire was not fixed but varied, and depended on a product’s utility function. Goods had a “final (or marginal) utility,” where each additional portion had less utility than the one before, because the final one was less intensely desired, a foundational economic concept that can be understood intuitively through cake: The first slice may taste wonderful, but queasiness tends to come after the third or fourth. Carl Menger in Austria and Léon Walras in Switzerland were developing similar ideas at around the same time. Together, those two and Jevons put the study of consumption and economics on entirely new foundations. Marginalism was born, and the utility of any given good could now be measured as a mathematical function.

It was Alfred Marshall who built on these foundations and, in the 1890s, turned economics into a proper discipline at the University of Cambridge. Jevons, he noted, was absolutely right: The consumer was the “ultimate regulator of demand.” But he considered Jevons’s focus on wants was too static. Wants, Marshall wrote, are “the rulers of life among lower animals” and human life was distinguished by “changing forms of efforts and activities”—he contested that needs and desires changed over time, and so did the attempts and means devoted to satisfying them. People, he believed, had a natural urge for self-improvement and, over time, moved from drink and idleness to physical exercise, travel, and an appreciation of the arts.

For Marshall, the history of civilization resembled a ladder on which people climbed towards higher tastes and activities. It was a very Victorian view of human nature. And it reflected a deep ambivalence towards the world of goods that he shared with such critics of mass production like the designer William Morris and the art critic John Ruskin. Marshall believed fervently in social reform and a higher standard of living for all. But at the same time, he was also deeply critical of standardized mass consumption. His hope was that people in the future would learn instead to “buy a few things made well by highly paid labour rather than many made badly by low paid labour.” In this way, the refinement of consumers’ taste would benefit highly-skilled workers.

The growing attention to consumption was not limited to liberal England. In imperial Germany, national economists turned to it as an indicator of national strength: Nations with high demand were also the most energetic and powerful, it was argued. The first general account of a high-consumption society, however, came not surprisingly from the country with the highest standard of living: the United States. In 1889, Simon Patten, the chair of the Wharton School of Business, announced that the country had entered a “new order of consumption.” For the first time, there was a society that was no longer fixated on physical survival but that now enjoyed a surplus of wealth and could think about what to do with it. The central question became how Americans spent their money and their time, as well as how much they earned. People, Patten wrote, had a right to leisure. The task ahead was no longer telling people to restrain themselves—to save or to put on a hairshirt—but to develop habits for greater pleasure and welfare.

This was more than an academic viewpoint. It had radical implications for how people should consume and think about money and their future. Patten summarised the new morality of consumption for a congregation in a Philadelphia church in 1913:

I tell my students to spend all that they have and borrow more and spend that … It is no evidence of loose morality when a stenographer, earning eight or ten dollars a week, appears dressed in clothing that takes nearly all of her earnings to buy.

Quite the contrary, he said, it was “a sign of her growing moral development.” It showed her employer that she was ambitious. Patten added that a “well-dressed working girl … is the backbone of many a happy home that is prospering under the influence that she is exerting over the household.” Some members at the Unitarian Church were outraged, insisting, “The generation you’re talking to now is too deep in crime and ignorance … to heed you.” Discipline, not spending on credit, was what they needed. Whether they liked it or not, the future would be with Patten’s more liberal, generous view of consumption.

* * *

Economists were not the only ones who discovered consumption in the late 19th century. They were part of a larger movement that included states, social reformers, and consumers themselves. These were years when steamships, trade, and imperial expansion accelerated globalization and many workers in industrial societies started to benefit from cheaper and more varied food and clothing. Attention now turned to “standard of living,” a new concept that launched thousands of investigations into household budgets from Boston to Berlin and Bombay.

The central idea behind these inquiries was that the welfare and happiness of a household was determined by habits of spending, and not just earnings. A better understanding of how money was spent assisted social reformers in teaching the art of prudent budgeting. In France in the 1840s, Frédéric Le Play compiled 36 volumes on the budgets of European workers. In the next generation, his student Ernst Engel took the method to Saxony and Prussia, where he professionalized the study of social statistics. He fathered Engel’s law, which held that the greater a family’s income, the smaller the proportion of income spent on food. For those of Engel’s contemporaries who worried about revolutions and socialism, there was hope here: Less spending on food translated into more money for personal improvement and social peace.

Above all, it was citizens and subjects who discovered their voice as consumers. Today, the fin-de-siècle is remembered for its cathedrals of consumption, epitomized by the Bon Marché in Paris and Selfridges in London. While they did not invent the art of shopping, these commercial temples were important in widening the public profile and spaces for shoppers, especially for women.

Intriguingly, though, it was not there in the glitzy galleries but literally underground, through the new material networks of gas and water, that people first came together collectively as consumers. A Water Consumers’ Association was launched in Sheffield in 1871 in protest against water taxes. In addition, needs and wants themselves were changing, and this expanded notions of entitlements and rights. In England, middle-class residents at this time were becoming accustomed to having a bath and refused to pay “extra” charges for their extra water. A bath was a necessity, not a luxury, they argued, so they organized a consumer boycott.

The years before the First World War turned into the golden years of consumer politics. By 1910, most working-class families and every fourth household in England was a member of a consumer cooperative. In Germany and France, such groups counted over a million members. In Britain, the Woman’s Cooperative Guild was the largest women’s movement at the time. Organizing as consumers gave women a new public voice and visibility; after all, it was the “women with the baskets”, as these working-class housewives were called, who did the shopping.

And it was women who marched in the vanguard of ethical consumerism. Consumer leagues sprang up in New York, Paris, Antwerp, Rome, and Berlin. In the United States, the league grew into a national federation with 15,000 activists, headed by Florence Kelley, whose Quaker aunt had campaigned against slave-grown goods. These middle-class consumers used the power of their purses to target sweatshops and reward businesses that offered decent working conditions and a minimum wage.

“The consumer,” a German activist explained, “is the clock which regulates the relationship between employer and employee.” If the clock was driven by “selfishness, self-interest, thoughtlessness, greed and avarice, thousands of our fellow beings have to live in misery and depression.” If, on the other hand, consumers thought about the workers behind the product, they advanced social welfare and harmony. Consumers, in other words, were asked to be citizens. For women, this new role as civic-minded consumers became a powerful weapon in the battle for the vote. This call on the “citizen-consumer” reached its apotheosis in Britain on the eve of the First World War in the popular campaigns for free trade, when millions rallied in defense of the consumer interest as the public interest.

Even before these movements took shape, many advocates had predicted a steady advance of consumer power throughout the 1900s. “The 19th century has been the century of producers,” Charles Gide, the French political economist and a champion of consumer cooperatives, told his students in 1898. “Let us hope that the 20th century will be that of consumers. May their kingdom come!”

Did Gide’s hope come true? Looking back from the early 21st century, it would be foolish not to recognize the enormous gains in consumer welfare and consumer protection that have taken place in the course of the last century, epitomized by John F. Kennedy’s Consumer Bill of Rights in 1962. Cars no longer explode on impact. Food scandals and frauds continue but are a far cry from the endemic scandals of adulteration that scarred the Victorians.

And consumers have remained a focus of academics. Economists continue to debate whether people adjust their consumption over time to get most out of life, whether they spend depending on what they expect to earn in the future, or whether their spending is determined more by how their income compares to others’. Consumption is still an integral component of college curricula, and not only in economics and business, but in sociology, anthropology, and history, too, although the last few tend to stress culture, social customs, and habits rather than choice and the utility-maximizing individual.

Today, companies and marketers follow consumers as much as direct them. Grand critiques of consumerism as stupefying, dehumanizing, or alienating—still an essential part of the intellectual furniture of the 1960s—have had their wings clipped by a recognition of how products and fashions can provide identities, pleasure, and fodder for entirely new cultural styles. Younger generations in particular have created their own subcultures, from the Mods and rockers in Western Europe in the 1960s to Gothic Lolitas in Japan more recently. Rather than being passive, the consumer is now celebrated for actively adding value and meaning to media and products.

And yet, in other respects, today’s economies are a long way from Gide’s kingdom of consumers. Consumer associations and activism continue, but they have become dispersed between so many issues that they no longer carry the punch of the social-reform campaigns of the early 20th century; today there are, for example, movements for slow food, organic food, local food, fair-trade food—even ethical dog food.

In hard times, like the First and Second World Wars, some countries introduced consumer councils and ministries, but that was because states had a temporary interest in organizing their purchasing power for war efforts and to recruit them in the fight against profiteering and inflation. During peacetime, markets and vocal business lobbies returned, and such consumer bodies were just as quickly wound up again. Welfare states and social services have taken over many of the causes for which consumer leagues fought a century ago. India has a small Ministry of Consumer Affairs, but its primary role is to raise awareness and fight unfair practices. In many less developed countries, consumers continue to be a vocal political force in battles over access and prices of water and energy. In the richest societies today, though, consumers have little or no organized political voice, and the great campaigns for direct consumer representation four or five generations ago have come to very little. Markets, choice, and competition are now seen to be the consumer’s best friend—not political representation. Consumers are simultaneously more powerful and powerless today than Gide had foreseen.

Today, climate change makes the future role of consumption increasingly uncertain. The 1990s gave birth to the idea of sustainable consumption, a commitment championed by the United Nations in Rio de Janeiro in 1992. Price incentives and more-efficient technologies, it was hoped, would enable consumers to lighten the material footprint of their lifestyles. Since then, there have been many prophecies and headlines that predict “peak stuff” and the end of consumerism. People in affluent societies, they say, have become bored with owning lots stuff. They prefer experiences instead or are happy sharing. Dematerialization will follow.

Such forecasts sound nice but they fail to stand up to the evidence. After all, a lot of consumption in the past was also driven by experiences, such as the delights of pleasure gardens, bazaars, and amusement parks. In the world economy today, services might be growing faster than goods, but that does not mean the number of containers is declining—far from it. And, of course, the service economy is not virtual, and requires material resources too. In France in 2014, people drove 32 billion miles to do their shopping—that involves a lot of rubber, tarmac, and gas. Digital computing and WiFi absorb a growing share of electricity. Sharing platforms like Airbnb have likely increased frequent travel and flights, not reduced them.

Moreover, people may say they feel overwhelmed or depressed by their possessions but in most cases this has not converted them to living more simply. Nor is this a peculiarly American or Anglo-Saxon problem. In 2011, the people of Stockholm bought three times more clothing and appliances than they did 20 years earlier.

How—indeed whether—consumers can adapt to a world of climate change remains the big question for the 21st century. In 1900, many reformers looked for answers to questions about social reform, social responsibility, and consumer representation. Climate change is its own monumental challenge, but there may be lessons that can be learned from that earlier history of the consumer. Consumers were identified as important players in tackling social blight and economic injustice. As buyers, they had some influence over what was produced, its quality as well as quantity. Organizing their interests added an important voice to the arena of public politics. These remain valuable insights: Consumers may not hold the answers for everything, but that does not mean they should be treated as merely individual shoppers in the market.

— source theatlantic.com By Frank Trentmann