We’ve moved into a post capital world, where attention is the scarcest resource - and therefore the one with most value. Albert Wenger wrote a book on this topic, The World After Capital, in which he writes:

Technological progress has shifted scarcity for humanity. When we were foragers, food was scarce. During the agrarian age, it was land. Following the industrial revolution, capital became scarce. With digital technologies scarcity is shifting once more. We need to figure out how to live in a World After Capital in which the only scarcity is our attention.

A new book by Shoshanna Zuboff called The Age of Surveillance Capitalism discusses how technology powerhouses have created a new, mutant form of capitalism which uses our data as a raw material.

The Guardian provides an in-depth commentary on the book and the concept of surveillance capitalism:

The headline story is that it’s not so much about the nature of digital technology as about a new mutant form of capitalism that has found a way to use tech for its purposes. The name Zuboff has given to the new variant is “surveillance capitalism”. It works by providing free services that billions of people cheerfully use, enabling the providers of those services to monitor the behaviour of those users in astonishing detail – often without their explicit consent.

She also writes about how Internet services use only a fraction of the data they collect to actually improve the service itself. The rest of it often fed into a machine learning model and is sold as predictive analytics software or part of a data product. In essence, data gets treated as a valuable raw material. One that is easy to create and manipulate in any way companies choose - with little agency or value transfer to customers.

“Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.

She also writes about the threat to democracy posed by surveillance capitalists:

These dangerous asymmetries are institutionalised in their monopolies of data science, their dominance of machine intelligence, which is surveillance capitalism’s “means of production”, their ecosystems of suppliers and customers, their lucrative prediction markets, their ability to shape the behaviour of individuals and populations, their ownership and control of our channels for social participation, and their vast capital reserves. We enter the 21st century marked by this stark inequality in the division of learning: they know more about us than we know about ourselves or than we know about them. These new forms of social inequality are inherently antidemocratic.

As someone that has spent a many years of my career as a “surveillance capitalist” of sorts - it’s really important that we ask ourselves some of these questions. This level of scrutiny - about what data to collect and how to use it - helps us become better as an industry.