‘Surveillance Capitalism’ And The Ownership Of People's Lives

Share:

Some people (myself included) have talked about some aspects of modern life seeming like feudalism. The nobility received land from the ruling monarch in return for a promise of military aid. Ordinary people—whether free peasants or serfs bound to service—lived on land with nobility taking the position of landlords. The peasants and serfs had to turn over significant portions of the food they grew to the landlord and, depending on their exact status, might not have the right to do anything significant in their own lives without permission.

Those terms came to mind when listing to an interview Harry Sharer did with author and Harvard Business School professor emerita Shoshana Zuboff, who discussed what she calls surveillance capitalism.

Put too simply, Zuboff says that technology-based business—with virtually embedding tech and emulating the same practices—has found that people are the ultimate product. Instead of simply selling products, more and more companies want to join Google and Facebook and their ilk and collect details of everything in the lives of people to create massive predictive engines that know what they will do, how they will feel and think, all to direct and control individuals along paths that will benefit the companies and all the other parties they sell data to.

The data collection happens everywhere: in your car, everywhere you go with your phone, on city streets where companies use real time facial recognition to see who is walking around. It's about what you eat, look at, read, and watch. How you move, respond, react, and dress. The concept is that all the tiny choices made and actions taken, when collected, become massively invasive and effective indicators.

The true power of an indicator, however, is not just as a sign of a person's current state of mind and action. It allows business entities to follow people and then nudge them into behavior that is more commercially rewarding for the companies. It's perhaps the ultimate form of feudalism. Instead of having your own existence on the planet, all the contents of your life are taken, without your explicit agreement, and turned into products that ultimately control your behavior to some degree or other.

This isn't some paranoid theory. In 2010, as an experiment, Facebook manipulated people into voting at somewhat higher rates than otherwise would have happened. Then, in 2014, the company found they could deliberately influence the emotional state of people.

Companies spend enormous amounts of money to follow people online across platforms and gather all the information they can to improve the chance of selling products. There are many companies in the business of providing those real-time systems that identify people in public spaces. This is by no means an intellectual exercise.

And the more massive software systems work to influence people in automated ways, the greater the chance that people are affected differently, depending on race, religion, class, gender, educational attainment, economic status, and more.

There are also the implications on a more personal and directed level. Business regularly targets those without resources for higher costs for products, services, and fees. The reason? Those with less wealth have fewer options. They become easy pickings for companies that want to rack up profits.

Samuel Lengen, a research associate at the school of data science at the University of Virginia, has noted effects of big data and social networks on less powerful groups:

"The truth is that datafication, with all its privacy implications, does not affect everyone equally.

Big data's hidden biases and networked discrimination continue to reproduce inequalities around gender, race and class. Women, minorities and the financially poor are most strongly affected. UCLA professor Safiya Umoja Noble, for example, has shown how Google search rankings reinforce negative stereotypes about women of color."

Or one could point to how artificial intelligence healthcare systems can effectively discriminate against groups like African Americans. Not because someone intended to, but because it's all too easy for that to happen with the implications of what decisions based on data can do.

There is no appeal. People don't even realize what is happening. All they know is that someone life treats them a certain way because the end of privacy isn't just others knowing what you do. It's others owning virtually every movement and utterance and then directing your life for their profit.