Opinion

How algorithms are amplifying misinformation and driving a wedge between people

iStock

The design of social media platforms is fundamentally flawed. They are based on algorithms that look for, learn from, and implement patterns. Algorithms are excellent at finding content that is similar to your previous interests and delivering more of that to you, but they don’t have any capacity for real discernment and can’t make subtle distinctions between sarcasm, innuendo, attitude, point of view. 

How social media works—or really, doesn’t work—now

The real root of the problem is that social media algorithms are designed to optimize one objective: increase advertising revenue. They are owned by private corporations that now operate as glorified ad agencies, exploiting data about people’s behavior and manipulating their attention to maximize profit. 

Social media algorithms feed people content to increase “engagement” and keep them scrolling to see more ads. On average, emotionally provocative content that reinforces what we already believe works better than factual information. This creates a circular feedback loop that traps each of us in our own filter bubble and drives a wedge between people with differing schools of thought or political beliefs. One obvious instance of this was former President Trump’s false claims that the election was stolen. No one can absolutely prove whether it was intentional, but it offers a prime example of how someone can cynically leverage these algorithms to spread misinformation and polarize people’s political beliefs. In fact, every authoritarian figure, from Brazil to Indonesia, elected in the “populist” wave a few years ago employed Cambridge Analytica to manipulate opinion on Facebook. There’s a very real and pressing need for social networks that don’t push people toward extreme views and away from rational discourse and debate. 

So, what are we to do? It’s time for a platform controlled by humans, not advertising algorithms

We need an online environment that reflects the way a healthy society naturally acts rather than an algorithm designed to manipulate our attention to make money. Two of the systems developed by society to seek factual understanding are independent editorial boards in professional journalism and the peer-review process in scientific publishing. Both are far better at delivering factual content than existing social media, but far from perfect. There is a legitimate ideal around democratizing information that social media represents, but it has been hijacked and subverted by the ad-driven model instead. To fulfill that original aspiration, we need a system that delivers transparency, establishes responsibility based on reputation and provides “data provenance.”

People need to be able to look at content and see where it came from and how it traveled to reach you. This is called data provenance, the ability to see and verify the path that it took in a clear, traceable manner to establish reality from reputable, human sources. 

Society is naturally composed of groups of people with differing and often opposing views. This has always been true, even if we feel more divided than ever before. But if the content we publish on our social channels can be disputed, refuted, or amplified by our peers, with some ability to discern where it came from, their reputation and thus the probability of it being factually correct, we’ll be in a better position to begin to rebuild a healthy view of reality. 

The specific solution we’re proposing on Tru Social gives the power and responsibility to engage with and evaluate content to real people, rather than algorithms. Tru provides groups of people that have knowledge and expertise in a field with a forum to share, discuss, and debate information privately, or in public view if they like. The forum works like a subreddit for each group, where members can vote the importance, relevance, or quality of content up or down. The set of people who act as curators control each hub, decide what to publish, and are responsible for their hub’s reputation. This recreates something more closely resembling human society than individuals connected by an algorithm, as it functions more like an editorial board in professional journalism or peer review in scientific publishing. 

Using this system, people can discern the reputation of groups. There’s also the opportunity for groups to subscribe to content from opposing views and examine the middle—opening the door for discussion and debate across the spectrum of opinion, instead of filter bubbles driving us further and further apart.

Society needs secure and trustworthy platforms for individuals, groups, and professional organizations to communicate. Social media curated by humans presents an opportunity to rebuild a foundation for reality on the internet. And to help people be more effective working together to solve the real problems in our world. People, organizations, and movements focused on activism need reliable tools to share, evaluate and elevate information and work toward solutions together. The things that are going to matter in the future—like climate change and biodiversity—are up to us to deal with now.

Jim is a serial social entrepreneur with three decades experience in tech, and four in regenerative design, sustainability and climate. He is co-founder and CEO of Tru Social Inc. and JLINC Labs, and co-founder of Planetwork. In 2000, the Planetwork conference launched a tech community initiative for a global network – before Facebook. Jim also funded a key social patent, resulting in shares in the Linkedin IPO.


changing america copyright.