Whenever it seems like the challenges of protecting my employer from risks to information security or business continuity are towering above me, I stop and think of the people whose job it is to think about the risks that technology (specifically social media these days) pose to a rational, democratic society. In comparison, my job suddenly seems much, much easier.
Back in the 90s, when we were all exploring the amazing possibilities offered by that new fangled Internet dealie, the future seemed bright. Giving everyone a voice and a platform to promote their ideas seemed like a great idea – no longer would one have to be a media mogul to become a publisher. The barriers to entry to the marketplace of ideas were crumbling like the Berlin Wall. Good news all round!
Well, we all know how THAT turned out.
Today, we are awash in a toxic cesspool of misinformation, disinformation, conspiracy fever dreams, and hate speech brought to us by amoral tech giants. The same laws which enabled (small, fragile) social platforms to grow and offer a voice to everyone now provide those (now huge, monopolistic) platforms a shield as they preside over and profit from the death of rationality and truth.
Something’s got to change.
First, the platforms themselves need to take a consistent stand against those who use them to spread misinformation, disinformation, and hate. This is not about silencing those who have unpopular opinions; it is about acting against those who weaponize demonstrable lies. The First Amendment does not apply to private platforms and no one is legally obligated to give demagogues and hatemongers a global megaphone.
Second, consumers need to stop using platforms which enable the actors who are using social media as a weapon against society as a whole. Our eyeballs are the oxygen these platforms need to survive.
Third, we need to change the laws which shield platform owners from responsibility for the content they carry. Today’s social media platforms have matured and can continue to operate and be profitable even if they need to invest more in moderation. They must be held financially and legally responsible for carrying content which incites criminal violence or which is libelous or slanderous.
There’s a lot of great content out there on social media – I look to Twitter, LinkedIn and the blogosphere every day to learn about infosec and to gain insight into all sorts of knowledge I would have never gained otherwise. But I sometimes wonder if the cost imposed on us by those who misuse social media outweighs the good.
If we don’t want to proceed further down the road to a tech enabled “endarkenment,” I hope that there are people way smarter than I working on ways to protect society from “malinformation.”