There is an abundance of discussion on fears of AI going rogue, turning on humanity due to some oversight in their programming. The more treacherous reality is that there is no need for it to go rogue. In truth, self-aware Artificial Intelligence as we often imagine it in these apocalyptic scenarios is hardly needed at all – though it’s surely an asset to bad actors.
Twitter users might remember a little notice that was developed if one’s Tweet was too strongly worded – “Want to review this before Tweeting? We’re asking people to review replies with potentially harmful or offensive language.” This might seem like a simple attempt to reduce the toxicity of a rather poisonous environment, but the implication is that Twitter has an algorithm to read your tweet, deduce its offensiveness, and prompt you to change your behavior. This machinery can be expanded for all manner of speech. Youtube’s own algorithms produce copyright strikes and demonetization based on their own set of programmed criteria. A new rule implemented recently will demonetize creators for profanity. Discussion of controversial topics or distribution of ‘misinformation’ have already been frequent grounds for automated demonetization – though whose pocket that dirty money is going into, after the creator is deprived of their pay-cheque for their controversial habits?
These are efforts, overt efforts, to control our behavior. Twitter’s was a simple warning, but Youtube’s had financial teeth. For creators on Youtube who rely on that income for their livelihood, do you truly think they were not aware of the algorithmic criteria and their incentives, and perhaps cautious about what topics they discussed? Twitter and Youtube, at least, do not have police officers, soldiers, and guns – but indulge me. Imagine that instead of Twitter sending you that warning on a Tweet, your text to a friend was belayed by an automated notice from your local police department? Perhaps a three-strike rule, and then your case may be escalated? Watch your language.
Open your phone and dabble with your text-to-speech functionality – it might not be perfect, but you will notice it’s quite good. Imagine for a moment that everything you ever said with your phone nearby was saved to a database. Even just to perform the task of displaying it to you, this was likely the case – though we can hope, at least, this information was not retained indefinitely, or collected when the program was not active. I worry that that’s a naive hope. Just your name and the text, perhaps arranged by date, collected and stored. Maybe to market something to you, or improve the product. Something innocuous. But this content, text just like your Tweets, can be run through the very same algorithm. Perhaps it contained harmful or offensive language, and that won’t do.
The East-German stazi would be salivating at the very idea of these algorithms, able to do the surveillance work of thousands of men near instantaneously. Able to generate databases and produce lists of potential troublemakers with a brief query. Let’s not forget that the Stazi were not some force from another era – their heyday was well within living memory. Think about what surveillance states like China would think of these technologies? Of course, the ones they deploy are likely far more sophisticated.
Most of us are well aware of the intrusiveness of what some call surveillance capitalism, and we usually at least take a breath of relief that most of these companies are supposedly just using them to market to us – but it’s so clearly, demonstrably not the case. Youtube controls speech. Twitter controls speech. Do we really think governments don’t have access to these tools, too? Do we really suppose that, from now on into eternity, they will never have the will to use it?
In January and February 2022, a large protest gathered outside Canada’s house of parliament, protesting COVID-19 mandates set out by the Federal government, such as vaccine mandates for truckers crossing the border between the United States and Canada. This noisy and inconvenient movement was, ultimately, cleared from Parliament hill by riot police – not an unconventional tactic for clearing out protesters in many parts of the world. Canada’s Prime Minister, Justin Trudeau, made use of the Emergencies Act to do something else, however – a less violent but no less coercive form of persuasion. He had the bank accounts of 76 protesters frozen. The government also collaborated – and legally threatened – donation services such as GoFundMe and GiveSendGo to freeze the finances of the movement. Let’s think back to the Youtube example, wherein certain forms of speech have been grounds for financial penalties, using the power of the purse to control what its users produce, and you can see the troubling potential of these sorts of tactics on the part of government.
Consider recent legislation passed in American Congress that will require all cars to have a “kill switch” – in order to prevent drunk driving, or perhaps help prevent high speed chases. Some vehicles will soon have an algorithm to detect abnormal driving behavior that will infer whether you’re misbehaving behind the wheel. I will not be surprised to find controls against speeding, as well. Of course the privilege of driving was already a matter of state approval – licensed – but as technologies to surveil and automatically punish grow more sophisticated, your vehicle will be another easy thing to lock you out of if you happen to not cooperate. Let’s also take a moment to recall that dissident journalists have already been murdered by the state via misuse of vehicular kill switches. Kill switch indeed.
Think about this every time your government seizes more power to bully or ‘protect’ its citizenry. Some posit that authoritarian regimes tend to collapse when, at a certain point, its coercive apparatus loses the will to continue its oppressive tactics. The collapse of the Soviet Union happened in no small part because people in the roles of soldiers and police refused to fire on protesters, and the power of the state to crush opposition crumbles, sometimes remarkably quickly. But what about when these regimes of oppression are not carried out by people? What happens when the machinery of authoritarianism is near completely automated, largely abstract, out of sight and out of mind of its operators? Stalin planned much of the collectivization that resulted in the Ukranian famine from the comfort of a countryside dacha, and the main inconvenience it caused him was the strain on his lieutenants sent to carry out the orders. What if the control of dissent can be as simple as a programmed ‘for loop’?