In Flashmob, John Smith, ex-miltary badass who can read minds, tracks down the shadowy operator of a nasty website called Downvote that crowd-sources violence and murder against people who have displeased its users, who vote on the violence.
Sounds like an extreme version of 4chan, incel derangement, Qanon dementia, and white supremacist websites, doesn’t it? In addition to being a cracking good read, Flashmob posits an important if unsettling question. What if violence can be created and fomented by software? Meaning, people don’t just somehow find a murderous site like Downvote. Rather they are deliberately led to it, then manipulated into rage and violence once there.
Well, that’s just crazy talk, isn’t it? Maybe not. The only difference between targeting middle-aged soccer moms in Philadelphia to buy Diet Coke using Pandora ads and riling up the already unstable to commit mayhem and murder is morality and ethics. The same technology can do both.
Facebook has experimented with manipulating news feeds to produce what they call “emotional contagion,” manipulating user emotions so they feel good or bad based on what Facebook shows them in their feed. Facebook and others also have the ability to deliver micro-targeted ads based on what they know about you. We saw this happen in the 2016 election with openly racist Facebook ads only appearing in targeted people’s news feeds.
Two years ago most thought the idea that Russia interfered in a U.S. presidential election by using technology and the net was crazy conspiracy theory. Today, even Republican senators acknowledge it happened.
There have been too many lone killers radicalized on the internet for it all to be random. Maybe they’re being guided to commit violence. It wouldn’t be that hard to do.
And do read Flashmob. Farnsworth is a gifted writer.