The former employee of an international agency explains how bots and trolls are used to exploit the flaws of the social network and manipulate public opinion
The social networks It is where we project the image of everything we pretend to be, even if it is a lie. The perception of others has become an obsession not only for ordinary users, but also for companies, political parties or football clubs. This preoccupation with appearance fuels the business of opaque agencies that make a living creating fake accounts to amplify messages and make citizens believe something that is not true, an industry of deception that already moves billions of euros.
“Being able to take control is something very greedy & rdquor ;, explains to EL PERIÓDICO the former employee of an international agency that offered as a service the creation of artificial campaigns of manipulation of public opinion, which is known as ‘astroturfing‘. For years, he coordinated a digital guerrilla in the pay of 10 people dedicated to fraudulently viralizing messages on Twitter, to defend the reputation of their clients —from large multinationals to television celebrities— and to harass their enemies. “I can create an army capable of making the world see you as an exemplary citizen & rdquor ;, he adds. Now he narrates his experience in ‘Confessions of a Russian bot‘ (Editorial Debate).
“Imagine that you are the director of a company and that an investigation uncovers your involvement in a corruption case. You’re sold, but suddenly a guy calls you who offers you a way out to solve that reputation crisis without dropping your pants. That call would have been made by my former boss and I would have been in charge of planning, designing and executing that strategy & rdquor ;, assures this former agent, who hides his identity because a confidentiality agreement prevents him from putting a face and eyes to the campaigns he orchestrated.
Twitter has become one of the main stages of this battle for the political, social and cultural narrative. And it is that part of the debates, discussions and insults that are poured there do not arise in a natural and organic way, but are coordinated and executed through bots —automated accounts, whether fake or not— with the intention of amplifying certain messages and thus achieving greater legitimacy. You’re more likely to stop to read or share a post with 2,000 retweets than a marginal one.
In turn, the repeated propagation of messages on the same topic seeks to sneak into the most commented topics (‘trending topics’), hijack the debate and try to set the media agenda. “Getting to popularize a ‘hashtag’ is very easy, you just have to analyze how many tweets the rest of the trends generate and calculate how many you have to launch to achieve it”, he points out.
«I have spent a few years insulting you on social networks because someone paid me. Now that I no longer have a payroll, I want to tell you how I did it»
— Editorial Debate (@debatebooks) January 24, 2022
Making a message go viral can be simple and relatively inexpensive. But in the world of psychological manipulation success goes beyond figures and to create a “real bond & rdquor; With the victim, everything has to be planned in detail, map what happens in the networks and identify weak points to influence the conversation. “If we manage an account antivacunas we need to know how they speak and what concerns they have to know what content will sink deeper into their hearts & rdquor ;, he warns.
Harassing and taking down rivals
The improvement of Twitter’s security measures against bots and the ease with which users detect them has made more and more ‘astroturfing’ agencies opt for a more aggressive strategy, the trolls: fake accounts from which they are launched abuse Y threats against specific targets. “If I want a journalist to stop publishing content that attacks my client, harassment is essential,” he says. “If every time you talk about a topic you receive 50 negative comments after a month you will stop entering the social network or you will do it to publish photos of the puppy you have adopted & rdquor ;. The best way to make up those accounts is to humanize them, that is to invent details of your life “as if it were a role-playing game.”
Penetrate the mind of the target is key, even more so if it is that “a quality speaker & rdquor; as a journalist buy your story. That is also more expensive, but it does not mean that those who work for these agencies have deep pockets. “You don’t earn much & rdquor ;, confesses the former agent. “There are many mileuristas, freelancers and people with illegal contracts & rdquor ;.
What Facebook The Youtube, Twitter has been criticized because its algorithms Recommendations tend to amplify the most viral content, which gives rise to incendiary messages that, by attraction or rejection, generate a reaction from users. However, he points out that the problem is the “wide sleeve & rdquor; of the platform and that “there is no control”. “How can you ignore the denunciations that journalists make of botnets and troll accounts? & rdquor ;, he asks.
With the publication of the book, the former agent seeks to report on a business that grows in the shadows, something that he also does from Twitter under the pseudonym @thebotruso. “We are very little aware of the large number of data what we left in internet and how valuable they can be in the wrong hands & rdquor ;, he warns. “These agencies still roam freely & rdquor ;.