Facebook’s army of malicious bots are being trained to research anti-spam methods
Facebook has created a parallel platform where these bots can run around and deal with similar situations as in real life and see how the algorithms deal with it.
Facebook might have some well-meaning efforts in place but some bad actors manage to get through its safeguards and policies. The social media platform is now upping its guards and is experimenting with a new way to strengthen its anti-spam walls and preempt bad behaviours that could breach its safeguards - an army of malicious bots.
The platform is developing a new system of bots that can simulate bad behaviours and stress-test the platform to unearth flaws and loopholes. These automated bots are trained and have been taught how to behave like a real person using all the data of Facebook jas acquired from its two billion-plus users.
To make sure that this experiment does not interfere with real-life usage, Facebook has built a parallel version of sorts of the platform itself where these bots are allowed to run loose. In this parallel Facebook-verse these bots can message each other, comment on posts, send friend requests, visit pages etc. And most importantly, they can stimulate extreme scenarios such as trying to sell drugs, guns etc to see how Facebook's algorithms are trying to prevent them.
Facebook says that this new system can host thousands or even millions of bots and since it runs on the same code the platform's users are actually using, the actions these bots take are faithful to the effects that would be witnessed in real life.
Mark Harman, the person leading the project, wrote in a blog post that the project is currently in a research-only stage and the hope is that it will eventually help Facebook improve their services and spot integrity issues before it affects people on the platform.
Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.