By Andrew Warshaw
July 6 – So-called fans who hide behind hidden social media accounts to fire abuse at black and BAME players face being called out under revolutionary technology that strips away their anonymity.
The initiative, named Threat Matrix and seen by Insideworldfootball, is being hailed by anti-racism groups as a potential game-changer in the global fight to stamp out on-line discrimination.
Developed by Signify, a London-based data science company, the system could be rolled out in the English Premier League by the start of next season while discussions are also taking place with FIFA about its possible introduction on the international stage.
The Black Lives Matter movement has led to footballers across the world taking the knee before matches, highlighting the scourge of racist abuse – much of it anonymously on platforms like Twitter and Instagram.
Now, as a result of Signify’s forensic capability to source where abuse is coming from, thousands of culprits could be named and shamed for discriminatory insults, threats and images that have caused pain and hurt to so many players and their families, often for years on end.
Signify uses a mixture of Artificial Intelligence and human expertise to process huge amounts of publicly available data and uncover the real identities of online abusers, crucially without breaking any privacy laws.
Explaining why the company, whose technology has already been used to fight anti-Semitism in the British Labour party, was moving into football, Jonathan Hirshler, who co-founded Signify with former journalist and social media strategist Jonathan Sebire, told Insideworldfootball: “We are both football fans and realised that one reason players sometimes under-perform is because of their fears over threats via social media. It’s a widespread problem.”
Signify is offering its services to clubs initially at a cost of around £5,000 a month and interest is growing fast.
“A number of Premier League clubs have shown huge interest and we’re looking to get something up and running for next season,” Hirshler said.
“We also have a multi-lingual capability and are in talks with FIFA over running a pilot study to cover international football, looking at the levels of social media abuse that is being hurled at different teams and players, and picking out the evidence.”
Sebire, the brains behind the enterprise which has already carried out live trials this season with one Premier League London-based club, says too much emphasis has been placed in the past on simply reacting to racist abuse.
“What we are doing is pro-active, focussing on people we can prove have real-world relationships with clubs, some of them season ticket holders, and de-anonymising them,” he said.
“The model can identify type of abuse, provide evidence of where it is coming from and, sometimes by deep-dive investigation, who it is coming from. We can capture it before it’s reported.”
“In the long term, if it becomes known we have the technology, we hope we can help lower the amount of on-line abuse. But it can only happen if the clubs buy into it. The world has changed. It’s no longer tenable for clubs and leagues to wait until something happens and then deal with it.”
Sanjay Bhandari, head of the anti-discrimination body Kick It Out, hailed Signify’s forensically sound model as a huge breakthrough in football’s fight against racism and homophobia and is backing a pilot programme for up to eight top-flight English clubs next season.
“On-line abuse is massively growing, it’s been on the rise significantly over the last few years,” said Bhandari.
“The reality is there are just not enough systems in place to manage it and while there is no one silver bullet, what Signify is doing is a hugely important part of the armoury.
“Why does on-line abuse proliferate? Because of anonymity and the feeling of being able to act with impunity. It’s like the Wild West on social media. This could be a game-changer and I would seriously advise clubs to buy into it.”
Contact the writer of this story at firstname.lastname@example.org