Artificial Intelligence (AI) software is set to be implemented in Australia to stop top-flight footballers from being abused online.
Online abuse directed at players has become one of football's biggest problems in recent years. A teenager was jailed for six weeks last month for sending a racist tweet to Manchester United forward Marcus Rashford after England lost the Euro 2020 final.
Countless Premier League footballers have highlighted the sickening abuse they receive online and the Australian A-League - both men's and women's - have taken action by announcing the use of AI technology to filter harmful comments before they're seen.
The automated machine-learning technology is created by a British company named GoBubble and will monitor the social media accounts of players to block harmful messages, images and emojis. If successful, the software could be implemented elsewhere.
The news comes after Josh Cavallo - the only openly gay man currently playing professional top-flight football - received homophobic abuse in January. The 22-year-old plays for A-League outfit Adelaide United and has represented Australia at U20 level.
"I'm not going to pretend that I didn't see or hear the homophobic abuse at the game last night," Cavallo wrote on Instagram. "There are no words to tell you how disappointed I was. As a society this shows we still face these problems in 2022.
"This shouldn't be acceptable and we need to do more to hold this [sic] people accountable. Hate never will win. I will never apologise for living my truth and most recently who I am outside of football.
"To Instagram, I don't want any child or adult to have to receive the hateful and hurtful messages that I've received. I knew truly being who I am that I was going to come across this. It's a sad reality that your platforms are not doing enough to stop these messages."
Last year, a report from the Professional Footballers' Association (PFA) found that 44 per cent of Premier League footballers have received discriminatory abuse on Twitter and 50 per cent of abusive tweets come from UK-based accounts.
The study also revealed that 20 per cent of all detected abuse on Twitter was directed at just four players, while 33 per cent of all abusive tweets contained homophobic messages. Only 10 accounts, however, passed the criminal threshold.
"Taking comments down is easy," said PFA chief executive Maheta Molango. "It's not about taking down comments, it is about holding the people behind those accounts accountable. This report shows that, if we want, there are ways to actually identify people and hold them accountable."
Birmingham City striker and PFA players' board representative Troy Deeney added: "Social media companies are huge businesses with the best tech people. If they wanted to find solutions to online abuse, they could. This report shows they are choosing not to. When is enough, enough? More must be done to hold these people accountable."