Mabel works in a company that is looking for the Internet to remove the worst images available online.
At home, he is a loving grandmother who loves to play with his grandchildren, but Mabel work must look at the “abominable” content, in his words, of sexual abuse on minors available on the internet.
It works for one of the few organizations authorized to actively search for indecent online content to help police and technological companies remove images.
The Watch Foundation (IWF) Internet contributed to taking a record number of almost 300,000 web pages last year, including the images generated by Artificial Intelligence (IA), whose number is almost fifted in a year.
“The content is horrible, it shouldn’t even have been created to start,” says Mabel, who is a former police officer.
“It never becomes immune to this, because in the end all these children are victims, it is abominable.”
Mabel – imaginary name to protect your identity – is exposed to some of the most perverse and horrible images available online. He says that his family is his main motivation to perform this function.
Mabel, originally from Wales, calls a “disruption”, because it prevents criminal gangs from sharing videos and images of abuse to make money.
IWF analysts work in anonymity to feel safe and protected by those who oppose their work, such as criminal gangs.
“There are not many jobs in which you will work in the morning and do well all day, as well as annoying truly harmful people, so I have the best of both worlds,” says Mabel.
“When I remove an image, physically commitment to people harmful to access these images.”
“I have children and grandchildren and I just want to make the internet a safer place for them,” he adds.
“On a wider scale, we collaborate with law enforcement from all over the world so that they can start an investigation and perhaps retain gangs”.
Cambridge, IWF based in the United Kingdom, is one of the only three organizations in the world with the permission to actively seek the content of abuse for children online and, last year, contributed to descending 291,270 web pages, which may contain thousands of images and videos.
The Foundation also claimed to have contributed to removing almost five times more images of sexual abuse on minors generated by artificial intelligence last year compared to the previous year – 245 in 2024, against 51 in 2023.
Last month, the government of the United Kingdom announced four new laws to combat the images generated by artificial intelligence.

The content is not easy to see. Tamsin Mcnally and his team of 30 people know it well. But they also know that their work helps to protect children.
“Let’s make the difference, and that’s why I do it,” says the team leader.
“On Monday morning I joined the complaint channel and there were more than 2,000 reports of public members who said they had come across this type of image. We received hundreds of complaints every day.”
“I really hope that everyone will see that this is a problem and that everyone will do their part to prevent this from happening”.
“I would like my work to not exist, but unfortunately, as long as there are spaces online, there will be a need for jobs like mine”, currency.
“When I tell people what I do, they often can’t believe that this type of work exists. So they ask: why do you want to do it?”

Many technological companies have intended legal actions against companies, claiming that the work had destroyed their mental health, but the Foundation stated that its duty of care was “Gold Standard”.
Institution analysts make compulsory monthly therapy, team meetings of Team and have a regular well -being support.
“There are these formal but also informal things that we have a billiard table, games, puzzles, they are a big fan of the puzzles, where we can take a break if necessary”, adds Mabel.
“All these combined things help us keep us here.”

IWF has rigid guidelines that guarantee that personal phones are not allowed in the office or that any work, including and -mail, is not done.
Although he was applied to work there, Manon – a fictitious name to protect his identity – was not sure it was a job he could do.
“I don’t even like watching horror movies, so I wasn’t sure I would be able to do the job,” says Manon, who is twenty years old.
“But the support you receive is so intense and complete that it is comforting,” he adds.
“Anyway, you are making the Internet a better place and I don’t think there are many jobs where you can do it every day.”

He studied linguistic at the university, which included work on the online language and separation, and this aroused his interest in the work of the Foundation.
“Criminals can be described as having their own community – and as part of it, they have their language or code they use to hide in view of everyone,” explains Manon.
“Being able to apply what I learned at university in a scenario of the real world and find images of children’s sexual abuse and preventing this community is really satisfactory.”
Source: Terra

Rose James is a Gossipify movie and series reviewer known for her in-depth analysis and unique perspective on the latest releases. With a background in film studies, she provides engaging and informative reviews, and keeps readers up to date with industry trends and emerging talents.