Skip to Content
Uncategorized

‘Flasher Detection’ Algorithm Aims to Clean Up Video Chat

Computer scientists have developed software that spots flashers in the act on video chat sites.

One of the more extraordinary trends in internet use has been the rapid rise of video chat services such as Chatroulette. These services randomly link the webcams of people who visit the site.

But Chatroulette has a problem. The site is dominated by flashers who expose their genitals.

Some 6.3 million visitors used Chatroulette in July 2010, perhaps because of the sexual nature of its content.

But this poses a significant threat to minors. There is no easy way to police the age of people who visit websites and minors can gain access easily. According to Xinyu Xing at the University of Colorado at Boulder and a few pals, a significant number of Chatroulette users appear to be minors.

“Our observations on a typical Saturday night indicate that as many as 20-30% of Chatroulette users are minors,” they say in a paper published today on the arXiv.

Xing and co have a solution, however. This team has developed a “flasher detection” algorithm that spots the offenders, allowing them to be kicked out.

It turns out that catching flashers is harder than it might seem at first. One way is to employ a crowdsourcing mechanism in which users report offenders whose video feed can then be evaluated by trained individuals and stopped if necessary.

But that’s a time consuming and expensive task that is open to abuse. And with upwards of 20,000 users at any time, it’s unlikely to work for Chatroulette in the long run.

Another approach is to use existing algorithms designed to detect pornographic content. Exactly how these algorithms work isn’t entirely clear, but they appear to look for skin content in images.

Unfortunately, this type of software does not work well with video chat content, say Xing and co. That’s because the video images are often poorly lit making it hard to distinguish skin from yellowy-white walls in the background, for example.

So Xing and co have developed their own algorithm, called SafevChat which they’ve tested on some 20,000 stills taken from Chatroulette videos and supplied by the service’s founder Andrey Ternovskiy. Their paper gives a detailed insight into how it works.

The new approach is interesting because it analyses the images using several different criteria and then fuses the results before deciding whether the image is acceptable or not.

To get over the problem of skin-coloured walls and furniture, they combine skin detection with motion detection that compares sequential frames to see whether the”skin” is moving. And they use face, eye and nose detectors to distinguish facial from non-facial skin. The results are fused and the image is then classified as normal or offensive, having been trained on the initial dataset.

Xing and so say it works well and significantly better than a commercial pornographic image detector programme called PicBlock. They have a video showing how it works on their website.

In fact, SafeVChat works so well, that Chatroulette began using it on its website earlier this month.

Whether this makes Chatroulette more or less popular, we’ll have to wait and see. But with any luck, it will make the site safer for all concerned.

Ref: arxiv.org/abs/1101.3124: SafeVchat: Detecting Obscene Content and Misbehaving Users in Online Video Chat Services

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.