Facebook is now warning members not to click dangerous links as part of a suite of new security steps announced today. The company is using WOT, or Web of Trust, whose community-based rating tools will fuel the pop-up warnings. If WOT’s users have deemed a site untrustworthy, or that it bears malware, this wisdom will now translate into an automatic warning message on Facebook as shown here.
The move helps Facebook. To the extent the warnings give users pause, it could mean they don’t click away from Facebook. But malware pushers love to figure out ways to make their links spread around on Facebook or any other large social platform because people are prone to clicking links when they appear to come from friends. Facebook has told me that attacks typically affect “less than a fraction of one percent” of members. But with 500 million active accounts, that can still mean a lot of people. If malware sites have been noticed by WOT, then the warnings will help–though they won’t result in those sites getting taken down, necessarily.
Alternatively, could this change the reliability of WOT itself? Anyone who wants to spread malware or spam on Facebook will now also want to fly under WOT’s radar. It’s plausible that scam-artists will either use links that haven’t yet been noticed by WOT, or will gin up a community on WOT that will furnish rave reviews.
What’s clear enough is that the idea of community-based website ratings may now be facing its ultimate challenge. WOT’s free service has more than 20 million users (anyone can use the WOT tool to see their warnings on search returns or in their browser). Now it may have 500 million of them.