Retailers face an evolving landscape of fraud tactics each day. It’s why companies are increasingly turning to AI to try and catch threat patterns never seen before, and block attacks before they ever happen. While this approach lends itself to efficiency, it’s also one that relies on increasingly complex data profiles of consumers. In this episode, we peer into the world of retail fraud detection.
- David Cost, VP of eCommerce and marketing at Rainbow Apparel
- Will Douglas Heaven, senior editor for AI at MIT Technology Review
- Rajesh Ramanand, co-founder & CEO at Signifyd
- Gaudet, A., Pullapilly, G. (2021). Queenpins. AGC Studios.
- Couple Allegedly Milks $31 Million From Fake Coupons, via YouTube
This episode was reported by Jennifer Strong and produced by Anthony Green and Emma Cillekens, It was edited by Mat Honan and contains original music from Garret Lang and Jacob Gorski. Our mix engineer is Garret Lang and our artwork is made by Stephanie Arnett.
Connie: Actually, how much would you pay me for this Cottonelle coupon? Would you pay maybe half of what it's worth?
JoJo: 20 bucks? Yeah. Why not?
Connie: So you'd give me 20 bucks for this coupon that I got for free? That's quite a profit. Don't you think?
Jennifer: That’s a scene from the movie Queenpins. It’s about two women who delve into the world of counterfeit couponing… inspired partly by a real case of fraud here in the U.S.
NewsNation Anchor: The FBI has busted a couple in what they say is one of the largest fraud operations ever discovered.
Jennifer: This couple spent years reaching out to people on Facebook who are really into coupons… inviting them to chat privately on the encrypted messaging app Telegram… Where they sold counterfeit coupons in exchange for digital payments.
NewsNation Anchor: Prosecutors say a Virginia Beach couple had at least $1 million in coupons in “every crevice of the house.” Retailers lost $31.8 million dollars because of this scam.
Jennifer: The coupon problem is just one example of the kind of fraud retailers face.
It’s why companies are increasingly turning to AI… trying to catch threat patterns that have never been seen before… and block attacks before they happen.
Jelther Goncalves: It works like this. We collect the information about the purchase and the behavior of the client.
Jennifer: This is a YouTube demo of AI powered software from the fraud detection company - Clearsale.
Jelther Goncalves: We run machine learning models, business rules and historical data from different industries to check if the purchase is fraudulent or not. All this runs in a matter of seconds.
Jennifer: It’s an approach that lends itself to efficiency… it’s also one that relies on increasingly complex data profiles of consumers.
I’m Jennifer Strong and this episode we peer into the world of retail fraud detection.
David Cost: This is kind of where it all starts.
David Cost: This is new merchandise coming into the building. It's gonna get processed through these lines, where everything has to be individually taken out. It has to be individually bagged. It's gotta get tagged.
Jennifer: We’re inside a retail distribution center…. in Philadelphia.
David Cost: My name is David Cost. I run eCommerce and marketing for Rainbow shops... So why don’t we go this way...
Jennifer: Rainbow sells clothing, shoes and accessories for women and kids.
David Cost: We operate about 1,100 brick and mortar stores in the US, Puerto Rico, US Virgin islands, as well as this e-commerce operation where we are today. So these are, you know, what happens as merchandise gets received. Each piece by color, style and size gets put into a location on one of these shelves. And then as customers place orders, we have technology that tells them to pick. If you had ordered this blue shirt, it would get picked. It gets put into a bin, gets put onto a conveyor. And ultimately it's gonna get sent to this sorting machine where we're gonna go see next.
Jennifer: Seeing all this in action really puts into perspective how clicking that buy button on a screen sets off a whole physical process.
David Cost: Items will drop out of those trays, go into a slot, go into a bin. That's how your order gets put together. It gets pushed through. There's a conveyor at the end that's gonna wind it up to the top. And then that's where we're gonna finally check the order. It's gonna get packed. Box gets sealed. It gets shipped out to the customer.
Jennifer: He says fraud management in places like this has changed quite a bit since he started this job. For example, they used to set up filters based on some simple questions.
David Cost: Was it a customer that we had ever seen before, right? We had certain flags that we would look for and somebody would manually go look at those orders and kind of have to just make a guess on whether they thought that order was good or not. We ship it out and if it turns out it's not good, then we, the retailer, you know, take that loss. So it's as if somebody stole those goods. It's the same thing as shoplifting. You know, and it can, it can have a real impact on your business and you want to be careful. If you're too cautious, you turn good business away. I mean, there's nothing worse than taking a customer who's got a good valid form of payment and telling them, nope. Sorry. Doesn't look quite right, so we're not gonna ship it.
Jennifer: One place Rainbow came up against this… was in doing business with buyers in other countries.
David Cost: So they're placing the order from one of the islands in the Caribbean. So their billing address is a Caribbean address, they're shipping the goods mainly to a Florida or a Texas re-shipper's address where a bunch of orders are combined to go to the island, right? And then they're taken into those islands. These are good customers but for traditional fraud, to be able to see that foreign credit card, foreign address, but a US delivery address would raise all kinds of alarm bells. So the sophistication of the AI now to be able to help us accurately figure out what's a good order and what's a bad order, lets us raise our level of customer service… So that we're taking care of customers at the same time, minimizing the potential loss to the business, with those orders that we would fulfill and ship and then not get paid for.
Jennifer: Rainbow is using a tool from fraud detection company, Signifyd… and he says these days they turn down less than 1% of their orders.
David Cost: In earlier times when we had to do this all manually… they would go to Google Maps and look up the address to see if the address looked legitimate. You know, now with the sophistication of the tech to know who you are, where else have we seen you order, digital IDs on your device, distance between your billing and shipping address. So many factors. So again, even five or six years ago, that process was manual and it could take a day or two, you know, now they're being made in fractions of a second.
Will Douglas Heaven: I mean, this happened to me yesterday, actually, you know, I got a text message. I'd taken my car to the garage and, you know, put a larger than usual sum on it.
Jennifer: That’s Will Douglas Heaven, Tech Review’s senior A-I editor.
Will Douglas Heaven: And I got text messages from my bank saying, you know, Hey, was this you? Are you okay with that? And, you know, text Y if yes, text N if no. That happened because, you know, there's just constant monitoring of my transactions, like everybody else's. And the system will have built up an idea of my spending habits.
Jennifer: Fraud detection systems traditionally flag transactions based on specific rules… Like where and when a transaction took place, or if the customer shops there often.
Will Douglas Heaven: But the problem with doing that by rules is you've always got to, you know, humans have gotta say what is or isn't okay. And increasingly that's hard because you can't sort of think ahead and, you know, try and predict ahead what things might be suspicious and, and what things aren't. And it's also not very flexible. It doesn't adapt very quickly. So in the last few years, people have been turning increasingly to machine learning, to a kind of AI to add that flexibility and to be able to spot, sort of, anomalies suspicious behavior without sort of having to say beforehand what that might be.
Jennifer: These AI-based fraud detection systems are typically trained with a method called unsupervised learning… where an algorithm is given a large amount of data, but not told what to look for. The system finds patterns in that data…. and anything outside what it deems ‘normal’… gets flagged.
Will Douglas Heaven: When a flag is raised because it's broken a rule, it's very clear to the humans who need to step in and figure out why that alert has been raised. And if you're gonna flag a large transaction as being suspicious, then you need to have a good reason for doing so. And if this transaction has been flagged because it breaks a rule, then that's very clear cut. But if it's been flagged by a machine learning system that sort of has taught itself what is, and isn't normal, it might not be immediately clear why this alert has been raised. And if you need to go to a big business client and give them a reason for why you're stopping their transaction, “the system says so,” isn't always good enough.
So one thing we're seeing is sort of a hybrid approach where machine learning systems are being used to discover new types of suspicious activity. And then if they're right in what they spot, then that could be entered into the system as a new rule. And, you know, then you have sort of, you know the transparency of a rule based system, but the sort of the flexibility of a machine learning system, that's able to spot things that you may not have anticipated.
Jennifer: You can find links to our reporting in the show notes... and you can support our journalism by going to tech review dot com slash subscribe.
We’ll be back… right after this.
Raj Ramanand: Most of us have experienced this at some point in our life. You know, you have a credit card and you've got a phone call from your credit card company saying, Hey, it looks like you've done a transaction and this doesn't look like it's yours. We could either be traveling somewhere internationally and it may be the first time we're traveling there and we get almost a cancellation on the credit card saying that this doesn't look like it's you, even though it could be you. What happens for the non-consumer, which is the retailer. If you look at their side of the story, they end up declining billions of dollars of transactions on an annual basis because they think it's fraud.
I'm Raj Ramanand, the co-founder CEO of signified.
Jennifer: His company uses machine learning in its fraud detection platform. And it’s what’s used in the retail distribution center we visited earlier. The company is one of several industry players including Riskified, Clearsale and Sift.
Raj Ramanand: So we look at trying to approve more good transactions in the world of commerce and turning away bad users, thereby protecting merchants from the downside of fraud and risk. While at the same time, increasing conversions and driving higher commerce.
Jennifer: When it comes to fraud, reporting a strange charge on your credit card typically leads to you and your card company getting that money back. But it’s a different story for the retailer, which usually ends up eating those costs.
Raj Ramanand: And so to protect themselves from the liability of fraud they start putting barriers to be able to turn away the fraudsters and in return for turning away the fraudsters, they turn away a lot of good people like us. So when you look at the size of the problem today and what retailers have to deal with, they're turning away anywhere from 1% to 20% of their traffic, which is good people who are trying to buy from sites, which is a massive problem. Because if you look at eCommerce in general, it's close to 4 trillion globally and if you're turning away 20% of that, you're turning away about a trillion of commerce and that's a massive amount of revenue for these guys, but at the same time, more than that, it's, it's the experience you face when you buy at a retailer and get turned away.
Jennifer: And returns are another issue for retailers.
Raj Ramanand: If you've ever bought a pair of say, Nike shoes online, and then you return it. It takes you anywhere from seven to 14 days for that money to come back on your credit card. It's not because the payment rails don't exist to put that money back onto your card. It's because Nike is afraid that if you don't ship those pairs of shoes and you ship something else. Then they've lost the money. And so it comes back to a fraud problem that effectively impacts a customer experience problem. And so if you could tell the retailer that, Hey, Jennifer is a good person, give her her money back today, but because you gave her the money back today, she may shop again at your store instantly. You drive conversion at that point, you deliver a great consumer experience. But you don't block their transactions because of the fear of fraud, that's one of the interesting problems that I think is gonna be the next phase of what retailers have to deal with.
Jennifer: But getting to this next phase of consumer trust, requires closing the information gap between retailers and card companies.
Raj Ramanand: The biggest problem in payments, which is a little bit hidden under the rails that no one talks about, is that most of these pipes, if you want to call them—the connections have been built 20 years ago. And those pipes were not built for the internet. And over the 20 years as eCommerce boomed, they simply built on top of these rails and they sent the same information that was expected, but they never sent a lot of the new information you collect in the digital age like an email address or a shipping address or a phone number or… All of that information just sits with the retailer and doesn't flow down the stream all the way down to the issuer. And so what's happened is, because these chains of custodians if you will, don't get that information. They can't make the most accurate decisions on a transaction. Think of it like if you're applying for a home loan, but you had no credit history ever, how could I approve you for that home loan, because I've never seen you before. It's very similar. I have seen nothing about you. How do I make that decision?
Jennifer: And these decisions are increasingly nuanced. The pandemic caused a boom in curbside pickup, and added an extra layer of complexity in fraud detection.
Will Douglas Heaven: I spoke to one company that builds these fraud detection systems, uses AI for fraud detection.
Jennifer: Again… Here's Will Douglas Heaven.
Will Douglas Heaven: And they were talking about how in the early days of lockdown, everybody started buying different things. Lots of people started buying lawn mowers or home improvement tools because they were stuck at home and they wanted to do different things. And so they got some flags for, for that kind of behavior, but it, they just needed to step in and, and, and teach the system that actually, you know, people are now doing this, it's okay.
Jennifer: And this ability for constant self-improvement, makes the system harder to outsmart.
Will Douglas Heaven: So if I'm a money launderer or, you know, a, a fraudster, then it's gonna be harder for me to do that because unless I'm very, very good at making this simply look like regular everyday behavior, then whatever I do will be flagged as out of the ordinary… So, if I'm a criminal, the game is just to do stuff that looks completely boring, completely normal.
Jennifer: This episode was produced by Anthony Green with help from Emma Cillekens. It was edited by me and Mat Honan, mixed by Garret Lang… with original music from Jacob Gorski.
If you have an idea for a story or something you’d like to hear, please drop a note to podcasts at technology review dot com.
Thanks for listening… I’m Jennifer Strong.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
The viral AI avatar app Lensa undressed me—without my consent
My avatars were cartoonishly pornified, while my male colleagues got to be astronauts, explorers, and inventors.
Roomba testers feel misled after intimate images ended up on Facebook
An MIT Technology Review investigation recently revealed how images of a minor and a tester on the toilet ended up on social media. iRobot said it had consent to collect this kind of data from inside homes—but participants say otherwise.
How to spot AI-generated text
The internet is increasingly awash with text written by AI software. We need new tools to detect it.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.