Skip to main content

TransUnion Fraudcast Episode 1: Digital Fraud

Episode 1

In this inaugural episode of the TransUnion Fraudcast, host Jason Lord talks with digital fraud expert Jim Hendrie about why digital fraud exploded since the beginning of the COVID-19 pandemic and what signals and technologies can be used to keep it in check — without degrading the customer experience.

Jason Lord:

Welcome to the TransUnion Fraudcast, your one-stop shop for all the absolute linkages between the day’s emerging fraud and authentication topics. We’ll cover trends, tropes and travails delivered with all the straight talk and none of the false positives.

I’m your host, Jason Lord, VP of Global Fraud Solutions and 15-year veteran of the fraud and marketing ecosystems.

If you’re new to the Fraudcast –– and I suspect you are because this is episode one –– each week we narrow in on a specific subtopic within the fraud and authentication universe, bringing on a special guest to help us dive in while keeping it high level enough that you don’t need a PhD in data analysis to understand what’s going on.

This week we’ll be focusing on fraud within the digital space.

Now, the digital channel has exploded since the pandemic to become the predominant way consumers interact with brands and agencies. And certainly it was already headed that way, but the pandemic accelerated the trend, and if you happen to see this year’s TransUnion State of Omnichannel Fraud report, you’ll have seen that fraud volumes have kept pace with the increase in digital interactions, which is just another way of saying that digital fraud is at an all-time high.

So what’s to be done? How can an organization build trust and safety back into their digital channels without making it difficult for legitimate consumers to interact with them?

Here to discuss this and much more is Jim Hendrie, Product Manager of Digital Insights Solutions at TransUnion, who has worked in fraud mitigation technology for 18 years, and certainly knows a thing or two about how technology can be used to improve fraud capture within the digital space.

Jim, welcome to the Fraudcast.

Jim Hendrie:

Happy to be here. Thank you.

Jason Lord:
Wonderful. So let’s dive in.

Where do many organizations that you speak with go wrong when fighting fraud in the digital channels?

Jim Hendrie:
So I don’t want to say “wrong,” necessarily, although there are some missteps for sure…we can always look for improvements, right? But I think the biggest challenge is getting away from a one-size-size-fits-all solution.

Not only are different fraud types needing different solutions, but you want to make sure, of course, they’re your good customers, which is a majority.

A crushing majority of your interactions with the public are good customers and you want to make sure that they’re having a good experience, while at the same time you’re protecting the services and the capabilities and products that you’re offering to the market.

And so being nimble, it’s probably a phrase I’ll use a couple times in this podcast to make sure that you’re being aware of the circumstances and what’s happening now and adjusting to that accordingly. And that includes it’s not set and forget, right?

We always hear about fraudsters continuing to change their tactics using different technologies, different approaches coming back to old technologies, things that maybe a lot of us have forgotten and trying those again to see if they can still bear some fruit for them.

So, making sure that there’s an opportunity to consider what strategies do you have in place with what you knew at the time versus what you know now, and the data that you have now, and making sure that’s being properly implemented into the fraud strategies so that you can have that better experience for your good customers –– and better fraud catch as well.

So kind of wrapping that back up, the idea of be nimble and not rest on what you have done in the past, but make sure you’re always looking at the new ways to apply what you have.

Jason Lord:
Well, you said something really interesting in there, Jim, which is that it’s not just about the new technology, it’s also perhaps about seeing what technologies you may have left in the past might be applicable.

And I’m really curious to know, can you give me some examples of both the new and the old technologies that people should be considering?

Jim Hendrie:
Yeah. So you referenced the Omnichannel report as we went through Covid.

Remember that many financial institutions, as an example, didn’t have access for their customers at the branches and that there was a massive increase in call center interaction. Which of course meant that the fraud channels moved into call center.

So whereas many people think about digital as the future, the current and future phone centers are still a key part of the interactions and are certainly vulnerable to various degrees from a fraud perspective. And so the fraud service, yeah.

Jason Lord:
It’s also worth probably pointing out that a lot of fraud doesn’t stay within one channel, right?

Jim Hendrie:
Yeah, exactly.

Jason Lord:
So it might start in the call center and end up in the digital channel.

Jim Hendrie:
Yeah, there are more avenues for the fraudsters to pursue to accomplish their goals, and that could be with social engineering. They’re trying to get more information from the call center. As an example, to use for a digital submission or vice versa, right?

So these all are working together, but again, every day there’s more context that’s being provided and therefore there is more opportunity for fraudsters to take advantage of that.

Jason Lord:
Now, one of the other things that you mentioned in your opening remarks is about the customer experience, and a lot of fraud analysts will talk about balancing “fraud prevention and customer experience.” And I always dislike that phrase because to me it sounds like a zero-sum thing, right?

You can either have fraud controls and a friction-filled experience, or you can have an ideal customer experience with a ton of fraud slipping through.

I’m curious to know how you think of that dichotomy, and is there a way to have both strong fraud controls and a good customer experience?

Jim Hendrie:
Yeah, absolutely. And you’re correct, it shouldn’t be an either-or. This should not be an “or” conversation. It should be an “and” conversation so that you have a good mix of appropriate tools and screenings and vetting in place so that the right friction is applied to the right risk that’s involved with that interaction.

So the history of that customer, the appropriateness of the interactions they’re having with your site, as an example, the timing in which they are doing these things or they’re jumping right ahead to change their contact info and they’ve never placed an order with you before –– these kinds of contextual things are certainly important.

But as we’re going through this continual advancement in how fraud strategies can be brought to bear on the fraudsters, there’s always got to be the understanding of its impact and perhaps the benefits that are applied to the good customers. So a good customer would expect a certain level of security when in fraud prevention. Of course, when they’re interacting with a financial site, you want to probably get some validation in a confirmation request that you’re transferring $10,000 to somebody.

Jason Lord:
Right. If I was to log into my bank account and do a wire transfer and nobody questions me any time about my identity, I might freak out a little bit at that point.

Jim Hendrie:
Exactly right. Like something got missed now, and that’s part of the thing where it’s really great to have it be under the covers where they do a lot of checks.

Again, that reduces the friction –– it makes it for a better user experience, but at the same time you want to know that there’s some handholding going on and that people are paying attention to these things with an appropriate level of seriousness.

And so when we’re talking about this in terms of that balance, as you mentioned, again most of your transactions are with good customers and they are the paramount of concern.

But you have to do it with that meaningful understanding of what types of restrictions or additional steps you might put in place that are appropriate for the risk of that interaction.

If I’m logging in, maybe there’s not as much damage that I can do as I log in, but certainly when I’m doing financial transfers or I’m changing contact info, etc., that should have a higher level of that friction to make sure that the right people are doing the right things.

Jason Lord:
And presumably the data and technology are referencing in addition to identifying risky interactions, also identify the safety interactions and then remove the friction against them so that the consumers have that ideal experience they’re looking for.

Jim Hendrie:
Exactly. So in a login scenario and with a well-established customer you have history, you have context. You can start to understand their normal behaviors, etc.

And so things that stay within that realm certainly help to build that confidence, that it’s the same user and they’re doing their normal process of interacting. And therefore you don’t have to necessarily put as many barriers in their way, whether it’s visible to them or even behind the scenes.

So you can save money on the processing of those transactions if you don’t need it, but if something is different, having that ability to put something new in, it’s in the gate, so to speak, so that you can make sure that you’re challenging appropriately.

Jason Lord:
And that’s an interesting point you make, because fraud prevention is often thought of as a cost center, but you’re indicating that if done appropriately, it might actually save you operational costs in the long run.

Jim Hendrie:
Oh, absolutely, yeah. If you don’t, if you don’t need to ask four questions about something because you’ve already known that you’re going to stop this transaction because of prior history, then yeah, you can save a bunch of money on a per-transaction basis of not having to do additional verification steps that are not going to be used in your final disposition because you’ve already answered that question. So…save cost, shortcut, save time –– lots of value there.

Jason Lord:
And you’ve already started to dive into that.

There are different points in the customer journey that probably require a different type of engagement.

So on the one side, there’s a counter origination and onboarding. And then on the other side, there might be transactions and account management as you’re thinking about that customer lifecycle: How do the technology and signals change as you move through the customer journey?

Jim Hendrie:
Yeah, great point.

So at account origination, application, those sorts of events, that’s where you have your best opportunity to get the most information about that customer as appropriate.

We want to make sure that you’re always being, uh, privacy appropriate, right? With asking for information that’s necessary for what you’re trying to do.

But at that point it’s normal for somebody to be requesting, let’s say, an account opening at a financial institution.

The first and last name and Social Security number, address, contact info, things of that nature, and they also understand that it’s going to take a moment for that application to be processed.

So on the back end, you’re calling out to different vendors, different services in-house to address those fraud risks, and it takes a little time to sequence those together.

Customers are generally going to be a little more accepting of that, but at login, typically we all expect a login experience that’s very, very fast because I’m providing my login credentials.

No additional checks are really happening as far as, like, identity verification or things of that nature, ‘cause I’m not normally going to be expected to provide that at a login, so it’s a much quicker, faster type of expectation.

But again, when we get into the high-risk types of transactions like you mentioned, a money transfer, you kind of want to be asked, did you do this because you want that extra step in place to make sure that the right people are getting the right the right outcomes?

Jason Lord:
I’m going to bring us back to my first question, and as we think about the customer journey –– and I know that you’re a nice person. Jim, you don’t want to criticize anybody for any choice that they’ve ever made… But what are inappropriate ways to apply technology or data or friction depending on the customer journey?

Jim Hendrie:
So I mean, there are certainly times where the lowest common denominator has been discussed as far as a fraud strategy. So limiting the use of, let’s say, one device per user forever.

Sounds good. Maybe the first time, it said.

But then there’s the immediate realization that, well, you would expect a customer to evolve and buy a new computer over time. As an example, right?

So that type of thing, obviously you have to think about the next step; overbearing restrictions on the number of computers that might be associated with an account, a blocking by geography where you do actually have a policy about from your terms and service types of things, of what region you serve.

But sometimes there have been suggestions, and luckily most customers have figured this out in the discussion phase, but they’ve been a little too strict and obviously that could result in some poor customer experience.

So if you think of it in terms of all the tools that you have, because they are tools, they are going to do what you tell them to do and working on them individually is certainly beneficial. But also thinking about all those tools collectively and how they interact with one another is really an important aspect of this.

To make sure that you have a relatively seamless and straightforward strategy in place so that you’re not creating trouble for yourself and, more importantly, not creating contradictory outcomes for your good customers.

Jason Lord:
That seems to be a motif that I hear a lot is don’t look at the data and don’t look at the technology in isolation, right? These things have an impact on one another, and if you want to understand the person at the other end of the interaction, you have to not just look at the phone individually or the offline identifiers individually, but how all these things in combination reflect your understanding of the risk of this individual.

I wanted to talk a little bit about, you know, the concept of identity proofing is probably well known to many of the people listening. It’s using data and documents to verify that the person’s claimed identity matches their actual identity.

I’ve heard you speak on a similar need to create a standard for device proofing, So what is device proofing, and how might it differ from other forms of digital fraud management?

Jim Hendrie:
So device proofing as a general concept is really trying to establish more depth of understanding exactly. To your point, there are three things that we are kind of thinking of it as.

So I’ll say it is three-dimensional, but I guess in some sense you could look at fourth and fifth dimensions as well, and that is to try to provide in that interaction of an onboarding experience as much of the detail that you can to understand how the different data that is available, how they corroborate one another, where there’s differences, where those differences are meaningful.

So as an example, in this case, using this multifaceted approach where you use those different views, I call it PMC for provided, monitored and collected.

So I’ll break that down a little bit.

Provided would be that that information that I am providing to you in this application for a loan, as an example. I could put in whatever information I want.

There’s going to be validation of that, but the provided information is from that end user who’s making the application.

There’s a monitoring aspect. How am I providing that information?

Am I putting that information in a normal form flow that you’d expect first and last name?

I type it out no problem. I put in my address and I go through top to bottom on the form. Or am I jumping around and copying and pasting? Am I flipping between the application webpage and going off from the browser and then coming back? I might have the correct information, but I’m putting it in. I’m submitting it in curious ways, maybe even just fraudulent ways, and then the last part of that is the collection of data.

Typically they would be passively done so device fingerprinting as an example where you’re gathering information about the device and the network, the IP address, etc., that the customer is not providing, but it’s being pulled in and being used in combination with these other aspects just mentioned to provide a more robust or comprehensive understanding of the user where they’re coming from.

Is there consistency with the data they provided and bringing that all together into a score or result that’s inclusive of all those things at that step and bringing that to bear for the decisioning that you want to make about if you’re going to allow that transaction to continue on if you’re going to just outright deny it, or also if you’re in those spaces where there’s questions in the review queue, maybe you have a lot more answers to that transaction and you can make a quicker decision about what you want to do with that transaction.

And as you learn from that, you can even fine tune it so that you can get to allow or deny a little more quickly and a little bit further upstream. So you don’t even have to go into the review queue and have a resource. Take a look at that and spend time trying to decide which way they’re going to go with it.

Jason Lord:
And to your earlier point, presumably save money or increase conversions over time.

Jim Hendrie:

Jason Lord:
So it’s not just fraud prevention. Again, it’s about improving the business KPIs.

So if I heard you correctly, and please correct me if I misunderstood, it sounds like there’s three components to this concept of device per thing.

There’s the device risk or device reputation piece of it, right? So that’s consortium data.

Jim Hendrie:

Jason Lord:
Presumably, then, there’s device to identity linkages, which is about the what we know about the individual and how it links to the device they hold.

And then there’s the behavioral analytics portion, which is how am I interacting with the form.

Am I doing it in a way that a normal consumer would, or am I doing it like a bot or a fraudster?
Would it might behave in some sort of way that seems abnormal, so that combination…what does that do differently than most device reputation alone?

Or, you know, device by biometric technologies might. How? How does this combination impact business decisions or the customer experience differently?

I think there’s probably a couple aspects here.

Obviously individual data points are very valuable, of course, but they can be misleading and so the context in which these three data points come together is really valuable. The sum is ––

Jim Hendrie:
Sorry, the total is greater than the sum of the parts where all of these things are combining together to provide a more accurate and more dimensional understanding of that interaction.
Whereas I might have said this device does not have any peculiarities to it.

Any anomalies? There’s no negative history with it. It hasn’t done a variety of transactions previously, so it doesn’t have high velocity, but again, from the behavioral standpoint, perhaps there’s a lot of oddities and how that data was presented.

That’s context. That is really important.

In addition, then, to tie that back to the device. So let’s say it did have a slightly high velocity.

It’s done a couple applications in the 24-hour period of time, as an example, and that in itself may not be too bad, but it’s something that raises attention as a risk.

Jason Lord:
So it sounds like what you’re saying is that there are many times when you know there may be enough signal based on device reputation, but there might be times when it’s not.

And now if I’m thinking as a person who has to make a decision about whether to let this transaction through, let’s say I’m a bank and so it’s a risky transaction. I’m probably going to default to a risk, right?

I’m probably going to say it’s probably more likely that this is a false positive than I’m going to let it through just because I don’t have enough signals, so presumably that bumps up the false positive rate and the conversion rate goes down.

Jim Hendrie:
Yeah, exactly.

Jason Lord:

So if you’re using if using these things in combination then you’re able to see a group of transactions that you might otherwise label as a false positive because you now have more signal than you ever had before.

Jim Hendrie:
Yeah, exactly. It’s just trying to build out…I use this phrase probably too often, but comfort and confidence so that the outcome that you have has had different dimensional aspects brought to bear upon it.

So you build out a stronger confidence that that decision is a good one. It’s not made in isolation. It’s made in context with other data and the other things that are all of our, you know, all the markets have really brilliant people working on these teams.

And so this is one aspect of many things that they’re bringing to bear with their experience and their knowledge on their respective spaces on what they should be doing with this.

And so anything that can add more context, and again, that comfort and confidence is certainly a win.

Jason Lord:
If you have to leave our listeners with one piece of advice about how they think about their digital fraud mitigation strategy and staying one step ahead of the fraudsters, what would you recommend?

Jim Hendrie:

In a single word: Nimbleness.

Expect that you’ll have to make changes to your strategy and plan ahead.

And I know it’s very difficult sometimes to get the resources to do these things. So the more that you’re planning ahead and anticipating the changes will be needed, the better off you are.

I often refer to the autoloop, not sure how familiar people are with that, but it’s basically a principle of observe, orient, decide and act.

So observe the issue, the fraud, the risk, whatever those things are, orient it to the context in which it’s happening to understand the why and the how and what those options could be to then decide on what technologies or strategies you’re going to put in place, whether they exist already or if they’re new things that have to be developed.

And then actually implement those so that you’re acting on that and then the loop starts again.

Now observe the outcomes of those: Has the fraud rate dropped? Excellent.

Has the conversion rate also dropped? Uh, let’s work through that. Let’s try to make that better again.

 Ideally you want to be looking at things that are very specific and very targeted in their approach so that you are addressing the problem, but not creating additional problems for your good customers.

Jason Lord:
At once again, look at it all in combination. Don’t look at it in isolation, right?

Jim Hendrie:

Jason Lord:
Well, Jim, I want to thank you so much for your time today. I found this conversation valuable.

I hope our listeners did as well.
Jim Hendrie, the Product Manager of Digital Fraud Solutions at TransUnion, thank you for your time today, and thank you all for listening out there.

I hope you take a listen to the rest of our Fraudcast. In the meantime, stay smart and stay safe.

TransUnion Fraudcast

Your essential go-to for all the absolute linkages between the day’s emerging fraud and identity trends, tropes and travails — delivered with straight talk and none of the false positives. Hosted by Jason Lord, VP of Global Fraud Solutions. 

For questions or to suggest an episode topic, please email

The information discussed in this podcast constitutes the opinion of TransUnion, and TransUnion shall have no liability for any actions taken based upon the content of this podcast.

Learn more about TransUnion’s TruValidate Device Proofing, and how it has made a difference for TransUnion customer Poshmark.

Do you have questions? Our team is ready to help.