Deepfake & AI driven Frauds- Largest Challenge in payment security in the next decade by Raksha Vashishta

Raksha Vashishta
Product Owner

Reviews

0
No votes yet
Automatic Summary

The Future of Finance: Women in Technology and AI-Driven Fraud

Welcome to our deep dive into the evolving landscape of finance, particularly focusing on the intersection of women in technology and the challenges posed by AI-driven fraud. My name is Raksha Vasishta, a product owner at Vira Mobility and a passionate advocate for inclusion in technology. With over six years of experience in product ownership, as well as roles in education and mentorship, I aim to shed light on the pressing financial security issues we face today.

The Financial Landscape in 2025

It is predicted that by the end of 2025, the financial loss attributed to AI-driven fraud could reach a staggering $10.5 trillion. This alarming figure underscores a significant issue: the asymmetric scaling of AI, which allows fraudsters to exploit vulnerabilities faster than financial institutions can respond.

Why Focus on Women?

  • Women are at a unique intersection in the financial technology landscape—both as architects of security systems and as consumers navigating a complex digital economy.
  • Despite advancements, women hold only 24% of cybersecurity positions globally, creating blind spots in how financial systems are secured.
  • A book by Emily Chang highlights the ramifications of this lack of diversity, noting that it shapes the conceptualization and deployment of security mechanisms.

The Gender Gap in Fraud Vulnerability

Research indicates that women, on average, lose about $1,700 more to AI-driven fraud compared to men. Female-headed households are also disproportionately affected, making it crucial to address systemic vulnerabilities in financial transactions and digital payments.

The Evolution of AI Fraud

Over the past five years, the approach to fraud detection has shifted drastically. We are moving from a reactive stance—waiting for fraudulent activity to occur—to a more proactive, preventive approach. Current strategies include:

  • Enhanced Pattern Recognition: Utilizing two-factor authentication and transaction alerts.
  • Multichannel Protection: Employing systems that monitor transactions across various payment channels simultaneously.
  • Operational Efficiency: Automating identity verification and fraud detection to free human resources for higher-value tasks.

The Dark Side of Innovation

While innovation brings improvements, it also opens doors for fraudsters to exploit systems:

  • No-Code Fraud Environment: Fraudsters no longer need extensive technical knowledge to exploit vulnerabilities.
  • Deep Fake Engineering: AI technologies are being used to impersonate executives, complicating fraud detection.
  • Generative Manipulation: Tailored phishing campaigns are designed to bypass traditional security systems.

A Case Study: The 2024 $20 Million Fraud

A notable case from early 2024 involved an employee from a British engineering firm in Hong Kong who was deceived into transferring £20 million to a fraudster masquerading as their CFO. By leveraging voice simulation and deep fake technology, the fraudster exploited the company's ongoing acquisition processes. This incident highlights how sophisticated fraudulent activities are becoming and the urgency for robust protective mechanisms.

Defense Mechanisms Against Fraud

To combat these growing threats, the traditional approach to security must adapt:

  • Continuous Authentication: A shift from perimeter defense to ongoing verification of identities.
  • Data Sharing Solutions: Developing international frameworks for sharing fraud-related data across borders.
  • Consortium Approaches: Establishing early warning systems that can identify anomalies in fraud patterns.

What Can We Do?

The future demands a concerted effort from various stakeholders:

  • Financial Institutions: Ensure precision in digital payments and develop inclusive AI ethics.
  • Technology Teams: Create explainable AI systems and monitor biases to serve diverse populations effectively.
  • Next-Generation Awareness: Enhance digital financial literacy, particularly for non-technical roles and small business owners.

The Call for Diversity

Diversity is not merely beneficial; it's essential. Teams that are homogeneous create predictable blind spots in fraud defenses. As Aishwarya, an AI expert, states, "the algorithms we build to protect financial systems will only be as comprehensive as the diversity of the teams that create them."

Conclusion: Empowering a New Future


Video Transcription

So to get started, I want to introduce myself.My name is Raksha Vasishta, and I have been working as a product owner for for six plus years, and I'm currently with Vira Mobility, a global leader in smart technology. Apart from my professional life, I also, work on various different initiatives. I serve as a faculty adviser for Virginia Commonwealth University, and I'm also a research a researcher and a startup adviser for various AIML projects. And I also serve as a mentoring coach for, small business initiatives across United States. And I'm based out of Austin, Texas. So that's that's pretty much about me. Let's get started. So the future of finance is not necessarily just about innovation. It's a race between production and deception.

And the reason that I say this is not merely because of the fact that, it's predicted about 10,500,000,000,000.0 US dollars, will be the financial loss by the end of twenty twenty five as the impact of AI driven frauds. I mean, the number doesn't necessarily mean everything. I'm saying that also because, the nature of AI, scales very asymmetrically. And, right, and we also ask today as women in technology, we stand at a unique intersection where we are both, like, architects of these financial security systems. And at the same time, we are also, like, consumers navigating an increasingly complex digital economy. Moving forward, why does it actually matter? I mean, why particularly women? Right? And so when I was, doing my research for this presentation, I came across this book, amazing book, wrote by Emily Chang, who's a journalist for Bloomberg. And she particularly wrote this book because she wanted to bring out the amount of gender equality and sexism that that, still, shows its effect on the Silicon Valley.

And she quoted that the lack of diverse perspectives in financial technology does not just impact representation, but it also fundamentally shapes how security is conceptualized, built, and deployed. I mean, I felt that was amazing. I mean, lot of these gender equality and the, and the the upcoming technologies in terms of AI is not just going to impact further challenges, but it's also going to amplify the existing challenges which we have, across the world. So this was an interesting fact, which I found that women on an average are more susceptible to frauds. When I say that, a survey conducted shows that about $1,700, were lost by women in comparison to men for AI driven frauds. And, also, like, female headed households are more susceptible to these frauds, especially in terms of, like, digital payments and accounting transactions.

It's not just the numbers that makes me feel concerned. I think the main concern for me is the fact that what are we actually doing about that? The financial security teams across the globe is predominantly dominated by male counterparts, and with women holding only about 24% of cybersecurity positions globally. What does this mean? It's creating a blind spot as to how we are building our system, how are we designing these systems, and how are we protecting the diverse, diverse categories of human beings that exist across the world. Because one counterpart is taking decisions for other. So that's definitely gonna increase a lot of risk. Moving forward, I wanted to speak a little bit about, like, AI fraud evolution and where we stand as of today. I mean, from what how things were, like, five years ago and how things are now, I think it's a complete turnaround, especially, the way things are being looked at and the way, financial institutions are more proactive than reactive.

I mean, we are moving more from actually, like, waiting for a fraudulent activity to happen and then actually react or, like, make things make things better. I think we are approaching things from a more preventative standpoint, which is which is really good. And enhanced pattern recognitions, I know, like, lot of us, we might have come across, like, wherein multiple times we are being asked, like, two factor authentication and making sure that, like, if you're actually making a transactions from, a place where you haven't visited before, there are, like, multiple checkpoints, and patterns which are recognized that make sure, like, you immediately get notified, right, on those those kind of transactions.

And multichannel protection, this is also, like, particularly called octopus effect as like, by some of the industry experts. What it says is, like, it allows, like, system to monitor transactions across all the payment channels simultaneously. I mean, not necessarily through one channel. Like, if you're making a transaction through your phone wallet, let's say, Apple Pay or Google Pay, then it's being monitored from a holistic standpoint rather than just from one payment channel standpoint and operational efficiency. So it's also noted that AI has increased, like, automation making, like, identify identity verification and fraud detection far more efficient, and it's freeing up the human capital for higher value work. So that's really nice that we are take making things, like, more efficient in nature and not necessarily having to, rely on, conventional approaches as to how we treat some of these processes. However, with every, with every innovation, I think there's always a flip side, and it's always like a double edged sword, I would say.

So there is always a counterpart for some of these innovations that we do as of today, which are being weaponized by those seeking to exploit the system rather than actually leveraging the system, to make things better for human beings holistically. Right? And some of those are, like, no code fraud environment. So today, fraudsters need not actually have any technical expertise to be able to understand some of these activities. I think it's making things more easier for them. Like, you don't necessarily have to have to be a technical person. You don't necessarily have to have a lot of technical knowledge in order to design or, like, redesign some of these, approaches to make sure that you can use it for your own advantage. And, also, like, we are entering, a new age of technology, like a deep deep fake engineering. In 2024, like, we've already seen, like, multiple cases of deep fake technology being used to impersonate, like, executives.

One of which I'm also going to talk going to be talking in the next few slides as part of a case study. And this is this is gonna heavily, heavily influence financial institutions as a whole and especially, like, payment channels and, institutions who heavily rely on digital payments and credit card transactions, my company being one of them, and also cross channel exploitation.

So, I mean, there are ways for some of these fraud fraudsters to actually ask for details which are related to specific payments payment channels, probably, like your ACH transactions or wire transactions. But those information is not limited to those channels alone. They can leverage that to exploit other payment channels as well, which which can make make things more complex. Generative manipulation. These fraudulent actors are now using generative AI, which is perfectly tailored phishing campaigns to bypass, like, traditional detection systems. And this is this is gonna be this is gonna be some of the major challenges that I foresee, in the upcoming decade. So moving forward, this was the case study that I was talking about. The source of this case study is from Financial Times, and it made news sometime early twenty twenty four, and it was pretty big.

So in 2024, a Hong Kong based employee of British engineering firm, Arru, based out of Hong Kong, was deceived to transferring approximately, like, 20,000,000, 20,000,000, £20,000,000 to fraudster. I mean, how did this even happen? Right? Like, that's so much such large amount of money. And for somebody to actually make things so easy and to just, like, get it through, it it must have it must have been, like, somebody who is, like, really, really informed of the know hows of how company operates and what's what's, what's the process look like for company to make these transactions available.

Right? But, it was a very interesting case study. So it appeared to be, like, an urgent video call from the company's CEO CFO, sorry, to transfer transfer this amount to finalize a confidential acquisition. I think the one of the major things which the fraudster leveraged out of this particular activity is the fact that the company was undergoing an acquisition. So they were acquiring a new company, and this was all around the news. And, the fraudster was aware of this information, and they use, like, voice simulation, visual deep fake, wherein they actually made the CFO to be appearing in the office itself and contextual knowledge. As I told earlier, like, they were aware of the fact that company was undergoing, undergoing acquisition with another company and also an emotional manipulation.

It was timed, at a particular time where, like, we they were pursuing the acquisitions. So the employee, following what they believe were proper protocols, actually authorized the transfer, and the sophisticated nature of this attack surpassed the previous deepfake fraudulent attempts. And later on, it came to the realization that, actually, this attack was through a fraudulent group based out of Europe. And, it was a Eastern European team who worked on this. So this just illustrates how what is science fiction just, like, twenty four months ago. Like, somebody couldn't even think of the fact that you can create an AI counterpart for a CFO with same virtual background and just actually, show things in a way which are more realistic in nature and, which will result in a fraudulent activity worth, like, £20,000,000. So that was a very interesting one, which I felt is worth mentioning as part of this talk.

So moving forward, what can we actually do to circumvent some of these frauds? Are there any defense mechanisms that we can actually leverage to be able to do this? Right? So the traditional approach of isolated security system is obsolete, and financial security must now function as an immune system rather than the fortress. The three critical paradigm shifts, which I believe are, necessary or critical in this aspect, is from perimeter defense to continuous authentication, which is mentioned as cross border data sharing. So international regulatory framework should be built in a way that financial institutions are freely able to exchange fraud relate fraud transaction data across across across the globe. And next would be a repository development, Having some sort of repository, which can confirm some of these synthetic identities or fraud patterns are becoming more and more critical, which we need to consider, and also a consortium approach. A consortium approach is more going to be like an early warning services.

Like, as soon as they detect some of the irregular anomalies or patterns in these fraudulent mechanisms, expanding that beyond the traditional banking systems would be highly beneficial. Moving forward, what does this actually mean for us? I mean, various different institutions, are responsible for different things. So from a financial standpoint, I think something which is, which is extremely important is precision of how these digital payments go through and cross channel visibility, having more visibility on some of the transactions, an inclusive AI ethics, and gender analysis requirements.

I personally believe that, being a victim of a fraudulent activity, especially in the digital space, should not be something that is impacted by that impacts your financial freedom or should not be something that's specifically targeted because you're a financial minority. Right? And, also, as women in, technology and, like, leadership position, I specifically feel like representation matters. As previously discussed, having more representation and visibility in, cybersecurity roles, would be highly beneficial and challenging some of the conventional assumptions, that were put in place to design some of these processes and systems should be more holistic in nature, which promotes more inclusive design, and, also showcases some leadership examples for, like, future, future decisions that will be made in this space.

And for development teams and, like, technology teams, I think making things inclusive in nature and explainable AI systems, data equity, which serves not just one, one category of, human beings, but, it's also, like, serving diverse category of different, different genders.

And, monitoring some of the biases that some of these systems put through would be highly beneficial. And I think for next generation I mean, nontechnical roles are, from a layman perspective, having a digital financial literacy and amplifying your voice in this regard is highly, highly beneficial. And having some sort of mentorship chains, which can actually educate people, including, like, small business owners. I work personally with a lot of small business owners on a day to day basis as part of my mentorship program. So I truly understand the value of this. Knowing about digital payment frauds would be highly beneficial. Moving forward, I know I have spoken a little bit about this already, so I'm gonna make it quick for this particular slide. Diversity is highly, highly important.

And, as research shows, gender diversity is extremely important, in teams. And, one of the AI experts in this space, Aishwarya, she also mentions that the algorithms that we build to protect financial systems will only be as comprehensive as the diversity of the teams that we create them. The homogeneous teams inevitably creates homogeneous protections for predict with predictable blind spots, which is truly, truly accurate. So having mixed gender teams create more robust synthetic fraud defenses, and creates a holistic solution. So last but not the least, this, when I was thinking about this conclusion slide, I mainly wanted to focus on the fact that how we, as the firsthand recipients of this AI transformation and being subject to these AI frauds, what can we actually do, and how does this actually impact us?

So in the world of financial automation, the unique human capacity is the judgment which is extremely, extremely important. I mean, we can speak a lot more from a technical aspect of things that payment ecosystems will have different different, will have various different, challenges. But as women in technology, I truly believe we have, two two main opportunities to reshape the security systems with our perspectives and experiences and to empower new generation of women and human beings in home with lot more tools and confidence to navigate this complex digital landscape.

And, also, the question is no longer about, like, how do we stop the fraud, but it's about how can we truly create an inclusive protection which serves everybody. How do we maintain human agency in increasingly automated financial systems? And how do we ensure security does not come at a cost of financial protection?