Image from Canva
I sat down for a virtual interview with Dominic Forrest, Chief Technology Officer at iProov — a man who has spent the last 12 years staring at forged faces. Within the first five minutes of our call, he did something that made me do a double-take: he swapped his face for someone else's.
It wasn't a clunky, lagging filter. It was seamless, hyper-realistic, and—frankly—terrifying. It reminded me of the Michael Jackson "Black or White" music video (probably aging myself there), but instead of "Black or White" magic, it’s the new favorite tool of cybercriminals.
The democratization of the deepfake
A year or two ago, creating a convincing digital clone required a team of PhDs and a server room’s worth of power. Today? "The skills required to do this have been taken out," Dom explained. Anyone with a mid-range gaming PC can download free software and, using just a single image from your LinkedIn or Facebook, appear as you during a live video call.
In 2025, deepfakes came of age for criminals. In 2026, it’s the year for the rest of us to catch up. According to iProov’s data, 99.9% of people cannot consistently tell a real person from a deepfake over a sample of 10 videos. Just think of all those deepfake videos of Vico Sotto or Leni Robredo urging people to invest in sketchy companies.
I even wrote a story on a content creator whose image was used in a deepfake to promote an online sportsbook app.
Dominic Forrest, Chief Technology Officer at iProov
Where the walls are thinnest
We often worry about opening new accounts, but Dom pointed out a "silent" danger zone: Rebinding.
Think about what happens when you get a new phone. You need to move your banking app, reset your credentials, and prove it’s still you. This "re-issuing" of credentials is where the biggest heists happen. Dom cited the 2024 MGM hack in Las Vegas—a $100 million disaster triggered simply by resetting a system administrator’s credentials.
If we rely on SMS or email codes with OTPs (which the BSP has already called for to end by the end of next month), we put ourselves at risk, as they could be phished or SIM-swapped.
Which is where 'Strong liveness detection' comes in. It's not just about matching a face, which we have already discussed could be a deepfake (and the fraudsters are getting better and better at it); it's about proving that there is a real, live, breathing human on the other end of the lens.
Leaving no one behind
One thing that really resonated with me—both as a journalist and a "tech mom"—is the necessity of inclusive security.
"Why should it only be the iPhone user who gets the security?" Dom asked. In the Philippines, where digital payments are exploding, security has to work on a budget handset in a remote province just as well as it does on the latest flagship in Makati.
The good news? The bias issues that plagued facial biometrics seven years ago (where systems struggled with different skin tones or ethnicities) are largely a thing of the past. Leading vendors are now hitting "equality of outcome," meaning your grandma’s face is just as secure as yours, regardless of her technical literacy, and this is also very important since many senior citizens become targets for online scams and fraudulent activities.
Educate and verify with tech
I learned from my chat with Dom that we have reached a point where we can no longer trust our own eyes. We shouldn't expect older people or the non-tech-savvy to be able to tell the difference in deepfakes. Although sitting down with your grandparents and elderly uncles and aunts is a step in the right direction, at least to help them become aware of deepfakes.
Companies and banks should step up with their identification and verification processes as well.
The Philippines is moving in the right direction, with regulators mandating a shift away from interceptable SMS codes toward strong biometrics. I also wrote about silent authentication, which you can read about here.
As for us? We need to keep asking the hard questions. Because in a world where anyone can wear your face, "seeing is believing" may not always be true.