How can you be sure an image wasn’t Photoshopped? Make sure it was shot with Truepic. This startup makes a camera feature that shoots photos and adds a watermark URL leading to a copy of the image it saves, so viewers can compare them to ensure the version they’re seeing hasn’t been altered.
Now Truepic’s technology is getting its most important deployment yet as one way Reddit will verify that Ask Me Anything Q&As are being conducted live by the actual person advertised — oftentimes a celebrity. [Update: Though to be clear, there’s no Reddit -wide or corporate partnership here. Reddit’s independent R/iAMA subreddit moderators have opted to suggest people use Truepic.]
But beyond its utility for verifying AMAs, dating profiles and peer-to-peer e-commerce listings, Truepic is tackling its biggest challenge yet: identifying artificial intelligence-generated Deepfakes. These are where AI convincingly replaces the face of a person in a video with someone else’s. Right now the technology is being used to create fake pornography combining an adult film star’s body with an innocent celebrity’s face without their consent. But the big concern is that it could be used to impersonate politicians and make them appear to say or do things they haven’t.
The need for ways to weed out Deepfakes has attracted a new $8 million round for Truepic. The cash comes from untraditional startup investors, including Dowling Capital Partners, former Thomson Financial (which become Reuters) CEO Jeffrey Parker, Harvard Business school professor William Sahlman and more. The Series A brings Truepic to $10.5 million in funding.
If you enjoyed our content, we'd really appreciate some "love" with a share or two.
And ... Don't forget to have fun!