top of page
Search
  • Ishbel Macleod

Met Gala deepfakes spark fears over authenticity



Met Gala outfits are a strong conversation topic each year, but at the 2024 event it wasn’t designers Maison Margiela, Givenchy or Alexander McQueen making a splash…it was AI. As the public turned to social media to see what outfits stars wore for the event, some people did a double-take, as posts suggested multiple different versions of what people were saying: with deep-fake AI versions of celebrities being shared.


One picture of Katy Perry which has had over 14million views and 314,000 likes at time of writing, feature her in a white gown strewn with flowers and with a moss trim. The only problem? The singer was not at the event, and this was a doctored photo using the Met Gala carpet from 2018.Despite this, the photo was so realistic it actually fooled Katy Perry’s own mum, as well as fans.


This was not the only AI photo that was shared from the event: there were other photos of Katy Perry as well as Rihanna shared.


It might seem like a bit of fun and bring back memories of dressing up dolls: but the fact that these images were believed and shared so widely is an indicator for things to come. While there are a few potential wonky fingers if you zoom into the images, AI is getting a lot more sophisticated than it once was…and this means that it is getting harder to see what is real and what is not. This brings fears around authenticity and brand image to the forefront.


Earlier this year singers such as Billie Eilish and Zayn Malik wrote an open letter calling for wider protection against deepfakes. While this was mostly focused on sound, deepfake images can have a huge impact on them and their reputation as well. In fact, at the start of May, an All-Party Parliamentary Group on Music in the UK called on the UK Government to create “a specific personality right to protect creators and artists from misappropriation and false endorsement”.

If a line isn’t drawn, what is to prevent a brand from creating a deepfake of artists wearing or using their products, presenting it as real?


There have already been deepfake ads: some where the celebrity didn’t know about it, and others, like the award-winning Mondelez Shah Ruhn Khan campaign that saw the Bollywood star allow his likeness to be used by small businesses.


It may seem small or funny, but deepfakes could have serious negative impacts for brands and celebrities: such as immediate reputational damage, legal issues (such as IP infringement), and loss of trust from the public. And with 47% of the UK public using social media for news, a tweet or Instagram post could have a lasting impression.


This becomes even more serious when the upcoming US election (and the next UK general election) is considered: it is easier to create fake ‘leak’ videos of candidates doing or saying bad things, and the onus is on them to prove they didn’t. This happened to London mayor Sadiq Khan earlier this year.


The problem is that generative AI is evolving faster than the law is, and at the moment it is down to the public and companies being able to spot fakes. We’re in uncharted grounds right now and the ability to authenticate content is rocky – which is bad given that 39% of the public say authenticity is what matters most to them in a brand. 


Some platforms, like Meta and YouTube, are attempting to put a defensive strategy in place, asking that AI images be tagged on social media…but will the people who are uploading images to deliberately fool people tag this? Unlikely. Education and awareness will be key, alongside creation of detection algorithms to spot and call out potential deepfakes, but this will take time.

That’s not to say that all AI or deepfakes as bad: Mondelez’s campaign proves that, and artist FKA Twigs has admitted that she is using it herself to help handle her social media and engage with fans.


The Met Gala pictures could have been innocent fun: but the response showed that these can easily take people in: if Katy Perry’s own mum can be fooled, what chance do the rest of us have?!


bottom of page