R/GA finds 'sweet spot' for AI Christmas cards after bizarre beginnings

To show how AI and people can work together creatively, R/GA London's Christmas card project shows how machine-learning needs 'just the right amount of wrong' and a highly selective use of data.

'Cookies and cake greetings': copy was produced by AI
'Cookies and cake greetings': copy was produced by AI

Many say "it’s the thought that counts" at Christmas time, but what happens if that festive sentiment is created by artificial intelligence?

R/GA London has explored what Christmas cards would look like if the greeting message was written by a program powered by machine-learning – and the results are pretty strange.

After feeding the AI with hundreds of songs, greetings and sayings that have become familar during the festive season, HappyHolid.AI returned its own messages, such as "Don’t get the tinsel" and "Christmas waves a Nutcracker".  

The 20 cards will be sold online, with all profits going to Code Your Future, a coding school for refugees and disadvantaged people.  

R/GA wanted to show that AI and people can work together to open up creative possibilities, in contrast to dystopic headlines heralding tech technology as the death of creativity. 

The AI is powered by a recurrent neural network, which is trained to learn over time and get better at recognising patterns in the source data. 

The R/GA collaborators on the project are group strategy director Lachlan Williams and Max Wilke, Igor Pancaldi, Dayoung Yun, Matt Rooney, Lachlan McDonald and Jessica Briggs

Williams told Campaign that early results produced "some really bizarre stuff", because of the way the AI interpreted modern festive carols and pop songs, such as Wham’s Last Christmas, Mariah Carey’s All I Want for Christmas is You and The Pogues’ Fairytale of New York (which contains words that would be deemed offensive out of context). 

"The AI was picking up a lot of meaning and it had a huge influence, but unfortunately produced sentences that didn’t make sense," Williams said. "We found them quite funny, but we wanted the messages to be positive.

"We found that there is a sweet spot to how long you allow the AI to run different iterations. At first, it comes out with a lot of nonsensical stuff, but if you let it go on too long, it becomes quite boring, as it gets really good at looking for patterns and connections. At a certain point, somewhere in the middle, it’s just the ‘right amount of wrong’. That’s where we begin to get stuff that you wouldn’t necessarily have thought of yourself."

The pitfalls of how AI amplifies bias have been well-documented in recent times, such as when Amazon's recruitment trial resulted in bias against women for software developer jobs, because most existing people in the role are men.

Professor Stephen Pinker, a leading academic in the field of cognitive psychology, told Campaign this year that AI systems developers should be prepared to sacrifice statistical accuracy in return for fairness by feeding the technology with positive biases. 

The HappyHolid.AI website launches today (11 December).