By Simon Moss, Business Development Director
We all had a chuckle in the Whiteoaks office at the misfortune of Microsoft’s AI chatbot, Tay, which was corrupted by the “yoof of today” and became a holocaust denying racist, within a day of being let loose on the world.
Like so many sci-fi films gone wrong, Tay was meant to be beautiful. A bot that would respond to users queries and emulate the casual, jokey speech patterns of a stereotypical millennial based on what users tweet to it. Its aim was “to experiment with and conduct research on conversational understanding”, with Tay able to learn from conversations and get progressively smarter.
While I’m still (just) in my 20s, I feel I speak for this generation when I denounce the activities of a small section of my peers, however humorous it may have been that Tay could be so easily corrupted.
What hope does Facebook Messenger have, then, with plans to fill the service with robots that will attempt to sell users things based on their data? It could be brilliant using targeted data, or more probably, offering you discounted Flumps or nail cream (though hopefully not in my case).
The social networking site introduced its new chatbot platform at its F8 conference, saying it could allow people to talk to artificially intelligent machines, but at the same time send messages to people telling them to buy more.
This is all in response to concerns from social media giants like Facebook and Instagram that ads are driving people off their site. This further step in personalisation shows the increasing confidence in AI companies have in dealing with their customers.
This drive for personalisation is something we see a lot of here at Whiteoaks, given our strong heritage in the retail technology space. Imagine Dorothy Perkins sending personalised chatty offers on jeans or trendy cardigans based on your latest Google searches, or recommending what you should wear with that tricky mustard coloured top?
I hope that Facebook users don’t corrupt a defenceless AI in the way that Twitter has. However, if they do, at least make it amusing…