Words So Swagulated: Microsoft’s Twitter bot and its discourse

Words So Swagulated: Microsoft's Twitter bot and its discourse

They say, every social experiment has a wee chance of going wrong in the worst possible manner. On Wednesday last week, when Microsoft made its teen-speaking bot Tay live on Twitter, you would have expected it to foresee the vile intentions that humans have, intentions that amplify the extent to which things can go wrong. By Thursday, the world and Microsoft had seen enough, and Tay has since gone offline after getting ‘tired’ of all the interactions. Murphy’s Law, anyone?

The Premise
With ongoing research to improve artificial intelligence and give them powers like thoughtful responses and social appropriateness, Microsoft developed an artificially intelligent chat robot, Tay, who would interact with humans via Tweets. She was imparted deep neural networking and machine learning capabilities, meaning that she would learn the latest trends of speaking in the world of teenage humans, and much more.

Until now, a robot is only as intelligent as the matter you present before it to learn. Unlike the more complicated neural network in human brains that can ‘judge’ based on experience, trust and morals, AI robots are still in their nascent. They are comparable to the brain networking of a child, who will possibly learn anything that you teach them. As years pass by, we humans ‘learn’ from long-established morals, ‘contemplate’ based on previous experiences, and form judgements based on a combination of the two.

The case in hand is a similar sketch that Microsoft attempted — Tay would interact with humans, and learn from their actions. However, things did not go according to plan.

The Incident
As it turns out, we humans have this specific shade to our characters that impart the notions of being ‘evil’, although your intention of being evil with a robot may just be personal amusement (nothing arguably evil in there). It is this parameter that has possibly gone wrong in the social experiment, leading Tay to tweet out largely embarrassing, scornful and highly inappropriate things, subsequently sending Microsoft scurrying all over the micro-blogging platform to delete its words.

To expect the massive, troll-loving, mischief-mongering social media audience to stick to morals with a robot that has been designed as a teenage girl is, sadly, a slight mistake. As it turns out, Tay was exposed to the particularly stingy best (read: worst) of words, phrases and personalities, that led her to profess her appreciation of Adolf Hitler, positive enthusiasm for Donald Trump and a sarcastic, sexual undertone in her words. Arguably, she mimicked much of the teens quite well, but that does not bode well for humanity in any way. Herein below are some of the tweets that Tay shot off her motormouth megaphone, which have been particularly noticed:

The Remedy
Microsoft has reportedly taken the bot offline, and is working on it to avoid a PR nightmare (that has probably been unleashed already, anyway). By saying that the company’s engineers are ‘working on it’ probably means that they are applying filters to impart a sense of appropriate speech and judgement to Tay, and prevent her from learning anything and everything. This would work pretty much on the same note as the morals that a lot of us are exposed to as kids, so that we ‘do the right thing’ (pun intended).

Tay is particularly exhausted at her illustrious debut chat session, and we hope that she gets back to tweeting, soon. As of now, she remains under custody of her parents, probably being exposed to a lot of rebuke (Lol!)

The Funnier Side
While a lot of her tweets have been outrageous and caused the social media to pour into the incident, we could not help but take notice of a few particular words. As it turns out, we at Team Digit have possibly grown a little too old to keep it touch with the sapiosexual, flamboyant youth. Tay’s first particular word, ‘swagulated’, caught our attention, and on this note, we decided to dig up a few more words that have possibly been doing the rounds on texts for quite a while (please excuse our ignorance, we do not excel at swagulating at all). So, here goes!

P.S. We thank Urban Dictionary from the bottom of our hearts, for providing us the meanings of these words.

Swagulate: (v.) To regulate one’s level of swag so that it does not get too much for others.
Can you imagine how hard it is with all the attention on me? I have to swagulate all day, man!

Pie: (adj.) A liar.
GTFO, ya pie!

Brogetit: (adv.) A very dumb thing that your bro says, but you’re happy to forget it.
Ted: I was at home when you called me on my home phone!
Lark: *Sigh* S’okay, brogetit.

Drakeophobia: (n.) The fear of Drake calling on your cellphone.
Tony: You used to call…
Steve: NO DON’T SAY THAT I HAVE DRAKEOPHOBIA!

Lit: (adj.) Something that is incredibly amazing.
Shrey: Hey the party at my house last night was totally lit, wasn’t it?
Sameer: *Turns away and leaves*

Woke: (n.) To be in the know; Be aware of what’s happening lately.
Steve: Hey did you check out the new camera on the iPhone SE?
Tim: That one’s been in the 6s already. Stay woke, dude.

Cheddar: (n.) Money. A substantial bit of it.
I made that cheddar working double shifts. How else d’you think I can afford this?

We are quite sure that if we were to present an exhaustive list here, it would have extended beyond the boundaries of the courtesies we bound ourselves to. Nevertheless, you’d rather Tay learn these, than praise Donald Trump, innit?

[Source:- Digit]