Can robots acquire and learn language? That’s what researchers at the University of Delaware believe they can accomplish. The language the robots will be speaking with each other is based on a set of actions “the robots performs in a certain order,” which would allow “robots to navigate tough tasks in small groups.”
What implications are there if we are able to engineer something this complex? For starters, it will give companies the option of owning their labor, in addition to their capital. What does that mean for humans? With minimal supervision, robots can essentially take over many of the work-related tasks humans perform already.
Remember Watson, the IBM computer that’s good at playing Jeopardy!? Well, Watson is also being programmed to be good at understanding and answering medical-related questions. So whenever you have a question about a particular ailment or sickness, why bother asking a doctor when Watson can answer your question quickly with a high-level of accuracy? Sure, there will still be humans available for quality control, but how many do you really need when robots and machines can perform much of the work already?
Imagine if a set of robots can perform surgery on a human being. I know we’re years and years away from something this elaborate, but it is worth thinking about as you comtemplate whether the robots “speak” with a British or American accent.