Futurists believe that within the next 50 years, human-robot interaction will become a necessity to carry on with our daily lives.
However, to achieve that kind of integration, robots need to process human behavior, as quick as they would a math equation, to be accepted by their human counterparts.
According to reports, humans might have managed to do just that.
Here's all about it.
Understanding human behavior
Researchers, at USA's Brown University and Tufts University, have managed to create a machine-learning algorithm that enables robots to mimic social behavior patterns of humans, while interacting with a human counterpart.
The researchers have devised a cognitive-computational model of human norms that can be imbibed into a robot, enabling it to respond with appropriate behavior that's formulated from the available human data.
Building the trust
This US Defence Advanced Research Projects Agency (DARPA) funded project is targeted at making robots more human instead of a heartless highly functioning machine that is frowned upon by many due to its aloofness.
Since we are pushing robots as worthy collaborators to human partners, this breakthrough is indeed an important milestone towards achieving the ground of trust that is needed in this case.
Sooner than you think
Reza Ghanadan, DARPA program manager, said that the idea behind this research was to pinpoint the human normative systems and to understand how it influences the human behaviors in certain situations.
Once it's done, this could be taught to the next-generation AI machines so that they can interact with humans without hiccups and get integrated into our day-to-day lives successfully.
There is a lot to be achieved
This is not going to be an easy feat, as human behavior and responses are multilayered in nature. We respond uniquely to every incident depending on our whims.
However, creating a research framework to understand and mimic complex human behavior will hopefully bear fruit in near future. We may soon have a robot that knows what it should say and what it shouldn't.