If you've ever seen the Disney Pixar movie "Wall-E" then over the movie's two hours, you may have noticed feelings brewing for the movie's main robot character. Throughout his struggles, triumphs, moments of uncertainty, flickers of love and messages of hope, we as viewers may have felt compassion, empathy, concern—even love—for Wall-E. And from what we could tell, those feelings were shared by the garbage-compacting bot himself, especially in scenes with his romantic interest, Eve. So what is so unusual about this? Wall-E is a robot, and robots aren't designed to feel the same emotions we do. Or are they? More and more forms of artificial intelligence (AI) are beginning to mimic facial expressions and body language, as well as detect that of others, offering the illusion these robots are contextually processing the social cues we give them. However, some studies and theories state that a robot's capacity to break down, analyze and ingest information in the same way we as people demonstrate critical thinking and consciousness, is impossible. But is this true?
Not only are advanced forms of AI mirroring bodily cues, designers are creating prototypes that imitate a person's physicality as well. Facial features, hair, clothing, height and movement can all serve to blur the lines between a heartbeat and a power switch. Will robots ever develop to the point of emotional intelligence, or is that something we've already disproved? Learn more from this playlist and decide for yourself.