Google is working on a new technology that can read your body language without using a camera- Technology News, Firstpost

    There’s no reason to deny it, but automation is the future. Imagine a world where your TV pauses the movie or show you’re watching when it senses you’ve got up to grab a bowl of fresh popcorn and resumes the content when you return. Or how about a computer sensing you’re stressed at work and starting to play some soothing and relaxing tunes?

    Well, about the future as these ideas seem, most of this is happening right now. However, one of the biggest reasons it fails is that these systems use cameras to record and analyze user behavior. The problem with using cameras in such systems is that it raises a lot of privacy concerns. After all, people are really paranoid about their computers and smartphones, keeping an eye on them.

    Google is actually working on a new system that records and analyzes user movements and behavior without using a camera. Instead, the new technology uses radar to read your body movements and understand your moods and intentions, then act accordingly.

    The basic idea for the new system is a device that will use radar to create spatial awareness and will monitor space for any changes, then send instructions to comply with what the user wants the system to do. implementation system.

    This isn’t the first time Google has floated the idea of ​​using spatially-perceived stimuli for its devices. In 2015, Google unveiled the Soli sensor, which uses radar-based electromagnetic waves to pick up precise gestures and movements. Google first used the sensor in the Google Pixel 4, when it used simple hand gestures for various inputs, like snoozing alarms, pausing music, taking screenshots, etc. uses radar-based sensors, in the Nest Hub smart display, to study the movement and breathing patterns of a person sleeping next to it.

    Research and testing around the Soli sensor is now allowing computers to recognize our everyday movements and make new kinds of choices.

    The new research focuses on proxemics, studying how people use the space around them to mediate social interactions. This assumes that devices such as computers and mobile phones have their own personal space.

    So when there is any change in personal space, the radar picks this up and sends out instructions. For example, a computer can start without you pressing a button.

    Google is working on a new technology that can read your body language without using the camera

    The final frontier for large-scale automation is private, end-user and household. If Google can perfect this technology and make it mainstream, this will be a big win for automation.

    Recent Articles


    Featured Article

    Leave A Reply

    Please enter your comment!
    Please enter your name here

    Stay on op - Ge the daily news in your inbox