Controllers, haptic devices, and gesture controls have dominated the VR and AR industries as inputs for their respective content. Boston-based, Neurable, plans to disrupt that. Earlier this summer, Neurable unveiled their brain-computer interface for virtual reality. This unprecedented technology allows developers to immerse their users by controlling objects with their thoughts in the virtual space.
Brain-controlled inputs is an intuitive way for users to interact with digital objects that are tailored to their own thoughts opening up the potential for user generated storytelling. Neurable’s brain-computer interface encourages a hands-free experience that alludes to a future of virtual experiences that do not limit the user. This approach to a new user interface makes content accessible to users with a physical disability – an area of user engagement that has not been fully explored.
Every major communication medium has had its killer interaction. Personal computer – mouse and keyboard. Smartphones – touch interface. Mixed reality, the next major computing medium, requires brain-computer interfaces (BCIs) to truly take advantage of the medium. Neurable’s current BCI features a hands-free, silent, and unrestrained UX/UI that enables mental selections.
We are working on creating the operating system of the future, a world without limitations from technology that works with the user, as opposed to by the user.
Dr. Ramses Alcaide
Dr. Ramses Alcaide will be highlighting the power of brain-computer interfaces and how the technology could change the accessibility of virtual reality applications. Neurable will be accepting requests for private demos of their brain-computer interface on October 26th for registered attendees. Email firstname.lastname@example.org if you are registered and would like more information on requesting a private demo.
For more information on VRS 2017 registration, visit here.