UC Berkeley students explore drone command with HoloLens

Michael Cottuli

Microsoft's HoloLens

Technology has been making leaps and bounds recently, but we haven’t heard much about everybody’s favorite corner of science fiction: Robots. The Dr. Allen Yang at UC Berkeley’s Center for Augmented Cognition has plans for drone and robot operation that might bring our mechanical sidekicks back into the limelight.

A post from the Next Reality blog came out today that showed off the work that Yang and his team have been doing: a project that aims to bring an entirely new user interface to drone and robot operation. The Immersive Semi-Autonomous Aerial Command System (ISAACS ) aims to use virtual reality rigs to put users in the driver seat when operating a drone and expanding the audience that might use one of them.

The goal is to make sure that robots can work for normal people. Those are the people who might not necessarily have computer science or electronic engineering background.

While any public availability of ISAACS is quite a ways off, you can still get your hands on some of Berkeley’s work using OpenARK, an open source AR SDK which should “allow you to rapidly prototype AR applications.” You can check that out here, and take a peek at the development of ISAACS through its home page here.