
Princeton researchers Parastoo Abtahi and Mohamed Kari are pushing VR beyond visuals, melding mixed reality with physical robotics so that interactions feel like magic. Imagine wearing a headset, selecting a virtual drink or popcorn with a gesture, and a robot quietly delivers the real item—wordlessly, invisibly, and instantly, right in your view. That’s the core of their work, tells this interesting article by Princeton University.
Here’s the thing: making the robot invisible and interactions intuitive isn’t gimmicky—it’s vital. They strip away the robot’s presence via clever rendering and interaction design, so the tech disappears, and human–computer interaction feels natural.
Gestures do the heavy lifting. With a simple hand movement, you can select objects across the room. Those gestures convert into robot commands. The robot even wears its own mixed-reality headset, letting it understand virtual coordinates and place objects precisely.
They also tackled scene realism. Using 3D Gaussian splatting, they digitize physical spaces so virtual objects blend seamlessly—and physical objects (such as the moving robot) vanish from your view. Conversely, a virtual bee might appear and carry chips to you (click here to watch the video on Vimeo).
Right now, full-room scanning is tedious, and they acknowledge it. Automating that scan, perhaps with a robot, is the clear next step.
Their research will be presented at the upcoming ACM Symposium on User Interface Software and Technology in Busan, South Korea.