Sector
Healthcare
Status
Idea

The application consists of two parts that communicate with each other via a secure network. There is a vision processing unit installed on the slave robot side, which reports information to a haptic force computation unit installed on the master side.

The goal of the application is to provide haptic force feedback to the surgical robot when the remote surgical instrument encounters the proximity of the protected region during surgery.

The slave robot side software is installed on a CPU unit, which is responsible for collecting multiple RGB-D sensor data, and a GPU unit that generates point cloud description of the scanned surgical field in real time. The master robot side software is installed on a simulated surgeon’s console (a PC for example) that controls the robot hardware; the software is running the haptic force rendering algorithm to compute haptic force that is sent to the haptic device at 1000Hz.

Acknowledgments: NSF under Award Number CNS 15-45069 and a grant through the JUMPARCHES (Applied Research for Community Health through Engineering and Simulation) program for addressing safety and reliability of surgical robots.

Team Information: Thenkurussi (Kesh) Kesavadas
[email protected]
(217)-244-9341
Xiao Li
[email protected]
(412)-706-3374

Questions or interested?

We’re here to help and advise.

Contact us