The research is designed to create a human-assisted robotic system capable of autonomously sensing 3D space and recognizing human gestures (commands) using stereovision and numerical intelligence algorithms. Such systems are designed to assist people in the most natural way - by interpreting gesture or voice commands.The robot's 3D scanning system is designed to feel the surrounding space and allow it to move without hitting existing objects. Research is also underway to monitor eye movements.