A.I. & Robotics

Research & Development in AI and Robotics

Besides biometrics, Neurotechnology's current field of research is artificial intelligence (AI) and mobile autonomous robots. The company's AI division was founded in 2004. The project's scope is development of a mobile autonomous robot technology that might be able to perform simple tasks in the real environment.

First mobile robot prototype

In 2004 the first mobile robot was built for autonomous navigation tests. The robot included wheeled chassis, vision system for landmarks detection, wireless connection system with regular PC and PC based neural network software for robot control. Although this robot was partially able to navigate in the office, it had some major limitations due to its chassis construction and vision system.

Second mobile robot prototype

In 2005 the next version of mobile robot has been built. It featured bigger four-wheel drive chassis. Robot uses advanced turn and vision systems. As the previous robot, it is connected to a regular PC, where neural network based control system is implemented.

The video clip shows a sample of the robot's navigation in a regular office environment:

  1. The first position of the robot is near a chessboard; the goal is set to go to a flowerpot. The robot is unable to see the flowerpot from the initial position. However, it stores the environment cognitive map in the memory, so it is able to move properly towards the goal.
  2. After this, the goal is set to go back from the flowerpot to the chessboard.
  3. Finally, the goal is set to go to a video camera position. Here you can see details of robot's construction.

Download robot's navigation sample video

SentiBotics Development Kit

In 2014 Neurotechnology has released SentiBotics – a mobile robotics development kit. SentiBotics enables the rapid development and testing of mobile robots and comes with software, sample programs, a tracked platform and grasping robotic arm, 3D vision, object recognition and autonomous navigation capabilities.

The kit includes a mobile robotic platform with a 3D vision system, modular robotic arm and accompanying Neurotechnology-developed, ROS-based software with complete source code. The programming samples included in the robotics development kit demonstrate a number of capabilities, including how to teach the robot to recognize and grasp objects and how to use the robot to create a map of the local environment that the robot can then use for autonomous navigation in the learned environment.

Robotic vision

In 2007 Neurotechnology has released SentiSight SDK for developing PC-based object recognition systems. To date, several major versions of the SDK has been released. The underlying technology enables the learning of objects and searching for learned objects in images from almost any camera, webcam, still picture or live video.

In 2012 SentiSight Embedded SDK was released. The SentiSight Embedded allows to use all advantages of the SentiSight PC-based technology on smartphones and tablets, as well as provides full compatibility with the PC-side SentiSight SDK.

Mechanical manipulator

A mechanical manipulator has been developed and produced by our partner Neuromechanica. The manipulator features:

  • Rigid aluminium construction with six degrees of freedom. The construction's carrying capacity is 0.7 kg (1.5 lbs) when the manipulator is outstretched in horizontal position.
  • The total weight of the manipulator is 5 kg (11 lbs).
  • The height of vertically stretched manipulator is 780 mm (2'7").
  • All joints are able to flex to 180 degrees angle.
  • The manipulator's "fingers" move on ways parallel to each other. A pressure sensor is mounted on them.
  • The "fingers" are able to spread up to 100 mm (3.9").
  • The manipulator uses 20W (peak 50W) of electric power (10-16V).
  • DC motors with PID controllers are used for control.
  • Control via USB port using dedicated protocol.
  • Feedback with controlling device (PC).
  • Movement speed is 3 seconds for the longest path.
  • Integrated electronics.
  • The manipulator's movements and operations are programmed using an API.

Download manipulator demonstration videos

Note that the manipulator in the sample videos was controlled by preprogrammed sequences of actions.

Cooperation and employment opportunities

We would be interested to cooperate with companies and researchers' groups involved into similar problems.

Also we have job offers for artificial intelligence and autonomous robotics developers to take part in this project. Please contact us for more information.

Please, also review job offers in biometrics field.

Contact information

Laisves pr. 125A,
Vilnius, LT-06118,

Phone: +370 5 277 3315
Fax: +370 5 277 3316

AFIS or multi-biometric fingerprint, iris, face and voice identification for large-scale systems.

Face identification for PC, mobile and Web solutions.

Fingerprint identification for PC, mobile and Web solutions.

Iris identification for PC, mobile and Web solutions.

Speaker recognition for PC, mobile and Web applications.

Ready-to-use robotics development kit.

More products for developers:

End-user products:
  • NCheck Bio Attendance – an attendance control application that uses fingerprint or face biometrics to perform persons identification.
  • NVeiler Video Filter – a plug-in for VirtualDub that automatically detects faces in a frame, tracks the faces (or other objects) in subsequent frames and hides them.
  • NPointer – a freeware application for gesture- and voice-based computer control that captures hand movements with a webcam and accepts voice commands.
Copyright © 1998 - 2015 Neurotechnology | Terms & Conditions | Privacy Policy