Ready-to-use robotics development kit

SentiBotics development kit is designed to provide a starting point for researchers and developers, who would like to focus on robot development. SentiBotics can be also used as an educational platform at universities.

The kit includes a mobile autonomous robot with a manipulator, ROS-based SDK and full source code of all algorithms used in the robot software. The robot is able to perform navigation in a common environment, recognizes 3D objects, can grasp them and manipulate with them.

The ROS-based SDK includes 3D models of the robot and a number of programming samples for the mentioned actions. SentiBotics SDK trial is available for running in Gazebo robotics simulator.

Download 30-day SDK Trial.

Object Recognition, Grasping and Delivery

Object's appearance should be learned in advance by robot before it is sent to locate it, grasp and retrieve back.

Object learning and recognition

Object recognition
SentiBotics, object recognition
Click to zoom

Object recognition
SentiBotics, object recognition

SentiBotics object segmentation algorithm tries to locate object candidates with certain properties (i.e. well separable point clusters, that lie on a planar support). Each candidate is compared with learned object models, and a label is assigned, if match is found.

An object is learned simply by placing it in front of the robot and specifying object's name. Only one object at a time should be present in a frame during learning phase. It is recommended to enroll object from different positions and distances.

Object grasping

Horizontal grasp
SentiBotics, object horizontal grasping
Click to zoom

Vertical grasp
SentiBotics, object vertical grasping
Click to zoom

Grasping scene
with occlusions
SentiBotics, grasping scene with occlusions
Click to zoom

Horizontal grasp
SentiBotics, object horizontal grasping
Vertical grasp
SentiBotics, object vertical grasping
Grasping scene with occlusions
SentiBotics, grasping scene with occlusions

User can order the robot to grasp a particular object. The robot will grasp the object after it is correctly recognized.

The SentiBotics robot can automatically determine object's orientation and arrange its manipulator in a way best suited for grasping the object according to its position on the scene. The robot can also automatically reposition itself in order to perform the grasping task. For example, it can drive closer and/or turn to provide optimal position for picking the object.

Path planning for robot's manipulator is performed automatically to avoid obstacles that might be between the recognized object and the manipulator. Grasping is performed by closing the gripper and measuring gripper's position and force feedback of the finger servo motor. The grasp is assumed as successful if the gripper is not fully closed and the force large enough.

Vertical and non-vertical grasping can be performed by the robot. The grasp type is determined by the recognized object point cloud.

SentiBotics object grasping and manipulation have some requirements and constraints, which are described below.

Requirements for grasping scene

The grasping scene should satisfy certain conditions:

  • The grasping scene should be static.
  • The planar support for the objects should be made from non-reflecting and non-transparent material (e.g. glass or marble-like material is not suitable).

Requirements for the objects on the grasping scene

The objects on the grasping scene should also satisfy certain conditions:

  • The objects should be positioned:
    • on a sufficiently large planar support (e.g. ground, or table-like platform);
    • in front of the robot within 1 m (3'3") from the short range camera;
    • not too close to each other so that they could be easily separable as point clusters.
  • Objects shape should be approximable by oriented bounding box.
  • Objects should have appropriate size to fit in the gripper.
  • Objects should require no specific grisps (e.g. currently it is impossible to grasp a cup by its handle).
  • Objects should be made from non-reflecting and non-transparent material.
  • Each object can weight up to 0.5 kg (1.1 lbs). Lighter objects are recommended for manipulator's servo motors longer service.

Object delivery

SentiBotics autonomous object delivery includes autonomous navigation, object recognition and object grasping functionalities. The robot performs this sequence of actions after it receives a delivery command:

  1. The robot navigates through its previously-mapped locations until it reaches the location, where the specified object was recognized.
  2. The robot tries to directly recognize the assigned object, and reposition itself until recognition occurs, and grasping is possible.
  3. The object is grasped using the robotic arm and placed into attached box.
  4. The robot delivers the object to the location where the delivery command was issued.
Neurotechnology Distributors Map Ex-Cle S.A - representative in Argentina FingerSec do Brasil - distributor in Brazil (web site in Portuguese) Distributors in Chile Neurotechnology's Chinese Office (web site in Chinese) Security Systems Ltda - distributor in Colombia (web site in Spanish) Data6terms - distributor in Congo D.R. General Security El Salvador - distributor in El Salvador (web site in Spanish) Infokey Software Solutions - distributor in Greece (web site in Greek and English) India Branch - Neurotechnology Lab India Fulcrum Biometrics India Pvt. Ltd. - distributor in India Unifyia Solutions India Pvt. Ltd. - distributor in India Biometric srl - distributor in Italy (web site in Italian) Software Sources Ltd - distributor in Israel Bruce and Brian Co., LTD. - distributor in Korea (web site in Korean) Biosec Solutions - distributor in Nigeria Ex-Cle S.A - distributor in Paraguay Digital Works - distributor in Peru DigiFace Solutions - distributor in Singapore Fingerprint i.t. - distributor in South Africa Intuate Biometrics - distributor in Spain (web site in Spanish) Sri Lanka Branch - Neurotechnology Lab Delaney Biometrics - distributor in the UK Fulcrum Biometrics - representative in the USA Unifyia, Inc - distributor in the USA
Follow us
Facebook icon   LinkedIn icon   Twitter icon   Youtube icon   Email newsletter icon
Copyright © 1998 - 2021 Neurotechnology | Terms & Conditions | Privacy Policy | Career