The Humanoid Sensing and Perception is within Istituto Italiano di Tecnologia.
Our group studies algorithms and technologies that allow robots to sense the environment and react appropriately. Our strategy is to exploit the capability of robots to learn under human guidance or from the interaction with the environment by exploiting multiple sources of information (e.g. proprioception, vision, touch, and audition). We work on visual and tactile perception, for robot navigation and object manipulation. We also develop software tools and study software integration methodologies for the development of complex behaviors.
Our platforms are the iCub and R1 humanoid robots, we focus on applications in the domain of service robotics.
Checkout our IIT's website.
Whenever possible we make code related to our research available to the community with open-source licenses.
Bookmarks:
- ⚙️ On-line detection and segmentation
- On-line Object Detection and Instance Segmentation project.
- ⚙️ ROFT
- Real-time Optical Flow-aided 6D Object Pose and Velocity Tracking.
- ⚙️ MASK-UKF
- Instance Segmentation Aided 6D Object Pose and Velocity Tracking using an Unscented Kalman Filter.
- ⚙️ Fast-YCB Dataset
- An annotated dataset for 6D object tracking, with fast moving YCB objects.
- ⚙️ Digit simulator for gazebo
- A tentative C++ wrapper for the Python based Digit tactile sensor simulation.
- ⚙️ Tracking sliding objects with tactile feedback
- A differentiable Extended Kalman Filter for object tracking under sliding regime
- 📚 Visualization for grasp candidates
- Barebones library to visualize simple manipulation environments
- ⚙️ Robot environment for pybullet
- A Python package that collects robotic environments based on the PyBullet simulator.
- ⚙️ GRASPA Benchmark
- A grasping benchmark for comparing grasping planners across different robot platforms
- ⚙️ YARP
- Yet Another Robot Platform - our middleware, check out official documentation page
- ⚙️ YCM
- Extra CMake Modules for YARP and Friends, check out official documentation page
- ⚙️ visual-tracking-control
- a suite of cross-platform applications for visual tracking and visual servoing for the humanoid robot platform iCub.
- ⚙️ navigation
- A collection of modules to perform 2D navigation with a YARP-based robot.
- ⚙️ Cardinal points grasping
- Simple superquadric-based grasping pose generator for iCub
- ⚙️ Superquadric fitting
- Solve an optimization problem to find out the best superquadric that fits a given partial point cloud.
- ⚙️ On the Fly recognition
- This demo allows to teach the iCub to visually recognize new objects "on the fly".
- ⚙️ https://github.com/robotology/himrep
- This repository contains a collection of modules to extract features from images or to perform classification tasks on feature vectors
- ⚙️ https://github.com/robotology/r1-grasping
- Grasping on R1 robot
- ⚙️ https://github.com/robotology/point-cloud-read
- Acquire point clouds of specific objects in the scene in order to save or stream them.
- ⚙️ https://github.com/robotology/superquadric-grasp-demo
- Object modeling and grasping with superquadrics and visual-servoing
- ⚙️ https://github.com/robotology/tactile-control
- Improve grasp stability using tactile feedback.
- 📚 superquadric-lib
- a Yarp-free library for computing and visualizing the superquadric representing an object and the relative grasping candidates for a generic robot.
- 📚 superimpose-mesh-lib
- an augmented-reality library to superimpose 3D objects on a images.
- 📚 bayes-filters-lib
- a recursive Bayesian estimation library.
Here is a list (autogenerated by this github action) of ALL the public repos contained in this organization:
Name | Description | CI status | Docker |
---|---|---|---|
GRASPA-benchmark | Repository gathering all code related to the paper GRASPA, Bottarel, Vezzani, Pattacini Natale, IEEE RA-L, vol. 5, no. 2, pp. 836-843, April 2020.) | ||
GRASPA-test | This repo contains the code for testing the GRASPA 1.0 on the iCub. | ||
HannesImitation | |||
adaptive-tactile-force-control | This repository contains the code associated to the paper "Adaptive Tactile Force Control in a Parallel Gripper with Low Positioning Resolution" | ||
behavior-stack-example | |||
bt_nav2_ergocub | Behavior Trees Nodes for Nav2 for the ergocub project | ||
byogg | [ICRA 2025] Bring Your Own Grasp Generator: Leveraging Robot Grasp Generation for Prosthetic Grasping | ||
concon-chi_benchmark | Repository to host the code associated to the CVPR 2024 paper "ConCon-Chi: Concept-Context Chimera Benchmark for Personalized Vision-Language Tasks" | ||
convince_bts | |||
dekf-tactile-filtering | A differentiable Extended Kalman Filter for object tracking under sliding regime | ||
ergocub-behavior | |||
ergocub-bimanual | Two-hand control of the ergoCub robot. | ||
ergocub-cartesian-control | |||
ergocub-gaze-control | |||
ergocub-perception | |||
ergocub-realsense-pose | |||
ergocub-rpc-interfaces | This repository aim is to group the rpc interfaces exposed by the ergoCub modules | ||
ergocub_navigation | ErgoCub Navigation Stack | ||
ergocub_suite | This repo contains all the sw dependencies and instructions needed by ergoCub robot | ||
fast-ycb | The Fast-YCB Dataset | ||
gazebo-yarp-digit-plugin | A tentative C++ wrapper for the Python based Digit tactile sensor simulation | ||
hannes-wrist-control | [ICRA 2025] Continuous Wrist Control on the Hannes Prosthesis: a Vision-based Shared Autonomy Framework | ||
hsp-land-annotation-tool | |||
icub-bimanual | Control classes for 2-handed control of the iCub robot | ||
learn_ltl | Tool for passive learning of Linear Temporal Logic formulae | ||
manip-env-visu | Barebones library to visualize simple manipulation environments | ||
mask-ukf | Instance Segmentation Aided 6D Object Pose and Velocity Tracking using an Unscented Kalman Filter | ||
masterThesisProject-Piquet | |||
multi-tactile-6d-estimation | Experiments for 6D estimation with tactile features. | ||
mutual-gaze-classifier-demo | |||
mutual-gaze-detection | |||
online-attentive-object-detection | |||
online-detection | This repository contains the python version of the source code for the experiments carried out for the On-line Object Detection and Instance Segmentation project. | ||
prosthetic-grasping-experiments | [IROS 2022] Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis. Code to replicate the results in our paper. | ||
prosthetic-grasping-simulation | [IROS 2022] Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis. Code for synthetic data generation. | ||
pybullet-robot-envs | A Python package that collects robotic environments based on the PyBullet simulator, suitable to develop and test Reinforcement Learning algorithms on simulated grasping and manipulation applications. | ||
r1-object-retrieval | |||
r1-steamdeck-launcher | A repository to store all the scripts and files used to navigate r1 with the Steam Deck | ||
rl-icub-dexterous-manipulation | This repository contains the code to reproduce the experiments related to the Dexterous Manipulation with RL project on the iCub humanoid. | ||
roft | Real-time Optical Flow-aided 6D Object Pose and Velocity Tracking | ||
roft-samples | A suite of applications based on ROFT | ||
sim2real-surface-classification | |||
tour-guide-robot | A collection of modules and classes that can be used to perform guided tours with R1 robot or to simply interact with it. It's also the repo that contains the configuration files to perform autonomous navigation with R1 |