Database and Code

Overview of Data Recording

We record human data from 19 subjects recruited from ETVJ in total. Among these subjects, 12 are students in the 1st and the 2nd year of the school. We consider these students as novice subjects. We keep tracking of the learning progress of these subjects and record their watchmaking manipulation every semester (three semesters so far). The other seven subjects comprise five students from the 3rd and the 4th year, as well as two teachers of the school. These seven subjects are considered as expert subjects. We record both motion and tactile information from human subjects. We use data recorded from expert subjects as baselines to evaluate the performance and assess the improvement of novice subjects.

Motion Segmentation

The recorded videos of human subjects are manually segmented and labelled based on the meaning of the motion. Labels of motion segments include:

  • Tool Selection: subject selects tools (e.g. tweezers, pegwood, screw-driver) for performing the task;
  • Adjustment of Hand Pose/Watchface: subject adjusts hand poses or orientation of watch face before task execution;
  • Localization: subject moves hands to localize the tips of tools towards the desired location, e.g. to pick up the component;
  • Pick and Place: subject uses tools to pick and place used watch component (e.g. screw, plate, spring) to its target position, either on the watchface, or on the table surface;
  • Execution: subject executes the required assembly/disassembly task.

The segmented motion lists of each subject’s data recording are available at:

https://github.com/epfl-lasa/SAHR_database/tree/master/Database/Motion%20Segmentation

Modelling of Human Hand

Manipulation motions of human subjects are captured by recording the movement of hand markers using camera array. Taking advantage of the reconstructed marker trajectories, we construct kinematic model of human hand to analyze the hand poses and finger motions of subjects during manipulation task.

The model include two parts:

  • Kinematic Model: a 21-DoF kinematic model for analyzing the static hand pose (determined by finger joint positions);
  • Animation Model: a model constructed by recorded hand trackers for visualizing the hand movement.

A simple example to use the model is given in the main file of both packages. These models are available at:

https://github.com/epfl-lasa/SAHR_database/tree/master/Code/Hand%20Model

In addition, the SynGrasp toolbox is also used for the analysis of hand poses. This MATLAB toolbox is available at:

http://sirslab.dii.unisi.it/syngrasp/?p=247

Data Visualizer

Tactile information are recorded during experiments. This data visualizer allows users to easily visualize the recorded tactile information data in the project, including:

  • Finger pressure information, recorded by the FingerTPS sensors;
  • Force/torque applied on the watchface, recorded by the ATI Force/Torque sensor;
  • Pressure applied to the tweezers, recorded by the DigiTacts system.

It also provides several data analysis functions, such as:

  • Calculating signal features (mean, standard deviation, amplitude etc.);
  • Fitting signals using 3D force/torque ellipsoid (or 2D ellipse);
  • Visualizing correlation coefficient matrix of signal sequences.

Introduction to installing and using this application can be found on the following page.

GitHub link: https://github.com/epfl-lasa/SAHR_database.git

Encoding and Switching Between Attractors and Limit Cycles Using Dynamical Systems With Bifurcations

This MATLAB/c++ package provides tools for:

  • finding parameters of a dynamical system with bifurcation from data trajectories via optimization (MATLAB folder);
  • controlling a robot’s end effector with the learned DS (cpp folder, as a ROS node);
  • modifying the parameters of the DS during operation.

GitHub link: https://github.com/epfl-lasa/SAHR_bifurcation

Learning Lyapunov-type energetic functions for multiple attractors dynamical systems

This MATLAB package provides a tool for:

  • Clustering sub-dynamics of multiple attractors dynamical system from unlabeled training data (position and velocities) of demonstrated trajecotries;
  • Attractors’ position identification in a so-called embedding space;
  • Reconstruction of plausible Lypaunov-type energetic functions through weighted sum of kernels.

GitHub link: https://github.com/epfl-lasa/SAHR_multids