|
--- |
|
license: cc-by-nc-4.0 |
|
tags: |
|
- sparsh |
|
- force field |
|
- digit |
|
--- |
|
|
|
# Sparsh (DINO) + force field decoder for DIGIT sensor |
|
|
|
We decode the touch representations from Sparsh into normal and shear force fields. This allows us to gather an intuition about what the representations capture in terms of forces in a way that is interpretable for humans. |
|
|
|
## How to Use |
|
For testing Sparsh(DINO) + force field decoder live, you only need a DIGIT sensor. Follow these steps to run the demo: |
|
|
|
1. Clone the [sparsh repo](https://github.com/facebookresearch/sparsh.git) |
|
|
|
2. Create a folder for downloading the task checkpoints. For example, `${YOUR_PATH}/outputs_sparsh/checkpoints`. |
|
3. Download the Sparsh (DINO) base [checkpoint](https://huggingface.co/facebook/sparsh-dino-base) |
|
4. Download the decoder checkpoints from this repo. |
|
5. Connect the sensor to your PC. In case of DIGIT, please make sure you have [digit-interface](https://github.com/facebookresearch/digit-interface) installed. |
|
6. Make sure the device is recognized by the OS (you can use Cheese in Linux to see the video that the sensor is streaming). |
|
|
|
7. Running the demo for DIGIT (please refer to the Sparsh repo README for more information about how to setup the path configs): |
|
|
|
```bash |
|
python demo_forcefield.py +experiment=digit/downstream_task/forcefield/digit_dino paths=${YOUR_PATH_CONFIG} paths.output_dir=${YOUR_PATH}/outputs_sparsh/checkpoints/ test.demo.digit_serial=${YOUR_DIGIT_SERIAL}` |
|
``` |
|
The DIGIT serial number is printed on the back of the sensor and has the format `DXXXXX`. |
|
|
|
8. Take the sensor and slide it across the edge of a table, or across objects with interesting textures! Look at the normal field to localize where you're making contact on the sensor's surface. Look at the shear field to gather an intuition about the direction of the shear force that you applied while sliding the sensor. For example, slide the sensor over an edge up and down to get translational shear or rotate the sensor in place to see torsional slip! |
|
|
|
<!-- <p align="center"> |
|
<img src="assets/demo_digit.gif" alt="animated" /> |
|
</p> --> |
|
|
|
### BibTeX entry and citation info |
|
```bibtex |
|
@inproceedings{ |
|
higuera2024sparsh, |
|
title={Sparsh: Self-supervised touch representations for vision-based tactile sensing}, |
|
author={Carolina Higuera and Akash Sharma and Chaithanya Krishna Bodduluri and Taosha Fan and Patrick Lancaster and Mrinal Kalakrishnan and Michael Kaess and Byron Boots and Mike Lambeta and Tingfan Wu and Mustafa Mukadam}, |
|
booktitle={8th Annual Conference on Robot Learning}, |
|
year={2024}, |
|
url={https://openreview.net/forum?id=xYJn2e1uu8} |
|
} |
|
``` |