2020 Real Robot Challenge
Challenge Protocol
Overview
The tasks which participants will tackle in this challenge are divided into separate phases:
- A qualification phase in simulation, where the objective is to move a cube to some goal pose. There will be different levels of difficulty, ranging from pushing to picking up and reorienting the object.
- Participants who passed phase 1 will be granted access to the robots and will be required to solve the same task as in phase 1, but now on the real robots.
- Same as phase 1 and 2, but for a more difficult object.
Participants will use the robots similarly to a cluster, they can submit jobs, where each job corresponds to executing an episode on a real robot. After termination, participants will have access to the data generated during that episode (all the sensory data and actions taken).
The code will be executed in a Singularity (similar to Docker) image, in which participants can install additional software and test their code in simulation. For this purpose, we provide a simulator with an identical interface as the real robots.
Prizes
Starting from Phase 2 in each phase prizes are awarded to the participants with the best scores
Phase 2:
- Winner: 3500 EUR
- Runner-up: 2000 EUR
Phase 3:
- Winner: 9000 EUR
- Runner-up: 2000 EUR
Teams that are affiliated with the Max Planck Institute for Intelligent Systems are allowed to participate in the challenge but are not eligible for receiving prize money.
Phase 1 (Simulation)
Anyone can participate, no registration is required at the beginning of this phase. Participants can simply download a publicly available repository containing the simulator (see instructions here).
The goal is to move the cube to a goal pose. There are 4 levels of difficulty:
- Pushing to a goal position on the table.
- Lifting to a certain height.
- Lifting and moving to a goal position.
- Lifting and moving to a goal position and orientation.
Participants’ code will be evaluated on each level, and the level scores will be averaged into an overall score.
At the end of the phase, teams will be required to hand in a 2-page proposal and their code. Teams will be selected based on a combination of their score and the quality of the proposal.
Phase 2
The teams which passed phase 1 will be granted access to the real robots and will be able to submit their code (instructions coming soon). The objective of phase 2 is to solve the same tasks as in phase 1, but now on the real robot.
At the end of phase 2, participants will be required to hand in a 2-page report detailing the approaches used (instructions coming soon) and their code (instructions coming soon). Teams will be ranked based on the score achieved by their code, as long as the report passes an acceptance threshold.
Phase 3
The task is similar to phase 1 and 2, except that a more difficult object has to be manipulated, more details will follow.
Important Dates
- August 10: Start of phase 1 - The simulator will be made available such that participants can start phase 1.
- September 25: End of phase 1 - Participants must hand in their proposal, code and score (see the here for details).
- October 12: Start of phase 2 - Participants who passed phase 1 will be notified and given access to the robots.
- November 13: End of phase 2 - Participants must hand in a 3-page report and the singularity image containing their code. Robot-access will be withdrawn.
- November 16: Start of phase 3 - Winners of phase 2 will be announced, and participants will be granted robot-access again such that they can start working on phase 3.
- December 11: End of phase 3
- 14:00 UTC: Submission deadline for the code and Singularity image
- 23:59 UTC: Submission deadline for the report -- Deadline for the report is extended until Dec 14.
- For more information regarding the submission, see How to Submit Your Final Version.
- December 14:
- 23:59 UTC: Submission deadline for the report
Robotic Platform
An open-source version of the challenge robot, with identical actuators, almost identical kinematics, and identical software, is described in this paper and the corresponding site. The main difference with respect to the challenge platform is that the construction of the open-source version has been simplified considerably, such that researchers will be able to build it themselves.
Please note: For this challenge all the robotic platforms will be hosted at our institute, you do not need to build your own.
Some highlights of the design (of both versions):
Hardware Design
Each finger has 3 DoF and they share a workspace, permitting complex fine-manipulation.
This design has the following qualities:
- low weight, high torque
- 1 kHz torque control and sensing
- robustness to impacts due to transparency of transmission
Software Design
The key strengths of the software framework are:
- simple user interface in Python and C++ for control at up to 1 kHz
- safety checks to prevent the robot from breaking
- synchronized history of all inputs and outputs available and can be logged
External References
- Follow us on Twitter: @robo_challenge
- Software Documentation
- If you have any questions, contact us through our forum.
- Reports are on OpenReview
Results
Phase 3
- Winner: ardentstork
- Runner-up: sombertortoise
Results of the final evaluation:
# | Username | Level 1 | Level 2 | Level 3 | Level 4 | Total Weighted Score |
---|---|---|---|---|---|---|
1. | ardentstork | -9239 | -4040 | -6525 | -25625 | -139394 |
2. | sombertortoise | -5461 | -8522 | -10323 | -36135 | -198016 |
3. | sincerefish | -7428 | -25291 | -26768 | -52311 | -347560 |
4. | innocenttortoise | -16872 | -31977 | -33357 | -55611 | -403344 |
5. | hushedtomatoe | -18304 | -31917 | -36835 | -60219 | -433521 |
6. | troubledhare | -18742 | -42831 | -36272 | -56503 | -439233 |
7. | giddyicecream | -33329 | -57372 | -53694 | -59734 | -548090 |
Phase 2
- Winner: ardentstork
- Runner-up: troubledhare
# | Username | Level 1 | Level 2 | Level 3 | Level 4 | Total Weighted Score |
---|---|---|---|---|---|---|
1. | ardentstork | -5472 | -2898 | -9080 | -21428 | -124221 |
2. | troubledhare | -3927 | -4144 | -4226 | -48572 | -219182 |
3. | sombertortoise | -8544 | -15199 | -14075 | -44989 | -261123 |
4. | sincerefish | -6278 | -13738 | -17927 | -49491 | -285500 |
5. | hushedtomatoe | -17976 | -41389 | -41832 | -60815 | -469509 |
6. | giddyicecream | -22379 | -46650 | -41655 | -61845 | -488023 |
Reports and Source Code of the Winning Teams
ardentstork
sombertortoise
troubledhare
- Report: OpenReview
- Code:
Phase 1
- Report: OpenReview
- Code:
Phase 1
Note: The scores listed here are the ones reported by the users and are based on different, randomly generated goals, so they are not fully comparable. For the decision which teams were allowed to proceed to phase 2, the reports also played an important role.
Username | Level 1 | Level 2 | Level 3 | Level 4 | Total Weighted Score |
---|---|---|---|---|---|
giddyicecream | -32 | -284 | -288 | -501 | -3474 |
ardentstork | -68 | -188 | -110 | -898 | -4370 |
hushedtomatoe | -170 | -89 | -109 | -1190 | -5440 |
sombertortoise | -282 | -126 | -178 | -1403 | -6683 |
troubledhare | -264 | -479 | -528 | -1495 | -8787 |
anonymous1 | -247 | -472 | -661 | -1537 | -9327 |
innocenttortoise | -295 | -763 | -675 | -1519 | -9927 |
sincerefish | -450 | -560 | -676 | -1785 | -10746 |
anonymous2 | -367 | -1161 | -897 | -1862 | -12833 |
anonymous3 | -502 | -1289 | -1134 | -1853 | -13897 |