1X robotics company showcases its androids driven by neural networks
Briefly

The actions in the video above are all controlled by a single vision-based neural network, emitting at 10Hz. The network absorbs a range of imagery to issue commands to control the driving, arms, gripper, torso, and head. That is all. There are no graphics installed, no video speedups, no teleoperation, and no computer graphics.
Thirty EVE robots were used to put together a top-rated, diverse dataset of demonstrations, to generate the behaviors. The data is then taken to train a "base model" to identify a range of physical behaviors, from standard human tasks such as tidying homes, picking up objects, and interacting with other people (or robots).
Read at ReadWrite
[
add
]
[
|
|
]