Human Motion Prediction.

Overview :


Human motion modelling is a classical problem at the intersection of graphics and computer vision, with applications spanning human-computer interaction, motion synthesis, and motion prediction for virtual and augmented reality. Following the success of deep learning methods in several computer vision tasks, recent work has focused on using deep recurrent neural networks (RNNs) to model human motion, with the goal of learning time-dependent representations that perform tasks such as short-term motion prediction and long-term human motion synthesis. We examine recent work, with a focus on the evaluation methodologies commonly used in the literature, and show that, surprisingly, state-of-the-art performance can be achieved by a simple baseline that does not attempt to model motion at all. We investigate this result, and analyse recent RNN methods by looking at the architectures, loss functions, and training procedures used in state-of-the-art approaches. We propose three changes to the standard RNN models typically used for human motion, which result in a simple and scalable RNN architecture that obtains state-of-the-art performance on human motion prediction.



1



This is a practical application for the paper

https://arxiv.org/pdf/1705.02445v1.pdf



Dependencies

h5py -- to save samplesTensorflow 1.2 or later.


Get this code and the data

First things first, clone this repo and get the human3.6m dataset on exponential map format.


git clone https://github.com/una-dinosauria/human-motion-prediction.git
cd human-motion-prediction
mkdir data
cd data
wget http://www.cs.stanford.edu/people/ashesh/h3.6m.zip
unzip h3.6m.zip
rm h3.6m.zip
cd ..


Quick demo and visualization

For a quick demo, you can train for a few iterations and visualize the outputs of your model.

To train, run

python src/translate.py --action walking --seq_length_out 25 --iterations 10000

To save some samples of the model, run

python src/translate.py --action walking --seq_length_out 25 --iterations 10000 --sample --load 10000

Finally, to visualize the samples run

python src/forward_kinematics.py

This should create a visualization similar to photo 1





References

https://arxiv.org/pdf/1705.02445v1.pdf


93 views
CXS-White-Base-01.png
Official
incubator

Coordination : Hassan Jacobsen