On Thursday, June 26th, we celebrated our new robot badlands (aka: our brand new headquarters) with a grand opening, accompanied of course by a robot ribbon-cutting. Clearpath co-founders made their way to the stage and gathered around ‘Jake’, the PR2. Together, the co-founders held that ceremonial ribbon so PR2 could do an epic swing of the arm to tear the ribbon in two (the video of the event is on the CTV page: CTV Kitchener Extended: Robot ribbon cutting). PR2 did none other than locate and terminate the grand opening ribbon, and we’re going to tell you how.

Skeleton Tracking

Although Jake the PR2 robot didn’t move autonomously, he was controlled with some pretty cool gadgets! A topic I have always been interested is how effective humans are at directly controlling robots. For example, when search-and-rescue robots are tested, it often takes a very long time to even open a door, as the operators have trouble perceiving the environment, and controlling the robot effectively. To this end, I wanted to map the human joint movement onto a robot’s arms, and see how easy it is to control a PR2. Turns out, not quite as easy as I had hoped. Firstly, our system consists of a Kinect 2.0 that feeds skeleton tracking data via rosserial_windows. This must be done as the Kinect 2.0 drivers are currently Windows 8(!) only. The skeleton tracking data looks similar to the following, and is provided in a stream of data points each with an X,Y,Z component in 3D space.

Kinect 1 skeleton tracking

This data is in the camera reference frame, with the Z axis pointing out from the camera lens, Y pointing vertically up, and X to the left of the user (Facing the camera). Once this data is collected and published directly to the PR2, python code on the PR2 figures out the angles of the user’s arms, does some offsetting, and publishes this to the arm controllers. Lastly, the previously published pr2-surrogate package is used to provide feedback from the PR2 to the user. (Check out the ROS wiki page for pr2-surrogate to see Kinect data displayed on the Oculus Rift) Thus, by simply stepping in front of the Kinect 2.0, and putting on the headset, we can teleoperate a PR2. This control is intuitive enough that first time users are able to grasp objects and move them around.

An earlier (un-smoothed) version of the full system can be seen here:

 

Test it out in simulation

To try a simulated version of the same system at home, do the following:

Download these files: pr2_teleop_kinect

Or from here: Github

Start with a ROS Groovy install. On the system, do:

sudo apt-get update

sudo apt-get install ros-groovy-pr2-simulator

roslaunch pr2_gazebo pr2_empty_world.launch 

In another terminal, use rosbag to play back some of the pre-recorded data files included in the download above:

rosbag play -l elbow_rotate_flex.bag -l

Lastly, execute the end_effect.py file included in the download. The pr2 should start moving its arms.

This example should be a fairly straightforward, though lengthy, port to use Kinect 1 data.

Other highlights of the night

PR2-grandopening2

The Grand Opening Husky Cake

The evening included tours of the new robot badlands, a special presentation from our local MP Harold Albrecht, and an award for entrepreneurship presented by Ernst & Young. Along with food and beverages, there was also a life-sized Husky cake! (And yes, it tasted as good as it looked!)

Thank you to all who came out to the event to celebrate with us. And thanks to everyone in the robotics community for your support and your ongoing curiosity to achieve new heights in the world of robotics!

Rockwell Automation completes acquisition of autonomous robotics leader Clearpath Robotics and its industrial offering OTTO Motors. Learn more | Read More
Hello. Add your message here.