Android phone as orientation sensor for Raspberry Pi
A project that I've started working on for about a month, and one that I've wanted to do for about a year now, is to create an intelligent robot from scratch. By "from scratch", I mean from the component level of microcontrollers, motors, wires, etc. I'm not going to build my microcontroller or motor, for reasons that should become quite obvious once I start to describe my process. Incorporating appropriate sensors is pretty important part of robot development, and here, I'll be talking about how I got an Android phone to act as an orientation sensor for a Raspberry Pi.
The idea is quite simple: we get the Android phone to send the data of its inertial measurement unit to the Raspberry Pi, and the Raspberry Pi will do all the calculations to figure out the three-axis orientation of the Android phone. If the phone and RPi are fixed to each other, then the orientation of the RPi is going to be the same as the phone's.
To get the Android phone to send sensor data to the RPi, I found a very simple Android application called Wireless IMU. It sends accelerometer, gyroscope and magnetometer data over UDP to one port of one IP address.
To calculate the orientation of the phone on the RPi, we do simple vector calculations. This Google talk by David Sach gives a very good overview of the sensor suite of the phone, as well the operating principles of each of the sensors. In my program, I used the accelerometer data to calculate the direction of gravity w.r.t to the phone. I then used this vector and the magnetometer data to calculate North w.r.t the phone. These two vectors then completely define the three-axis orientation of the phone.
I wrote the program in Python, just because I'm comfortable with it.
I think it's pretty cool to use the sensors of a phone for an external application such as robotics. For one thing you get to save on getting additional sensors. The downside is that you've got to have a local network wherever you're developing your robot - not an issue if you have a home network.
After this, I want to make use of this sensory input to perform useful tasks. A simple one would be a wheeled robot that follows North. This is my immediate goal. I'm thinking it'll be straightforward. (famous last words)
The idea is quite simple: we get the Android phone to send the data of its inertial measurement unit to the Raspberry Pi, and the Raspberry Pi will do all the calculations to figure out the three-axis orientation of the Android phone. If the phone and RPi are fixed to each other, then the orientation of the RPi is going to be the same as the phone's.
To get the Android phone to send sensor data to the RPi, I found a very simple Android application called Wireless IMU. It sends accelerometer, gyroscope and magnetometer data over UDP to one port of one IP address.
To calculate the orientation of the phone on the RPi, we do simple vector calculations. This Google talk by David Sach gives a very good overview of the sensor suite of the phone, as well the operating principles of each of the sensors. In my program, I used the accelerometer data to calculate the direction of gravity w.r.t to the phone. I then used this vector and the magnetometer data to calculate North w.r.t the phone. These two vectors then completely define the three-axis orientation of the phone.
I wrote the program in Python, just because I'm comfortable with it.
I think it's pretty cool to use the sensors of a phone for an external application such as robotics. For one thing you get to save on getting additional sensors. The downside is that you've got to have a local network wherever you're developing your robot - not an issue if you have a home network.
After this, I want to make use of this sensory input to perform useful tasks. A simple one would be a wheeled robot that follows North. This is my immediate goal. I'm thinking it'll be straightforward. (famous last words)