Optical mouse odometer for robots using Raspberry Pi

One of my university projects involved developing a search and rescue autonomous robot using a Raspberry Pi. The logic deployed for the application required the precise position of the robot. The usual methods for determining the position of the robot is to use a stepper motor (not cheap) or an encoder (didn’t want to interpret the IO signals on a Raspberry Pi). Using an optical mouse odometer would be a cheap solution for the problem as the surface of the game area for the robot is flat.

John Graham’s post about using a optical mouse odometer served as an inspiration. However, he suggests using the signals directly from the sensors, again interpreting the signals on a Raspberry Pi and it requires a mouse with a particular optical sensor.

Mouse data

The file /dev/input/mice provides data in a PS/2 format for the USB mouse. The python code demonstrates the methods of accessing the position of the mouse. The file access is “blocking” and returns a relative position value whenever there is a mouse event.

import struct, os

file = open( "/dev/input/mice", "rb" );

point_x = 0;
point_y = 0;

class Point:
	x = 0.0
	y = 0.0

def getMouseEvent():
  	buf = file.read(3); 
  	x,y = struct.unpack( "bb", buf[1:] );
  	dis = Point();
  	dis.x = x;
  	dis.y = y;
  	return dis;

while( 1 ):
  	dis = getMouseEvent();
	point_x = point_x + dis.x;
	point_y = point_y + dis.y;
	print ("%d  %d" % (point_x, point_y));
file.close();

This method will work for any standard USB Mouse connected to a Raspberry Pi.

Running as a service

Accessing the mouse /dev/input/mice from multiple locations could result in a conflict in the file access and missing of a mouse event. So I created a background process that acquires the relative position data, converts into an absolute value with a scaling and puts it onto one or more FIFOs (Named pipes) for access by other applications.

The python code writes the calculated position onto a FIFO. Refer to this post for better understanding of FIFOs.

import struct, math, os, errno

file = open( "/dev/input/mice", "rb" );
output = "mouse_FIFO";

point_x = 0;
point_y = 0;
scaling = 0.046875; #determine the scaling based on trial and calibration

class Point:
	x = 0.0
	y = 0.0

def getMouseEvent():
  	buf = file.read(3);
  	x,y = struct.unpack( "bb", buf[1:] );
  	dis = Point();
  	dis.x = x;
  	dis.y = y;
  	return dis;

while( 1 ):
  	dis = getMouseEvent();
	point_x = point_x + (scaling * dis.x);
	point_y = point_y + (scaling * dis.y);
	
	try:
		pipe = os.open(output, os.O_WRONLY | os.O_NONBLOCK);
		os.write(pipe, "%d %d" % (point_x,point_y));
		os.close(pipe);
	except OSError as err:
		if err.errno == 6:
			pass;
		else:
			raise err;
file.close();

The following C++ function demonstrates accessing the FIFO for determining the absolute position of the robot. However note the data can be accessed from any other programming language.

Point robot_position(){
    FILE* fp;
    char readbuf[10];

    fp = fopen("mouse_FIFO", "r");
    fgets(readbuf, 10, fp);
    fclose(fp);

    char *pch;

    pch = strtok(readbuf," ");
    int x = atoi(pch);
    pch = strtok (NULL, " ");
    int y = atoi(pch);
    
    return Point(x, y);
}

Hope this examples provided you with a cheap solution for determining the position of a robot. Mind that the cheapness comes with its restrictions, an optical mouse requires a proper surface to achieve proper movement data.

6 thoughts on “Optical mouse odometer for robots using Raspberry Pi

  1. Thanks for sharing this inspirational odometry method. But i think there will be error on following piece on code:
    point_x = point_x + (scaling * dis.x);
    point_y = point_y + (scaling * dis.y);

    Because coordinate plane is changing while a little change occurs in the angle of mouse. Hence, just adding dx to x can not give correct result.
    Did you use this code on a robot? If yes, what did you change in this code? Or did you use this method for just one directional movement? I will be glad if you answer.
    Thanks.

    1. Yes you are right. I used this method for one directional movement alone, I configured my robot to make only 90 degree turns so I can have a separate coordinates maintained in my main code to know where the robot is at.

      The point_x and point_y don’t correspond to the position of the robot in this case.
      Using Mecanum wheels (move in all directions without turning the robot) would help with proper tracking of the robot completely based on the mouse.

  2. Hi, I wonder if it is possible to use two mouse together? If so, how does Python identify each mouse? For with two mouse it is possible to calculate the possession of the robot using Dead Reckoning. Thank you very much in advance.

    1. Hi, I don’t see any problem with using two mice together. However, it needs testing
      “/dev/input/mice” this provides a single interface for multiple mouses – it is for hotplugging purposes. Please try using the below files meant for individual mouses the same way. I will test on it when I have time, update me if you have progress

      crw-r–r– 1 root root 13, 32 Mar 28 22:45 mouse0
      crw-r–r– 1 root root 13, 33 Mar 29 00:41 mouse1
      crw-r–r– 1 root root 13, 34 Mar 29 00:41 mouse2

    1. Hello Kent, the accuracy was fairly sufficient for my use case. However, the accuracy depends on the surface you are using it on and how well the mouse has been mounted. What I remember is it was accurate to a few cms, but again a lot of other factors matter. I will be interested in testing it with a “Trackball mouse” for different surfaces.

Leave a Reply

Your email address will not be published.