Friday, February 11, 2011

The Eyes, They be Moving!

Here's a short video that shows the eyes on Exuro moving. You can see Mac moving around in the screen on the lower right of the video and the eyes moving to track the closest part of him. I'll post more details soon but right now I'm just psyched that they're moving and tracking pretty well!

Sunday, January 23, 2011

Kinect, Python and Pushing Through

There's tangent project that I'm working on that involves robotics, arduinos, kinect and fire. It's a small robot called Exuro that has a pair of stainless steel eyes that are meant to track a person coming up to a donation box and when they make a donation, set off a small poofer. The idea is to track people using a kinect and have the eyes move as if they're watching the closest person to the donation box. Working with an arduino to control external systems is pretty straight forward for me, it's something that I've done before. But pulling sensor data from something like a kinect and interpreting the data is something I've never done. It's rather intimidating. Processing video data at something like 30 frames per second, not something I'm used to do. But it sounds like fun!

There's an open source driver to access the kinect called libfreenect that's available from Included are wrappers for using the library from Python which most definitely my preferred programming language. That works.

Getting libfreenect to build on a Ubuntu 10.10 system was pretty straight forward. Just follow the instructions in the README.asciidoc file. Getting the Python wrappers to work took a bit more effort. cython is used to create the bindings between libfreenect and Python. Unfortunately, the version that's currently included with Ubuntu 10.10 isn't up to the task. Once I removed the Ubuntu and installed from the latest source, the Python bindings built and worked as just fine. I'm sure the fine folks maintaining Ubuntu will make a newer version available at some point, I'm just not willing to put this project on hold till they do ;-)

There's a few demo files that are included with the wrapper so you can start to play with the interface, library and the kinect data. Two of them, and, make for demo. The first opens two windows and shows a live video feed of the rgb camera in one and the depth camera in the other. The other demo shows a video of the depth camera but sweeps through the data showing what's seen at different depths. These are really interesting demos to help wrap your head around what's available from the kinect.

I got to wondering about the depth data and if there wasn't a way to combine the two demos to be able to slide through the depth manually to see what's there. The result is It allows you to slide along at any depth to see what's there and then to contract or expand to see what's around that depth. Here's a sample video showing my hand pushing through a virtual wall:

The depth slider sets the focal point for what data to display and the threshold provides a +/- tolerance for how much data to display. A depth of 535 and a threshold of 0 would show just the data at 535 while a depth of 535 and a threshold of 6 would show the data from 529 thru 541.

It's an interesting application to play with the gain a basic understanding of the data being returned and possible ways to use it. I've submitted a pull request on github to the maintainers of libfreenect to see if they're willing to include it in the next release. Here's hoping that they will.

There's a lot more work I need to do for this project. The next steps will be to find the closest person in the data stream and calculate their location in the real world in reference to the location of the kinect. And I have almost no idea how to go about doing that. Time to read up on numpy and opencv...

Thursday, May 13, 2010

OS X and the mystery bits

Sometimes I have to learn a lesson over and over again before I get a glimmer of what's going on. I spent some time again today trying to get the MySQL client library module for Python working. I need it to be able to talk w/ a MySQL database for an online store. I'd rather use PostgreSQL but in this case, I don't have a choice.

I downloaded MySQL OS X package and ran the installation program. It did what it was supposed to - install the software under /usr/local. The MySQL client runs and all seems right with the world. I download and install the MySQL Python client module and it builds as expected. But when I try to run Django, I get this error:

django.core.exceptions.ImproperlyConfigured: Error loading MySQLdb module: dynamic module does not define init function (init_mysql)

Grumble, grumble, grumble.

I try to load the MySQL Python module directly:

peter@drag:~> python
Python 2.6.1 (r261:67515, Feb 11 2010, 00:51:29)
[GCC 4.2.1 (Apple Inc. build 5646)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
port>>> import _mysql
Traceback (most recent call last):
File "", line 1, in
File "build/bdist.macosx-10.6-universal/egg/", line 7, in
File "build/bdist.macosx-10.6-universal/egg/", line 6, in __bootstrap__
ImportError: dynamic module does not define init function (init_mysql)

I checked the build output. Checked the install output. Everything looks fine. This should work. Searching the web doesn't help much since the references I find are from 2007 and 2008. The patches that they describe have already been incorporated into the MySQL Python module.

About this time, I realize that I've solved this problem before. It has to do with bits. I run:

peter@drag:~> python
Python 2.6.1 (r261:67515, Feb 11 2010, 00:51:29)
[GCC 4.2.1 (Apple Inc. build 5646)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import platform
>>> platform.architecture()
('64bit', '')

I look at the original MySQL disk image that I downloaded and see that its a 32bit application. Mixing a 32bit dynamic library with a 64bit executable doesn't work.

After downloading the 64bit MySQL disk image, installing it and rebuilding the MySQL Python module, everything works. Now its on to writing code for the online store. Yea.

Sunday, March 28, 2010

Bad data and the wonder of RegEx

Regular expressions aren't something that I use regularly but I think that's about the change. I came up a programming world where C, Cobol and Fortran were just the best thing considering that the alternative was writing assembler. And in that world, regular expressions don't really exist. The basic toolkit I developed internally for solving programming problems didn't include it. To this day, I sometimes still think about iterating over a character sequence in memory until a NULL is found. But in the modern programming world, languages like Python have tools built into the language and standard libraries that provide a lot of help to a developer. One of those tools is regular expression matching.

I'm currently working on a project to convert a FileMaker 5.5 database to SalesForce. Inside the FileMaker database, the data is in a very strange and odd states because there was no data validation done. So a date field can contain strings like '1/1/01', '6/24/2007' or '9-1-2005' or '2/27' or 'Jan 1.' These different formats cause a world of grief in trying to move the data to a database that expects a date to be structured. Something like 1/1/2001. It doesn't know about the various format of our dates and it rejects these as invalid. So the dates had to be filtered into something more standard.

I started to write some code that searches the strings for '/' and '-' and the text names of months. And naturally, it quickly became a rats nest of nested if and conditions that made understanding the code very cumbersome. So I went looking for another way to solve the problem. I remembered that Python had this module called re that provides regular expression processing but I hadn't really used it and wasn't sure about how it would work. So I start searching the web to find some help on using regular expressions and the re module. What I found was just wonderful. A.M. Kuchling wrote up the fantastic Regular Expression HOWTO. In almost no time flat I was starting to put together a regular expression that would match most the permutations of dates (those based on '1/1/01') that I've seen in our database and a second that would match the ones that had the months written out. The quickest way that I found to test the regular expressions was using Python's unittest module. I would edit the regular expression, run the unittests, re-editing, re-run, etc until it worked and all the tests past.

The code that I developed looks like this:

import unittest
import re

class TestDateRegEx(unittest.TestCase):
def test_slashes(self):
slashes = re.compile(r'(\d{1,2})[/-](\d{1,2})[/-](\d{2,4})$')

self.assertEqual(slashes.match('1/1/10').group(), '1/1/10')
self.assertEqual(slashes.match('1/1/10').group(1), '1')
self.assertEqual(slashes.match('1/1/10').group(2), '1')
self.assertEqual(slashes.match('1/1/10').group(3), '10')
self.assertEqual(slashes.match('01/1/10').group(), '01/1/10')
self.assertEqual(slashes.match('11/1/10').group(), '11/1/10')
self.assertEqual(slashes.match('11/13/10').group(), '11/13/10')
self.assertEqual(slashes.match('1/1/2010').group(), '1/1/2010')
self.assertEqual(slashes.match('12/1/2010').group(), '12/1/2010')
self.assertEqual(slashes.match('1/21/2010').group(), '1/21/2010')
self.assertEqual(slashes.match('1/21/2010').group(1), '1')
self.assertEqual(slashes.match('1/21/2010').group(2), '21')
self.assertEqual(slashes.match('1/21/2010').group(3), '2010')

self.assertEqual(slashes.match('111/1/10'), None)
self.assertEqual(slashes.match('11/111/10'), None)
self.assertEqual(slashes.match('11/11/10100'), None)

self.assertEqual(slashes.match('1-1-10').group(), '1-1-10')
self.assertEqual(slashes.match('01-1-10').group(), '01-1-10')
self.assertEqual(slashes.match('11-1-10').group(), '11-1-10')
self.assertEqual(slashes.match('11-13-10').group(), '11-13-10')
self.assertEqual(slashes.match('1-1-2010').group(), '1-1-2010')
self.assertEqual(slashes.match('12-1-2010').group(), '12-1-2010')
self.assertEqual(slashes.match('1-21-2010').group(), '1-21-2010')
self.assertEqual(slashes.match('1-21-2010').group(1), '1')
self.assertEqual(slashes.match('1-21-2010').group(2), '21')
self.assertEqual(slashes.match('1-21-2010').group(3), '2010')

self.assertEqual(slashes.match('111-1-10'), None)
self.assertEqual(slashes.match('11-111-10'), None)
self.assertEqual(slashes.match('11-11-10100'), None)

self.assertEqual(slashes.match('1'), None)
self.assertEqual(slashes.match('1/'), None)
self.assertEqual(slashes.match('1/1'), None)
self.assertEqual(slashes.match('1'), None)
self.assertEqual(slashes.match('1/'), None)
self.assertEqual(slashes.match('1/1'), None)
self.assertEqual(slashes.match('Jan 1'), None)
self.assertEqual(slashes.match('Jan 1, 2010'), None)

def test_names(self):
names = re.compile(r'(jan|feb|mar|apr|may|jun|jul|aug|sep|oct|nov|dec)\s(\d{1,2})(,\s(\d{2,4}))?$', re.IGNORECASE)

self.assertEqual(names.match('Jan 1').group(), 'Jan 1')
self.assertEqual(names.match('Jan 1').group(1), 'Jan')
self.assertEqual(names.match('Jan 1').group(2), '1')
self.assertEqual(names.match('feb 21').group(), 'feb 21')
self.assertEqual(names.match('feb 21').group(1), 'feb')
self.assertEqual(names.match('feb 21').group(2), '21')
self.assertEqual(names.match('Jan 1, 2010').group(), 'Jan 1, 2010')
self.assertEqual(names.match('Jan 1, 2010').group(1), 'Jan')
self.assertEqual(names.match('Jan 1, 2010').group(2), '1')
self.assertEqual(names.match('Jan 1, 2010').group(4), '2010')
self.assertEqual(names.match('MAR 14, 2010').group(), 'MAR 14, 2010')
self.assertEqual(names.match('MAR 14, 2010').group(1), 'MAR')
self.assertEqual(names.match('MAR 14, 2010').group(2), '14')
self.assertEqual(names.match('MAR 14, 2010').group(4), '2010')

self.assertEqual(names.match('MA 20, 2010'), None)
self.assertEqual(names.match('xyz 2,'), None)
self.assertEqual(names.match('jan 2,'), None)

if __name__ == '__main__':

The use of regular expressions and the ability to match and group the matches makes the code needed to clean up the dates much simpler and a lot easier to maintain. It might take a moment next time I need to parse strings to think about using regular expressions and I'm pretty sure I'll have to refer to the howto a couple more times but I'm really happy with how powerful and how much simpler text processing can be by using them.

Friday, October 16, 2009

Web service monitoring w/ Nagios and JSON

I'm using Nagios to act as a watch dog for my network and the various services that live on it. Nagios does the job pretty well. It lets me know when there's a problem, when things are back to normal and generally keeps on eye on things for me.

The checks that Nagios performs are done through a series of check commands. These commands are your typical Unix style program with the exceptions that they produce a single line of text that describes the state of the item being checked and the exit value let's Nagios know what's going on.

So for instance, to check the health of the web service on the localhost:

peter@sybil:~$ /usr/lib/nagios/plugins/check_http -H localhost
HTTP OK HTTP/1.1 200 OK - 361 bytes in 0.001 seconds |time=0.001021s;;;0.000000 size=361B;;;0
peter@sybil:~$ echo $?

The single line of text that is displayed follows a specific format. It starts with the prefix of what's being tested, HTTP. Next is the status, OK. This can be OK, WARNING, CRITICAL or UNKNOWN. Everything after the status is eye candy that provide details that are specific to the test being done. Nagios doesn't really care about it but it does provide important details when looking at problems that may be occurring.

Writing these check program in Python is pretty straight forward.

I recently had a situation where our ISP moved our web servers from one physical machine to another. This resulted in the credit cards processing for our online store to fail. The payment provider uses the IP address of the server as part of the authentication process when submitting credit cards for processing. Since the server changed, the IP address changed. Things went around in circles for a while until we figured out the problem and gave the new IP address to the payment

I thought is would be a good additional Nagios check for the store web site to check on the IP address of the physical server. Unfortunately, the ISP doesn't provide access to the IP address. But they do provide access to the hostname.

To get the hostname, I added a simple CGI program that determines the hostname and then packages it up as a JSON data structure.

#!/usr/bin/env python

Bundle the hostname up as a JSON data structure.

Copyright (c) 2009 Peter Kropf. All rights reserved.

import cgi
import popen2
import sys
sys.path.insert(1, '/home/crucible/tools/lib/python2.4/site-packages')
sys.path.insert(1, '/home/crucible/tools/lib/python2.4/site-packages/simplejson-2.0.9-py2.4-linux-x86_64.egg')

import simplejson as json

field = cgi.FieldStorage()
print "Content-Type: application/json\n\n"

r, w, e = popen2.popen3('hostname')
host = r.readline()

fields = {'hostname': host.split('\n')[0]}

print json.dumps(fields)

There's a couple of things to note. Since the ISP provides a very restrictive environment, I have to add the location of the simplejson module before it can be imported. It's a bit annoying but it does work.

On the Nagios service side, I created a new check program called check_json. It takes the name of a field, the expected value and the URI from which to pull the JSON data.

#! /usr/bin/env python

Nagios plugin to check a value returned from a uri in json format.

Copyright (c) 2009 Peter Kropf. All rights reserved.


Compare the "hostname" field in the json structure returned from against a known value.

./check_json hostname buenosaires

import urllib2
import simplejson
import sys
from optparse import OptionParser

prefix = 'JSON'

class nagios:
ok = (0, 'OK')
warning = (1, 'WARNING')
critical = (2, 'CRITICAL')
unknown = (3, 'UNKNOWN')

def exit(status, message):
print prefix + ' ' + status[1] + ' - ' + message

parser = OptionParser(usage='usage: %prog field_name expected_value uri')
options, args = parser.parse_args()

if len(sys.argv) < 3:
exit(nagios.unknown, 'missing command line arguments')

field = args[0]
value = args[1]
uri = args[2]

j = simplejson.load(urllib2.urlopen(uri))
except urllib2.HTTPError, ex:
exit(nagios.unknown, 'invalid uri')

if field not in j:
exit(nagios.unknown, 'field: ' + field + ' not present')

if j[field] != value:
exit(nagios.critical, j[field] + ' != ' + value)

exit(nagios.ok, j[field] + ' == ' + value)

Some checking is done to ensure that the JSON data can be retrieved, that the needed field is in the data and then that the field's value matches what's expected.

These examples show the basic testing that's done and the return values:

peter@sybil:~$ /usr/lib/nagios/plugins/check_json hostname buenosaires
JSON OK - buenosaires == buenosaires
peter@sybil:~$ echo $?
peter@sybil:~$ /usr/lib/nagios/plugins/check_json hostname buenosaires
JSON UNKNOWN - invalid uri
peter@sybil:~$ echo $?
peter@sybil:~$ /usr/lib/nagios/plugins/check_json hostname buenosairs
JSON CRITICAL - buenosaires != buenosairs
peter@sybil:~$ echo $?
peter@sybil:~$ /usr/lib/nagios/plugins/check_json ostname buenosaires
JSON UNKNOWN - field: ostname not present
peter@sybil:~$ echo $?

Once the Nagios server is configured with the new command, the hostname on the server can be monitored and hopefully ease any problems that may occur then next time things change at the ISP.

More details on Nagios can be found at and on developing check program at

Monday, June 01, 2009

Running External Django Scripts

Django is pretty good at creating a database driven website. The documentation is clear and the tutorials show how to use the framework to create web based applications. But one part that I wish was a bit more straight forward is running scripts outside the web server. The issue is that Django code expects to have a certain environment configured and setup for the framework. With this in place, you can preform tasks like polling an IMAP server for incoming email messages or monitoring a directory for new files or whatever else needs to be done. There are several posts online to help you get the environment setup here, here and here. But some of them seem not to work correctly because of the changes to Django for the 1.0 release or other reasons.

I have a fairly straight forward example of how to setup the Django environment and allow the rest of your code to access the Django framework for your web application. Its remarkably simple and straight forward.

Suppose that I've created a Django project in my tmp directory called demo_scripts and within that project, I create an app called someapp.

peter@fog:~/tmp> startproject demo_scripts
peter@fog:~/tmp> cd demo_scripts/
peter@fog:~/tmp/demo_scripts> startapp someapp

I create a model in someapp that looks like:

from django.db import models

class Foo(models.Model):
name = models.CharField(max_length=21,
help_text="Name of the foo.")

def __unicode__(self):

class Meta:
ordering = ('name',)

Next step is to sync the database:

peter@fog:~/tmp/demo_scripts> ./ syncdb
Creating table auth_permission
Creating table auth_group
Creating table auth_user
Creating table auth_message
Creating table django_content_type
Creating table django_session
Creating table django_site
Creating table someapp_foo

You just installed Django's auth system, which means you don't have any superusers defined.
Would you like to create one now? (yes/no): yes
Username (Leave blank to use 'peter'):
E-mail address:
Password (again):
Superuser created successfully.
Installing index for auth.Permission model
Installing index for auth.Message model

And add some initial data to the database:

peter@fog:~/tmp/demo_scripts> ./ shell
Python 2.5.4 (r254:67916, May 1 2009, 17:14:50)
[GCC 4.0.1 (Apple Inc. build 5490)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from someapp.models import Foo
>>> Foo(name='A Foo').save()
>>> Foo(name='Another Foo').save()

Now we can write a standalone script to do something with the data model. For simplicity's sake, I'll just print out all the Foo objects. The script is going to live in a new directory called scripts. Here's the source:

#! /usr/bin/env python

import sys
import os
import datetime

sys.path.insert(0, os.path.expanduser('~/tmp/demo_scripts'))
os.environ['DJANGO_SETTINGS_MODULE'] = 'settings'

from someapp.models import *
print Foo.objects.all()

When I run the script, it prints the array of the two Foo objects that I previously created:

peter@fog:~/tmp/demo_scripts> ./scripts/ 
[<Foo: A Foo>, <Foo: Another Foo>]

Lines 8 and 9 are the critical lines in the script code. The first adds the project directory to the Python system path so that the settings module can be found. The second tells the Django code which module to import to determine the project settings.

Wednesday, May 27, 2009

Django Google Apps Authentication

Django has an excellent user management and authentication system built into the framework. With it you can easily create users that can be authenticated against the website. But there are times when you just need to authenticate against a different system. In the case of an app I recently developed, I originally wanted to authenticate against an OS X Server. The OpenDirectory service on OS X Server is an LDAP server, under the hood you'll find slapd from OpenLDAP running. So should be pretty straight forward to create an authentication module that uses Python's LDAP module. And this article from the Carthage WebDev site shows you how to do it.

After I got the module working on my site, I realized the site would be better served if the authentication happened against Google Apps. Since Google Apps is currently being used by the organization for email, calendaring and sharing documents, everyone already has an account there. And with the module from Carthage Webdev, I thought it would be pretty simple to provide a module.

To get started, I had to install gdata. The installation instructions found on the Google Apps APIs page were pretty easy to follow. Specifically, I had to install the Provisioning API.

On a side note, I'm using Python 2.5 as installed via MacPorts. Before I could use the gdata APIs, I had to install py25-socket-ssl.

The APIs are pretty well documented via the examples from the Python Developer's Guide. Here's how I'm authenticating a Django project with users on Google Apps.

To start, there are three configuration variables that I added to the Django project's module:

# Google Apps Settings
GAPPS_USERNAME = 'name_of_an_admin_user'
GAPPS_PASSWORD = 'admin_users_password'

These will allow the module to authenticate against Google Apps and ask for specific details about the user.

Here's the code for

import logging
from django.contrib.auth.models import User
from django.conf import settings
from gdata.apps.service import AppsService, AppsForYourDomainException
from import DocsService
from gdata.service import BadAuthentication


class GoogleAppsBackend:
""" Authenticate against Google Apps """

def authenticate(self, username=None, password=None):
logging.debug('GoogleAppsBackend.authenticate: %s - %s' % (username, '*' * len(password)))
admin_email = '%s@%s' % (settings.GAPPS_USERNAME, settings.GAPPS_DOMAIN)
email = '%s@%s' % (username, settings.GAPPS_DOMAIN)

# Check user's password
logging.debug('GoogleAppsBackend.authenticate: gdocs')
gdocs = DocsService() = email
gdocs.password = password
# Get the user object

logging.debug('GoogleAppsBackend.authenticate: gapps')
gapps = AppsService(email=admin_email, password=settings.GAPPS_PASSWORD, domain=settings.GAPPS_DOMAIN)
guser = gapps.RetrieveUser(username)

logging.debug('GoogleAppsBackend.authenticate: user - %s' % username)
user, created = User.objects.get_or_create(username=username)

if created:
logging.debug('GoogleAppsBackend.authenticate: created') = email
user.last_name =
user.first_name =
user.is_active = not guser.login.suspended == 'true'
user.is_superuser = guser.login.admin == 'true'
user.is_staff = True

except BadAuthentication:
logging.debug('GoogleAppsBackend.authenticate: BadAuthentication')
return None

except AppsForYourDomainException:
logging.debug('GoogleAppsBackend.authenticate: AppsForYourDomainException')
return None

return user

def get_user(self, user_id):

user = None
user = User.objects.get(pk=user_id)

except User.DoesNotExist:
logging.debug('GoogleAppsBackend.get_user - DoesNotExist')
return None

return user

It was pretty easy to write and debug this code using the module as a working example.

One downside to this code is that any newly created users in the Django auth database don't have any rights. So if the Django project is expecting to be able to dynamically change the contents based on the rights that the user has, the account will have to manually modified via the Django admin interface. Not too bad, but annoying.