Art 483 | Fundamentals of Interface Design | Winter 2007

The Living Robot

DESCRIPTION

The Living Robot is a simple simulation of a robot that uses seven people linked together in a group, each mimicking a different component of the robot's behavior by following a precise set of instructions. This simulation aims to bring some of the basic principles of software engineering to life and give non-technical people a feeling for what is involved in designing software to control robots.

When the independent instructions of each component are followed properly and precisely, the robot (a composite of the seven people) performs the desired actions. In this way, the participants can comprehend the resulting behavior of the whole robot as an integrated set of interdependent smaller actions.

Here is a schematic of the robot's configuration:


COMPONENTS

Here are the functions of the seven components:

  • BRAIN: controls all operations and actions of the other components
  • EYES: sense the intensity of the ambient light in front
  • LEFT ARM: senses physical contact with an object on the left side
  • RIGHT ARM: senses physical contact with an object on the right side
  • LEFT LEG: propels the left side of the robot forward or backward
  • RIGHT LEG: propels the right side of the robot forward or backward
  • TAIL: signals with rear-facing lamp

Using this configuration, we specify a behavior as follows:

The robot goes steadily forward until a light is shined on it, when it stops. While moving, if it is bumped on the right side, it will turn to the left and turn on the tail lamp during the turn. Similarly, if bumped on the left, it will turn to the right with the lamp indicating this.

In order to accomplish this, the brain must synchronize the activities of all of the other components. It continuously monitors the information arriving from the input sensors–the eyes, left arm, and right arm. Processing that information, it then directs the activities of the action components–the left leg, right leg, and tail lamp.


INSTRUCTIONS

Here are the instructions for each component of the Living Robot:

Component WHEN YOU... DO THIS...
LEFT ARM

contact anything

say LEFT CONTACT

RIGHT ARM

contact anything

say RIGHT CONTACT

EYES

see a bright light

say BRIGHT LIGHT

LEFT LEG

hear LEFT LEG GO

hear LEFT LEG STOP

walk slowly forward

stop walking

RIGHT LEG

hear RIGHT LEG GO

hear RIGHT LEG STOP

walk slowly forward

stop walking

TAIL

hear LAMP ON

hear LAMP OFF

turn on light

turn off light


The brain is a little more complicated because it has to orchestrate many actions based on input from the other components:

Component WHEN YOU... DO THIS...
BRAIN

want to start the robot

say LAMP OFF
say LEFT LEG GO
say RIGHT LEG GO

hear BRIGHT LIGHT

say LAMP OFF
say LEFT LEG STOP
say RIGHT LEG STOP

hear LEFT CONTACT

say LAMP ON
say RIGHT LEG STOP
wait one second
say RIGHT LEG GO
say LAMP OFF

hear RIGHT CONTACT

say LAMP ON
say LEFT LEG STOP
wait one second
say LEFT LEG GO
say LAMP OFF

A printer-friendly version of these instructions can be found here.


IMPLEMENTATION

For the microprocessor version of the simulation, a LEGO Mindstorms robot is constructed and a software program written to implement the behavior. The robot has 2 touch sensors (for the left and right arms) and a light sensor representing the eyes. Two motors are used for the left and right legs, and a simple bulb is used for the tail lamp.

Here are some photos of an implementation on an older-style RCX robot that implements this design:

You can click on any of these to see a large version in separate window.

Here is a picture of a Robolab program implements this design:

You can click on this to see a large version in separate window.

And finally, you can find the actual Robolab program here.

OK, I'm lying, for now. This will be an assignment later in the course, so I'm not putting them online until after you do it.