SAIL Automatic Mode

The SAIL robot can be launched in automatic mode simply by clicking the 'Alive' button, although it won't perform any tasks until one of the buttons on the button bar at the bottom of the screen is pressed. This causes the communication software associated with the function chosen to be loaded into memory. The 'All' button can be pressed to load all functions of the robot simultaneously. The only feedback available to the user in automatic mode (other than visually watching the robot perform and respond) are the text messages which display in the message window. The reason for this is to reduce the processing load of the user interface thereby freeing more processing power for the robots functionality. The text in the message window can be saved to a file or printed for diagnostic use. If more interface feedback is desired for troubleshooting, the robot should be operated in manual mode.

The function calls available for input and output communication with the robot are listed below.

The main screen of the SAIL GUI looks like this: (you can click on the bottom button bar for information on the functionality of each robot part)

Alive

This button is used to activate the SAIL robot. If it is not depressed, the robot will not function in automatic mode. It is mutually exclusive with the Asleep mode.

Asleep

This button turns the robot back off. It can be used as an emergency stop as it shuts down communication with the robot down entirely. It is mutually exclusive with the Alive mode.

Learn

This button toggles the SAIL learning mode on and off. If it is depressed that the robot will learn as it operates. When it is unpressed, the robot will operate with knowledge already gained but will perform no new learning.

All

This button turns all the functions of the robot simultaneously on or off. To use individual portions of the robot, turn this button off (unpressed) and then depress only the buttons you want to have functional.

Head

The head control offers the functionality of the eyes (video input). It also controls the neck (turntable) and a height adjustment (torso). Additionally the eyes (cameras) are mounted on individual pan-tilt units which allows them to move independent of one another if desired. There are three separate interfaces at work here. The pan-tilt units, the video input, and the neck -torso adjustment. The pan tilt units

Listen

The listen control offers the functionality of the ears (audio input). It determines the source of the sound (direction) as well trying to recognize the word or sound and the speaker.

Speak

The speak control offers the functionality of a mouth (audio output). It can perform as a text reader or may try to phonetically create realistic dialog.

Move

The move control offers the functionality of the legs (locomotion). It controls the movements of the robto base unit.

Reach

The reach control offers the functionality of the an arm. It can grasp items and rotate them in a umber of different ways. The current arm on the robot has joints correspondign to shoulder, elbow, and wrist joints. The wrist joint can perform movements well beyond the human wrist.

NOTE:

As the SAIL project progresses and more functionality is added, this page will be updated. Check back often for the latest information.

Top of Page