I wrote a program to simulate a two-legged robot, like the scout walkers in Star Wars.
I never bothered to add a neck.
The robot is controlled like a puppet. The A and S keys take left and right steps forward, and the Z and X keys take backward steps. The length of time you hold down the keys controls the step length. It’s easy to turn around in place by making forward and backward steps with alternate feet.
The mouse controls the direction the head is looking.
Part of this idea came from when I visited the Battletech Center in Chicago. They have little “mech” simulators you sit in. The robots are controlled with foot pedals. I was hoping that you’d control the robots with your feet in a way that matched the motion of the robot’s feet, but it was more like driving a tank. I thought this was lame.
There’s some nice pseudo-physics that give the illusion of momentum. It’s mostly based on damped oscillators. The eyes track more quickly than the head. This is what happens with your own eyes. When you turn to look at something, your head always lags behind your eyes because it weighs a lot more.
The code to position the robot involved a lot of wacky inverse kinematics. I also wrote code to create optimized triangle meshes to speed everything up a little bit.
My original intent was to drive the robot from a first-person point of view. If the mouse was moved to the left side of the screen, the robot’s head (and the view) would rotate 90 degrees to the left.
You could quickly look around a 180 field of view with a sweep of your hand. This would compensate for the lame 30 degree field of view that most monitors provide. Larger movements of the camera would require rotating the robot’s body around by taking steps.
With camera controls like this, the computer would have a pretty good idea where you were looking. In a networked environment, the program could draw your robot’s head in the same direction. In this way, the players could make eye contact through the robot proxies.