Dendiablo is not affiliated with any Devils.

About Me

My photo
Carlsbad, California, United States
Humans are screwing up the place.

Wednesday, March 15, 2006

Robot Eyes

When I was still in junior high school in the 60's I developed a strong interest in robots. It was still over my head at that time, since I only understood basic electronics and nothing about computers. My father had lots of ham radio equipment, but even the smallest computers of that era were still the size of a semi-truck. My father also had many wood and metal working tools, which I put to use on lots of little projects.

One project was to build a robot. I could not find anything useful about robots in the library of our little Midwestern town, nor did any of my teachers know anything about them, so most everything I knew came from science fiction – Robbie the Robot, etcetera. Of course that meant I knew nothing whatsoever. All I had for a design model was my own body, really.

The first thing I designed was a robot arm, which was really nothing more than a rather complicated puppet arm. I built the arm from wood, metal hinges and rotary joints, and made a system of little pulleys and fishing line sinews which all led up through guide sleeves to a final “control box” where the sinews terminated with grommets, such as those at the end of guitar strings. Depending on how many sinews I pulled at once, the arm could bend at the various joints, or twist at the wrist, or grasp things with its little wooden fingers.

At that point, far even from being any kind of robot, I knew that controlling the thing was hopeless. I could not build any workable system of servos or actuators from the scraps of electronics and slot car parts that I'd collected, and my family was not wealthy enough to buy any other stuff. The problem was truly way over my head. Yet I felt that it was a kind of baroque artistic work, and many of my friends were impressed with the many moving parts I had fashioned. It also determined my career choices in the future.

Still, I gave up on it, and went through a period of tumultuous distractions, including the Vietnam war.

Many years later, after the first 8080 was developed, I developed a renewed interest in robots. Anyone knew that it would take a computer to control all the little moving parts of any useful robot. I was already writing software for IBM/360s and PDP/11s by then, but those were too clunky to be suitable for any robot. At least I knew the ins and outs of analog electronics, such as those for model airplanes and TV or radio. Yet the little robot arm that hung silent on my father's work bench for so many years would stay there forever. It was and always would be just a childish piece of art.

After a few more years, due to my software skills, I gained employment in an automation company that built industrial machines. Suddenly I felt like I was in my dream world, and I had access to all the microcomputers, servo controllers and other machine control electronics I could ever want. Still, due to the crude systems available at the time, it was not robots that I directly worked on, but the operating systems of the computers that controlled them. I was somewhat disappointed, but the job was still quite educational.

It was then that I realized, around 1980, that robots were not difficult to produce – they were difficult to program. Software was the real missing link. This especially became evident when I tried to write sensor analysis programs for real-time event recognition. It is one thing to input a few bits and determine some rational response, but quite another to input thousands of bits at one time and make heads or tails of all that stuff in real-time.

Vision was the main culprit. Hearing is also difficult (actually signal recognition) but there was just enough bandwidth available to handle such limited linear signals. Vision was hundreds of times harder, mainly because it added so many more dimensions. For every dimension that needs to be processed the problem is multiplied by the number of resolvable values in that dimension.

Any other kind of single sensor input was just a graph formed by Y amplitude values in X time samples. Monochrome vision added a Z dimension in. Limit sensors and various other state sensors (to keep parts from destroying themselves) accounted for several more dimensions, beyond any single letter symbols. Yet robots needed almost ALL their sensors to be analyzed at the same time. That meant there was a very, very large number of matrix elements to the problem.

So I was stumped. I read everything I could from MIT, Caltech and many other scientific institutions, and even tried to experiment myself to solve the problem of making sense from large numbers of sensors in real-time. If fish and insects can do it, surely I could figure it out. So how did a fish or insect brain work? That was when it became obvious that the so-called “neural networks” employed by living brains were needed in robots. Even the logic gates (like AND, OR) of computers were actually simplified versions of brain cells. The main difference was that real brain cells allowed far more inputs than logic gates for their decision making. Logic gates only had two inputs each but a brain cell might have thousands of inputs.

I already knew about the “perceptron” at that time, and had read many papers about elementary neural systems. But the machines of that time were too slow to really emulate any sizable “brain” of that type, and there were very few parallel processing computers in existence and they were far too expensive and clunky. It takes a very large number of connections between a very large number of neurons before any non-trivial functions resembling animal vision could result.

So it came apparent to me that robotics was pretty much impossible without being able to process vast amounts of sensory data in real-time -- just like a living brain. My little wooden puppet arm would still be almost as impossible to control as any humanoid robot envisioned in science fiction stories.

My little wooden arm was lost because my father eventually died. I have no idea whatever happened to it, or to any of those tools, since my stepmother inherited that stuff and she probably threw it all out as junk. I had moved several states away long before and my stepmother was virtually a stranger to me.

For the remainder of my career, and to this day, my work has concentrated on the “vision” problem of robotics. If a truly intelligent “machine eye” could be made, then that would solve 99% of the problem of robotics. All other robot problems, although always complex in themselves, would be nearly trivial in comparison, and could probably be solved in much the same general way as the vision problem.

Some people argue that machines can be simplified, and like airplanes compared to birds, do not need the same complexity as living things. That may be so, but even cruise missiles need a great deal of human intervention and are built in extremely complex factories. They do not build themselves, nor learn on their own how to operate their deadly mechanisms.

Most higher animals devote a large portion of their nervous systems to vision. Human vision uses color, brightness and motion attention, stereo differential distance, motion differential distance, and X,Y field mapping, all at the same time. That is a huge amount of processing and it doesn't even count the other senses or the effects of memory, linear reasoning, imagination and decision making. Yet even fish have most of those same abilities (but not all – they have very little reasoning ability ) plus they have sensors that humans do not have, such as for electric fields and hydraulic shock waves.

There are many clever designs for machine vision that younger, smarter people than myself have developed, many for the military but also for general security systems. Most people are familiar with face or fingerprint recognition, especially if they watch CSI-type TV programs. Carver Mead, who developed many early integrated circuits, also developed an artificial retina that had many features of real retinas, especially for real-time motion oriented vision.

There is probably not much useful work I can do as I get older, but whatever time I have left I will still devote to understanding the science of robotics. Since machines have already replaced most of my past occupational skills, I suspect that robots will eventually do away any need for me at all. That might not be such a good thing for me, personally, but might be good for other people in the future.

No comments: