Mother Nature on a Motherboard

Genetic algorithms help planes fly and buses run.

By Jessie Scanlon

John Holland, the father of genetic algorithms, laid out the basic theory of GAs in the early 1970s. The underlying principle was that the powerful creative forces of evolution - the laws of natural selection and genetics - could be applied to complex computational problems.

As the name implies, a genetic algorithm is a formula or solution based on the core tenets of genetics: reproduction, crossover, and mutation. Take the problem of designing the most efficient airplane wing. A team of aerodynamics engineers could spend months designing a wing, crunching numbers about airflow, lift, and stress before finding a solution. A genetic algorithm would define a wing as a string of variables (such as size, shape, weight), create hundreds or thousands of sample wings, test the individual members of this wing "population" to determine how well they measure up to the desired performance, and let the fittest - the most aerodynamic - survive.

The beauty of GAs lies in their robustness, their ability to solve a wide range of problems (including machine learning), and their knack for engineering optimization, in which a problem's most efficient solution must be found. Another advantage is GAs' ability to evaluate multiple solutions simultaneously. This implicit parallelism not only makes GAs faster than traditional algorithms - GAs can solve problems in sub-quadratic times - it also lends itself to parallel computing.

To understand the core technology, consider a very simple optimization problem. Imagine a black box with five on/off switches that control the box's output - voltage. We want to find the combination of settings that produces the greatest electrical charge. The first step is to develop a "coding scheme" - a numeric representation of the prob-lem. In our example, each possible solution will be expressed by a five-bit binary string in which on is represented by a 1 and off by a 0. We must also define the parameters of the problem: population size, crossover and mutation rates, and the number of generations. There are no standards- populations could range from 50 to 640,000 strings depending on the problem's complexity, and the rate of genetic operations will vary by user. The next, most crit-ical step is to write a fitness function that calculates the fitness of each member of the population. In the box example, solutions gener-ating a higher voltage earn a higher score.

Once the fitness function has been written, we need to create an initial population of organisms, or strings of bits, by picking a set of random strings. The initial population should be large (and diverse) enough to ensure healthy evolution. Too small a population will lead to the GA equivalent of inbreeding.

Reproduction - the first GA operator - essentially copies members of the initial generation and passes them on to the second. Strings with higher fitness scores have a proportionally greater chance of contributing offspring to the next generation, but even the fittest is not guaranteed to survive. This probabilistic reproduc-tion reflects the randomness of biological reproduction - meeting Mr. Right is largely a matter of chance, after all.

Individual strings of the new generation are then paired or "mated" based on fitness, and crossover takes place. This technique mimics biological sexual recombination: each parent contributes a portion of DNA to the offspring. For example, the first three bits of string A (11011) are swapped with the corresponding three bits of string B (10110), thereby creating two new strings - 11010 and 10111. The crossover specifics vary from the simple one-point swap illustrated above to a process involving multiple cut points, but in all GAs this is the workhorse operation, forming up to 90 percent of the next generation.

In the final genetic operation - mutation - one bit is flipped or reversed. The importance of mutation to evolutionary progress is hotly debated: Stanford University computer scientist John Koza once described mutation as "a sideshow," while David Goldberg, a GA researcher who worked with John Holland, thinks of it as an "insurance policy": mutation restores bits lost through reproduction and crossover. At most, however, the computer might insert a random bit change into 3 percent of the population.

Once the new generation's fitness has been calculated, the three-step process repeats for a set number of generations or until a specified fitness level has been achieved. The simple box problem would be solved within several generations, but the GA will really shine in a more complex problem with hundreds of variables.

So why are genetic algorithms the bastard children of computer science? Why were they rejected by most scientists? "Many of the accusations leveled against the early GAs were accurate," admits Goldberg. "People were fixed on a particular way of doing crossover and mutation that turned out to be very brittle with more difficult problems." In other words, populations tended to converge before an optimal solution was reached, and once a population has lost its diversity, successive improvement in generations slows drastically. But just as modern genetics has moved far beyond Gregor Mendel's pea-pod observations, GA theory has evolved far beyond John Holland.

The best improvements in GA design hinge on what Holland called schemata - highly fit combinations of bits within the strings. "Identifying and propagating these building blocks as a unit, a supergene, is a key issue now," says Goldberg. Researchers have also refined GA design with alternate operators - such as adaptive mutation, dominance, and diploidy - and developed a better understanding of how the traditional operators interact. "We are learning how to design around the limitations of simple GAs," says Goldberg, "and we are crossing the threshold from being a weird science to being accepted."

GAs are also crossing into commercial applications. General Electric developed a hybrid system for jet engine design that combined GA technology with an expert system. And in Japan, Hitachi has developed a GA-based system to create efficient bus schedules. More experimental projects are using GAs to generate art and breed avatars. The fittest of these will survive.

Jessie Scanlon (jessie@wired.com) is a section editor at Wired.

Original link of this page.


  

 

Copyright © 1993-97 Wired Magazine Group, Inc.
Compilation copyright © 1994-97 HotWired, Inc.

All rights reserved.


<<

<

>

>>