Computers, like humans, can learn. But when Google tries to fill in your search box based only on a few keystrokes, or your iPhone predicts words as you type a text message, it’s only a narrow mimicry of what the human brain is capable.

The challenge in training a computer to behave like a human brain is technological and physiological, testing the limits of computer and brain science. But researchers from IBM Corp. (NYSE: IBM) say they’ve made a key step toward combining the two worlds.

The company announced Thursday that it has built two prototype chips that it says process data more like how humans digest information than the chips that now power PCs and supercomputers. (Read press announcement here.)

The chips represent a significant milestone in a six-year-long project that has involved 100 researchers and some $41 million in funding from the government’s Defense Advanced Research Projects Agency, or DARPA. IBM has also committed an undisclosed amount of money.

“Researchers at IBM have been working on a cognitive computing project called Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE),” wrote IBM’s Darmendra Modha, project leader for IBM Research, in a blog.

(Watch a video about the project at YouTube here.)

“By reproducing the structure and architecture of the brain—the way its elements receive sensory input, connect to each other, adapt these connections, and transmit motor output—the SyNAPSE project models computing systems that emulate the brain’s computing efficiency, size and power usage without being programmed.”

The prototypes offer further evidence of the growing importance of “parallel processing,” or computers doing multiple tasks simultaneously. That is important for rendering graphics and crunching large amounts of data.

The uses of the IBM chips so far are prosaic, such as steering a simulated car through a maze, or playing Pong. It may be a decade or longer before the chips make their way out of the lab and into actual products.

But what’s important is not what the chips are doing, but how they’re doing it, says Giulio Tononi, a professor of psychiatry at the University of Wisconsin at Madison who worked with IBM on the project.

The chips’ ability to adapt to types of information that it wasn’t specifically programmed to expect is a key feature.

“There’s a lot of work to do still, but the most important thing is usually the first step,” Tononi said in an interview. “And this is not one step, it’s a few steps.”

Beyond machines

“For more than half a century, computers have been little better than calculators with storage structures and programmable memory, a model that scientists have continually aimed to improve,” Modha wrote in his blog titled “Beyond machines.”

“Comparatively, the human brain—the world’s most sophisticated computer—can perform complex tasks rapidly and accurately using the same amount of energy as a 20 watt light bulb in a space equivalent to a 2 liter soda bottle.”

Modha pointed to cognitive computing as a means of advancing what computers can do.

“Making sense of real-time input flowing in at a dizzying rate is a Herculean task for today’s computers, but would be natural for a brain-inspired system,” he wrote. “Using advanced algorithms and silicon circuitry, cognitive computers learn through experiences, find correlations, create hypotheses, and remember—and learn from—the outcomes.

“For example, a cognitive computing system monitoring the world’s water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making.”

Technologists have long imagined computers that learn like humans. Your iPhone or Google’s servers can be programmed to predict certain behavior based on past events. But the techniques being explored by IBM and other companies and university research labs around cognitive computing could lead to chips that are better able to adapt to unexpected information.

Chips as neurons and synapses

IBM is a leader in a movement to link physical infrastructure, such as power plants or traffic lights, and information technology, such as servers and software that help regulate their functions. Such projects can be made more efficient with tools to monitor the myriad analog signals present in those environments.

Modha said the new chips have parts that behave like digital “neurons” and “synapses” that make them different than other chips. Each “core,” or processing engine, has computing, communication and memory functions.

“You have to throw out virtually everything we know about how these chips are designed,” he said. “The key, key, key difference really is the memory and the processor are very closely brought together. There’s a massive, massive amount of parallelism.”

The project is part of the same research that led to IBM’s announcement in 2009 that it had simulated a cat’s cerebral cortex, the thinking part of the brain, using a massive supercomputer.

Using progressively bigger supercomputers, IBM had previously simulated 40 percent of a mouse’s brain in 2006, a rat’s full brain in 2007, and 1 percent of a human’s cerebral cortex in 2009.

A computer with the power of the human brain is not yet near. But Modha said the latest development is an important step.

“It really changes the perspective from ‘What if?’ to ‘What now?'” Modha said. “Today we proved it was possible. There have been many skeptics, and there will be more, but this completes in a certain sense our first round of innovation.”

Get the latest news alerts: Follow WRAL Tech Wire at Twitter.