Computers that learn from experience and adapt to a novel situation are now set to change the face of Artificial Intelligence (A.I).

The commercial version of a new kind of 'neuromorphic processor' that learns from its own mistake is about to enter the markets in 2014, according to New York Times.

The new technology could do away with hours of programming and even make 'computer crashes' obsolete, tolerating minor errors. The new kind of computing is based on biological systems. These new artificial neural networks could advance facial recognition that is currently in its infancy stage.

The brain doesn't crash if it finds a novel idea. It forms new neural connections and interprets the signals as best as it can. The new computer chip works similarly and according to NYT, these chips are already in use in few large tech companies.

"We're moving from engineering computing systems to something that has many of the characteristics of biological computing," Larry Smarr, an astrophysicist at the California Institute for Telecommunications and Information Technology told NYT.

The new chip is designed by I.B.M. and Qualcomm and researchers at Stanford University.

Machines that learn and Adapt

Deep learning, which is inspired by the neurons of the brain, is one of the most popular aspects of machine learning. Companies such as Facebook already depend on certain deep learning technologies to target their advertisements at the right consumers.

In 2012, a research team at Google connected 1,000 computers into a neural network. The system was then showed thousands of pictures that are commonly available on the internet. Within three days, the artificial learning machine was able to spot cats.

The research team was working with Prof Andrew Ng, head of the artificial intelligence lab at Stanford University, California, BBC had earlier reported.

Graduate-level machine learning was the most popular course at Stanford University this fall with over 760 students taking it, Forbes reported.

Watch Ng talk about machine learning in the video below:

You can learn more about deep learning, here.