Machine Learning from Brain

Current Intel-i7 processors have about 1.75 billion transistors with a density of about 17 million transistors per sq mm. Human brain in comparison has 100 billion neurons with a density of less than a hundred thousand neurons per cubic mm. Speed of a transistor today is approaching pico-second switching times in fast processors. In comparison, a neuron can only switch at a rate of 1 ms. 109 times slower!

Although brain appears to have more number of neurons, the modern processors beats it both in terms of speed and density of transistors.  Thus, it seems computers should beat humans in all tasks. But they seldom do especially in intelligent and creative tasks. But with the advent of machine learning this is changing. Machine learning is achieving feats which were considered earlier very difficult or impossible for computers. Machines today are driving cars autonomously, making creative arts and music and doing cancer diagnostics among hundreds of other applications hitherto considered out of their domains. Neural networks inspired from brain neuronal connections are behind this revolutionary new development. Machine learning codes try to create a network of virtual neurons on the transistor chips which are then trained to map input and output relationships to discover a general mathematical representation of the input to output transformation through an extremely complex non-linear model which theoretically is pretty close to the way in which neurons represent knowledge in brain. But still humans can perform better and faster on most learning tasks with much less data. Why? This appears illogical given the vastly denser and faster transistor networks that are available in computer chips today.

Here, I think we still have to make one more critical leap in coming closer to the brain which is – Parallelism. In brain, each neuron fires independently and doesn’t wait for all previous calculations to complete; which is the case with sequential digital circuits. Synchronous digital circuits driven by a clock were an essentiality for writing conventional sequential programs. But in neural networks, the paradigm of sequential programming is no more required and hence we can very well explore chips with completely asynchronous digital circuits. Such chips shall have perceptron as the unit of computation and all perceptrons shall work asynchronously. The evolution of this new type of chip and re-emergence of asynchronous digital electronics might just be the next quantum leap of machine learning which shall make computers come closer to the brain.

Leave a Reply

Your email address will not be published.