Neuromorphic Computing Is a Big Deal for A.I., But What Is It?
Engineering computers to work like brains could revolutionize technology as we know it. Here’s everything you need to know about neuromorphic computing.
Get 20% off http://www.domain.com domain names and web hosting ...when you use coupon code SEEKER at checkout!
Thinking in Silicon https://www.technologyreview.com/s/522476/thinking-in-silicon/
“Computers are incredibly inefficient at lots of tasks that are easy for even the simplest brains, such as recognizing images and navigating in unfamiliar spaces. Machines found in research labs or vast data centers can perform such tasks, but they are huge and energy-hungry, and they need specialized programming.”
SpiNNaker Home Page http://apt.cs.manchester.ac.uk/projects/SpiNNaker/
“SpiNNaker is a novel computer architecture inspired by the working of the human brain. A SpiNNaker machine is a massively parallel computing platform, targeted towards three main areas of research: neuroscience, robotics, and computer science.
Breakthrough in construction of computers for mimicking human brain https://www.sciencedaily.com/releases/2018/07/180711093119.htm
“A computer built to mimic the brain's neural networks produces similar results to that of the best brain-simulation supercomputer software currently used for neural-signaling research.”
____________________
Elements is more than just a science show. It’s your science-loving best friend, tasked with keeping you updated and interested on all the compelling, innovative and groundbreaking science happening all around us. Join our passionate hosts as they help break down and present fascinating science, from quarks to quantum theory and beyond.
Seeker explains every aspect of our world through a lens of science, inspiring a new generation of curious minds who want to know how today’s discoveries in science, math, engineering and technology are impacting our lives, and shaping our future. Our stories parse meaning from the noise in a world of rapidly changing information.
Neuromorphic Computing Is a Big Deal for A.I., But What Is It?
Engineering computers to work like brains could revolutionize [...]
Engineering computers to work like brains could revolutionize technology as we know it. Here’s everything you need to know about neuromorphic computing.
Get 20% off http://www.domain.com domain names and web hosting ...when you use coupon code SEEKER at checkout!
Thinking in Silicon https://www.technologyreview.com/s/522476/thinking-in-silicon/
“Computers are incredibly inefficient at lots of tasks that are easy for even the simplest brains, such as recognizing images and navigating in unfamiliar spaces. Machines found in research labs or vast data centers can perform such tasks, but they are huge and energy-hungry, and they need specialized programming.”
SpiNNaker Home Page http://apt.cs.manchester.ac.uk/projects/SpiNNaker/
“SpiNNaker is a novel computer architecture inspired by the working of the human brain. A SpiNNaker machine is a massively parallel computing platform, targeted towards three main areas of research: neuroscience, robotics, and computer science.
Breakthrough in construction of computers for mimicking human brain https://www.sciencedaily.com/releases/2018/07/180711093119.htm
“A computer built to mimic the brain's neural networks produces similar results to that of the best brain-simulation supercomputer software currently used for neural-signaling research.”
____________________
Elements is more than just a science show. It’s your science-loving best friend, tasked with keeping you updated and interested on all the compelling, innovative and groundbreaking science happening all around us. Join our passionate hosts as they help break down and present fascinating science, from quarks to quantum theory and beyond.
Seeker explains every aspect of our world through a lens of science, inspiring a new generation of curious minds who want to know how today’s discoveries in science, math, engineering and technology are impacting our lives, and shaping our future. Our stories parse meaning from the noise in a world of rapidly changing information.
This video is the eleventh in a multi-part series discussing computing. In this video, we’ll be discussing what cognitive computing is and the impact ...it will have on the field of computing.
00:00 Intro
[0:28-5:09] What is Cognitive Computing - Starting off we'll discuss, what cognitive computing is, more specifically – the difference between current computing Von Neuman architecture and more biologically representative neuromorphic architecture and how these two paired together will yield massive performance and efficiency gains!
[5:09-10:46] Benefits of Cognitive Computing -Following that we’ll discuss, the benefits of cognitive computing systems further as well as current cognitive computing initiatives, TrueNorth and Loihi.
[10:46-17:11] Future of Cognitive Computing - To conclude we’ll extrapolate and discuss the future of cognitive computing in terms of brain simulation, artificial intelligence and brain-computer interfaces!
Thank You To The Members Who Supported This Video ➤
♫ 00;00 "70" by Taylor King
♫ 00;28 "Sun" by HOME
♫ 02;54 "Flood" by HOME
♫ 06;09 "Resonance" by HOME
♫ 09;33 "If I'm Wrong" by HOME
♫ 13;31 "Accelerated" by Miami Nights 1984
♫ 17;13 "June" by Aire Atlantica
Dr. Dharmendra S. Modha is an IBM Fellow and IBM Chief Scientist for Brain-inspired Computing. He is a Cognitive Computing pioneer who envisioned and now ...leads a highly successful effort to develop Brain-inspired Computers. The project has received ~$70 million in research funding from DARPA (under SyNAPSE Program), US Department of Defense, US Department of Energy, and Commercial Customers. The ground-breaking project is multi-disciplinary, multi-institutional, and mult-national and has a world-wide scientific impact. The resulting architecture, technology, and ecosystem breaks path with the prevailing von Neumann architecture (circa 1946) and constitutes a foundation for energy-efficient, scalable neuromorphic systems.
Save
active
Will Neuromorphic Computing Prevent AI Winter?
Throughout its history artificial intelligence has run into several so [...]
Throughout its history artificial intelligence has run into several so called winters. Are we headed for another or are we in for a perpetual spring break?
While some would have us ...believe that Moore’s law, the doubling of transistor density every 18 months or so, will yield intelligent computers within the next 10 to 15 years, a little basic logic paints a very different picture.
You see, this blob of fat between our ears is actually quite the magnificent computer. It’s responsible for everything from creating symphonies, discovering quantum mechanics, and even finding humor in death metal kitty videos. It does all this while only consuming around 20W of power, less than a typical lightbulb. Compare this to modern petaflop scale super computers that can’t even meme, yet hoover up tens of megawatts of power, and well you can see where this is going.
So what’s a nerd to do? Should we give up on our dream of an all powerful artificial general intelligence that is just as capable of writing music as it is discovering the grand unified theory of everything? I suppose that if we’re just going to end up with more Justin Bieber pop without the train wreck to gawk at, then perhaps we shouldn’t bother.
Either way, It’s clear that to make progress towards real artificial intelligence we’re going to need new computing paradigms. That’s where neuromorphic computing comes in. It’s an attempt to more accurately mimic the functionality of the human brain, while simultaneously reducing power consumption.
Learn how to turn deep reinforcement learning papers into code:
Get instant access to all my courses, including the new Prioritized Experience Replay course, with my subscription service. $29 a month gives you instant access to 42 hours of instructional content plus access to future updates, added monthly.
Discounts available for Udemy students (enrolled longer than 30 days). Just send an email to sales@neuralnet.ai
Our brain has 86 billion neurons connected by 3 million kilometers of [...]
Our brain has 86 billion neurons connected by 3 million kilometers of nerve fibers and The Human Brain Project is mapping it all. One of the key applications is neuromorphic ...computing - computers inspired by brain architecture that may one day be able to learn as we do.
#BloombergGiantLeap #Science #Technology
--------
"Moonshot" introduces you to the scientists and thinkers chasing humanity’s wildest dreams. The series takes a deeper look into how science is solving the world's most complex problems in order to create a better tomorrow. The first season explores major breakthroughs from scientists including plastic eating bacteria, asteroid hunting and oceanic exploration. Watch every episode: https://youtube.com/playlist?list=PLqq4LnWs3olXYh0FhU2KgOg1Mzleojbie
Mike Davies, director of Intel’s Neuromorphic Computing Lab, describes the aim of neuromorphic processor research: to make a #chip that mimicks ...the human brain. #Intel Labs is leading research efforts to help realize #neuromorphic computing’s goal of enabling next-generation intelligent devices and autonomous systems.
#Technology #Innovation #AI
About Intel Newsroom
Intel Newsroom brings you the latest news and updates on world-changing technology that enriches the lives of everyone on Earth. Catch up on the latest innovations in client computing, artificial intelligence, security, data centers, international news and more. Watch recaps and replays from industry events where Intel has a major role, such as Mobile World Congress (MWC), Intel Innovation, the Consumer Electronics Show (CES) and others.
With around 86 billion neurons and up to 1 quadrillion synapse [...]
With around 86 billion neurons and up to 1 quadrillion synapse connections, the human brain contains over 400,000 km of nerve fiber; long enough to reach the moon!
All of these ...connections allow the brain to fire a quintillion calculations per second or one ExaFLOP in Computing Terms.
For the uninitiated, an ExaFLOP is far beyond what the most powerful supercomputer in the world is capable of.
This is the IBM designed OLCF-4, AKA Summit, the fastest supercomputer in the world.
Summit’s speed is generated by 250 refrigerator-sized cabinets taking up over 520 square meters and weighing 340 tons.
With over 200,000 processor cores and over 27,000 GPUs, Summit is capable of 200 quadrillion calculations per second, or 200 petaflops.
That is over 600,000 times faster than the CPU in an iPhone X!
Even with all of Summit’s brut power, the human brain can still make calculations five times faster.
And crazy as it sounds, the next comparison is far more fascinating.
Summit’s over 73 trillion transistors generate so much heat that it requires over 15,000 liters of water a minute, to keep it cool.
The system as a whole consumes 13 MW of power, enough for 13,000 microwaves.
The human brain, on the other hand, consumes just 20 watts of power, less than a single lightbulb!
DRAMATIC PAUSE
Why are computers far superior to humans at performing specific specialized tasks but do not come close to humans regarding cognitive faculties like perception, imagination, and consciousness?
And how does the brain have five times the flops as Summit while being 130,000 times more efficient?
While there are many answers to these questions, the differences boil down to architecture.
Today’s computers are extraordinarily complex, yet they are based on the Von Neumann Architecture dating back to 1945!
This architecture has four components:
First is the Control Unit that decodes instructions and controls how the data flows through the computer.
Second is the Arithmetic Logic Unit or ALU that processes all of the mathematical operations. The ALU and control unit together make up the Central Processing Unit or CPU.
The third component is the Main Memory Unit, which holds data and instructions.
And fourth is the Input/Output devices.
One of the issues with this architecture is the Von Neumann Bottleneck caused by the separation of memory and processing components.
This brings us to IBM’s TrueNorth Chip inspired by the brain.
TrueNorth is the product of 16 years of research from scientists at IBM in Almaden CA led by Dr. Dharmendra Modha.
IBM received DARPA contract in 2008 to develop a new kind of cognitive computer with architecture similar to the brain.
By 2011, IBM built two prototype chips called Golden Gate and San Francisco each containing just 256 neurons, roughly the size of the nervous system of a worm.
Despite the limited number of neurons, the chips were capable of simple cognitive exercises such as playing pong and recognizing handwritten digits.
By 2013, Modha’s team then shrunk the components of the Golden Gate chip by 15-fold and reduced the power consumption by 100-fold to create a neurosynaptic core.
4,096 of these smaller more efficient cores fit together on a 64 x 64 gird forming the TrueNorth Chip equipped with 1 million neurons and 256 million synapses.
The cores operate independently and in parallel with one another.
Each core houses both memory and processing functions as all of the instructions and information needed are store locally and mimics neurons of the brain.
This small section lies the axon buffers containing 256 inputs that receive data spikes like neuron dendrites in the brain.
The Neuron Block holds the 256 neurons which are individually programmed to send its output when its threshold is reached similar to neurons in the brain.
Like the brain, the output are messages called spikes and indicate neuron activity.
The output is sent to the routing network which sends the data to other neurons similar to axon terminals in the brain.
Indeed, the TrueNorth mimics the brains architecture quite well, and it also mimics its efficiency.
With 5.4 billion transistors, TrueNorth is the second largest chip IBM has ever produced and yet it consumes just 73 milliwatts which is around a thousand times less than a typical CPU.
It can run at full blast on an iPhone battery for a whole week.
On top of all of this, TrueNorth can be tiled with other TrueNorth Chips for increasingly complex tasks.
This is the NS16e circuit board incorporating 16 IBM TrueNorth chips.
Since its inception, TrueNorth has proven to be incredibly proficient at machine learning applications such as image recognition.
TrueNorth is capable of monitoring dozens of TV video feeds at the same time and classifies 6,000 images per watt.
To put this in perspective, NVIDA’s Tesla P4 GPU classifies about 160 images per watt.
- Videos from the AMD, Intel, NVidia, and other official Youtube channel were used for illustrative purpose, all copyrights belong to the respective owners, used here under Fair Use.
- A few seconds from several other youtube videos have been used for illustrative and educational purposes. Please contact me if you'd like to be credited!
#NEUROMORPHIC #PROCESSORS #BRAIN
Save
active
How to Wire a Computer Like a Human Brain
The goal of neuromorphic computing is simple: mimic the neural [...]
The goal of neuromorphic computing is simple: mimic the neural structure of the brain. Meet the current generation of computer chips that's getting closer to reaching this not-so-simple goal.
» Subscribe ...to Seeker! http://bit.ly/subscribeseeker
» Watch more Elements! http://bit.ly/ElementsPlaylist
» Visit our shop at http://shop.seeker.com
The central processing unit, or CPU, that’s the key to making your home computer work is often likened to a brain, but the truth is it’s nothing like the brains found in nature or in our skulls.
CPUs are great at performing precise calculations with huge numbers, but when it comes to learning and abstraction, the thinky meat between our ears has the CPU licked.
An emerging field of artificial intelligence called neuromorphic computing is attempting to mimic how the neurons in our own brains work, and researchers from Intel and IBM are making true silicon brains a reality.
Brain-like computer chips are a totally wild concept. And they might just be the future of AI https://www.inverse.com/innovation/neuromorphic-computing
"Current neuromorphic systems primarily use silicon-based superconducting neural networks that the authors write are set to far surpass their energy limit by 2040 at their current rate."
Elements is more than just a science show. It’s your science-loving best friend, tasked with keeping you updated and interested in all the compelling, innovative and groundbreaking science happening all around us. Join our passionate hosts as they help break down and present fascinating science, from quarks to quantum theory and beyond.
Seeker empowers the curious to understand the science shaping our world. We tell award-winning stories about the natural forces and groundbreaking innovations that impact our lives, our planet, and our universe.