Air Force Research Lab develops brain-like sensory supercomputing

SuperComputer

Defense IT

Air Force Research Lab develops brain-like sensory supercomputing

The Air Force Research Laboratory (AFRL) and IBM have embarked upon a joint AI venture to engineer a first-of-its-kind brain-inspired supercomputing system powered by a 64-chip array, industry officials said.

The technology is aimed at improving sensory processing beyond systems powered by standard chips.

The IBM TrueNorth Neurosynaptic System is designed to convert data-- such as images, video, audio and text-- from multiple, distributed sensors into symbols in real time, an IBM statement said.

“AFRL will combine this ‘right-brain’ perception capability of the system with the ‘left-brain’ symbol processing capabilities of conventional computer systems,” the statement continued.

The scalable platform IBM is building for AFRL will feature an end-to-end software ecosystem designed to enable deep neural-network learning and information discovery.

The system’s advanced pattern recognition and sensory processing power will be the equivalent of 64 million neurons and 16 billion synapses, while the processor component will consume the energy equivalent of a dim light bulb, developers explained.

“AFRL was the earliest adopter of TrueNorth for converting data into decisions,” Daniel S. Goddard, director, information directorate, U.S. Air Force Research Lab, said in a written statement. “The new neurosynaptic system will be used to enable new computing capabilities important to AFRL’s mission to explore, prototype and demonstrate high-impact, game-changing technologies that enable the Air Force and the nation to maintain its superior technical advantage.”

Supercomputing technology designed to mirror the functions of human neurons can present both advantages and limitations; while high-tech processing speeds can, in some instances, outperform human cognition with respect to some procedural functions, most AI experts maintain that computers cannot equal human perception when there is a need to respond in real time dynamic circumstances requiring quick problem solving.

Nonetheless, there is broad consensus among experts that supercomputing advances can mirror or accelerate functions currently performed by the human mind. Algorithms controlling autonomy for unmanned air and ground systems are rapidly advancing. Aerial drones and ground robots are increasingly able to navigate and interpret data without needing to be teleoperated by a human being. Also, small robots engineered to find and detonate roadside bombs are now equipped with software enabling them to perform sensory functions autonomously or semi-autonomously.

Weapons systems, such as advanced seeker technology used on the Navy’s Long-Range Anti-Ship Missile, help facilitate semi-autonomous guidance systems able to direct flight toward moving targets independently. In addition, computer automation such as that used for the F-35 or new Ford-class aircraft carriers have already shown an ability to massively reduce the cognitive burden upon humans and, in some cases, reduce the need for personnel. New seeker technology for the Tomahawk missile enables the weapon to adjust to a moving target without needing to be redirected by a person.

According to DARPA program managers and senior military leaders, such as Air Force Chief Scientist Gregory Zacharias, computers can add substantial value when it comes to procedural functions such as avionics check lists needed prior to flight. Zacharias believes the day is quickly approaching when pilots of fighter jets will be able to function in a command and control capacity controlling nearby autonomous drones carrying weapons, testing enemy radar or conducting surveillance missions.

When it comes to solving emerging problems quickly, however, algorithms for autonomy and perception have a long way to go, experts maintain. While massively increased processing speed is likely to expedite procedural functions and the organization of some sensory input data, it is not at all clear that such developments can succeed in replicating the sensory functions of human neurons, many AI experts maintain.

The system fits in a 4U-high (7”) space in a standard server rack, and eight such systems will drive the unprecedented scale of 512 million neurons per rack. A single processor in the system consists of 5.4 billion transistors organized into 4,096 neural cores creating an array of 1 million digital neurons that communicate with one another via 256 million electrical synapses.    

The large scale of the system is designed to enable both data parallelism where multiple data sources can be run in parallel against the same neural network and model parallelism where independent neural networks form an ensemble that can be run in parallel on the same data, according to IBM.

 “The evolution of the IBM TrueNorth Neurosynaptic System is a solid proof point in our quest to lead the industry in AI hardware innovation,” said Dharmendra S. Modha, IBM fellow and chief scientist of brain-inspired computing at IBM Research–Almaden, said in a written statement. “Over the last six years, IBM has expanded the number of neurons per system from 256 to more than 64 million – an 800 percent annual increase over six years.’’


About the Author

Kris Osborn is editor-in-chief of Defense Systems. He can be reached at kosborn@1105media.com.

Let's block ads! (Why?)



from All Articles and Blogs http://ift.tt/2tgkBXS
via Defens News
Air Force Research Lab develops brain-like sensory supercomputing Air Force Research Lab develops brain-like sensory supercomputing Reviewed by Unknown on 18:57:00 Rating: 5

No comments:

Defense Alert. Powered by Blogger.