Intel Loihi Robotics Lit Review
A summary of ]different published robotics applications of Intel's neuromorphic control board, Loihi
Last updated
A summary of ]different published robotics applications of Intel's neuromorphic control board, Loihi
Last updated
All of these sources were found in the review paper: M. Davies et al., "Advancing Neuromorphic Computing With Loihi: A Survey of Results and Outlook," in Proceedings of the IEEE, vol. 109, no. 5, pp. 911-934, May 2021
Combined a spiking neural network (SNN) with a deep neural network (DNN) and trained them in conjunction on mapless navigation
The spiking actor network (SAN), a type of SNN, was deployed on the Loihi
Loihi communicates with the on-board computer of the TurtleBot2, which runs ROS to actually process controlling and sensing
Built their controller without using neural networks (using traditional tools), then converted it into an SNN to put it onto the Loihi
Visual input processor developed as a DNN converted to an SNN
Control signals created by a NEF (neural engineering framework). User must first find conventional circuit that solves their problem, then convert to NEF
Once vision and control sub-networks converted, connected together on Loihi
Use Python interfaces to communicate with the board (NengoInterfaces)
Augments an existing PD controller to assist a robot arm (Kinova Jaco) in reaching while holding an unexpected weight
Loihi adds context sensitive I term to controller to account for varying parameters. This is done seperately to regular PD controller (operational space controller; OSC)
PD OSC runs in Python, then uses NengoInterfaces to send joint data to Loihi (or other places; can run on regular CPU or GPU)
NEF manages disseminating sensory signals to neurons
Controlled a fully simulated hexapod with a CPG network on the Loihi
Created a new bursting neuron compartment model for Loihi, then used bursting activity as a pacemaker for CPG
Actual communication to the simulation is done through ROS
Have a "faster than real time" communication speed between ROS and Loihi, since everything is simulated
Developed an oculomotor controller and put it into a robot head
Head made with Dynamixel servos
Controller was developed based on knowledge of existing biological structure of eye
Included some training component just to see if the system would improve with it; network development strategy did not require any training to implement
Did not discuss how the Loihi controller interfaces with the servos, but included C++ code in the appendix (which I couldn't immediately find)