Intel Loihi Robotics Lit Review

A summary of ]different published robotics applications of Intel's neuromorphic control board, Loihi

  • Combined a spiking neural network (SNN) with a deep neural network (DNN) and trained them in conjunction on mapless navigation

  • The spiking actor network (SAN), a type of SNN, was deployed on the Loihi

  • Loihi communicates with the on-board computer of the TurtleBot2, which runs ROS to actually process controlling and sensing

Figure 3 from Tang et al. 2020

Rover Problem

  • Built their controller without using neural networks (using traditional tools), then converted it into an SNN to put it onto the Loihi

    • Visual input processor developed as a DNN converted to an SNN

    • Control signals created by a NEF (neural engineering framework). User must first find conventional circuit that solves their problem, then convert to NEF

  • Once vision and control sub-networks converted, connected together on Loihi

  • Use Python interfaces to communicate with the board (NengoInterfaces)

Arm Control Problem

  • Augments an existing PD controller to assist a robot arm (Kinova Jaco) in reaching while holding an unexpected weight

  • Loihi adds context sensitive I term to controller to account for varying parameters. This is done seperately to regular PD controller (operational space controller; OSC)

Fig 2.1 of DeWolf et al. 2020
  • PD OSC runs in Python, then uses NengoInterfaces to send joint data to Loihi (or other places; can run on regular CPU or GPU)

  • NEF manages disseminating sensory signals to neurons

  • Controlled a fully simulated hexapod with a CPG network on the Loihi

  • Created a new bursting neuron compartment model for Loihi, then used bursting activity as a pacemaker for CPG

Fig 1 of Polykretis et al. 2020
Fig 2 of Polykretis et al. 2020
  • Actual communication to the simulation is done through ROS

    • Have a "faster than real time" communication speed between ROS and Loihi, since everything is simulated

  • Developed an oculomotor controller and put it into a robot head

    • Head made with Dynamixel servos

  • Controller was developed based on knowledge of existing biological structure of eye

  • Included some training component just to see if the system would improve with it; network development strategy did not require any training to implement

  • Did not discuss how the Loihi controller interfaces with the servos, but included C++ code in the appendix (which I couldn't immediately find)

Fig 1 of Balachandar et al 2020

Last updated

Was this helpful?