Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 16, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 16, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 16, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 16, 2025

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025

      Minecraft licensing robbed us of this controversial NFL schedule release video

      May 16, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      The power of generators

      May 16, 2025
      Recent

      The power of generators

      May 16, 2025

      Simplify Factory Associations with Laravel’s UseFactory Attribute

      May 16, 2025

      This Week in Laravel: React Native, PhpStorm Junie, and more

      May 16, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025
      Recent

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»On-Chip Implementation of Backpropagation for Spiking Neural Networks on Neuromorphic Hardware

    On-Chip Implementation of Backpropagation for Spiking Neural Networks on Neuromorphic Hardware

    November 26, 2024

    Natural neural systems have inspired innovations in machine learning and neuromorphic circuits designed for energy-efficient data processing. However, implementing the backpropagation algorithm, a foundational tool in deep learning, on neuromorphic hardware remains challenging due to its reliance on bidirectional synapses, gradient storage, and nondifferentiable spikes. These issues make it difficult to achieve the precise weight updates required for learning. As a result, neuromorphic systems often depend on off-chip training, where networks are pre-trained on conventional systems and only used for inference on neuromorphic chips. This limits their adaptability, reducing their ability to learn autonomously after deployment.

    Researchers have developed alternative learning mechanisms tailored for spiking neural networks (SNNs) and neuromorphic hardware to address these challenges. Techniques like surrogate gradients and spike-timing-dependent plasticity (STDP) offer biologically inspired solutions, while feedback networks and symmetric learning rules mitigate issues such as weight transport. Other approaches include hybrid systems, compartmental neuron models for error propagation, and random feedback alignment to relax weight symmetry requirements. Despite progress, these methods face hardware constraints and limited computational efficiency. Emerging strategies, including spiking backpropagation and STDP variants, promise to enable adaptive learning on neuromorphic systems directly.

    Researchers from the Institute of Neuroinformatics at the University of Zurich and ETH Zurich, Forschungszentrum Jülich, Los Alamos National Laboratory, London Institute for Mathematical Sciences, and Peking University have developed the first fully on-chip implementation of the exact backpropagation algorithm on Intel’s Loihi neuromorphic processor. Leveraging synfire-gated synfire chains (SGSCs) for dynamic information coordination, this method enables SNNs to classify MNIST and Fashion MNIST datasets with competitive accuracy. The streamlined design integrates Hebbian learning mechanisms and achieves an energy-efficient, low-latency solution, setting a baseline for evaluating future neuromorphic training algorithms on modern deep learning tasks.

    The methods section outlines the system at three levels: computation, algorithm, and hardware. A binarized backpropagation model computes network inference using weight matrices and activation functions, minimizing errors via recursive weight updates. Surrogate ReLU replaces non-differentiable threshold functions for backpropagation. Weight initialization follows He distribution, while MNIST data preprocessing involves cropping, thresholding, and downsampling. A spiking neural network implements these computations using a leaky integrate-and-fire neuron model on Intel’s Loihi chip. Synfire gating ensures autonomous spike routing. Learning employs a modified Hebbian rule with supervised updates controlled by gating neurons and reinforcement signals for precise temporal coordination.

    The binarized nBP model was implemented on Loihi hardware, extending a previous architecture with new mechanisms. Each neural network unit was represented by a spiking neuron using the current-based leaky integrate-and-fire (CUBA) model. The network used binary activations, discrete weights, and a three-layer feedforward MLP. Synfire gating controlled the information flow, enabling precise Hebbian weight updates. Training on MNIST achieved 95.7% accuracy with efficient energy use, consuming 0.6 mJ per sample. On the Fashion MNIST dataset, the model reached 79% accuracy after 40 epochs. The network demonstrated inherent sparsity due to its spiking nature, with reduced energy use during inference.

    The study successfully implements the backpropagation (nBP) algorithm on neuromorphic hardware, specifically using Loihi VLSI. The approach resolves key issues like weight transport, backward computation, gradient storage, differentiability, and hardware constraints through techniques like symmetric learning rules, synfire-gated chains, and surrogate activation functions. The algorithm was evaluated on MNIST and Fashion MNIST datasets, achieving high accuracy with low power consumption. This implementation highlights the potential for efficient, low-latency deep learning applications on neuromorphic processors. However, further work is needed to scale to deeper networks, convolutional models, and continual learning while addressing computational overhead.


    Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. If you like our work, you will love our newsletter.. Don’t Forget to join our 55k+ ML SubReddit.

    🎙 🚨 ‘Evaluation of Large Language Model Vulnerabilities: A Comparative Analysis of Red Teaming Techniques’ Read the Full Report (Promoted)

    The post On-Chip Implementation of Backpropagation for Spiking Neural Networks on Neuromorphic Hardware appeared first on MarkTechPost.

    Source: Read More 

    Hostinger
    Facebook Twitter Reddit Email Copy Link
    Previous ArticleGoogle AI Proposes Re-Invoke: An Unsupervised AI Tool Retrieval Method that Effectively and Efficiently Retrieves the Most Relevant Tools from a Large Toolset
    Next Article Retrieval-Augmented Generation (RAG): Deep Dive into 25 Different Types of RAG

    Related Posts

    Security

    Nmap 7.96 Launches with Lightning-Fast DNS and 612 Scripts

    May 17, 2025
    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-48187 – RAGFlow Authentication Bypass

    May 17, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    CVE-2025-3925 – BrightSign Players Privilege Escalation Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    New Xerox Printer Flaws Could Let Attackers Capture Windows Active Directory Credentials

    Development

    Malaysia Braces for Cyberattacks During Hari Raya: Cyber999 Issues Warning

    Development

    CVE-2025-4258 – Zhangyanbo2007 Youkefu Unrestricted File Upload Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    Highlights

    Linux

    ArcoLinux: La fine di un progetto e il futuro degli utenti

    April 14, 2025

    ArcoLinux, una delle distribuzioni GNU/Linux basate su Arch Linux più apprezzate per la sua versatilità…

    Many Fuel Tank Monitoring Systems Vulnerable to Disruption

    April 29, 2025

    Gh0st RAT Trojan Targets Chinese Windows Users via Fake Chrome Site

    July 29, 2024

    CodeSOD: Mailing it In

    July 30, 2024
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.