Four years ago, Jason Eshraghian from UC Santa Cruz pioneered “snnTorch,” a Python library merging neuroscience and artificial intelligence to craft spiking neural networks—a machine learning approach inspired by the brain’s adept data processing. Surpassing 100,000 downloads, this open-source gem is now integral in diverse projects, spanning NASA’s satellite tracking to semiconductor firms fine-tuning chips for AI.
The journal Proceedings of the IEEE not only delves into the code library but doubles as an enlightening resource for students and programmers keen on unraveling the intricacies of brain-inspired AI.
Eshraghian, an assistant professor of electrical and computer engineering, finds the widespread adoption of “snnTorch” thrilling, signaling a growing interest in brain dynamics and a recognition of neural networks’ inefficiencies compared to the brain. As concerns mount over the environmental impact of power-hungry neural networks and colossal language models, this innovative direction offers a promising solution.
Building snnTorch
Spiking neural networks mirror the brain’s efficiency by remaining at rest until stimulated with information—a stark departure from traditional neural networks that constantly process data. Jason Eshraghian, driven by a desire to fuse the brain’s efficiency with AI functionality, embarked on creating a spiking neural network code in Python during the pandemic, not just as a passion project but also to delve into the language itself.
Initially a chip designer, Eshraghian recognized the potential for optimizing computing chips by co-designing software and hardware to enhance power efficiency. His creation, snnTorch, has now gained global traction, aiding diverse projects like NASA’s satellite tracking and collaborations with major chip designers like Graphcore.
While crafting the Python library, Eshraghian seamlessly documented the code and produced educational materials, a natural outcome of his self-taught Python journey. These resources evolved into go-to references for those entering the realms of neuromorphic engineering and spiking neural networks, contributing significantly to the library’s widespread adoption.
An honest resource
Recognizing the potential value of his educational materials for the burgeoning community of computer scientists and beyond delving into this field, Eshraghian decided to consolidate his extensive documentation into a paper.
This paper serves as a companion to the snnTorch code library, adopting a tutorial format with an unapologetically opinionated stance. Eshraghian openly addresses the uncertainties within brain-inspired deep learning, offering a distinctive perspective on the field’s future.
Intentionally forthright, the paper aims to spare students the frustration of seeking theoretical foundations for code decisions in an evolving and unsettled field like neuromorphic computing. Eshraghian emphasizes the paper’s honesty, acknowledging the unknowns in deep learning and the sometimes ambiguous reasoning behind successful approaches.
Deviating from traditional research papers, this publication incorporates code blocks, occasionally accompanied by explanations that highlight unsettled areas while shedding light on researchers’ rationale for potential success.
Eshraghian’s candid approach has resonated positively within the community, with reports of the paper being utilized in onboarding materials at neuromorphic hardware startups. He underscores a commitment to preventing others from experiencing the same challenges he faced during his research journey.
Learning from and about the brain
The paper delves into the challenges of brain-inspired deep learning, given our limited understanding of how the brain processes information. Jason Eshraghian highlights the need for AI researchers to bridge the gaps and disparities between deep learning and biology to adopt more brain-like learning mechanisms.
A crucial distinction lies in the brain’s inability to retrospectively survey all past data, unlike AI models. Emphasizing the potential for enhanced energy efficiency, Eshraghian notes that the brain processes real-time data, coupling training and processing in a unique way.
The paper explores the neuroscience concept that neurons firing together strengthen their connections—a phenomenon not fully understood on an organ-wide scale. While traditionally seen as opposed to deep learning’s backpropagation, Eshraghian suggests their potential complementarity, paving the way for new avenues in the field.
Eshraghian expresses enthusiasm for collaborating with biomolecular engineering researchers on cerebral organoids—brain tissue models grown from stem cells. This collaboration, particularly with the Braingeneers group at the UCSC Genomics Institute, provides a rare opportunity to integrate “wetware” (biological models) into the prevalent software/hardware co-design paradigm. The snnTorch code may serve as a platform for simulating organoids, offering insights that are challenging to obtain in a lab setting.
In essence, this collaboration holds promise for unraveling the intricacies of how the brain processes information, potentially leading to advancements that make deep learning more efficient.
Brain-inspired learning at UCSC and beyond
Drawing on the principles encapsulated in his library and recent paper, Eshraghian integrates these concepts into his UC Santa Cruz class, “Brain-Inspired Deep Learning,” a course on neuromorphic computing. Both undergraduate and graduate students, spanning various academic disciplines, engage in the class to grasp deep learning fundamentals and undertake projects where they craft tutorials for snnTorch, potentially contributing to its development.
For Eshraghian, the class transcends traditional academic outcomes—it’s about students making tangible contributions and leaving with more than just grades. The collaborative ethos extends beyond the classroom, with Discord and Slack channels dedicated to discussing the spiking neural network code fostering a dynamic environment for collaboration between industry and academia. Notably, Eshraghian recently encountered a job posting that sought proficiency in snnTorch as a desired qualification, showcasing the real-world impact of his work.
Eshraghian’s collaborative efforts span diverse realms, ranging from uncovering biological insights about the brain to pushing the boundaries of neuromorphic chips for efficient low-power AI workloads. His vision extends to facilitating collaborations that bring the spiking neural network computing style into domains beyond traditional AI, including applications in natural physics.