The objective of this project is to build a framework for experimenting with and further developing Sparsey, a brain-inspired machine learning model developed by Dr. Rod Rinkus that employs a hierarchical memory structure and sparse representations to support lifelong learning without catastrophic forgetting.

Team Members

Sashwat Anagolum | JD Padrnos | Andy Klawa | Christopher Pereira |

Project Poster

Click on any image to enlarge.


Watch the Project Video

Video Player Icon


Project Summary

Overview

The objective of this project is to build a framework for experimenting with and further developing Sparsey, a brain-inspired machine learning model developed by Dr. Rod Rinkus that employs a hierarchical memory structure and sparse representations to support lifelong learning without catastrophic forgetting. To achieve this, we built a modernised, Python-based implementation of the core Sparsey model as part of a complete machine learning framework with extensive hyperparameter optimization functionality and robust visualisation capabilities for analysis of quantitative results. This framework aims to facilitate lifelong learning and other experiments for Dr. Rinkus, streamline the benchmarking process, and make Sparsey accessible for wider experimentation in the machine learning community as a Python package.

Objectives

  • Complete Python reimplementation of Sparsey model and its Code Selection Algorithm.
  • Build a modern, extensible testing and hyperparameter optimization framework to support Dr. Rinkus’ current and future research into all aspects of Sparsey.
  • Make Sparsey accessible for wider experimentation in the machine learning community.

Approach

  • Analysed relevant papers and existing Java implementation to fully understand the problem domain.
  • Conducted extensive discussions with Dr. Rinkus’ on his requirements and use cases to understand the intricacies of Sparsey and tailor our framework to his exact needs.
  • Designed end-to-end machine learning framework covering all aspects of the Sparsey training and testing process.
  • Utilised layered architecture to allow modular addition and replacement of key components.
  • Built on PyTorch to provide high performance and compatibility with researchers’ existing workflows.
  • Integrated extensive functionality for systematic hyperparameter optimization.
  • Leveraged Weights & Biases for experiment tracking and visualisation capabilities.
  • Built automatically validated configuration files and execution scripts to allow easy use without writing a single line of code.

Outcomes

  • Easy-to-use system to run experiments at scale.
  • Modern Python framework enables easy collaboration.
  • Vastly extend the universe of explorations available with the Sparsey model for Dr. Rinkus and other researchers.
  • Cloud storage and visualisation of all experiment data lets Dr. Rinkus experiment from anywhere.
  • Reduce end-to-end experiment time from days to minutes.