Skip to main content
Enterprise AI Analysis: Dynamical modeling of nonlinear latent factors in multiscale neural activity with real-time inference

Enterprise AI Analysis

Dynamical modeling of nonlinear latent factors in multiscale neural activity with real-time inference

Real-time decoding of target variables from multiple simultaneously recorded neural time-series modalities, such as discrete spiking activity and continuous field potentials, is important across various neuroscience applications. However, a major challenge for doing so is that different neural modalities can have different timescales (i.e., sampling rates) and different probabilistic distributions, or can even be missing at some time-steps. Existing nonlinear models of multimodal neural activity do not address different timescales or missing samples across modalities. Further, some of these models do not allow for real-time decoding. Here, we develop a learning framework that can enable real-time recursive decoding while nonlinearly aggregating information across multiple modalities with different timescales and distributions and with missing samples. This framework consists of 1) a multiscale encoder that nonlinearly aggregates information after learning within-modality dynamics to handle different timescales and missing samples in real time, 2) a multiscale dynamical backbone that extracts multimodal temporal dynamics and enables real-time recursive decoding, and 3) modality-specific decoders to account for different probabilistic distributions across modalities. In both simulations and three distinct multiscale brain datasets, we show that our model can aggregate information across modalities with different timescales and distributions and missing samples to improve real-time target decoding. Further, our method outperforms various linear and nonlinear multimodal benchmarks in doing so.

Authors: Eray Erturk, Maryam M. Shanechi

Executive Impact

This research introduces MRINE, a cutting-edge framework for real-time decoding of complex neural data. By effectively fusing multimodal time-series with varying scales and distributions, MRINE provides unprecedented accuracy and robustness, paving the way for advanced brain-computer interfaces and deeper neuroscientific insights.

MRINE leverages a novel multiscale encoder and dynamical backbone to nonlinearly aggregate information across discrete spiking activity and continuous field potentials, even with missing samples. It enables real-time recursive decoding, outperforming current state-of-the-art linear and nonlinear multimodal models in simulations and diverse brain datasets.

0.0 Latent Reconstruction CC (Lorenz Sim)
0 NHP Behavior Decoding Improvement
0 Accuracy Decrease (40% Missing Samples)
0.00 Frame ID Prediction MAE (Visual Stimuli)

Deep Analysis & Enterprise Applications

Select a topic to dive deeper, then explore the specific findings from the research, rebuilt as interactive, enterprise-focused modules.

MRINE: A Novel Framework for Real-time Multiscale Fusion

MRINE introduces a comprehensive approach to address key challenges in neural data analysis: multimodal integration, disparate timescales, missing data, and real-time inference. Its core innovation lies in a novel multiscale encoder architecture combined with a dynamical backbone and modality-specific decoders. This design enables nonlinear aggregation of information across neural signals like spikes (Poisson, fast timescale) and LFPs (Gaussian, slower timescale), while dynamically handling missing observations.

Enterprise Process Flow

Raw Multimodal Data (Spikes, LFPs)
Modality-Specific MLPs
Modality-Specific LDMs (Kalman Filtering)
Fusion Network
Multiscale LDM (Temporal Dynamics)
Modality-Specific Decoders (Reconstruction)

MRINE Outperforms Leading Multimodal Models

In direct comparisons, MRINE consistently surpassed various linear and nonlinear multimodal benchmarks, including MSID, mmPLRNN, MMGPVAE, and MVAE. Across different channel configurations and datasets (NHP grid-reaching, NHP center-out), MRINE demonstrated significantly higher behavior decoding accuracies. For instance, in a 20-spike/20-LFP scenario, MRINE achieved 0.611 CC for grid-reaching, against MSID's 0.519, highlighting its superior ability to capture complex neural dynamics.

Method Avg. CC (20 Spike/20 LFP) Key Capabilities
MRINE 0.611 - 0.649
  • Nonlinear
  • Multiscale
  • Real-time Inference
  • Missing Data Robustness
MSID 0.519 - 0.561
  • Linear
  • Multiscale
  • Real-time Inference
mmPLRNN 0.540 - 0.538
  • Nonlinear
  • Single-timescale (Non-causal)
MMGPVAE 0.479 - 0.601
  • Nonlinear
  • Single-timescale (Non-causal)
MVAE 0.425 - 0.544
  • VAE-based
  • No Explicit Dynamics

Unwavering Performance Amidst Incomplete Data

A critical strength of MRINE is its inherent robustness to missing data, whether due to disparate timescales or random sample drops. The multiscale encoder's modality-specific LDMs enable predictive capabilities for missing samples. In scenarios where 40% of spike samples were missing (with 20% LFP samples also missing), MRINE's behavior decoding accuracy only decreased by 5.4%, significantly less than other methods. This resilience is vital for real-world brain-computer interface applications where data loss is common.

5.4% Decoding accuracy decrease with 40% spike/20% LFP samples missing.
1.5 Overall average rank for neural reconstruction across various missing data scenarios, showing top performance.

High-Dimensional Visual Cortex Data: A New Frontier

MRINE's capabilities extended successfully to a high-dimensional (800-D) visual stimuli dataset, integrating neuropixel spiking activity (120 Hz) and calcium imaging data (30 Hz). In a frame ID decoding task, MRINE achieved a Mean Absolute Error (MAE) of 5.09, outperforming the CEBRA baseline (MAE 9.31). This demonstrates its versatility beyond motor tasks and its potential for diverse neuroscience applications, including brain-computer interfaces requiring real-time, robust multimodal decoding.

Advanced ROI Calculator

Estimate the potential return on investment for implementing real-time multimodal AI in your enterprise operations.

Estimated Annual Savings $0
Annual Hours Reclaimed 0

Your AI Implementation Roadmap

A phased approach to integrate MRINE into your operations, from initial data assessment to continuous optimization and adaptation.

Discovery & Data Integration

Initial assessment of existing neural data modalities, including spiking activity and local field potentials (LFP) with varying timescales and distributions. Integration into the MRINE framework, ensuring real-time data ingestion and alignment.

Model Training & Optimization

Training MRINE's multiscale encoder, dynamical backbone, and modality-specific decoders. Hyperparameter tuning using stochastic Lorenz attractor simulations and NHP datasets to optimize for real-time inference and robustness to missing samples.

Real-time Decoding & Validation

Deployment of the MRINE model for real-time, recursive decoding of target variables (e.g., NHP arm kinematics, visual stimuli frame IDs). Comprehensive validation against linear and nonlinear multimodal benchmarks, ensuring superior performance and clinical applicability.

Continuous Adaptation & Monitoring

Establishment of a continuous monitoring system for model performance. Implementation of adaptive strategies to account for temporal variability in neural dynamics, ensuring long-term stability and effectiveness in dynamic neuroscience applications.

Ready to Transform Your Enterprise with AI?

Schedule a consultation with our AI strategists to discuss how MRINE and similar advanced AI solutions can drive efficiency and innovation in your organization.

Ready to Get Started?

Book Your Free Consultation.

Let's Discuss Your AI Strategy!

Lets Discuss Your Needs


AI Consultation Booking