Skip to content

cserajdeep/LIQUID-NEURAL-NETWORK-LNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

🧠 Run-1 LIQUID NEURAL NETWORK LNN🚀

True Liquid Time-Constant (LTC) Cell

  • Implemented proper continuous-time dynamics with learnable time constants
  • Added decay factor based on tau parameter that controls information flow

Enhanced Architecture

  • Multi-layer capability for deeper networks
  • Self-attention mechanism for capturing temporal relationships
  • Skip connections and layer normalization for better gradient flow

Robust Training Framework

  • Learning rate scheduling with ReduceLROnPlateau
  • Early stopping to prevent overfitting
  • Gradient clipping to prevent exploding gradients
  • AdamW optimizer with weight decay for regularization

Comprehensive Evaluation

  • Detailed metrics for both classification and regression
  • Visualization of results (confusion matrices, prediction plots)
  • Feature importance analysis for regression tasks

Hyperparameter Tuning

  • Simple grid search to find optimal model configuration
  • Best model checkpointing

Improved Data Handling

  • Better error handling for dataset loading
  • Proper input normalization
  • Support for both sequence and non-sequence data formats

The code is now much more robust, handles edge cases better, and should provide significantly better performance on both classification and regression tasks.