Skip to content

Full Keras Transformer neural network (encoder and decoder) for SOC estimation (time-series prediction)

Notifications You must be signed in to change notification settings

att-ar/transform_decode_soc

Repository files navigation

Transformers + TensorFlow and Pandas for SOC Estimation

The Testing branch is the most up to date

The Repo without the Decoder implemented: Attar's Github Repo

Building a transformer neural network using TensorFlow and Transformers in Python with the goal of prediciting Li-ion State of Charge based on real time voltage, current and delta time data.

The transformer network uses Batch Normalization instead of the Layer Normalization typically found in NLP. This was done because literature said it proved significantly more effective than the NLP application of transformers.

The transformer's input will be voltage, current, and previous SOC points in a batch of windowed data of shape:
(G.batch_size, G.window_size, G.num_features)

The voltage, current and soc data will be from time: $$t - \text{windowsize} \rightarrow t$$

  • The encoder's input will be from time: $t - \text{windowsize} \rightarrow t - \text{targetlength}$;
  • The decoder's input will be from time: $t - \text{targetlength} \rightarrow t$;
  • The decoder's output should be from time: $(t + 1) - \text{targetlength} \rightarrow t + 1$;
  • The transformer output should be the decoder output with shape (G.batch_size, G.tgt_len, 1);

Note that the value that is actually wanted is the one at time $t + 1$

About

Full Keras Transformer neural network (encoder and decoder) for SOC estimation (time-series prediction)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published