Recurrent Neural Network: TimeDelay Neural Network Implemented in Fast Compressed Neural Network For R

This a general overview of how a time delay network data flow would work.  Note that I've ommited the actual weight calculations/tuning etc. and I've generally skipped the hidden layer calculations and shown only a 2X time delay.  In theory you can show X time delays b

Red are the current set of inputs

Blue are the outputs from the previous evaluation

Green are the outputs from the run before that.  etc. etc.

Purple is the hidden layer

Teal is the output layer

Time Delay Feed Forward Neural Network Diagram

 

 

Code coming soon.

General flow is this.

In the diagram above.  Data consists of a time series with four inputs, doesn't matter what.

Height, Weight, Length, Width

Outputs are what we are attempting to predict/react to.  Lets say BASE 10 output: Up, Down, Left, Right 

Time series data would look something like this:

TimeSequence Height Weight Length Width
TS1 6 160 320 280
TS2 7.02 170 310 290
TS3 5.8 130 320 280

 

So Values for the input nodes on the first run:

INPUT LAYER 6 160 320 280 0 0 0 0 0 0 0 0
HIDDEN LAYER CALC1 CALC2 CALC3 CALC4 CALC5 CALC6 CALC7 CALC8
OUTPUT LAYER 5 8 7 8

 

 

 

 

 

 

So Values for the input nodes on the Second run:

INPUT LAYER 7.02 170 310 290 5 8 7 8 0 0 0 0
HIDDEN LAYER CALC1 CALC2 CALC3 CALC4 CALC5 CALC6 CALC7 CALC8
OUTPUT LAYER 9 3 5 8

 

Third Run

INPUT LAYER 5.8 130 320 280 9 3 5 8 5 8 7 8
HIDDEN LAYER CALC1 CALC2 CALC3 CALC4 CALC5 CALC6 CALC7 CALC8
OUTPUT LAYER 5 8 7 8

 

 

 

 

 

etc...