Skip to content

Instantly share code, notes, and snippets.

@gregn610
Created September 22, 2016 18:46
Show Gist options
  • Select an option

  • Save gregn610/f9583a537d9b4ca87f992d0e5cd5c70c to your computer and use it in GitHub Desktop.

Select an option

Save gregn610/f9583a537d9b4ca87f992d0e5cd5c70c to your computer and use it in GitHub Desktop.
Keras SimpleRNN
batch_size = modelData.batch_size
timesteps = modelData.X_train.shape[1] #xshape[1]
input_dim = modelData.X_train.shape[2] #xshape[2]
in_neurons = input_dim +7
hidden_layers = 1
hidden_neurons = in_neurons
out_neurons = 3
ret_sequences = True
rnn_activation = 'relu'
dense_activation = 'linear'
epochs = 111
# Input shape - 3D tensor with shape (nb_samples, timesteps, input_dim).
model = Sequential()
model.add(SimpleRNN(
in_neurons,
return_sequences = ret_sequences,
stateful = False,
activation = rnn_activation,
batch_input_shape = (batch_size, timesteps, input_dim)))
for _ in range(1, hidden_layers):
model.add(SimpleRNN(hidden_neurons,
return_sequences = True,
stateful = False,
activation = rnn_activation,
) )
# Not really a hidden layer but the easiest way to deal with 1 layer nets
if hidden_layers > 0:
model.add(SimpleRNN(out_neurons,
return_sequences = False,
stateful = False,
activation = rnn_activation,
) )
model.add(Dense(1, activation = dense_activation))
model.compile(loss="mean_squared_error", optimizer="rmsprop")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment