Skip to content

Instantly share code, notes, and snippets.

@waleedka
Last active December 17, 2016 02:18
Show Gist options
  • Select an option

  • Save waleedka/37873f88063fb3ac4a8fc7d851463090 to your computer and use it in GitHub Desktop.

Select an option

Save waleedka/37873f88063fb3ac4a8fc7d851463090 to your computer and use it in GitHub Desktop.
# Create a graph to hold the model.
graph = tf.Graph()
# Create model in the graph.
with graph.as_default():
# Placeholders for inputs and labels.
images_ph = tf.placeholder(tf.float32, [None, 32, 32, 3])
labels_ph = tf.placeholder(tf.int32, [None])
# Flatten input from: [None, height, width, channels]
# To: [None, height * width * channels] == [None, 3072]
images_flat = tf.contrib.layers.flatten(images_ph)
# Fully connected layer.
# Generates logits of size [None, 62]
logits = tf.contrib.layers.fully_connected(images_flat, 62, tf.nn.relu)
# Convert logits to label indexes (int).
# Shape [None], which is a 1D vector of length == batch_size.
predicted_labels = tf.argmax(logits, 1)
# Define the loss function.
# Cross-entropy is a good choice for classification.
loss = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(
logits, labels_ph))
# Create training op.
train = tf.train.AdamOptimizer(learning_rate=0.001).minimize(loss)
# And, finally, an initialization op to execute before training.
# TODO: rename to tf.global_variables_initializer() on TF 0.12.
init = tf.initialize_all_variables()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment