Skip to content

Instantly share code, notes, and snippets.

View Gareth001's full-sized avatar

Gareth Booth Gareth001

  • Brisbane, Queensland
View GitHub Profile
@iridiumblue
iridiumblue / AttentionWithContext.py
Last active October 5, 2020 17:16 — forked from cbaziotis/AttentionWithContext.py
Keras Layer that implements an Attention mechanism, compatible with Eager Execution and TF 1.13. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
# AttentionWithContext adapted for Tensorflow 1.13 with Eager Execution.
# IMPORTANT -you can't use regular keras optimizers. You need to grab one that is subclassed from
# tf.train.Optimizer. Not to worry, your favorite is probably there, for example -
# https://www.tensorflow.org/api_docs/python/tf/train/AdamOptimizer
# That's it, now you can use this layer -
# Adapted from https://gist.github.com/cbaziotis/7ef97ccf71cbc14366835198c09809d2
# Tested using functional API. Just plop on top of an RNN, like so -
@jubjamie
jubjamie / pb_viewer.py
Created March 31, 2017 10:01
Load .pb into Tensorboard
import tensorflow as tf
from tensorflow.python.platform import gfile
with tf.Session() as sess:
model_filename ='PATH_TO_PB.pb'
with gfile.FastGFile(model_filename, 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
g_in = tf.import_graph_def(graph_def)
LOGDIR='/logs/tests/1/'
train_writer = tf.summary.FileWriter(LOGDIR)