Last active
December 25, 2018 21:58
-
-
Save kailaix/32e3e80784c2d242fbe09627085f78ac to your computer and use it in GitHub Desktop.
Multilayer Perceptron in Julia
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| using TensorFlow | |
| num_layers = 3 # number of hidden layer | |
| nh = 20 # number of neurons per hidden layer | |
| """ | |
| neural(x, scope="default", reuse=false) | |
| A Multilayer Perceptron with `num_layers` hidden layers. The last layer is a linear layer. | |
| The neural network is enough to perform many useful approximations | |
| """ | |
| function neural(x, scope="default", reuse=false) | |
| variable_scope(scope, initializer=Distributions.Normal(0, .01), reuse=reuse) do | |
| W1 = get_variable("W0", shape=[2,nh], dtype=Float32) | |
| b1 = get_variable("b0", shape=[nh], dtype=Float32) | |
| net = x*W1 + b1 | |
| net = nn.tanh(net) | |
| for i = 1:num_layers-1 | |
| W1 = get_variable("W$i", shape=[nh,nh], dtype=Float32) | |
| b1 = get_variable("b$i", shape=[nh], dtype=Float32) | |
| net = net*W1 + b1 | |
| net = nn.tanh(net) | |
| end | |
| W1 = get_variable("Wend", shape=[nh,1], dtype=Float32) | |
| b1 = get_variable("bend", shape=[1], dtype=Float32) | |
| z = net*W1 + b1 | |
| return z | |
| end | |
| end |
Author
Author
Troubleshooting
If you encounter the following error
Tensorflow error: Status: Input 'ref' passed double expected ref type while building NodeDef 'Assign_3' using Op<name=Assign; signature=ref:Ref(T), value:T -> output_ref:Ref(T); attr=T:type; attr=validate_shape:bool,default=true; attr=use_locking:bool,default=true; allows_uninitialized_input=true>
Check the datatypes you are using (Float32 or Float64). assign operator must be applied to consistent typed tensors.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This is an implementation of a Multilayer Perceptron with
num_layershidden layers. The last layer is a linear layer.The neural network is enough to perform many useful approximations. See TensorFlow.jl for details.