The Wayback Machine - https://web.archive.org/web/20200404050104/https://github.com/aymericdamien/TensorFlow-Examples/issues/290
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about biLSTM example #290

Open
aitsc opened this issue Dec 3, 2018 · 2 comments
Open

Questions about biLSTM example #290

aitsc opened this issue Dec 3, 2018 · 2 comments

Comments

@aitsc
Copy link

@aitsc aitsc commented Dec 3, 2018

return tf.matmul(outputs[-1], weights['out']) + biases['out']

I think "outputs [-1]" and "outputs [0]" are equivalent (reversed) in this line of code, but the former (89%) works better than the latter (86%). Why?

@song-ting

This comment has been minimized.

Copy link

@song-ting song-ting commented Jan 14, 2019

outputs is a length T(=28) list of outputs (one for each input), which are concatenated forward and backward outputs.
output[-1] = concatenate(output_fw[-1], output_bw[0])
output[0] = concatenate(output_fw[0], output_bw[-1])
while:
output_fw[-1] not equal to output_bw[-1]
output_fw[0] not equal to output_bw[0]

@gugarosa

This comment has been minimized.

Copy link

@gugarosa gugarosa commented Feb 15, 2019

Using tf.nn.bidirectional_dynamic_rnn and concatenating its outputs outputs = tf.concat([outputs[0], outputs[1]], 2) should perform better. It will create the graph on-the-fly and allow for variable batch sizes feeding.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
3 participants
You can’t perform that action at this time.