Skip to content

Section 2 Exercise: Softmax clone doesn't work the same as tensorflow's softmax #308

Discussion options

You must be logged in to vote

Hey @Raunak-Singh-Inventor,

I updated your code to reflect the correct solution based on here: https://www.tensorflow.org/api_docs/python/tf/nn/softmax

import tensorflow as tf
tf.__version__

# Creating sample tensor 
tensor = tf.constant([[1, 2, 3, 6],
                      [2, 4, 5, 6],
                      [3, 8, 7, 6]], dtype=tf.float32)
print(f"input to softmax activation:{tensor[0, :]}")
print("--------------------------------------------------------------------------------------")
outputs = tf.keras.activations.softmax(tensor, axis=-1) # apply the softmax function on the input tensor, axis=-1 is default
print(f"output of softmax activation:{outputs[0, :]}")
print("----------------…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@Raunak-Singh-Inventor
Comment options

@mrdbourke
Comment options

Answer selected by mrdbourke
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants