Skip to content
This repository was archived by the owner on Apr 28, 2023. It is now read-only.

Commit 1981248

Browse files
committed
update python test for softmax to use default-initialized reduction
The test used to pass because output tensors were seemingly zero-initalized implicitly, and the values we generated were strictly positive making 0 an acceptable initial value for max reduction.
1 parent c1796f3 commit 1981248

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

test_python/layers/test_softmax.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ class TestSoftmax(unittest.TestCase):
2727
def test_softmax(self):
2828
LANG = """
2929
def softmax(float(N, D) I) -> (O, maxVal, expDistance, expSum) {
30-
maxVal(n) max= I(n, d)
30+
maxVal(n) max=! I(n, d)
3131
expDistance(n, d) = exp(I(n, d) - maxVal(n))
3232
expSum(n) +=! expDistance(n, d)
3333
O(n, d) = expDistance(n, d) / expSum(n)

0 commit comments

Comments
 (0)