Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Correct the misdescription of LogSoftmax class
Browse files Browse the repository at this point in the history
Correct the misdescription comment of LogSoftmax class from 'softmax loss function' to 'logarithm of softmax'
  • Loading branch information
cchung100m committed Mar 12, 2019
1 parent 174a9e7 commit 4a02a72
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion example/bayesian-methods/bdk_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ def backward(self, out_grad, in_data, out_data, in_grad):


class LogSoftmax(mx.operator.NumpyOp):
"""Generate helper functions to evaluate softmax loss function"""
"""Generate helper functions to calculate the logarithm of softmax"""
def __init__(self):
super(LogSoftmax, self).__init__(False)

Expand Down

0 comments on commit 4a02a72

Please sign in to comment.