Any way to calculate feature significnace with Tensorflow? #228
Unanswered
ibilalmirza
asked this question in
Q&A
Replies: 1 comment
-
Hey @ibilalmirza Unlike the traditional ML methods finding the important features in a neural network is really hard, as we know the neural network learns patterns through the layers we construct. So the layers learned by the first layer won't be good as the patterns learned in the latter layers. I found this StackOverflow thread and I hope it might help you. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, i am doctorate student and learning TensorFlow (deep and machine learning) for my research work. I know in black box models its complicated to get true feature significance, however, is there any easy way to achieve this? There are several topics on git-hub which are advertised to be capable of extracting feature importance from Tensor-flow network. However, I cannot understand how to use them or integrate them with my tensor-flow model, as tier instructions are non-existence or too complicated for me.
So is there any easy way to come up with any level of feature importance in TensorFlow model?
Beta Was this translation helpful? Give feedback.
All reactions