Lesson 2: Theory of Split Learning - Compiled questions #303
Unanswered
gonzalo-munillag
asked this question in
Course: Foundations of Private Computation
Replies: 1 comment
-
Hi @gonzalo-munillag for your question no. 3, it says that the shape of your labels y (which should be the label of current batch) is not same as the shape of your original label. They should be of same shape. Please check the shape you set in your output layer where the model architecture is defined. Also, is it only occurring for the very last batch of the input? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi :)
I am enjoying the course very much!
I have compiled some questions from Lesson 2, Split Learning:
Video of class 5
There is a typo, the expected value of the information content should be depicted on the slide Entropy as E[I(X)] not as E[I(x)]. I(x) is a number, therefore the E would be that same number, not useful.
It should be:
H(X) = - sum_(from n with i = 1) (P(x_i)log(P(x_i))) = E[-log_2 P(X)] = E[I(X)]
Question. How does the torch.cat() work for the 2 activation tensors of class number 10?
This is a question due to my forgetfulness of DL. Does this mean that the input neurons of the data scientist's NN ingest a vector of vectors and not a single vector because we concatenated the prior activation outputs?
I get an error on the last notebook, even though it is the same as the solutions provided.
Where might be the mistake in the sample notebooks?
Minor things
Audio quality is low, echoes and background noise. Video of class 9 is an example. I could not understand the audio, lest I had subtitles on.
Again thank you so much for putting all this together, it is an amazing opportunity! Looking forward to cryptography basics!
Cheers!
Beta Was this translation helpful? Give feedback.
All reactions