You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have read the above rules and searched the existing issues.
Description
Sometimes the pre-embedding layers or prediction logits are more useful than the prediction tokens.
To achieve the similar features in Guidance-AI or Layer-wise Analysis, is there any possible configuration in batch inference?
For example, --output-layer: -1 --output-layer-length: 10
to obtain prediction logits of top 10 tokens.
Pull Request
No response
The text was updated successfully, but these errors were encountered:
Reminder
Description
Sometimes the pre-embedding layers or prediction logits are more useful than the prediction tokens.
To achieve the similar features in Guidance-AI or Layer-wise Analysis, is there any possible configuration in batch inference?
For example,
--output-layer: -1
--output-layer-length: 10
to obtain prediction logits of top 10 tokens.
Pull Request
No response
The text was updated successfully, but these errors were encountered: