This repository was archived by the owner on Sep 18, 2024. It is now read-only.
  
  
  
  
Replies: 1 comment
-
| 
         i have tried to add the gelu activation setting  still getting error activation_setting = { QuantizationSetting.register(torch.nn.GELU(), activation_setting)`  | 
  
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
getting error
AssertionError: GELU is not registered, please register setting with QuantizationSetting.register()tried with
from nni.compression.quantization import QuantizationSettingbut still no module availabe from compression it gives.
Beta Was this translation helpful? Give feedback.
All reactions