How to change RPU config? #571
-
Hi there! Suppose you've trained a model in a hardware-aware fashion, and saved it. As such, when you Now, let's say I want to change some parameters of the RPU config such as So far, I've been using How can I do that? What is the most correct way? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 14 replies
-
Hi @edminge, analog_model.load_state_dict(state_dict, load_rpu_config=False) However, there is also a convenient function to change the for tile in analog_model.analog_tiles():
tile.to(rpu_config=rpu_config) This loops through all the tiles (aka crossbars) and applies the new |
Beta Was this translation helpful? Give feedback.
-
Apologies the double post, to properly make use of your suggestion, I need to install 0.9.0 with CUDA-support, but installing from a local package (aka pulling from git -> Any tips for installing this version smoothly? Secondly, perhaps I am misunderstanding Many thanks. |
Beta Was this translation helpful? Give feedback.
Hi @edminge,
thanks for the question. The most general way is, if you have a checkpoint file, to construct a new model with a new
RPUConfig
and doHowever, there is also a convenient function to change the
RPUConfig
of an existing model via:This loops through all the tiles (aka crossbars) and applies the new
RPUConfig
. Note that different tiles could in principle have differentRPUConfigs
to specify e.g. different noise levels etc, so theRPUConfig
is a tile property and not a model property.