Replies: 1 comment
-
I seem to have the same issue. Did you figure a workaround? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
this is my model.
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 16, 128, 1)] 0 _________________________________________________________________ conv_0 (Conv2D) (None, 16, 128, 256) 256 _________________________________________________________________ bn_conv_0 (BatchNormalizatio (None, 16, 128, 256) 1024 _________________________________________________________________ conv_act_0 (Activation) (None, 16, 128, 256) 0 _________________________________________________________________ conv_1 (Conv2D) (None, 16, 128, 256) 65536 _________________________________________________________________ bn_conv_1 (BatchNormalizatio (None, 16, 128, 256) 1024 _________________________________________________________________ conv_act_1 (Activation) (None, 16, 128, 256) 0 _________________________________________________________________ conv_2 (Conv2D) (None, 16, 128, 2) 512 _________________________________________________________________ bn_conv_2 (BatchNormalizatio (None, 16, 128, 2) 8 _________________________________________________________________ conv_act_2 (Activation) (None, 16, 128, 2) 0 _________________________________________________________________ flatten (Flatten) (None, 4096) 0 _________________________________________________________________ dense_0 (Dense) (None, 1) 4096 _________________________________________________________________ bn_dense_0 (BatchNormalizati (None, 1) 4 _________________________________________________________________ dense_act_0 (Activation) (None, 1) 0 _________________________________________________________________ dense_1 (Dense) (None, 32) 32 _________________________________________________________________ bn_dense_1 (BatchNormalizati (None, 32) 128 _________________________________________________________________ dense_act_1 (Activation) (None, 32) 0 _________________________________________________________________ output_dense (Dense) (None, 2) 66 _________________________________________________________________ output_softmax (Activation) (None, 2) 0 ================================================================= Total params: 72,686 Trainable params: 71,592 Non-trainable params: 1,094 _________________________________________________________________
When I perform
hls_model.build(csim=False)
, the process will be killed,The log is as follows.please help me.Beta Was this translation helpful? Give feedback.
All reactions