Skip to content

Commit ebb22a7

Browse files
Update phishing_email_detection_gpt2.py
Fix access to .vocab_size for new tokenizer.
1 parent 6951f19 commit ebb22a7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

phishing_email_detection_gpt2.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -337,7 +337,7 @@ def call(self, x):
337337
inp = tf.keras.layers.Input(shape=(), dtype=tf.string)
338338
# gp2_tokenizer = TokenizerLayer(max_seq_length=max_seq_length)
339339
gp2_tokenizer = NewTokenizerLayer(max_seq_length=max_seq_length,tokenizer_checkpoint=tokenizer_checkpoint)
340-
VOCABULARY_SIZE = gp2_tokenizer.tokenizer.vocabulary_size()
340+
VOCABULARY_SIZE = gp2_tokenizer.tokenizer.vocab_size
341341
tokens = gp2_tokenizer(inp)
342342

343343
# On larger hardware, this could probably be increased considerably and

0 commit comments

Comments
 (0)