Support Phi 3.5 MoE #9115
Dampfinchen
started this conversation in
Ideas
Replies: 3 comments 2 replies
-
From: https://huggingface.co/microsoft/Phi-3.5-MoE-instruct/blob/main/config.json {
"_name_or_path": "Phi-3.5-MoE-instruct",
"architectures": [
"PhiMoEForCausalLM"
],
"attention_bias": true,
"attention_dropout": 0.0,
"auto_map": {
"AutoConfig": "configuration_phimoe.PhiMoEConfig",
"AutoModelForCausalLM": "modeling_phimoe.PhiMoEForCausalLM"
},
"bos_token_id": 1,
"eos_token_id": 32000,
"hidden_act": "silu",
"hidden_dropout": 0.0,
"hidden_size": 4096,
"initializer_range": 0.02,
"input_jitter_noise": 0.01,
"intermediate_size": 6400,
"lm_head_bias": true,
"max_position_embeddings": 131072,
"model_type": "phimoe",
"num_attention_heads": 32,
"num_experts_per_tok": 2,
"num_hidden_layers": 32,
"num_key_value_heads": 8,
"num_local_experts": 16,
"original_max_position_embeddings": 4096,
"output_router_logits": false,
"rms_norm_eps": 1e-05,
"rope_scaling": {
"long_factor": [
1.0199999809265137,
1.0299999713897705,
"...",
64.95999908447266
],
"long_mscale": 1.243163121016122,
"original_max_position_embeddings": 4096,
"short_factor": [
1.0,
1.0399999618530273,
1.0399999618530273,
"...",
2.7899999618530273
],
"short_mscale": 1.243163121016122,
"type": "longrope"
},
"rope_theta": 10000.0,
"router_aux_loss_coef": 0.0,
"router_jitter_noise": 0.01,
"sliding_window": 131072,
"tie_word_embeddings": false,
"torch_dtype": "bfloat16",
"transformers_version": "4.43.3",
"use_cache": true,
"vocab_size": 32064
} |
Beta Was this translation helpful? Give feedback.
2 replies
-
Please see #9119 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Support for Phi-MoE architectures was recently merged (commit f8feb4b). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Microsoft released a MoE 16x3.3b Model. In my short tests it has proven to be extremly capable. I really would like to see llama.cpp support!
https://huggingface.co/microsoft/Phi-3.5-MoE-instruct
Beta Was this translation helpful? Give feedback.
All reactions