-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add phi1 and phi4 models #1377
base: main
Are you sure you want to change the base?
Add phi1 and phi4 models #1377
Conversation
module_name = build_module_name(variant, Source.HUGGINGFACE, Framework.PYTORCH, Task.CAUSAL_LM) | ||
|
||
# Record Forge Property | ||
# record_forge_property("model_name", module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why this is commented?
module_name = build_module_name(variant, Source.HUGGINGFACE, Framework.PYTORCH, Task.CAUSAL_LM) | ||
|
||
# Record Forge Property | ||
# record_forge_property("model_name", module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add below line to all priority models
record_forge_property("group", "priority")
) | ||
|
||
# Record Forge Property | ||
record_forge_property("model_name", module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
module_name = build_module_name(variant, Source.HUGGINGFACE, Framework.PYTORCH, Task.CAUSAL_LM) | ||
|
||
# Record Forge Property | ||
# record_forge_property("model_name", module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
config_dict = config.to_dict() | ||
config_dict["return_dict"] = False | ||
config_dict["use_cache"] = False | ||
# config_dict["resid_pdrop"] = 0.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why this is commented?
compiled_model = forge.compile(framework_model, sample_inputs, module_name) | ||
|
||
# Model Verification | ||
# verify(sample_inputs, framework_model, compiled_model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why this is commented?
48bae96
to
8a4a0c1
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1377 +/- ##
=======================================
Coverage 43.40% 43.40%
=======================================
Files 48 48
Lines 7860 7860
=======================================
Hits 3412 3412
Misses 4448 4448 ☔ View full report in Codecov by Sentry. |
|
|
|
|
# PhiConfig from pretrained variant, disable return_dict and caching. | ||
config = PhiConfig.from_pretrained(variant) | ||
config_dict = config.to_dict() | ||
config_dict["return_dict"] = False | ||
config_dict["use_cache"] = False | ||
config = PhiConfig(**config_dict) | ||
|
||
# Load tokenizer and model from HuggingFace | ||
tokenizer = AutoTokenizer.from_pretrained(variant) | ||
framework_model = PhiForCausalLM.from_pretrained(variant, config=config) | ||
framework_model.eval() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you use download_model to load model and tokenizer so that we can get rid of the redundant code in each test?
No description provided.