Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add phi1 and phi4 models #1377

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft

Add phi1 and phi4 models #1377

wants to merge 1 commit into from

Conversation

dsudhakarTT
Copy link
Contributor

No description provided.

@dsudhakarTT dsudhakarTT self-assigned this Mar 6, 2025
module_name = build_module_name(variant, Source.HUGGINGFACE, Framework.PYTORCH, Task.CAUSAL_LM)

# Record Forge Property
# record_forge_property("model_name", module_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this is commented?

module_name = build_module_name(variant, Source.HUGGINGFACE, Framework.PYTORCH, Task.CAUSAL_LM)

# Record Forge Property
# record_forge_property("model_name", module_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add below line to all priority models

record_forge_property("group", "priority")

cc: @meenakshiramanathan1

)

# Record Forge Property
record_forge_property("model_name", module_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

module_name = build_module_name(variant, Source.HUGGINGFACE, Framework.PYTORCH, Task.CAUSAL_LM)

# Record Forge Property
# record_forge_property("model_name", module_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

config_dict = config.to_dict()
config_dict["return_dict"] = False
config_dict["use_cache"] = False
# config_dict["resid_pdrop"] = 0.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why this is commented?

compiled_model = forge.compile(framework_model, sample_inputs, module_name)

# Model Verification
# verify(sample_inputs, framework_model, compiled_model)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why this is commented?

@dsudhakarTT dsudhakarTT force-pushed the dsudhakar/phi_models branch from 48bae96 to 8a4a0c1 Compare March 6, 2025 10:44
@codecov-commenter
Copy link

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 43.40%. Comparing base (87c5d8a) to head (8a4a0c1).
Report is 3 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1377   +/-   ##
=======================================
  Coverage   43.40%   43.40%           
=======================================
  Files          48       48           
  Lines        7860     7860           
=======================================
  Hits         3412     3412           
  Misses       4448     4448           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link

github-actions bot commented Mar 6, 2025

TestsPassed ✅Skipped ⚠️Failed
TT-Forge-FE Tests625 ran489 passed136 skipped0 failed
TestResult
No test annotations available

Copy link

github-actions bot commented Mar 6, 2025

TestsPassed ✅Skipped ⚠️Failed
TT-Forge-FE Tests684 ran541 passed143 skipped0 failed
TestResult
No test annotations available

Copy link

github-actions bot commented Mar 6, 2025

TestsPassed ✅Skipped ⚠️Failed
TT-Forge-FE Tests625 ran489 passed136 skipped0 failed
TestResult
No test annotations available

Copy link

github-actions bot commented Mar 6, 2025

TestsPassed ✅Skipped ⚠️Failed
TT-Forge-FE Tests684 ran541 passed143 skipped0 failed
TestResult
No test annotations available

Comment on lines +31 to +41
# PhiConfig from pretrained variant, disable return_dict and caching.
config = PhiConfig.from_pretrained(variant)
config_dict = config.to_dict()
config_dict["return_dict"] = False
config_dict["use_cache"] = False
config = PhiConfig(**config_dict)

# Load tokenizer and model from HuggingFace
tokenizer = AutoTokenizer.from_pretrained(variant)
framework_model = PhiForCausalLM.from_pretrained(variant, config=config)
framework_model.eval()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you use download_model to load model and tokenizer so that we can get rid of the redundant code in each test?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants