Skip to content

Commit

Permalink
Update clm.py
Browse files Browse the repository at this point in the history
  • Loading branch information
AakritiKinra authored Jan 1, 2025
1 parent a07dac5 commit d19bf86
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions llments/eval/factscore/clm.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,10 @@ def load_model(self) -> None:
def _generate(
self,
prompt: str,
prompts: Union[str, List[str]],
sample_idx: int = 0,
max_sequence_length: int = 2048,
max_output_length: int = 128,
prompts: Union[str, List[str]],
end_if_newline: bool = False,
end_if_second_newline: bool = False,
verbose: bool = False,
Expand All @@ -76,8 +76,8 @@ def _generate(
Args:
prompt (str): The input prompt to generate text from.
sample_idx (int, optional): Index to differentiate between samples. Defaults to 0.
prompts (Union[str, List[str]]): Single prompt string or a list of prompt strings.
sample_idx (int, optional): Index to differentiate between samples. Defaults to 0.
max_sequence_length (int, optional): Maximum length of the input sequence.
Defaults to 2048.
max_output_length (int, optional): Maximum length of the generated output.
Expand Down

0 comments on commit d19bf86

Please sign in to comment.