Skip to content

Commit

Permalink
refactor: introduced client_parameters attribute; renamed input_varia…
Browse files Browse the repository at this point in the history
…bles to template_variables; other_data to custom_data;
  • Loading branch information
MoritzLaurer committed Dec 9, 2024
1 parent bdb5645 commit f88a2eb
Show file tree
Hide file tree
Showing 9 changed files with 122 additions and 114 deletions.
4 changes: 2 additions & 2 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,8 @@ pip install hf-hub-prompts
>>> # 3. Inspect the template:
>>> prompt_template.template
[{'role': 'system', 'content': 'You are a coding assistant who explains concepts clearly and provides short examples.'}, {'role': 'user', 'content': 'Explain what {concept} is in {programming_language}.'}]
>>> # Check required input variables
>>> prompt_template.input_variables
>>> # Check required template variables
>>> prompt_template.template_variables
['concept', 'programming_language']

>>> # 4. Populate the template with variables
Expand Down
6 changes: 3 additions & 3 deletions docs/repo_types_examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ prompt_template = PromptTemplateLoader.from_hub(
)

print(prompt_template)
# ChatPromptTemplate(template=[{'role': 'system', 'content': '<artifacts_info> The assistant can create and reference artifacts during conversations. Artifacts are ... Claude is now being connected with a human.'}, {'role': 'user', 'content': '{user_message}'}], input_variables=['current_date', 'user_message'], metadata=[{'source': 'https://gist.github.com/dedlim/6bf6d81f77c19e20cd40594aa09e3ecd'}])
# ChatPromptTemplate(template=[{'role': 'system', 'content': '<artifacts_info> The assistant can create and reference artifacts during conversations. Artifacts are ... Claude is now being connected with a human.'}, {'role': 'user', 'content': '{user_message}'}], template_variables=['current_date', 'user_message'], metadata=[{'source': 'https://gist.github.com/dedlim/6bf6d81f77c19e20cd40594aa09e3ecd'}])
```

Prompt templates are downloaded as either `ChatPromptTemplate` or `TextPromptTemplate` classes. This class makes it easy to populate a prompt template and convert it into a format that's compatible with different LLM clients. The type is automatically determined based on whether the YAML contains a simple string (TextPromptTemplate) or a list of dictionaries following the OpenAI messages format (ChatPromptTemplate).
Expand All @@ -48,8 +48,8 @@ Prompt templates are downloaded as either `ChatPromptTemplate` or `TextPromptTem
With the `create_messages` method, we can then populate the prompt template for a specific use-case.

```python
# Check which input variables the prompt template requires
print(prompt_template.input_variables)
# Check which variables the prompt template requires
print(prompt_template.template_variables)
# ['current_date', 'user_message']

user_message = "Create a simple calculator web application"
Expand Down
22 changes: 11 additions & 11 deletions docs/standard_prompt_format.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ A prompt template YAML or JSON file must follow the following standardized struc

- Top-level key (required): `prompt`. This top-level key signals to the parser that the content of the file is a prompt template.
- Second-level key (required): `template`. This can be either a simple string, or a list of dictionaries following the OpenAI messages format. The messages format is recommended for use with LLM APIs or inference containers. Variable placeholders for populating the prompt template string are denoted with double curly brackets {{...}}.
- Second-level keys (optional): (1) `input_variables`: an optional list of variables for populating the prompt template. This is also used for input validation; (2) `metadata`: Other information, such as the source, date, author etc.; (3) Any other key of relevance, such as `client_settings` with parameters for reproducibility with a specific inference client, or `metrics` form evaluations on specific datasets.
- Second-level keys (optional): (1) `template_variables`: a list of variables for populating the prompt template. This is used for input validation and to make the required variables for long templates easily accessible; (2) `metadata`: information about the template such as the source, date, author etc.; (3) `client_parameters`: parameters for the inference client (e.g. temperature, model).

Example prompt template following the standard in YAML:
```yaml
Expand All @@ -16,7 +16,7 @@ prompt:
content: "You are a coding assistant who explains concepts clearly and provides short examples."
- role: "user"
content: "Explain what {{concept}} is in {{programming_language}}."
input_variables:
template_variables:
- concept
- programming_language
metadata:
Expand All @@ -29,7 +29,7 @@ prompt:
author: "Karl Marx"
```
**Naming convention:** We call a file a *"prompt template"*, when it has placeholders ({{...}}) for dynamically populating the template similr to an f-string. This makes files more useful and reusable by others for different use-cases. Once the placeholders in the template are populated with specific input variables, we call it a *"prompt"*.
**Naming convention:** We call a file a *"prompt template"*, when it has placeholders ({{...}}) for dynamically populating the template similr to an f-string. This makes files more useful and reusable by others for different use-cases. Once the placeholders in the template are populated with specific variables, we call it a *"prompt"*.
The following example illustrates how the prompt template becomes a prompt.
Expand All @@ -41,13 +41,13 @@ The following example illustrates how the prompt template becomes a prompt.
... filename="code_teacher.yaml"
... )

>>> # 2. Inspect the template and it's input variables:
>>> # 2. Inspect the template and it's variables:
>>> prompt_template.template
[{'role': 'system', 'content': 'You are a coding assistant who explains concepts clearly and provides short examples.'}, {'role': 'user', 'content': 'Explain what {concept} is in {programming_language}.'}]
>>> prompt_template.input_variables
>>> prompt_template.template_variables
['concept', 'programming_language']

>>> # 3. Populate the template with its input variables
>>> # 3. Populate the template with its variables
>>> prompt = prompt_template.populate_template(
... concept="list comprehension",
... programming_language="Python"
Expand Down Expand Up @@ -120,10 +120,10 @@ prompt_template_langchain = prompt_template.to_langchain_template()


### Existing prompt template repos:
- [LangChain Hub](https://smith.langchain.com/hub) for prompts (main hub is proprietary. See the old public oss [repo](https://github.com/hwchase17/langchain-hub), using JSON or YAML, with {...} for input variables)
- [LangChain Hub](https://smith.langchain.com/hub) for prompts (main hub is proprietary. See the old public oss [repo](https://github.com/hwchase17/langchain-hub), using JSON or YAML, with {...} for template variables)
- [LangGraph Templates](https://blog.langchain.dev/launching-langgraph-templates/) (underlying data structure unclear, does not seem to have a collaborative way of sharing templates)
- [LlamaHub](https://llamahub.ai/) (seems to use GitHub as backend)
- [Deepset Prompt Hub](https://github.com/deepset-ai/prompthub) (seems not maintained anymore, used YAML with {...} for input variables)
- distilabel [templates](https://github.com/argilla-io/distilabel/tree/main/src/distilabel/steps/tasks/templates) and [tasks](https://distilabel.argilla.io/latest/components-gallery/tasks/) ([source](https://github.com/argilla-io/distilabel/tree/main/src/distilabel/steps/tasks)) (using pure jinja2 with {{ ... }} for input variables)
- [Langfuse](https://langfuse.com/docs/prompts/get-started), see also [example here](https://langfuse.com/guides/cookbook/prompt_management_langchain) (no public prompt repo, using JSON internally with {{...}} for input variables)
- [Promptify](https://github.com/promptslab/Promptify/tree/27a53fa8e8f2a4d90f887d06ece65a44466f873a/promptify/prompts) (not maintained anymore, used jinja1 and {{ ... }} for input variables)
- [Deepset Prompt Hub](https://github.com/deepset-ai/prompthub) (seems not maintained anymore, used YAML with {...} for template variables)
- distilabel [templates](https://github.com/argilla-io/distilabel/tree/main/src/distilabel/steps/tasks/templates) and [tasks](https://distilabel.argilla.io/latest/components-gallery/tasks/) ([source](https://github.com/argilla-io/distilabel/tree/main/src/distilabel/steps/tasks)) (using pure jinja2 with {{ ... }} for template variables)
- [Langfuse](https://langfuse.com/docs/prompts/get-started), see also [example here](https://langfuse.com/guides/cookbook/prompt_management_langchain) (no public prompt repo, using JSON internally with {{...}} for template variables)
- [Promptify](https://github.com/promptslab/Promptify/tree/27a53fa8e8f2a4d90f887d06ece65a44466f873a/promptify/prompts) (not maintained anymore, used jinja1 and {{ ... }} for template variables)
30 changes: 10 additions & 20 deletions examples/example-usage.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -53,18 +53,7 @@
"execution_count": 2,
"id": "947ac23c",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<module 'hf_hub_prompts.tools' from '/Users/moritzlaurer/huggingface/projects/hf-hub-prompts/hf_hub_prompts/tools.py'>"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"outputs": [],
"source": [
"#import importlib\n",
"#import hf_hub_prompts.hub_api\n",
Expand All @@ -86,15 +75,15 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 3,
"id": "08e6da78",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"TextPromptTemplate(template='Translate the following text to {{language}}:\\n{{..., input_variables=['language', 'text'], metadata={'name': 'Simple Translator', 'description': 'A si..., other_data={}, populator_type='double_brace', populator=<hf_hub_prompts.prompt_templates.DoubleBracePopula...)\n",
"TextPromptTemplate(template='Translate the following text to {{language}}:\\n{{..., input_variables=['language', 'text'], metadata={'name': 'Simple Translator', 'description': 'A si..., client_parameters={}, other_data={}, populator_type='double_brace', populator=<hf_hub_prompts.prompt_templates.DoubleBracePopula...)\n",
"True\n"
]
}
Expand Down Expand Up @@ -127,25 +116,25 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 6,
"id": "cd085b87",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"ChatPromptTemplate(template=[{'role': 'system', 'content': 'You are a coding a..., input_variables=['concept', 'programming_language'], metadata={'name': 'Code Teacher', 'description': 'A simple ..., other_data={}, populator_type='double_brace', populator=<hf_hub_prompts.prompt_templates.DoubleBracePopula...)\n",
"ChatPromptTemplate(template=[{'role': 'system', 'content': 'You are a coding a..., input_variables=['concept', 'programming_language'], metadata={'name': 'Code Teacher', 'description': 'A simple ..., other_data={}, populator_type='double_brace', populator=<hf_hub_prompts.prompt_templates.DoubleBracePopula...)\n"
"ChatPromptTemplate(template=[{'role': 'system', 'content': 'You are a coding a..., input_variables=['concept', 'programming_language'], metadata={'name': 'Code Teacher', 'description': 'A simple ..., client_parameters={'temperature': 0.5}, other_data={}, populator_type='double_brace', populator=<hf_hub_prompts.prompt_templates.DoubleBracePopula...)\n",
"ChatPromptTemplate(template=[{'role': 'system', 'content': 'You are a coding a..., input_variables=['concept', 'programming_language'], metadata={'name': 'Code Teacher', 'description': 'A simple ..., client_parameters={}, other_data={}, populator_type='double_brace', populator=<hf_hub_prompts.prompt_templates.DoubleBracePopula...)\n"
]
},
{
"data": {
"text/plain": [
"True"
"False"
]
},
"execution_count": 3,
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -169,7 +158,8 @@
"template_3 = ChatPromptTemplate(\n",
" template=template,\n",
" input_variables=input_variables,\n",
" metadata=metadata\n",
" metadata=metadata,\n",
" client_parameters={\"temperature\": 0.5}\n",
")\n",
"print(template_3)\n",
"\n",
Expand Down
Loading

0 comments on commit f88a2eb

Please sign in to comment.