Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add CloudFlare models helper #161

Open
hadley opened this issue Nov 18, 2024 · 5 comments
Open

Add CloudFlare models helper #161

hadley opened this issue Nov 18, 2024 · 5 comments

Comments

@hadley
Copy link
Member

hadley commented Nov 18, 2024

https://developers.cloudflare.com/workers-ai/models/

@durraniu
Copy link

durraniu commented Mar 6, 2025

Thanks for creating ellmer. I am trying to use the Cloudflare Workers AI Models but couldn't figure out how to configure chat_openai. Here is a working example with httr2:

ACCOUNT_ID <- Sys.getenv("ACCOUNT_ID")
API_KEY <- Sys.getenv("API_KEY")
max_tokens <- 1000
base_url = "https://api.cloudflare.com/client/v4/accounts/"
model = "@cf/meta/llama-3.3-70b-instruct-fp8-fast"
url <- paste0(base_url, ACCOUNT_ID, "/ai/run/", model)

system_prompt <- "You tell 5-sentence stories"
prompt <- "Tell me a story about a turtle"

response <- httr2::request(url) |>
  httr2::req_headers(
    "Authorization" = paste("Bearer", API_KEY)
  ) |>
  httr2::req_body_json(list(
    max_tokens = max_tokens,
    messages = list(
      list(role = "system",
           content = system_prompt),
      list(
        role = "user",
        content = prompt
      )
    ))) |>
  httr2::req_method("POST") |>
  httr2::req_error(is_error = \(resp) FALSE) |>
  httr2::req_perform() |>
  httr2::resp_body_json()

response$result$response
# > response$result$response
# [1] "In a small pond, a turtle named Finley lived a peaceful life surrounded by water lilies and lazy fish. One day, while swimming near the surface, Finley spotted a shiny object lying on the bottom of the pond. Curiosity getting the better of him, Finley dove down to investigate and found an old harmonica buried in the mud. As he swam back to the surface, Finley began to play a lively tune on the harmonica, and the other pond creatures gathered around to listen. From that day on, Finley became known as the pond's resident musician, and his harmonica playing brought joy to all who lived there."

This is what I tried with ellmer:

library(ellmer)

chat <- chat_openai(
  base_url = paste0(base_url, ACCOUNT_ID, "/ai/run/"),
  api_key = API_KEY,
  system_prompt = system_prompt,
  model = model
)
> live_console(chat)
╔═══════════════════════════════╗
║ Entering chat console.        ║
║ Use """ for multi-line input. ║
║ Type 'Q' to quit.             ║
╚═══════════════════════════════╝
>>> Tell me a story
Error in `req_perform_connection()`:
! HTTP 400 Bad Request.

Is it possible to use chat_openai with Cloudflare at the moment?

@gadenbuie
Copy link
Contributor

gadenbuie commented Mar 7, 2025

From OpenAI compatible API endpoints · Cloudflare Workers AI docs, it looks like the endpoint for the OpenAI compatible API is /ai/v1/.

It almost works, but it looks like Cloudflare omits the role argument which causes an error in ellmer.

library(ellmer)

base_url = "https://api.cloudflare.com/client/v4/accounts/"
model = "@cf/meta/llama-3.3-70b-instruct-fp8-fast"

chat <- chat_openai(
  base_url = paste0(base_url, Sys.getenv("CLOUDFLARE_ACCOUNT_ID"), "/ai/v1/"),
  api_key = Sys.getenv("CLOUDFLARE_API_KEY"),
  model = model
)

chat$chat("Write a haiku about pizza")
#> Melty cheesy bliss
#> Flavors dancing on my tongue
#> Pizza's sweet
#> Error: <ellmer::Turn> object properties are invalid:
#> - @role must be <character>, not <NULL>

Btw, I created an account following these instructions and was able to run the above on the free tier.

@gadenbuie
Copy link
Contributor

@hadley what do you think about using "assistant" as the default role, either at the end of method(value_turn, ProviderOpenAI) or more generally?

Turn(message$role, content, json = result, tokens = tokens)

A very simple fix for this case is

- Turn(message$role, content, json = result, tokens = tokens)
+ Turn(message$role %||% "assistant", content, json = result, tokens = tokens)

but maybe that's a generally reasonable assumption that could be made in Turn?

@hadley
Copy link
Member Author

hadley commented Mar 8, 2025

@gadenbuie yeah, I think it's fine to fix that in value_turn. I'm not sure I can say why, but it feels better to fix at that level than in the Turn constructor.

If you do make that fix, it would be worth taking a look to see if other value_turn() methods for classes that extend ProviderOpenAI could be removed/simplified.

@hadley
Copy link
Member Author

hadley commented Mar 8, 2025

(I think it's still worth having a chat_cloudflare() even if it's mostly just for advertising purposes)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants