Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestions for fastapi support #3

Open
StudyingLover opened this issue Apr 4, 2024 · 12 comments
Open

Suggestions for fastapi support #3

StudyingLover opened this issue Apr 4, 2024 · 12 comments

Comments

@StudyingLover
Copy link

I often use FastAPI to build the backend of my applications, so I was excited to see that I can write code using Python and FastAPI with Cloudflare Workers. However, after using it, I don't feel as comfortable as before.

For example, if I want to access my D1 database, I need to define the function's parameters as req: Request and use env = req.scope["env"] to get the env. Doing so means FastAPI can't help me validate my parameters, which is one of the main advantages of using it. Additionally, the lack of type hints makes development harder for me.

On top of that, the absence of an ORM for D1 makes it quite troublesome to use, do you have any plans to develop an ORM?

@kflansburg
Copy link

We have plans for an SDK to make things more pythonic. I think that would be the best place to improve the FastAPI ergonomics for bindings.

Regarding ORM, I think it would make sense to try to support Sqlalchemy. Is that too opinionated?

@hoodmane
Copy link

hoodmane commented Apr 4, 2024

Pyodide already includes sqlalchemy. If we add raw socket support they should work very well together.

@dom96
Copy link
Contributor

dom96 commented Apr 4, 2024

If we want to make SQLAlchemy work with D1 then we'd need to implement a Dialect for it.

If we add socket support then we can support hyperdrive in SQLAlchemy pretty much for free, and support hosted databases that way. But this won't work with D1.

Additionally, the lack of type hints makes development harder for me.

Yeah, I'd really love to see us offering type definitions like we do for TypeScript for our APIs. It's already on our todo list.

@kflansburg
Copy link

Agreed, wasn't sure what it was called, but Dialect is what I had in mind. Would be great to support both.

@Qxxxx
Copy link

Qxxxx commented Apr 16, 2024

As this issue discussion environment variable, I also have some confusion.

  • Is that possible to get the env variable file wise, instead of getting this only in fetch function, like following code
model = ChatOpenAI(model="gpt-3.5-turbo", temperature=0,openai_api_key=env.OPENAI_API_KEY)

async def on_fetch(request, env):
  prompt = PromptTemplate.from_template("Complete the following sentence: I am a {profession} and ")
  chain = prompt | model

  res = await chain.ainvoke({"profession": "electrician"})
  return Response.new(res.split(".")[0].strip())
  • As this is serverless function, does it still make sense to create variable/object outside of fetch function?

@hoodmane
Copy link

hoodmane commented Apr 16, 2024

Well we do have a way to get the env parameter like

async def on_fetch(request, env: Env):
  ...

@Qxxxx
Copy link

Qxxxx commented Apr 16, 2024

I may not express myself clearly, I want to get the env parameter outside of on_fetch is that possible?

@StudyingLover
Copy link
Author

StudyingLover commented Apr 16, 2024 via email

@hoodmane
Copy link

os.getenv requires all of the values to be strings, if we try to put anything else in there we get a type error. Since many of the bindings aren't strings, I don't think this is a good approach.

@StudyingLover
Copy link
Author

StudyingLover commented Apr 16, 2024 via email

@Qxxxx
Copy link

Qxxxx commented Apr 16, 2024

As this issue discussion environment variable, I also have some confusion.

  • Is that possible to get the env variable file wise, instead of getting this only in fetch function, like following code
model = ChatOpenAI(model="gpt-3.5-turbo", temperature=0,openai_api_key=env.OPENAI_API_KEY)

async def on_fetch(request, env):
  prompt = PromptTemplate.from_template("Complete the following sentence: I am a {profession} and ")
  chain = prompt | model

  res = await chain.ainvoke({"profession": "electrician"})
  return Response.new(res.split(".")[0].strip())
  • As this is serverless function, does it still make sense to create variable/object outside of fetch function?

how about the second question?
If I have an object which will be used across different request function? Shall I initialize this object outside of request or initialize this object in every request function? Since It is serverless function, I don't know what's the best practice here.

@hoodmane
Copy link

I think initializing at top level is fine for most purposes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants