Skip to content

[Feature Request]: Intergrate with Vercel AI SDK #70

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
toan5ks1 opened this issue Feb 14, 2025 · 0 comments
Open

[Feature Request]: Intergrate with Vercel AI SDK #70

toan5ks1 opened this issue Feb 14, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@toan5ks1
Copy link

Problem Description

Would you consider integrating mlc-ai/web-llm with Vercel AI SDK to provide:

Easier API abstraction – Developers can easily interact with web-llm through Vercel’s AI tooling.
Automatic streaming & caching – Enhances performance while keeping inference efficient.
Better compatibility with Next.js – Many AI developers use Next.js & Vercel, so this integration would lower the barrier to entry.

Why This Could Help the Community

It could streamline browser-based AI deployments and encourage more developers to adopt WebGPU-powered LLMs.
Vercel AI SDK is already popular among AI developers, so native support for mlc-ai/web-llm would make integration much smoother.
A collaboration between MLC AI & Vercel AI SDK could push WebGPU adoption forward in the AI space.

Would love to hear your thoughts! Thanks again for all the amazing work you do. 😊

Solution Description

I share my project here as an example of what’s possible:
🔗 Deepseek Local on Web GPU - Vercel AI SDK
🔗 Demo

Alternatives Considered

No response

Additional Context

No response

@toan5ks1 toan5ks1 added the enhancement New feature or request label Feb 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant