A flexible Laravel application that integrates with various Language Model (LLM) providers like OpenAI, Anthropic, and Deepseek for content generation. This application provides a modern interface for generating different types of content using various AI models.
- 🤖 Support for multiple LLM providers (OpenAI, Anthropic, Deepseek)
- 📝 Generate different types of content (blog posts, social media, emails)
- 🎨 Modern and responsive user interface
- ⚡ Real-time parameter adjustments
- 🔄 Extensible architecture for adding new providers
- 📊 Content history and management
- ⚙️ Configurable generation parameters
- PHP 8.2 or higher
- Laravel 11.x
- Composer
- MySQL/PostgreSQL
- API keys for LLM providers
- Clone the repository:
git clone https://github.com/iritmaulana/content-assistant.git
cd content-assistant
- Install dependencies:
composer install
- Copy environment file and configure:
cp .env.example .env
- Set up your database and LLM API keys in
.env
:
DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=content_assistant
DB_USERNAME=root
DB_PASSWORD=
OPENAI_API_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
DEEPSEEK_API_KEY=your_deepseek_key
- Generate application key:
php artisan key:generate
- Run migrations:
php artisan migrate
- Start the development server:
php artisan serve
app/
├── Interfaces/
│ └── LLMProviderInterface.php
├── Services/
│ ├── LLM/
│ │ ├── DeepseekProvider.php
│ │ ├── OpenAIProvider.php
│ │ └── AnthropicProvider.php
│ └── LLMFactory.php
├── Http/
│ └── Controllers/
│ └── ContentController.php
└── Models/
└── Content.php
- Create a new provider class in
app/Services/LLM/
:
namespace App\Services\LLM;
use App\Interfaces\LLMProviderInterface;
class NewProvider implements LLMProviderInterface
{
public function generateContent(string $prompt, array $options = []): array
{
// Implementation
}
}
- Add provider configuration in
config/llm.php
:
'new_provider' => [
'api_key' => env('NEW_PROVIDER_API_KEY'),
'base_url' => env('NEW_PROVIDER_BASE_URL'),
],
- Register the provider in
LLMFactory.php
:
public static function make(string $provider): LLMProviderInterface
{
return match ($provider) {
'new_provider' => new NewProvider(),
// ... other providers
};
}
All LLM providers must return responses in this format:
[
'success' => bool,
'content' => string,
'provider' => string,
'error' => string|null
]
class ContentController extends Controller
{
public function store(Request $request)
{
$llm = LLMFactory::make($request->provider);
$result = $llm->generateContent($request->prompt, [
'max_tokens' => $request->max_tokens,
'temperature' => $request->temperature
]);
// Handle result
}
}
$provider = LLMFactory::make('openai');
$result = $provider->generateContent(
"Write a blog post about AI",
['max_tokens' => 1000]
);
- Blog Posts
- Social Media Posts
- Email Templates
Each type can be customized with:
- Title
- Prompt
- Generation parameters (temperature, max tokens)
- Provider selection
Key configuration options in config/sercives.php
:
return [
'openai' => [
'api_key' => env('OPENAI_API_KEY'),
],
'anthropic' => [
'api_key' => env('ANTHROPIC_API_KEY'),
],
'deepseek' => [
'api_key' => env('DEEPSEEK_API_KEY'),
'base_url' => env('DEEPSEEK_BASE_URL'),
],
];
- Fork the repository
- Create a feature branch:
git checkout -b feature/new-feature
- Commit changes:
git commit -am 'Add new feature'
- Push to the branch:
git push origin feature/new-feature
- Submit a pull request
Please do not include any API keys in your code or pull requests. Always use environment variables for sensitive data.
- Laravel team for the amazing framework
- All LLM providers for their APIs
- Bootstrap team for the UI framework
For issues and feature requests, please use the GitHub issue tracker.
- Add support for more LLM providers
- Implement caching for API responses
- Add batch processing capability
- Create API documentation
- Add more content types
- Implement user feedback system