Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 640 Bytes

README.md

File metadata and controls

11 lines (9 loc) · 640 Bytes

BLOOM-Based Marketing Email Generation

Overview

This project fine-tunes a BLOOM-based language model for generating marketing emails. It uses advanced NLP techniques and model compression to optimize efficiency and performance.

Features

  • Model Fine-tuning: Utilized bigscience/bloom-1b7 for language modeling.
  • Dataset Preparation: Loaded synthetic data from AryanPrakhar/marketing_mail_data.
  • Training: Configured batch size, learning rate, and gradient accumulation.
  • Inference: Generated marketing emails for product descriptions.
  • Model Deployment: Published model and adapters on Hugging Face Hub.