Skip to content

Explore the power of LLMs. Learn how to scan and detect fake and unsafe content in reviews, and discover how AI is applied to keep reviews trustworthy and the platform secure.

License

Notifications You must be signed in to change notification settings

pyladiesams/llms-scan-reviews-nov2024

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LLMs in Action: How Booking.com Scans, Detects, and Monitors Fake and Unsafe Content

Workshop description

Explore the power of LLMs in this hands-on workshop featuring real use cases from Booking.com. Learn how to scan and detect fake and unsafe content in reviews, and discover how AI is applied to keep reviews trustworthy and the platform secure.

The workshop will be divided into two parts. The first part will introduce the business use case and explain how LLMs come to the rescue in detecting and preventing fake content. In the second part, participants will get hands-on experience working with script-generated reviews, gaining an overview of relevant models from Hugging Face, and training a model specifically for this purpose. You’ll also learn practical applications using Hugging Face's powerful transformers library.

Whether you're interested in LLMs, online safety, or the mechanics of content moderation, this session offers an in-depth look at the cutting-edge tools that help maintain a safe digital environment.

Requirements

  • Google Colab with GPU.

Setting Up Notebook for Part II (Hands-On)

Download the files from Google Drive and upload them to your Google Drive. It can take some time as you need to download around 4GB. It contains a huge dataset and some pre-trained model weights. Hence the size.

Then:

  1. Open the Notebook in Google Colab.
  2. Enable GPU Runtime:
  • Go to Runtime > Change runtime type in the Colab menu
  • Set Hardware accelerator to GPU
  • Click Save
  1. Mount your Drive. This can be done with a cell:
from google.colab import drive
drive.mount('/content/drive')
  1. Add the Drive files to the Colab runtime:
import sys
sys.path.append('/content/drive/MyDrive/<PATH_TO_YOUR_FILES>')

Credits

This workshop was set up by @pyladiesams and @flozefi.

About

Explore the power of LLMs. Learn how to scan and detect fake and unsafe content in reviews, and discover how AI is applied to keep reviews trustworthy and the platform secure.

Topics

Resources

License

Stars

Watchers

Forks