Skip to content

Commit

Permalink
Merge pull request #111 from NillionNetwork/feat/ai-vm
Browse files Browse the repository at this point in the history
AIVM Docs
  • Loading branch information
crypblizz8 authored Oct 23, 2024
2 parents 8d75452 + 59011cf commit f10914d
Show file tree
Hide file tree
Showing 5 changed files with 380 additions and 5 deletions.
188 changes: 188 additions & 0 deletions docs/aivm-reference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,188 @@
# API Reference

This document provides detailed information about all available functions and classes in the AIVM client library.

## Functions

### Model Management

#### `upload_model(model_path, model_name, model_type)`
Uploads a custom model to the AIVM server.

**Parameters:**
- `model_path` (str): Path to the model file
- `model_name` (str): Name to identify the model
- `model_type` (ModelType): Type of the model (must be one of the supported types)

**Example:**
```python
import aivm_client as aic

# Upload a custom model directly
aic.upload_model("path/to/model.pth", "MyCustomModel", aic.ModelType.LeNet5)
```

#### `upload_bert_tiny_model(model_path, model_name)`
Convenience function to upload a BertTiny model.

**Parameters:**
- `model_path` (str): Path to the model file
- `model_name` (str): Name to identify the model

**Example:**
```python
import aivm_client as aic

# Upload a custom BertTiny model
aic.upload_bert_tiny_model("path/to/bert_model.pth", "MyBertModel")
```

#### `upload_lenet5_model(model_path, model_name)`
Convenience function to upload a LeNet5 model.

**Parameters:**
- `model_path` (str): Path to the model file
- `model_name` (str): Name to identify the model

**Example:**
```python
import aivm_client as aic

# Upload a custom LeNet5 model
aic.upload_lenet5_model("path/to/lenet_model.pth", "MyLeNetModel")
```

### Inference

#### `get_prediction(inputs, model, model_type=None)`
Performs secure inference using encrypted inputs.

**Parameters:**
- `inputs` (ArithmeticSharedTensor): Encrypted input tensor
- `model` (str): Name of the model to use
- `model_type` (ModelType, optional): Type of the model if using raw cryptensor

**Returns:**
- Prediction result from the model

**Raises:**
- ValueError: If inputs are not of a supported cryptensor type and no model_type is provided

**Example:**
```python
import aivm_client as aic
import torch

# For LeNet5
input_tensor = torch.randn(1, 1, 28, 28)
encrypted_input = aic.LeNet5Cryptensor(input_tensor)
result = aic.get_prediction(encrypted_input, "LeNet5MNIST")

# For BertTiny
sentence = "Hello, this is a test message"
tokens = aic.tokenize(sentence)
encrypted_input = aic.BertTinyCryptensor(*tokens)
result = aic.get_prediction(encrypted_input, "BertTinySpam")
```

#### `get_supported_models()`
Retrieves a list of all models available for inference.

**Returns:**
- list: Names of supported models

**Example:**
```python
import aivm_client as aic

# Get list of available models
models = aic.get_supported_models()
print(f"Available models: {models}")
```

### Text Processing

#### `tokenize(sentence)`
Tokenizes input text for use with BertTiny models.

**Parameters:**
- `sentence` (str): Input text to tokenize

**Returns:**
- tuple: (input_ids, attention_mask) tensors

**Example:**
```python
import aivm_client as aic

# Tokenize text for BertTiny
sentence = "This is a sample text for sentiment analysis"
tokens = aic.tokenize(sentence)
encrypted_input = aic.BertTinyCryptensor(*tokens)
```

## Classes

### `BertTinyCryptensor`
Specialized cryptensor for BertTiny model inputs.

**Inherits from:** `ArithmeticSharedTensor`

**Parameters:**
- `*inputs` (torch.Tensor): Two tensors (tokens and attention_mask), each of shape (1, 128)

**Example:**
```python
import aivm_client as aic

# Create encrypted input for BertTiny
sentence = "Sample text for classification"
tokens = aic.tokenize(sentence)
encrypted_input = aic.BertTinyCryptensor(*tokens)
```

### `LeNet5Cryptensor`
Specialized cryptensor for LeNet5 model inputs.

**Inherits from:** `ArithmeticSharedTensor`

**Parameters:**
- `inputs` (torch.Tensor): Input tensor for the LeNet5 model

**Example:**
```python
import aivm_client as aic
import torch

# Create encrypted input for LeNet5
image = torch.randn(28, 28) # Sample input image
encrypted_input = aic.LeNet5Cryptensor(image)
```

## Complete Usage Example

Here's a complete example showing how to use the AIVM client for both BertTiny and LeNet5 models:

```python
import aivm_client as aic
import torch

# List available models
models = aic.get_supported_models()
print(f"Available models: {models}")

# Example with LeNet5
image_input = torch.randn(28, 28)
encrypted_image = aic.LeNet5Cryptensor(image_input)
digit_prediction = aic.get_prediction(encrypted_image, "LeNet5MNIST")

# Example with BertTiny
text_input = "This is a sample message for spam detection"
tokens = aic.tokenize(text_input)
encrypted_text = aic.BertTinyCryptensor(*tokens)
spam_prediction = aic.get_prediction(encrypted_text, "BertTinySpam")

# Upload custom models
aic.upload_lenet5_model("path/to/custom_lenet.pth", "MyCustomLeNet")
aic.upload_bert_tiny_model("path/to/custom_bert.pth", "MyCustomBert")
```
170 changes: 170 additions & 0 deletions docs/aivm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,170 @@
# Nillion AIVM

:::warning
[Nillion AIVM](https://github.com/NillionNetwork/nillion-aivm) is currently in early development. While functional, it may contain bugs and is not recommended for deployment in production or critical systems.
:::

[Nillion AIVM](https://github.com/NillionNetwork/nillion-aivm) is a secure inference platform for Deep Learning models based on Multi-Party Computation (MPC). It enables private model inference and custom model deployment while maintaining data confidentiality throughout the computation process. This documentation covers installation, supported models, and usage instructions.

## Supported Models

AIVM currently supports the following pre-trained models for specific learning tasks. You can either utilize these existing models or deploy your own custom-trained versions:

### BertTiny
- SMS Spam Classification
- Binary classification for detecting spam messages
- Input: Text string
- Output: Binary classification (spam/not spam)
- Movie Rating Sentiment Analysis
- Sentiment analysis for movie reviews
- Input: Text string
- Output: Sentiment score (-1 to 1)

### LeNet5
- Handwritten Digit Recognition (MNIST)
- Classification of handwritten digits
- Input: 28x28 grayscale image
- Output: Digit classification (0-9)
- Cats vs Dogs Classification
- Binary image classification
- Input: 28x28 grayscale image
- Output: Binary classification (cat/dog)

## Installation

Installing Nillion AIVM is straightforward:

1. Create a virtual environment:

```bash
python3 -m venv .venv
```

2. Activate the virtual environment:

On Linux/macOS:

```bash
source .venv/bin/activate
```

On Windows:

```bash
.\venv\Scripts\activate
```

3. Install the package:

```bash
pip install "nillion-aivm[examples]"
```

This command installs all necessary dependencies for performing secure inference on AIVM.

### Starting the Development Network

Launch the AIVM development network with:

```shell
aivm-devnet
```

This command starts a persistent process that manages the secure computation infrastructure. To stop the network, use `CTRL`+`C`.

:::info
**Note**: Ensure `aivm-devnet` is running before proceeding with the following examples.

If you get stuck on this with VSCode, ensure the correct venv is selected. If it asks to pip install your packages / no pip is found, you can use `pip install ipykernel -U --force-reinstall` to install it.
:::

## Performing Secure Inference

### Basic Usage

1. First, import the AIVM client and check available models:

```python
import aivm_client as aic

# List all supported models
available_models = aic.get_supported_models()
print(available_models)
```

2. Prepare your input data. Here's an example using PyTorch to generate a random input:

```python
import torch

# Create a sample input (e.g., for LeNet5 MNIST)
random_input = torch.randn((1, 1, 28, 28)) # Batch size 1, 1 channel, 28x28 pixels
```

3. Encrypt your input using the appropriate Cryptensor:

```python
# Encrypt the input
encrypted_input = aic.LeNet5Cryptensor(random_input)
```

4. Perform secure inference:

```python
# Get prediction while maintaining privacy
result = aic.get_prediction(encrypted_input, "LeNet5MNIST")
```

The `get_prediction` function automatically handles the secure computation protocol with the `aivm-devnet` nodes, ensuring that your input data remains private throughout the inference process.

## Custom Model Deployment

You can deploy your own trained models to AIVM, provided they follow the supported architectures (BertTiny or LeNet5).

### Uploading Custom Models

1. Import the AIVM client:

```python
import aivm_client as aic
```

2. Upload your custom model:

```python
# For BertTiny models
aic.upload_bert_tiny_model(model_path, "MyCustomBertTiny")

# For LeNet5 models
aic.upload_lenet5_model(model_path, "MyCustomLeNet5")
```

3. Perform inference with your custom model:

```python
# For BertTiny models
result = aic.get_prediction(private_berttiny_input, "MyCustomBertTiny")

# For LeNet5 models
result = aic.get_prediction(private_lenet5_input, "MyCustomLeNet5")
```

### Model Requirements

Custom models must meet these requirements:
- Follow the exact architecture of either BertTiny or LeNet5
- Be trained using PyTorch
- Use the same input dimensions as the original architectures
- Be saved in PyTorch's standard model format (.pth or .pt)

:::info
**Note**: Custom model names must be unique within your AIVM instance.
:::

### Next Steps
Now you can check out the [examples](https://github.com/NillionNetwork/nillion-aivm/tree/main/examples) folder and get started with your own fine-tuned nd custom models.

You can try:
- LeNet5 for digit classification
- BertTiny for spam detection
- BertTiny for tweet sentiment
9 changes: 8 additions & 1 deletion docs/nada-by-example/nada-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,14 @@ Nada-AI is a Python library designed for AI/ML on top of Nada DSL and Nillion Ne

To learn more about the library, check out the full [Nada-AI docs here](/nada-ai-introduction).

:::warning
If you are looking for Nillion AIVM, to run secure inference platform for Deep Learning models, visit [here.](../aivm.md)

Nada is for building on top of Nada DSL and Nillion Network.
:::

:::info

## Nada AI Limitations

The choice for blind computing implies certain trade-offs in comparison to conventional computing. What you gain in privacy, you pay in extra computational overhead & capacity constraints.
Expand All @@ -18,4 +25,4 @@ That said, the Nillion team is working around the clock to push the boundaries o

## Nada AI Examples with Google Colab Links

<DocCardList/>
<DocCardList/>
Loading

0 comments on commit f10914d

Please sign in to comment.