Skip to content

Commit

Permalink
More notebook fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
josevalim committed May 28, 2024
1 parent 422cfea commit 92cb3d2
Show file tree
Hide file tree
Showing 4 changed files with 7 additions and 7 deletions.
2 changes: 1 addition & 1 deletion notebooks/k_nearest_neighbors.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ key = Nx.Random.key(42)
>
```

## Introduction
## Use cases

This notebook will cover the three primary applications of k-nearest Neighbors: classification, regression, and anomaly detection. Let's get started with a practical example. Imagine just moved to a new city, and you're here for the first time. Since you're an active person, you'd like to find a nearby gym with good facilities. What would you do? You'd probably start by searching for gyms on online maps. The search results might look something like this:

Expand Down
6 changes: 3 additions & 3 deletions notebooks/linear_regression.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Mix.install([
])
```

## Introduction
## Setup

In the livebook, we will cover the typical use cases of linear regression on practical examples.

Expand Down Expand Up @@ -50,7 +50,7 @@ key = Nx.Random.key(42)

<!-- livebook:{"branch_parent_index":0} -->

## Linear Regression on Synthetic Data
## Linear regression on synthetic data

Before we dive into real-life use cases of linear regression, we start with a simpler one. We will generate data with a linear pattern and then use `Scholar.Linear.LinearRegression` to compute regression.

Expand Down Expand Up @@ -225,7 +225,7 @@ x_b |> Nx.LinAlg.pinv() |> Nx.dot(y)

<!-- livebook:{"branch_parent_index":0} -->

## Polynomial Regression on Synthetic Data
## Polynomial regression on synthetic data

Before moving on to a more complex example, this section will briefly show how to use another regression method. While not strictly linear, the approach and calculations that go into it is similar enough that it makes sense for it to be explained alongside linear regression.

Expand Down
4 changes: 2 additions & 2 deletions notebooks/mds.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Nx.global_default_options(compiler: EXLA)

<!-- livebook:{"branch_parent_index":0} -->

## Swiss Roll
## Swiss roll

Multidimensional scaling (MDS) seeks a low-dimensional representation of the data in which the distances respect well the distances in the original high-dimensional space.

Expand Down Expand Up @@ -1081,7 +1081,7 @@ As we see, MDS collapsed one dimension and what we see is similar to a cross sec

<!-- livebook:{"branch_parent_index":0} -->

## Digits Dataset
## Digits dataset

In the next section we change dataset to Digits dataset. It consists of almost 1800 8x8 images with digits.

Expand Down
2 changes: 1 addition & 1 deletion notebooks/nearest_neighbors.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ x = Scholar.Preprocessing.StandardScaler.fit_transform(x)
{x, y}
```

Once the data is preprocessed, we can move on to the KDTree model. First, we need to set up the model using `Scholar.Neighbors.KDTree.fit`. This method initializes the tree structure for further processing.
Once the data is preprocessed, we can move on to the KDTree model. First, we need to set up the model using `Scholar.Neighbors.KDTree.fit/2`. This method initializes the tree structure for further processing.

```elixir
num_neighbors = 6
Expand Down

0 comments on commit 92cb3d2

Please sign in to comment.