Skip to content

Commit

Permalink
update Lux Linear Regression
Browse files Browse the repository at this point in the history
  • Loading branch information
NeroBlackstone committed Oct 28, 2024
1 parent 7168262 commit 7bb4a63
Showing 1 changed file with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
@@ -147,7 +147,7 @@
"\n",
"For standard operations, we can use a framework’s predefined layers, which allow us to focus on the layers used to construct the model rather than worrying about their implementation. Recall the architecture of a single-layer network as described in Fig. 3.1.2. The layer is called fully connected, since each of its inputs is connected to each of its outputs by means of a matrix-vector multiplication.\n",
"\n",
"In Flux, a `Dense(2 => 1)` layer denotes a layer of one neuron with two inputs (two feature) and one output. "
"In Lux, a `Dense(2 => 1)` layer denotes a layer of one neuron with two inputs (two feature) and one output. "
]
},
{
@@ -187,7 +187,7 @@
"id": "c717e1ec-be46-4e88-b238-5a3b3dbc745a",
"metadata": {},
"source": [
"The `Flux.mse` function computes the mean squared error. By default, `Flux.mse` returns the average loss over examples. It is faster (and easier to use) than implementing our own."
"The `MSELoss()` function computes the mean squared error. By default, `MSELoss()` returns the average loss over examples. It is faster (and easier to use) than implementing our own."
]
},
{

0 comments on commit 7bb4a63

Please sign in to comment.