|
Baremetal-NN
Baremetal-NN API documentation
|
| void nn_silu1d_f32 | ( | Tensor1D_F32 * | y, |
| const Tensor1D_F32 * | x | ||
| ) |
Applies the SiLU (Sigmoid Linear Unit) activation function to a 1D floating-point tensor.
nn_silu1d_f32
y[i] = x[i] * sigmoid(x[i])
| y | The result tensor. |
| x | The input tensor. |