Implement Softmax Function

0
easy

Given a 2D tensor of raw scores (logits) where each row represents a distinct data point and each column represents a class, implement the softmax function to transform the raw scores into probabilities. Ensure that the probabilities for each data point sum to 1. Return the resulting tensor of probabilities.

The softmax function is often used in the output layer of a neural network model to represent the probability distribution of the possible classes. Check hint for specifit steps.

Softmax(z)i=exp(zi)j=1Kexp(zj)\text{Softmax}(\mathbf{z})_i = \frac{\exp(z_i)}{\sum_{j=1}^K \exp(z_j)}

Examples:

1.0
2.0
3.0
2.0
4.0
6.0
0.0900
0.2447
0.6652
0.0159
0.1173
0.8668

Loading...
2024 © TensorGym
Contrinbute | Contact | Copyright | 𝕏