Log-softmax function: log_softmax(x) = log(softmax(x)) = x - log(sum(exp(x)))
Numerically more stable than computing log(softmax(x)) separately. Commonly used in cross-entropy loss computation.
Input tensor storage type
The axis along which to apply log-softmax (supports negative indexing)
type Logits = TensorStorage<Float32, [32, 1000], [1000, 1], DefaultLayoutFlags>;type LogProbs = LogSoftmax<Logits, -1>; // Log probabilities over classes Copy
type Logits = TensorStorage<Float32, [32, 1000], [1000, 1], DefaultLayoutFlags>;type LogProbs = LogSoftmax<Logits, -1>; // Log probabilities over classes
Log-softmax function: log_softmax(x) = log(softmax(x)) = x - log(sum(exp(x)))
Numerically more stable than computing log(softmax(x)) separately. Commonly used in cross-entropy loss computation.