Standard softmax function: softmax(x) = exp(x) / sum(exp(x))
Input tensor storage type
The axis along which to apply softmax (supports negative indexing)
type Input = TensorStorage<Float32, [32, 10], [10, 1], DefaultLayoutFlags>;type Result = Softmax<Input, -1>; // Softmax over classes (last dimension) Copy
type Input = TensorStorage<Float32, [32, 10], [10, 1], DefaultLayoutFlags>;type Result = Softmax<Input, -1>; // Softmax over classes (last dimension)
type AttentionScores = TensorStorage<Float32, [32, 8, 128, 128], [131072, 16384, 128, 1], DefaultLayoutFlags>;type AttentionWeights = Softmax<AttentionScores, -1>; // Softmax over key sequence Copy
type AttentionScores = TensorStorage<Float32, [32, 8, 128, 128], [131072, 16384, 128, 1], DefaultLayoutFlags>;type AttentionWeights = Softmax<AttentionScores, -1>; // Softmax over key sequence
Standard softmax function: softmax(x) = exp(x) / sum(exp(x))