Question
A 2013 paper by Ian Goodfellow et al. introduced a technique named analogously to dropout in which layers that perform this operation are added to a neural network. The LogSumExp, or LSE, function essentially performs a smooth version of this operation. This is the most common operation used for pooling in convolutional neural networks. The rectifier function, which is by far the most popular activation function in neural networks today, performs this operation on (*) 0 and the input. The L infinity norm of a vector is equal to this operation applied to its components. Generalizing the logistic function to multiple dimensions produces a function called “soft [this operation]”. For 10 points, name the larger of the two values that are subtracted to find the range of a dataset. ■END■
ANSWER: maximum [accept descriptions like “the biggest value”; accept softmax or argmax; accept supremum]
<AW>
= Average correct buzz position
Conv. % | Power % | Average Buzz |
---|
100% | 25% | 86.50 |
Back to tossups