Question

A 2013 paper by Ian Goodfellow et al. introduced a technique named analogously to dropout in which layers that perform this operation are added to a neural network. The LogSumExp, or LSE, function essentially performs a smooth version of this operation. This is the most common operation used for pooling in convolutional (15[1])neural networks. The rectifier function, which is by far the most popular activation function in neural networks today, performs this operation on (*) 0 and the input. (-5[1])The L infinity norm (10[1])of a vector is equal (10[1])to this operation applied to its components. Generalizing the logistic function to multiple dimensions produces a function called “soft [this operation]”. For 10 points, name the larger of the two values that are subtracted to find the range of a dataset. ■END■ (10[1])

ANSWER: maximum [accept descriptions like “the biggest value”; accept softmax or argmax; accept supremum]
<AW>
= Average correct buzz position
Conv. %Power %Average Buzz
100%25%86.50

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Seth EbnerDianetics for DiabeticsEdwardian Manifestation of All Colonial Sins5115
Dan Niplaying emacs while my parents are arguinga neural-net processor; a thinking machine77-5
Earthflax GeologybuzzerMacro EditorsEight Megabytes And Constantly Swapping8110
Henry Cafaroscreaming into the public static void main(String[] args)We Bought a Complexity Zoo Story8610
Kevin Wanga neural-net processor; a thinking machineplaying emacs while my parents are arguing12810