Question

A 2013 paper by Ian Goodfellow et al. introduced a technique named analogously to dropout in which layers that perform this operation are added to a neural network. The LogSumExp, or LSE, function essentially performs a smooth version of this operation. This is the most common operation used (15[1])for pooling (15[1])in convolutional neural networks. (15[1])The rectifier function, which is by far the most popular activation function in neural networks today, performs this operation on the two arguments of (*) 0 and the input. The L infinity norm of a vector is equal to this (10[1])operation applied (10[1])to its components. Generalizing the logistic function to multiple dimensions produces a function called “soft [this operation]”. For 10 points, name the larger of the two values that are subtracted to find the range of a dataset. ■END■

ANSWER: maximum [accept descriptions like “the biggest value”; accept softmax or argmax; accept supremum]
<AW>
= Average correct buzz position
Conv. %Power %Average Buzz
100%60%67.00

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Liam KusalikI Paused My Unique Game to Be HereCarnegie Lemons4715
Andrew HunterA TV Guide for NetheadsComputer Science: Going Outside4915
Michał GerasimiukWhy does ACF have electrons do its work?I thought this was a Counter-Strike themed tournament5315
Eric ChenEventually Munches All Computer Storagefoo9210
David BassJAX guide -league -of -legends -lol -mortal -kombatThe9410