Question

Raymond Mooney is known for an expletive-laden quip arguing against using these objects. A log-bilinear model named for these objects attempts to learn the logarithm of the ratio of co-occurrence probabilities. A team at Stanford led by Jeffrey Pennington created a model named for “global” examples of these objects. A model whose name includes an abbreviation of these objects uses two contrasting approaches called CBOW and skip-gram. For maximum margin (*) separators, (-5[1])data that lie exactly on the margin are partly named for these objects. (10[2])A word embedding model partly named for these objects (10[1])is usually illustrated with the example of (-5[1])solving analogies. A classification technique that can use the kernel trick is named for the “support” type of these objects. (10[1])For 10 points, a semantic model beginning “word2” is named for what mathematical objects and might calculate similarity by taking their dot product? ■END■ (10[1])

ANSWER: vectors [accept support vectors; accept global vectors or GloVe; accept word2vec; accept support vector machines after “kernel trick” and reject beforehand; reject “arrays” or “lists”] (The Raymond Mooney quote is “You can't cram the meaning of a whole %&!$# sentence into a single $&!#* vector!”)
<JX>
= Average correct buzz position
Conv. %Power %Average Buzz
100%0%104.00

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Nathan NeequayeCarnegie LemonsWhy does ACF have electrons do its work?70-5
Andrew HunterA TV Guide for NetheadsJAX guide -league -of -legends -lol -mortal -kombat8310
Michael DuI Paused My Unique Game to Be HereEventually Munches All Computer Storage8310
Michał GerasimiukWhy does ACF have electrons do its work?Carnegie Lemons9210
Sam BraunfeldfooThe99-5
Geoffrey WuI thought this was a Counter-Strike themed tournamentComputer Science: Going Outside11910
Zac BennettThefoo14310