Question

Raymond Mooney is known for an expletive-laden quip arguing against using these objects. (15[1])A log-bilinear model named for these objects attempts to learn the logarithm of the ratio of co-occurrence probabilities. A team at Stanford led by Jeffrey Pennington created a model named for “global” examples of these objects. A model whose name includes an abbreviation of these objects uses two contrasting approaches called CBOW and skip-gram. For maximum margin (*) separators, data that lie exactly on the margin are partly named for these objects. A word embedding model partly named for these objects (10[1])is usually illustrated (10[1])with the example of solving analogies. A classification technique that can use the kernel trick is named for the “support” (10[1])type of these objects. For 10 points, a semantic model beginning “word2” is named for what mathematical objects and might calculate similarity by taking their dot product? ■END■

ANSWER: vectors [accept support vectors; accept global vectors or GloVe; accept word2vec; accept support vector machines after “kernel trick” and reject beforehand; reject “arrays” or “lists”] (The Raymond Mooney quote is “You can't cram the meaning of a whole %&!$# sentence into a single $&!#* vector!”)
<JX>
= Average correct buzz position
Conv. %Power %Average Buzz
100%25%78.50

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Seth EbnerDianetics for Diabeticsplaying emacs while my parents are arguing1215
Rahul KeyalEdwardian Manifestation of All Colonial SinsMacro Editors9210
Swapnil GargWe Bought a Complexity Zoo StoryEight Megabytes And Constantly Swapping9510
Henry Cafaroscreaming into the public static void main(String[] args)a neural-net processor; a thinking machine11510