Question
Raymond Mooney is known for an expletive-laden quip arguing against using these objects. A log-bilinear model named for these objects attempts to learn the logarithm of the ratio of co-occurrence probabilities. A team at Stanford led by Jeffrey Pennington created a model named for “global” examples of these objects. A model whose name includes an abbreviation of these objects uses two contrasting approaches called CBOW and skip-gram. For maximum margin (*) separators, data that lie exactly on the margin are partly named for these objects. A word embedding model partly named for these objects is usually illustrated with the example of solving analogies. A classification technique that can use the kernel trick is named for the “support” type of these objects. For 10 points, a semantic model beginning “word2” is named for what mathematical objects and might calculate similarity by taking their dot product? ■END■
ANSWER: vectors [accept support vectors; accept global vectors or GloVe; accept word2vec; accept support vector machines after “kernel trick” and reject beforehand; reject “arrays” or “lists”] (The Raymond Mooney quote is “You can't cram the meaning of a whole %&!$# sentence into a single $&!#* vector!”)
<JX>
= Average correct buzz position
Conv. % | Power % | Average Buzz |
---|
100% | 0% | 104.00 |
Back to tossups