The 'snunkoople' effect in literature quantified. The what??

Credit: © flytoskyft11 / Fotolia 

The Snunkoople Effect: what makes a word funny?
Every writer at some time has wondered, what makes a word funny?  

Is it the unexpected usage a' la S. J. Perelman?  ("I once shot an elephant in my pajamas.  What it was doing in my pajamas I'll never know," a line he allegedly wrote for Groucho Marx.)  

Is it a straight up pun?

Is it absurd usage?

Is it the inspired nonsense of Lewis Carroll?

As one of the most vexing problems in modern literary science, it has finally been researched resulting in a scale that rates the funniness of a word in objective terms based on the word's inherent entropy.  I have no idea what that means, either, but you've come this far, you may as well plunge ahead.

I've long labored under the delusion that funniness is in the eye of the beholder.  Well, I'm wrong.  Despite the hilarity you may engender with your prose, it now has to pass the test of the The Snunkoople Effect.  Can we now expect some editor or producer to reject your best work because of a low Snunkoople score?  Despite your roommates rolling on the ground when you read it to them?  Or your mother responding, "that's nice, dear," when you read it to her?

I'm sorry, but despite being a nerd, I'm not comfortable with nerds taking over the language.  We got enough problems dealing with rigid miserable Latinate grammarians.  (You can always spot these folks.  Their computer screens are covered in blue pencil marks.)

Read at your own peril.
*  *  *  *  *

How funny is this word? The 'snunkoople' effect

How do you quantify something as complex and personal as humour? University of Alberta researchers have developed a mathematical method of doing just that -- and it might not be quite as personal as we think.

"This really is the first paper that's ever had a quantifiable theory of humour," says U of A psychology professor Chris Westbury, lead author of the recent study. "There's quite a small amount of experimental work that's been done on humour."

"We think that humour is personal, but evolutionary psychologists have talked about humour as being a message-sending device."

The idea for the study was born from earlier research in which test subjects with aphasia were asked to review letter strings and determine whether they were real words or not. Westbury began to notice a trend: participants would laugh when they heard some of the made-up non-words, like snunkoople.

It raised the question -- how can a made-up word be inherently funny?

The snunkoople effect
Westbury hypothesized that the answer lay in the word's entropy -- a mathematical measure of how ordered or predictable it is. Non-words like finglam, with uncommon letter combinations, are lower in entropy than other non-words like clester, which have more probable combinations of letters and therefore higher entropy.

"We did show, for example, that Dr. Seuss -- who makes funny non-words -- made non-words that were predictably lower in entropy. He was intuitively making lower-entropy words when he was making his non-words," says Westbury. "It essentially comes down to the probability of the individual letters. So if you look at a Seuss word like yuzz-a-ma-tuzz and calculate its entropy, you would find it is a low-entropy word because it has improbable letters like Z."

Inspired by the reactions to snunkoople, Westbury set out to determine whether it was possible to predict what words people would find funny, using entropy as a yardstick.

"Humour is not one thing. Once you start thinking about it in terms of probability, then you start to understand how we find so many different things funny."

For the first part of the study, test subjects were asked to compare two non-words and select the option they considered to be more humorous. In the second part, they were shown a single non-word and rated how humorous they found it on a scale from 1 to 100.

"The results show that the bigger the difference in the entropy between the two words, the more likely the subjects were to choose the way we expected them to," says Westbury, noting that the most accurate subject chose correctly 92 per cent of the time. "To be able to predict with that level of accuracy is amazing. You hardly ever get that in psychology, where you get to predict what someone will choose 92 per cent of the time."

People are funny that way
This nearly universal response says a lot about the nature of humour and its role in human evolution. Westbury refers to a well-known 1929 linguistics study by Wolfgang Köhler in which test subjects were presented with two shapes, one spiky and one round, and were asked to identify which was a baluba and which was a takete. Almost all the respondents intuited that takete was the spiky object, suggesting a common mapping between speech sounds and the visual shape of objects.

The reasons for this may be evolutionary. "We think that humour is personal, but evolutionary psychologists have talked about humour as being a message-sending device. So if you laugh, you let someone else know that something is not dangerous," says Westbury.

He uses the example of a person at home believing they see an intruder in their backyard. This person might then laugh when they discover the intruder is simply a cat instead of a cat burglar. "If you laugh, you're sending a message to whomever's around that you thought you saw something dangerous, but it turns out it wasn't dangerous after all. It's adaptive."

Just as expected (or not)
The idea of entropy as a predictor of humour aligns with a 19th-century theory from the German philosopher Arthur Schopenhauer, who proposed that humour is a result of an expectation violation, as opposed to a previously held theory that humour is based simply on improbability. When it comes to humour, expectations can be violated in various ways.

In non-words, expectations are phonological (we expect them to be pronounced a certain way), whereas in puns, the expectations are semantic. "One reason puns are funny is that they violate our expectation that a word has one meaning," says Westbury. Consider the following joke: Why did the golfer wear two sets of pants? Because he got a hole in one. "When you hear the golfer joke, you laugh because you've done something unexpected -- you expect the phrase 'hole in one' to mean something different, and that expectation has been violated."

The study may not be about to change the game for stand-up comedians -- after all, a silly word is hardly the pinnacle of comedy -- but the findings may be useful in commercial applications such as in product naming.

"I would be interested in looking at the relationship between product names and the seriousness of the product," notes Westbury. "For example, people might be averse to buying a funny-named medication for a serious illness -- or it could go the other way around."

Finding a measurable way to predict humour is just the tip of the proverbial iceberg. "One of the things the paper says about humour is that humour is not one thing. Once you start thinking about it in terms of probability, then you start to understand how we find so many different things funny. And the many ways in which things can be funny."

Related stories:

Humorous writing that fails the Snunkoople Test
WRITING, READING & TECHNIQUE
Story Source:  Materials provided by University of Alberta, original written by Kristy Condon.  Chris Westbury, Cyrus Shaoul, Gail Moroschan, Michael Ramscar. Telling the world’s least funny jokes: On the quantification of humor as entropy. Journal of Memory and Language, 2016.

Comments

Popular posts from this blog

Workout supplements: An emerging eating disorder in men?

Using Artificial Intelligence to Predict Violence. . . or to Control It?

Remember faces but not names? You got it wrong.

The Downside of Being 'Class Clown'

M.I.T. Research Explains How Will Rogers Observation Works