The Demise of Football?

Wednesday, May 02, 2012

In an interview at Slate, best-selling author Malcolm Gladwell of The New Yorker predicts the demise of college and professional football as we know it.

The factor that I think will be decisive is the head-injury issue. Colleges are going to get sued, and they will have to decide whether they can afford their legal exposure. That said, the issue ought to be how big-time college sports subverts the academic mission of university education.

...

[B]oxing and horseracing didn't end. They have persisted, just in vastly less popular forms than before. They have gone into slow and irreversible decline. I suspect that the same will happen with football. It's going to wither as the supply of talent slowly dries up. I heard on ESPN Michael Wilbon--who is one of the most influential sports journalists in the country--say that he will not let his kids play pro football. If Wilbon won't, who will?
This may be, but what I find more interesting is that Gladwell's observations come out against the backdrop of an argument he plans to make that college football should be banned -- or at least that's what Slate claims he'll argue. But observing that, in today's increasingly risk-averse, paternalistic culture -- and litigious and yet responsibility-free legal environment -- that a sport millions enjoy is in danger isn't really the same thing as saying that it should be banned. Indeed, Gladwell later says the following:
If you want college athletes to assume an as yet unknown risk of permanent physical and neurological damage, you should pay them. Properly. It's a bit much both to maim AND exploit college football players.
I don't think Gladwell is exactly a laissez-faire capitalist, but I think he stumbles into something close to what would (and should) happen with college football in a free economy: It would become -- completely, that is -- lower-tier professional football. Whatever Gladwell ends up arguing, I doubt it will be anything like, "Get the government out of the businesses of education and professional sports," but his prediction is worth noting.

I once observed that you could probably scratch many conservatives and draw pro-big government blood when it came to education (and college sports). This would happen once you connected the dots and made it clear that their college football teams (and, probably many of their almae matres) might receive much less funding, if indeed they continued to exist at all, under capitalism. But Gladwell's prediction serves as something of an antidote by highlighting the dark side of government support: government meddling.

A government big enough to grant largesse is also big enough to take it (and more) away. Football may appear to benefit from its improper association with an educational system the government shouldn't be running, but the same excuse, need, that drives the government's involvement in education drives the redistributionism in the legal system and the elevation of risk assessment from a personal affair to an excuse to confiscate money, issue orders, or both.

Ultimately, more freedom might temporarily be painful until professional football (including college football) adjusted to complete self-support, but it would allow the sport to survive.

-- CAV

20 comments:

Steve D said...

If college football is going to die anyway than why would anyone suggest banning it?
The only answer I can come up with is the raw exercise of power.
‘It would become -- completely, that is -- lower-tier professional football.’
But money isn’t the only contributing factor. There is a lot of prestige involved in college sports, especially football. So the college football players do get something out of it; fame and a chance at a professional career where they could make millions (and possibly a free education if they choose to take it). I could imagine a scenario under capitalism where nothing much would change – the prestige would carry the day. However, you might expect other football leagues to form in competition…

Jennifer Snow said...

There's not a significantly worse risk of injury in football than any organized sport. Heck, people who run simply for exercise often wind up in the hospital with severe injuries. People even die.

If you're going to ban college football on the basis that it poses an "unacceptable" risk, you're going to wind up banning EVERY college sport by logical extension. If that dude does give a talk on banning college football, I hope someone shows up with a comprehensive list of common track injuries and some numbers on how many athletes suffer from them every year.

Oh, and it's alma maters

Gus Van Horn said...

Steve and Jenn,

Good points, both. And yes, I'd love for someone to educate Gladwell on running injuries.

Regarding the plural of alma mater, we're both right. Mine has the added advantage of being correct in Latin.

It's interesting how, when risk becomes a collective concern, there is suddenly zero tolerance for it.



Gus

z said...

What would be interesting is if in the next few decades there happened to be advances in brain injury treatments on par with the improvement of knee treatments over the last 30-40 years. It used to be that football players would retire after a knee injury that tore ligaments. Not sure what it would consist of, but I imagine a better understanding of brain medicine would save football from what you're describing. And then we could say, "back in my day, a brain injury and they'd have to keep you out for a few games, or even the rest of the season." Maybe I've been watching too many TED talks?!

Gus Van Horn said...

Heh! I think some kind of mitigating treatment is possible in the time frame you describe. I think it's hard enough problem that it would take that long. Coincidentally, this is a similar time frame to the length of time it would take to achieve the kind of cultural change that would remove politics as an existential threat to the game.

Jennifer Snow said...

I'm not sure it's *possible* to have a correct Latin term in an English sentence. After all, since it's part of the object in that sentence it ought to be in the accusative case anyway.

Hence why you should pluralize it as if it were an English term.

Gus Van Horn said...

So how should I pluralize minimum? Minima or mimimums? What if I speak of a minimum's units of measure? (or the units of measure for several minima) Should I use the genitive case?

English is inconsistent about how it handles foreign plurals, and when there is a choice, as there is here, I prefer the Latin.

Snedcat said...

Jennifer Snow: "I'm not sure it's *possible* to have a correct Latin term in an English sentence. After all, since it's part of the object in that sentence it ought to be in the accusative case anyway."

Well, actually, to be persnickety about it, it's a partitive genitive modifying the subject of an object clause, so it "should" be "many of their almarum matrum" or something like that with effectively redundant "of," or, biting the bullet, "many almarum matrum suarum." Which looks ghastly, but C.S. Lewis did exactly that in one of his books on medieval literature, in which he wrote phrases like "The image horti conclusi..."

So the question is, do we ban all distinctively Latin forms or do we go whole hog on Latin? We could do the former, though you'd have to surgically mutilate English with an empowered academy or the like, and it would be painful to attempt the latter. But surely there's no rational dividing line, is there?

Actually, there is. English doesn't have much in the way of a case system, for there are only distinct case forms for pronouns. (Possessive s is not strictly speaking a case ending any longer, unlike the possessive case in the pronouns, since it actually gets stuck on the last word in the possessor noun phrase, not the head noun in the phrase: The King of England's horse, not The King's horse of England, which in fact was the appropriate form of the phrase a few centuries ago when possessive s was a case ending. Instead, it's a possessive particle, or to use the appropriate linguistic terminology, it's a possessive proclitic, meaning an independent meaningful unit that contracts with or fuses with a preceding word; Latin que 'and' is a further example, and classical Greek was full of the suckers.)

So there's no call for importing the case forms of a Latin word into English except in fields like astronomy or zoology where Latin is required as part of the institution of that branch of science. (So in zoology, a description of a new species is accepted as official only when a thorough desciption of the species has been published in Latin, since that makes it fully accessible to all zoologists of all countries, or notionally so at least.) The plural, however, is fully alive and well, and so there's a reason to import the plural forms used in the source language if one wants to do so.

On the other hand, the question of how much of the case system of one language to borrow into another is a live issue in languages with a strong case system. For medieval Latin, many Greek words were borrowed in that way, slotted into the corresponding Latin declension through matching of principal parts, and in German relics remain of distinctly Latin cases in the use of such phrases as Jesus Christus, though the distinctly Latin cases actually used in German have dimiinished during the last two centuries. Similarly, Russian has a fun old time with trying to fit English words into appropriate declensions. For the most part this is easy, since most Russian declensions are based purely on the pronunciation, but not always...and foreign last names are especially difficult because Russian last names typically have a mixed nominal-adjectival declension; the result is that men's last names get declines while all women's last names that are not native adjective forms are invariant.

Snedcat said...

Yo, Gus, you write: "English is inconsistent about how it handles foreign plurals, and when there is a choice, as there is here, I prefer the Latin."

I prefer the choice most accepted by relevant native speakers for the particular context, with the option reserved of disapproving of a particular usage that comes across as ugly or illiterate. For many technical nouns there's a native plural in s for less technical use and a Greco-Latin plural for more technical use--schemas versus schemata, for example. This distinction is felicitous, and it's distinct from the semi-literate pseudointellectual bastard Latin so many allegedly intelligent adults with white-collar union cards (a.k.a. BA's) like to use to feel smugly superior to the great unwashed: Basically, people who seem to think Latin had only two declensions and so use tempi and corpi as the plurals of tempus and corpus--when I hear that I think to myself, "It's Latin, not Italian."

And the botching of Greek plurals in Greco-Latin terminology is truly sad, though understandable, since Greek had an even greater variety of third-declension forms than Latin did, and often with much less transparent stem formation. Rhinoceros (note the Greek ending -os, which in fact has omega, not omicron, so it's not even mistakable by accident for a second-declension noun in Greek) has the Greco-Latin plural rhinocerotes (used by Strabo, for example, so it's not just Greco-Latin), which is also I think the accepted plural in zoology; outside of that field, the proper plural is rhinoceros or rhinoceroses, or even rhinos, but not rhinoceri, which is an excrescence of ignorance.

In the long term, yes, perhaps that sort of plural form will win out, but as a native speaker of English my vote counts too.

(And I should add I am much more forgiving of its use in the mouth of the same undereducated folk who backform the singulars matricee, thesee,, and parenthesee than I am of supercilious miseducated dolts who misuse Latin to give themselves an unearned patina of learning. The former are just trying to use their mother tongue; the latter are aiming for a much higher level of game, and as such should be judged at an appropriate level--and in saying corpi or rhinoceri they're tripping on their untied shoelaces on the way to the playing field. Latin--learn it and love it or else leave it out entirely, I say.)

Gus Van Horn said...

Snedcat,

I always enjoy hearing your take on such matters.

My "favorite" backform is "bicep", the supposed singular of "biceps".

Your mention of "semi-literate pseudointellectual bastard Latin" (closely related to the Starbucks Esperanto dialect identified by Joe Queenan) reminds me of a funny Latin sentence I am waiting for the right occasion to use, and whose meaning is, "Whatever is said in Latin seems profound."

Gus

Snedcat said...

Yo, Gus, you write: "I always enjoy hearing your take on such matters." It's an interesting subject, especially for someone like me who is a professional editor and translator on the one hand and a professional linguist on the other. There is of course a lot of ugly stereotyping back and forth, that descriptivists are devoted to the destruction of culture through a commitment to the equality of the learned and the ignorant, or that prescriptivists are too busy strapping English into the procrustean bed of Latin to understand what the actual rules of English usage are, but those are hysterical exaggerations with quotes from a token atypical figure or two thrown in. Many prescriptivists know English quite well, and few descriptivists let grammatical errors off lightly in assignments. In fact, the basic difference is that descriptivists focus on speech, prescriptivists on writing, with the additional fact that descriptivists typically work on languages for which written grammars are not available. In that circumstance what is necessary above all is to describe the facts of language use accurately, and any important language attitudes (what is colloquial, what is considered substandard, etc.) described objectively. Prescriptivists (or at least the thoughtful ones, of whom there are many) fully recognize the reality of language change and are quite willing to accept new usages, provided they do not impair the expression of thought by conflicting with accepted usage, spreading ambiguity, and the like—and this requires a solid descriptive knowledge of the language.

The problem is that language attitudes can be deeply emotional, and for many less thoughtful prescriptivists proper usage is accepted essentially unthinkingly—it is considered more beautiful, logical, or what have you, and questioning it or even examining it descriptively draws a negative reaction. (Especially amusing is when this less thoughtful prescriptivist then sticks a foot in it, such as an elementary school English teacher I remember railing against lax usage and concluding that the schools have to teach students to "pronunciate" their words correctly.) I'll leave the question of beauty to the beholder and focus on "logical" language use. In fact, language radically translates aspects of the world into a fundamentally dissimilar medium, and it's no more "logical" in the sense of more closely reflecting the structure of the world to have adjectives precede or follow nouns they modify, for example, to have verbs precede or follow their subjects, and so on; the structure of language is arbitrarily (better, conventionally) related to the structure of the world. If you dig into the examples given of what is "logical" in proper usage, it's hard to see exactly what the term means—it's not clear-cut but seems in general to mean "consistent." However, some forms of proper usage are more consistent than solecisms and others are less consistent, or more or less expressive in some sense. For example, is it more consistent with the system of English to say "myself" but "himself" or "myself" and "hisself"? In that case standard usage is less consistent. On the other hand, the prohibition on double negatives allows for a more flexible and expressive use of indefinite pronouns and the like—in this case standard usage takes the win.

[Part I cut off here for length]

Snedcat said...

[Part II]
This issue is in fact distinct from what is accepted, and the basic issue is not whether language is "logical" but that language is a conventional system of associations between sound and sense (conventional not in the sense that you consciously ratify what is approved usage on reaching puberty or adulthood or whatever, but in the sense that you use the basic rules in speaking, and have every reason to expect other speakers to understand the rules and follow them as well—of course, the whole terminology of "rule" is a little inapt there if you haven't actually followed a rule consciously but simply acted in such a way that the regularities in your speech are fully described thereby, but we can leave that aside). Proper usage codifies those conventions, and those conventions are the basis of and justification for proper usage, the same as the conventions governing social interactions; this is especially true for writing, for which conscious attention should be paid to every action taken and decision made in it until it becomes habit (unlike speech, which is for the greatest part learned unconsciously), for it has distinct cognitive characteristics (visual patterning and the like) and different expectations of the intended audience than speech that make "write like you talk" extraordinarily ineffective advice for the student.

It is true that a descriptivist is likely to be forgiving of non-standard usage, especially in speech, in the same way that someone who has lived in another country is likely to be more forgiving of lapses in politeness or etiquette, but it's a stereotype and an unfair one to suggest that a thoughtful prescriptivist won't or can't be, or that a descriptivist is opposed to etiquette as such. Some few are, but that's due to a radical take on politics rather than scholarly training.

Snedcat said...

[Part III--and at least this one is a bit separate from what has gone before]
As an example, the controversy over the Oakland school board decision described in the press as calling for teaching black English instead of standard English is good. In fact, the decision was to use black English (more precisely, Afro-American Vernacular English, or AAVE, to use the term linguists prefer, since there is as wide a variety of black English as there is of white English in the US) to help teach standard English. (How would you do that? As an example, consider the problem of teaching the formation of the past tense to students whose home language simplifies consonant clusters, so that the -ed past tense marker is usually lost in speech. They can't rely on their home language to know automatically when to write the marker, but it does include irregular past tense forms like came; the use of the past tense marker is then taught by training them to use mental substitution until the distinction becomes clear. It's not precisely the use of their home language as such to teach standard English, but to know enough about the features of their home language to suit pedagogy to their particular needs.)

Unfortunately, this quite sensible proposal (which seems to have been the intention of the school board to promote) was buried under a horrendously miswritten resolution by a school board that clearly didn't understand what it was saying (the intended sense, that black English is a form of English descended from earlier dialects of English and thus in the linguistic sense and that sense alone genetically fully a form of English, got bastardized into a largely meaningless statement that would easily be read, and was read in the press, as saying it was a form of language genetically conditioned by the skin color of its speakers), and the whole foofaraw ended up associating fine scholars like William Labov with anti-white racists who stoked the controversy on the other side. (It was also shaped by policies of California with regard to bilingual education—the decision to try the policies under the rules for a bilingual situation, which allows the use of the home language in the classroom, was a poor gamble and a bad decision, but about what you'd expect from a bunch of politicians, and was easily misconstrued as an attempt to secure bilingual-education funding in a scurrilously dishonest manner; I gather that in fact bilingual funding was already completely strapped by that time in the late great state of California, so that would have been a vain endeavour in any case.) And of course the whole issue was a nice little set piece for thoughtless prescriptivists and descriptivists of every sort—in the abstract I have sympathy for the Oakland school board, given the constraints they were under and the policies they seem to have had in mind, but lord'a'mercy, what a fiasco, and deservedly so. They could have tried to make their case forthrightly, but it's not clear that they even had their case worked out for themselves and they certainly didn't make it clearly to anyone else, and they stepped right into a mess of their own making.

Jennifer Snow said...

Lol, Sned, that's awesome.

I studied Latin a bit in high school but I was a pretty indifferent student all around. What I find interesting is that there is no discrete English term for some concepts denoted in English usage by certain Latin phrases like, for example, Alma Mater. Or Sub Rosa. or Sotto Voce. (Wait, that's Italian, isn't it?)

Are there other languages that snarf up terminology from other languages just because that terminology is shorter and less awkward than trying to construct that same term in English?

Supposedly English is full of a lot of similar efficiencies (although that word is probably not one of them), such as the fact that our nouns don't have gender.

Gus Van Horn said...

Jenn and Snedcat,

Your comments "crossed" since I was unable to moderate comments until now. (And I'm in a hurry, too.)

I, and I suspect, Jenn, will enjoy reading your three-parter. Thanks, as always, for your thoughts.

Gus

Snedcat said...

Jennifer Snow: "Lol, Sned, that's awesome."

I should add that besides being a professional editor, translator, and linguist, I'm also a practitioner (or perpetrator?) of linguistic satire. Sometimes I'm only half-serious, but even when I'm fully serious about what I say, I'm also aiming for humorous effect--sometimes to try out an idea for a future piece, sometimes just for the entertainment of the reader. In this case I was mostly serious, but in real life I'm a tad more tolerant of that sort of pretentiousness than I used to be.

Of course, thinking about it, I'm not strictly a linguistic satirist; some of what I write is simple linguistic humor (on occasion not so simple if you're not a linguist): an odd concatenation of ideas, say, or a curious coincidence. Even there, though, I usually have a satirical edge to a piece because I express the humorous idea though the sort of character who'd seize on the idea and run with it as if it were a great pearl of wisdom. (Though others are pretty biting, or at least sharply nibbling, straight satire.) So in general I'd summarize by saying that my humor is academic humor, not just because of the subject matter, which is eminently and unapologetically academic, but because my preferred form of human folly to satirize is what I think of as the mindless academic mindset.

So, to bring it back to topic, or at least an earlier tangent, on the one hand I want written English taught as rigrously as possible so that I can continue to have an audience able to fully appreciate my authorial voice...but perhaps if it were taught a lot better there would be far fewer targets for my satire. Is a puzzlement...

Snedcat said...

"Are there other languages that snarf up terminology from other languages just because that terminology is shorter and less awkward than trying to construct that same term in English?"

Languages vary widely in how much they borrow from other languages. English does it very widely, while other languages will create equivalents based on native elements (called "calquing," or "loan translation"). And borrowing does not always happen because of a clear need due to a lack of a native equivalent--often a language will borrow from another because its speakers consider the other language prestigious. (Thus, English borrowed such words as "face," "cross," and "conscience" before or just after the Norman Conquest even though there were perfectly common native words for those ideas already--rood for "cross" and inwit for "conscience," for example.)

Mongolian, for example, borrows probably as readily as English does. At the opposite extreme, Iceland and the Faroe Islands have language planning academies to coin native words for any foreign word that might try to worm its sinister way into the language. (I remember reading an interesting essay about Faroese language planning by its head language planner of the time, who decried the harms threatening Faroese linguistic purity from the perilous charms of...wait for it...Danish.) The French elites would love to replicate the success of Iceland and the Faroes, but French youngsters are just too damn enamored of other cultures to pay the Academie Francaise any respect.

In the middle of the spectrum, German borrows less than English but more than Icelandic, and Chinese borrows more than is usually realized since it fits borrowed words into native pronunciation masked by characters. So, a humorous Chinese word for "martini" suggested by the linguist Yuanren Chao was ma-ti-ni "a horse kicks you," and there are many others that are fully accepted, such as xiao-ke-li for "chocolate," tu-la-ji for "tractor," and xi-ming-na-er for "seminar," in which some or all of the characters are chosen for meaning--the ji in "tractor" means "machine," for example. Older loanwords are often not realized as such though: hu-die "butterfly" was borrowed from a non-Chinese language of South China, which the two-syllable character of the word suggests and the traditional explanation makes clear. (Native scholars of later generations said the first character was a native word that referred to male butterflies, the second character to females, which is quite as transparently absurd an explanation as you'll find.) On the other hand, there is of course a countervailing tendency in Chinese to coin two- or four-syllable words encoding meaning alone for foreign ideas, items of culture, and institutions.

Of course, then there are dying languages, which term does not usually refer to languages whose speakers are dying out but rather who are shifting as a group to another language. There you find wholesale borrowing from another language by the last two generations or so, who use the other language in ever greater spheres of life and do not pass on their native language to their children--usually for quite conscious reasons of helping their children have future advantages. Linguists worry about this problem, since linguists are devoted to the study of langauges as such, but it's not the sort of problem one can solve very easily since the economic reasons behind language death are well-nigh insuperable.

Snedcat said...

"Supposedly English is full of a lot of similar efficiencies (although that word is probably not one of them), such as the fact that our nouns don't have gender."

Well, yes and no, or maybe just maybe. It's always possible to point to examples of greater efficiency in language change, but sometimes (I won't say always) if you look more widely you can see a countervailing loss of efficiency. Basically, language change is driven by two competing forces--greater efficiency for speakers versus greater clarity or distinctiveness for listeners. If you model it mathematically, I'm sure you'd find it's a problem with an infinite number of optimal solutions, so there will be no final state to which all languages will degrade.

For example, if you look at the matter of gender in a broad view, marking gender has the advantage of allowing greater variation in word order for emphasis and the like, and also of theoretically lessened ambiguity in the reference of pronouns, for example. The extreme in this respect is a language like Swahili with over a dozen noun classes (in a linguistic view, gender systems are a special subset of noun class systems), so think about the advantages versus disadvantages there (extensive noun classes reduce ambiguity a great deal but add to the difficulty for children of learning the language).

And on the other end of the spectrum, again, there are many languages that do not have a gender distinction at all, even in pronouns--Japanese, Korean, Chinese, Mongolian, the Turkic languages, Persian, and many others do not distinguish he, she, and it (well, some languages like Japanese and Chinese have jury-rigged such distinctions under foreign influence, but such distinctions tend to be exclusively written--the pronoun are written differently but pronounced identically in Chinese--or very formal in the case of Japanese). The lack of such a distinction does not in practice add too much to the ambiguity of reference, however unnatural it might feel to an English speaker.

Snedcat said...

[Part II again...]
This is related to some extent to the question of the complexity of languages. The pronouncement in linguistics in its strong form is that all languages are equally complex--this is only sensical once you make the specification that by complexity is meant potential expressive power; in this reading it does seem to be true.

The issue is complicated by the fact that it is possible to make unambiguous measures of complexity for different subsystems of a language, but coming up with an umambiguous measure of overall complexity is much harder--the straightforward approach is to posit reasonably broad measures of complexity for the major subsystems of a language and take a weighted average. This might not actually answer the quesion, but it should be a reasonable first stab at the issue that will give some insight to the problem.

And in fact John McWhorter did just that; his result was that most languages do have broadly similar amounts of overall complexity, but newly created languages (creoles) are significantly less comlex and grow more complex with the number of generations they are spoken until they reach the general level of complexity of "natural languages."

(I should add that McWhorter is more famous as a popular writer in linguistics and politics than in linguistics. He failed to make tenure at Stanford, I suspect because he just didn't produce enough original work as much as for his conservative political views. In any case, I find his popular linguistic works unreliable enough that I don't recommend them strongly.) My view is that to a great extent complexity in one part of a language is balanced out by simplicity in another part, and that over time there is a tendency for overmuch complexity to be ironed out and overmuch simplicity to be amplified for expressiveness by speakers.

Besides these more specific measures, there are well-known findings in general psycholinguistics--the experimental psychology of language. There you do have clear indications of tendencies to reducing longer phrases when they are more commonly used, for example, but the actual processes underlying the statistical regularities found are really hard to tease out. Zipf's Law, for example, gives a precise relationship between frequency of use and length of a word that holds remarkably widely, but explaining the regularity with reference to underlying cognitive processes is...a highly disputed matter, let us say.

Snedcat said...

Yo, Gus, need to make one big correction: "Older loanwords are often not realized as such though: hu-die "butterfly" was borrowed from a non-Chinese language of South China, which the two-syllable character of the word suggests and the traditional explanation makes clear. (Native scholars of later generations said the first character was a native word that referred to male butterflies, the second character to females, which is quite as transparently absurd an explanation as you'll find.)"

Should have thought about this more closely. While hu-die is indeed a loan as I stated, it's not the word I was thinking of, which I can't track down now. It was a two-character set having to do with biting insects that was thus interpreted by later scholars. I forget the text it was in--this was 12 years or so ago at the very end of a grueling semester.

And minor corrections: Chinese for "chocolate" is qiao-ke-li, not xiao-ke-li, or to be more English-speaker-friendly, ch'iao-k'e-li. Also, possessive s is an enclitic (it follows the word it fuses with), not a proclitic (which precedes the word it fuses with, like to some extent the does unless it's stressed). In this case I was too busy pondering whether to include such a naughty-looking word--I remember the time I was a TA and had to introduce the topic of clitics right after discussing a reading by, yes, Ray Jackendoff. Lots of, er, tittering in the peanut gallery that day, I assure you...and probably the only time 80% of the class ever paid any attention to what I was saying; teaching a class at 1 PM in a room with overactive heating in mid-September is a recipe for mass afternoon naps. (And interestingly enough, that word is related tot he word it sounds like, but quite indirectly so.)