Monthly Archives: December 2007

Learning History: Dates or Trends?

The old conventional wisdom was that dates were important, hence an “educated” person is supposed to know that 1776 was the signing of the Declaration of Independence, the Civil War lasted from 1860-1865 and that William the Conquerer successfully invaded England in 1066. So generations of school children memorized lists of dates and events in hopes of passing their final exam…and often forgot them 15 minutes later.
But are the dates important or is it the trend that matters? After all what’s the point of knowing that the Civil War took place in the 1860s if you don’t understand that it was a major conflict over racial equality vs. states rights and the impact of both on Western expansion of American territory (more or less). So the modern trend in historical instruction has often been to focus on the the trends behind the dates and not worry so much about dates. Memorizing trends instead of dates is now the new conventional wisdom.
Makes sense right? Actually I’ve found there is a teeny problem. It’s often the case that if you completely skip dates you lose some nuance that really could help you understand history at a deep level. If you’re not careful history can become a weird melange of events of that happened sometime between WWI and the Moon landing.
And if you wish to ANALYZE history instead of just reading about it – you have to deal with dates. Using dates to sequence events is critical to constructing a “narrative”. Without an accurate chronology, your analysis will not just be wrong but usually wrong in a howlingly inaccurate way that will have the date dweebs on floor screaming, laughing or crying.
Picture, if you will, a confused child of the 22nd century who can’t quite distinguish the two president Bushes claiming that we invaded Kuwait to avenge the the loss of of the World Trade Center and the simultaneous Oklahoma City bombing. Of course this is someone who has merged the 1st Persian Golf war in Kuwait with two separate events.
It sounds funny, but it’s also a little dangerous. As an adult, the same person could wonder why we’ve invested so much in Kuwait when they’re our mortal enemies and underestimate the dangers of domestic terrorists who are the ones who actually set off the bomb in Oklahoma City.
I’m not saying that we shouldn’t explore deeper issues such as “Why is Al Qaeda attacking us?” but that knowing details such as dates and locations really help form a fuller picture of what’s really going on. I have to say, a lot of modern Middle Eastern politics did confuse me until I learned more about the Ottoman Empire in Turkey and how some of it was divvied up after WWI into French and British territories for a while. Suddenly…a few more things made sense and I really understood some of what Al Qaeda was mad about (it’s not really just the US by the way).
So if we are serious about students learning higher-order thinking … I really do think we have to teach students how to memorize and use dates (and locations) in their thinking as well as social trends like “nationalism” and “colonialism”.

Both/And vs Either/Or Thinking

I’ve participated in and/or observed a lot of hot academic debates such as “language rules or language constraints?”, “qualitative or quantitative research?”, “English or Spanish?” “Mac or PC” and a new addition “Is history about trends or dates?” The premise behind many of these discussions is that you must choose between or option OR another. Rarely is it the case that a discussion centers around the idea that maybe BOTH categories could be appropriate.
As a linguist, I believe in categories, but have you noticed that items often fall into more than one category? Or that depending on what “filter” you have on at the time – items may appear different?
Take the social science debate of “quantitative” (data primarily from statistical analysis) “qualitative” (data from interviews). Quantitative specialists prefer this method because results are more precisely quantified and easier to generalize (assuming your survey pool is long enough). On the other hand qualitative specialists feel that interviews allow to learn unexpected details that a survey might miss and will allow you to follow an productive inquiry path with a subject as needed.
But guess what – I think both perspectives are both right…in their own way. I do like that qualitative studies can give you more details and unexpected twists, but it usually is hard to generalized (unless you bring in some statistical analysis). On the other hand, quantitative statistics is great for highlighting oddball trends a casual user might miss in an interview. The only problem is that if you don’t ask for the right input, you might miss a discovery. Using both strategies might give you a fuller picture of a social trend.
Yet from what I have seen students in the social science pursuing a degree are typically forced to choose EITHER quantitative or qualitative. Only rarely are you allowed to combine BOTH quantitative and qualitative together. The idea that you might want to combine two techniques seems almost heretical.
FYI – Linguistics has the same “either/or” issues to suffer through. For instance it’s rare to find a linguist comfortable with both formal grammatical theory and sociocultural issues.
And it’s not just academia. I often hear discussions of whether immigrant children in the US should learn English or their parents’ language. Why not both? Their little brains are set up for multiple languages.
Or maybe you wonder if a university should be all PC or all Mac. Maybe the answer depends on whether you are working on supply chain managment (PC probably) or video editing (Mac probably). This is one reason why a major university usually has to support both platforms…even if to adds to overhead.
I know there are sometimes we have to choose (left turn or right turn to the grocery store) but there are many times I wish society was more willing to explore some “both/and” options.
For instance, maybe it really is OK for people to say both “Merry Christmas” and “Happy Holidays” – I never understood why either side asked us to choose just one holiday salutation.

Ah…The Gathering Instinct

As many of us learned from a PBS documentary, humans were “hunter-gatherers” in the days agriculture developed. Since this “hunting and gathering” has gone on for much longer than agriculture, many anthropologists and biologists have speculated on what impact this has had on humanity as a species.
Normally though, I hear more references to the “hunting” instinct and its impact on our species. Topics have included the thrill of “the chase”, cooperation within the hunting band and speculations on the thrill of the “kill” (not to mention the thrill of the grill). Those who believe that some of the weirder aspects of human behavior can be explained by evolution tend to believe that many humans still have a “hunting” instinct of some sort.
Rarely however, do I hear discussions of the “gathering” instinct (at least not so much in popular science). Yet “gathering” is probably the more productive of our food gathering strategies – although hunting does give you the higher value “protein”.
So in one of those odd caffeinated moments in the car, I asked myself …do we still “gather” as well as “hunt”? And then it hit me – we shop!
More importantly, we often shop even if we don’t need to. After all, do the Gossip Girls really need another pair of shoes? Do I really need to buy another novel when I have a stack on my bed table? Of course not. And gadget gurus – if you’ve been able to cope without the iPhone did you really need to be in line at 4 AM on the first sale day? Just asking….
It’s not just in modern Western culture either that shops. Archaeology is full of evidence for cultures going to great lengths to obtain the right gemstones, the best dyes and even the best flint for your flint tool set (those things work!) And there has always been a luxury food market. They don’t call chocolate the “food of the gods” for nothing.
So is anyone investigating this all-important human activity? Yes actually, and some very interesting regults can be found at Design of Desire. Marketers have always been interested in exploiting the shopping instinct, but it’s also good that there’s some neuro and cognitive science behind this too. As much as we may not want to admit, our desire to gather and hoard really does drive a lot of our economic behavior.
(OK – That was such a good site – I had to share it with you!)

How do you get a word for “auntly”?

Here’s a weird one for the “you learn something new everyday files”.
Today I was asked if there was a feminine counterpart for the English term avuncular which is derived from Latin avunculus ‘maternal uncle’.
I should note that although avuncular technically means “uncle-like”, the usage has expanded to mean any older kindly gentleman (like your nicer uncles). For instance Hollywood.com describes actor Gordon Jump (the original Maytag Man and the kindly if befuddled boss at WKRP in Cincinnati as an “avuncular television actor”
The original question was from a gender studies professor, so I assume her intention to provide a similarly positive role-model for older women (I’m thinking Aunt Bea from Andy Griffith here).
Well, most of the linguists said they hadn’t heard of one…but surprise there really is one! The Oxford English Dictionary (OED) does list materteral “aunt-like” from Latin matertera ‘maternal aunt’.
Now the interesting question was – does anyone really use this term (i.e. is this a “real word”?). As you may guess, most Google searches for this term give you a list of word trivia sites, so it’s certainly not common. I know I never saw this on my SAT vocabulary terms and even the OED classifies this as “humourous”.
What I think is interesting is that someone went to the trouble to coin this little sweetie. In some sense, if you are exposed to enough Greco-Roman roots, you can start making new combinations up (and yes some people find this fun). For instance, I’m waiting for the new archaeological field of paleotechnology (extracting evidence from old Apple II computers) to be invented in future decades. There is a little sub-grammar just for these terms embedded into English grammar. I’m not sure what that means, but it sure is interesting.
I compare this to bootylicious – no one had heard of this until Beyoncé sang it, but we all knew what it meant because it combines “booty” with part of “delicious.” Instant word creation!
I think the same is true with materteral. It may not be in common usage now, but it has been coined and it has a basic meaning which can be reconstructed from its parts. So it is a word (or maybe a potential word). I did this send this along as a “word”, because even if you went to the trouble of making a Latinate counterpart, you would concoct something much like this. Maybe our instructor will create a new field of “materteral” studies.
And oddly enough – I have seen this being used in a real sentence from a British aunt’s blog (it’s a new word for her, but why not?)

The next few weeks are to be consumed by materteral activities. Alexander is coming to Suffolk this weekend and then the following weekend we are going to Hampshire to see him get Christened.

Sounds like she will be having fun with her nephew.

Retronyms and Sapir Whorf

I’ve gotten myself into another discussion of whether language shapes thought (Sapir-Whorf Hypothesis) or whether language is just a tool to express a thought (versus waving your hands or drawing a picture). As I mentioned in my first blog post, I believe language is just a tool and that thought is not necessarily linguistically based.
An implication of thought driving language is that you predict that a human can have a new thought then CREATE new terminology for it. This happens all the time, but an interesting case of this is the retronym or words which reflect new distinctions which did not exist in the past.
For instance, when I was growing up, we referred to the (then) recent past by the decades (e.g. the fifties, forties, sixties). Items designed from those decades would be referred to as “forties design” or “fifties design”. But recently a new term “midcentury design” has emerged to reflect collective trends from about the 30-70s depending on the design.
That is, as art historians began comparing past trends to more recent trends (e.g. 90’s/2000’s), they saw some similarities in form emerge that had been not as evident before. But did people in 1955 think of themselves as “midcentury”? I doubt it – they were “modern” just like we are (although we are really early 21st century).
What’s interesting to me about retronyms is that they show not just new words developing but that communities have the capability to reexamine their collective assumptions and reconceptualize their universe. They saw trends emerge and realized that a new name was needed – hence “midcentury”. A need to express a new thought drove innovation in the language.
Now I won’t deny there are interesting differences in vocabulary across languages, but I think they reflect thought rather than constrain it. For instance, I am certain that peoples of the Arctic have more terms for snow than those in more temperate climates. But then again, so do English-speaking skiers. The vocabulary merely shows that these speakers know their snow. But you can be sure that a speaker of Arabic (a “desert” language) would surely be able to understand differences in snow conditions if he or she takes up skiing or dog sledding as a hobby. The mind is flexible enough to adapt.
And I am glad that thought is powering this engine and not language. Because if this weren’t true we’d never be able to explore and explain the complexity or snow or the sand. We’d also be trapped in a world which didn’t have innovations like “democracy”, “multicultural diversity” or “civil liberties.” A scary thought indeed.

The /Cw/ Clusters of Suzanne Sugarbaker

Just saw a Designing Women re-run the other day and picked a fun pronunciation tidbit from Suzanne Sugarbaker (the still hilarious Delta Burke). Suzanne’s character is rich, flighty and not known for her multicultural outreach, so that means her Spanish diction is not as bueno as it could be.
The one I noticed was how she pronounced the name of her housekeeper Consuela. In Spanish it should be /kon.swe.la/ with a /swe/ cluster, and while most of the other English speakers could manage it, Suzanne consistently mispronounced it as /kon.su.e.la/ (4 syllables).
I realize this is Delta Burke is acting here, but her “miss” is interesting because it’s an example of how English is a little uncomfortable with many consonant+glide clusters. English has a few native /Cw/ clusters such as /kw/ (queen) and some /tw/’s (twin) and the repertoire has been expanding with the help of language contact to include examples such as pueblo (pw), bwana (bw), Gwen (gw), and vichysoise (sw)
But a few /Cw/ clusters still cause problems – my favorite example being Luigi which in Italian is /lwi,ǰi/ but in “American” is usually /lu.i.ǰi/. Another one is French roi ‘king’ which should be /rwa/, but usually comes out something more like a shortened /ruwa/ when I attempt it.
Apparently Suzanne Sugarbaker has similar problems with Spanish /sw/ – hence she ends up with Con-su-e-la. This is plausible since /s/ is coronal like /l,r/…but still humorous.
FYI – A more common example is how English “mangles” consonant+glide is /Cj/ (or /Cy/) to /Ci/ as in Tokyo /to.kjo/ (2 syllables) to /to.ki.o/ (3 syllables). Similarly Kyoto /kjo.to/ becomes /ki.o.to/ and Spanish fiesta /fje.sta/ becomes /fi.e.sta/.
P.S. Why so many TV examples in the blog? I like to collect examples students may have heard before. Hopefully some of them are Designing Women fans.