PDA

View Full Version : There's no such thing as 'incorrect' English



DamaNegra
01-30-2008, 10:00 AM
Well, I used English because it's the language used in this forums, but you can switch English with the language of your choice.

I'm reading this book by John Lyons and came across this very interesting paragraph:

There are no absolute standards of correctness in language. We can say that a foreigner has made a mistake, if he says something that violates the rules immanent in the usage of native speakers. We can also say, if we wish, that a speaker of a non-standard social or regional dialect of English has spoken ungrammatically if his utterance violates the rules immanent in Standard English. But, in saying this, we are of course assuming that he was intending, or perhaps ought to be intending, to use Standard English. And that is itself an assumption which requires justification.

The text goes on to explain how common 'mistakes', such as double negatives, the condemnation of the split infinitive and the distinction between 'you and I' and 'you and me' are not really mistakes, only variances of language that make sense in their context.

So, what do you people think about this?

JoNightshade
01-30-2008, 10:29 AM
This is basically the difference between a proscriptive and descriptive approach to grammar. I think the trend now is to view grammarians as descriptive - the experts simply look at how the language is used and describe the underlying rules. In the past it was more of a proscriptive approach - the rules were created from the language, and then anything that deviated from that was "wrong."

At best, language is a set of agreed-upon symbols. As long as the group you're communicating with understands your symbols, mission accomplished.

Which doesn't mean I'm saying there's no such thing as "good" or "proper" English. There are, say, a certain set of rules agreed upon by most upper class businessmen. You need to know and use those rules if you want to make it with them. With writing it's even more strict.

In any case, an interesting little tidbit: when I was trained to teach English as a foreign language, I was told to tell my students to "give up" on attempting to acheive a "native speaker" level. This is virtually impossible. What is important is to reach the level at which you can communicate effectively - not by duplicating the native speech, but by fitting into enough of the patterns that you can be easily understood. I found this to be very good advice.

DamaNegra
01-30-2008, 10:38 AM
This is basically the difference between a proscriptive and descriptive approach to grammar. I think the trend now is to view grammarians as descriptive - the experts simply look at how the language is used and describe the underlying rules. In the past it was more of a proscriptive approach - the rules were created from the language, and then anything that deviated from that was "wrong."

Yeah, the approach Lyons takes in his book is definitely descriptive. I had a course last semester, it was supposed to be a Grammar course but instead we studied cultural linguistics (I think it was called) and then proceeded to disprove every single grammar rule ever created.

My problem with this approach is that some prescriptiveness is necessary to achieve a standarization in language, not only for didactic but also for comunicative purposes. Chaos would ensue if everyone was just let to his or her own devices and every single deviation from the original language was considered 'normal'. Especially in such a globalized world, when so many different cultures are trying to communicate using a 'standard' language, you can't just tell people: "well, you tried, which is important". I feel we're approaching an era in which effective communication is more than necessary, it's vital, and truly effective communication cannot be established unless people agree 100% on the meaning of symbols and words.

Which is just a dream, obviously, but still...

JoNightshade
01-30-2008, 10:52 AM
My problem with this approach is that some prescriptiveness is necessary to achieve a standarization in language, not only for didactic but also for comunicative purposes. Chaos would ensue if everyone was just let to his or her own devices and every single deviation from the original language was considered 'normal'. Especially in such a globalized world, when so many different cultures are trying to communicate using a 'standard' language, you can't just tell people: "well, you tried, which is important". I feel we're approaching an era in which effective communication is more than necessary, it's vital, and truly effective communication cannot be established unless people agree 100% on the meaning of symbols and words.

Oh, I totally agree. There has to be a happy medium, I think.

But of course, outside of basic education, there really is no force except the culture itself to keep the language in check. I think there are two major solidifying forces that act to this end:

1) Hollywood. Virtually anyone who learns English watches movies, movies, and more movies. In fact, "Hollywood English" has become so prevalent that it's wiping out a lot of American dialects.

2) The written word. More than anything else, I think the standardization of the written language keeps the spoken word in check. Our society depends so heavily upon written communication that it can't evolve rapidly - or else we'd lose comprehension of documents written 50, 100 years ago. And reading those same documents reinforces the language as it is.

Frankly, I find the whole thing fascinating. :)

JoNightshade
01-30-2008, 10:56 AM
Another thought... I was just reading another thread where one person got slammed (ouch) for spelling a couple of things incorrectly. The dude doing the slamming was being a jerk, but it serves to illustrate another way in which language standardizes itself: those outside common usage are often percieved as being less intelligent or capable. Less effective communication = less tolerance. To be accepted into a group, you must match the linguistic standards or risk being seen as an outsider.

DamaNegra
01-30-2008, 11:36 AM
2) The written word. More than anything else, I think the standardization of the written language keeps the spoken word in check. Our society depends so heavily upon written communication that it can't evolve rapidly - or else we'd lose comprehension of documents written 50, 100 years ago. And reading those same documents reinforces the language as it is.

Yeah but more and more, in literature at least, authors are opting to write using their particular dialects. I recall a work, though I do not remember the author, that is written in 'Cuban'. That is, Spanish phonetically sounded out to be read as if you were speaking the Cuban dialect with a Cuban accent. I've tried reading works by Latin American authors and I don't always understand, even if we're both supposed to be speaking the same language. As I've seen it, the trend it opposing standarization rather than trying to achieve it. And that, at least for me, is kind of scary because we're losing the ability to communicate effectively with each other. Just look at what happened to the Roman empire after latin evolved into completely different languages.

paprikapink
01-30-2008, 11:38 AM
I feel we're approaching an era in which effective communication is more than necessary, it's vital, and truly effective communication cannot be established unless people agree 100% on the meaning of symbols and words.

Which is just a dream, obviously, but still...

It is a nice dream, but you're right, 100% agreement is not attainable. What could help is if we agree to accept each other's sincere efforts at communication for what they are -- communications, not grammar tests or pissing contests. Standard English to me is a beautiful thing. A good sentence can stop me cold, take my breath away. I love the art of language.

But in other contexts, "R U OK?" might be the most important message you see all day. It'd be a shame to reject it because it ain't purty.

I don't think it will lead to chaos. Maybe a little chaos. But if every person on Earth could understand every other person, that'd be the kinda chaos I could get behind.

DamaNegra
01-30-2008, 11:39 AM
Another thought... I was just reading another thread where one person got slammed (ouch) for spelling a couple of things incorrectly. The dude doing the slamming was being a jerk, but it serves to illustrate another way in which language standardizes itself: those outside common usage are often percieved as being less intelligent or capable. Less effective communication = less tolerance. To be accepted into a group, you must match the linguistic standards or risk being seen as an outsider.

Yes, but at the same time, different, specific communities are developing whole new codes of communication. MySpace speak comes to mind, l33t, lolcat, SMS speak, Messenger, etc. While not all of these are used in everyday communication, I see the numbers of people using MySpace, SMS and Messenger speak in school writings and I even saw one person who *gasp* said 'lol' instead of laughing. Seriously. The guy never laughed, when he found something funny his reaction was saying 'lol'.

donroc
01-30-2008, 04:16 PM
I remember this phrase when someone mispronounced a word or two. "I am well read but poorly spoken."

Dawnstorm
01-30-2008, 04:28 PM
I've tried reading works by Latin American authors and I don't always understand, even if we're both supposed to be speaking the same language. As I've seen it, the trend it opposing standarization rather than trying to achieve it. And that, at least for me, is kind of scary because we're losing the ability to communicate effectively with each other.

To be honest, I fell that fiction isn't the place to insist on standardisation. Actually, I'd argue that fiction is the perfect place for non-standard usage, since immediate understanding isn't critical to the success of social situations.

I do think that "Standard English" is a good idea, because if communication fails we can always retreat to that position. But "standard English" isn't more correct than any other variation; it's just what we expect everyone to speak and understand in addition to their own dialect.

And to the extent that nobody speaks "standard English" in everyday life, "rules" are necessary, as a sort-of normaliser, a life belt.

The problem with prescriptivism is, though, that most of the rules that get attention are nitpicks that don't hinder communication in anyway. If you're insisting that it's "the boy with which I go to school" is correct, but "the boy which I go to school with" isn't, you're not going to convince me that the issue is effective communication. There's something else, something about wearing tribal colours.

The thing is if we realise that the rules that are taught are the usage of the alpha male and not the usage of the community, how should we deal with this? Much prescriptivism is just throwing your weight around. (And sometimes it's just lipservice; the people who teach a rule often don't follow it themselves. Is this effective communication?)

Also notice that when we're teaching "correct language", we're focusing on production, while pretty much ignoring reception. You don't sensitivise the students for variety. Instead you tell them there's a right way, and there's a wrong way, and if it isn't the right way, it's wrong. But listening, and awareness of difference, is an important aspect of communication. People whose dialect differs substantially from standard English already have to put more energy into learning these rules of "correctness". You're not helping the cause of communication by making them feel bad about the way they speak.

The more formal the situation, the more important the standard variety of the language will become, as it levels the field to some extent. This is important. But I also think it's a bit unfair to ignore that some varieties are further from the standard than others, and that those may have a harder time of "following the rules of standard English". There's a difference between the occasional slip and not making an effort at all.

I do think fiction is a great opportunity to get in touch with all sorts of dialects in a playful environment. Trying to work your way through a regional variety is - IMO - an exercise in communication. Trainspotting in RP? I don't think so.

A standard language is a good thing. Trying for it in the appropriate contexts (whatever they are) is polite. But dialects aren't "wrong". They have their own set of rules. They're not arbitrary. And - here's the thing - they tend to survive despite nobody teaching them.

Variation is pretty much a universal feature of any language.


Yes, but at the same time, different, specific communities are developing whole new codes of communication. MySpace speak comes to mind, l33t, lolcat, SMS speak, Messenger, etc. While not all of these are used in everyday communication, I see the numbers of people using MySpace, SMS and Messenger speak in school writings and I even saw one person who *gasp* said 'lol' instead of laughing. Seriously. The guy never laughed, when he found something funny his reaction was saying 'lol'.

That's not much different from saying "ouch" instead of screaming, or gasping, though, is it? Or saying "hehe", rather than laughing. Again, the issue here isn't communication; it's that language spills over from typing to speaking, and you're not used to it yet. If you (not necessarily you) happen to think l33t is a bad thing, the association will also carry over.

I'd argue that using 133t in school writing is taking things a bit far; it's undermining the idea of standard English in the first place. But saying "lol" because you think something's funny, is nothing but a writing-habit translated to a speech-habit. I don't find that too surprising. (And I think I've heard people say "lol" before I ever heard of myspace; and I'm Austrian, so the people who said that would have spoken German.)

DamaNegra
01-30-2008, 06:36 PM
The problem with prescriptivism is, though, that most of the rules that get attention are nitpicks that don't hinder communication in anyway. If you're insisting that it's "the boy with which I go to school" is correct, but "the boy which I go to school with" isn't, you're not going to convince me that the issue is effective communication. There's something else, something about wearing tribal colours.

Actually, my concern is not with grammar but with vocabulary. I'll use Spanish as an example since it is (obviously) the language I'm most familiar with.

The differnces in grammar in Spanish are not really consequential as far as sentence structure goes. You can comfortably said "Ayer fui al cine", "Al cine fui ayer", "Fui al cine ayer", "Fui ayer al cine", "Ayer al cine fui", etc, and the sentence would mean exactly the same. Spanish is very flexible when it comes to sentence structure and, as long as the correct words ware there (ayer, cine, fui), then everybody's going to understand what you mean. Heck, you can even say "ayer fui cine", even though that is gramatically incorrect in terms of Standard Spanish, but people are still going to understand you. We can argue that accents in written Spanish are important because they can greatly change the meaning of words (Papa means pope, papa means potato and papá means father, etc.).

The challenges in effective communication come when the difference is lexical instead of grammatical. Hell, I have trouble getting people to understand me even though I'm still in the same country. People in the north barely understand people in the center and south because of vocabulary. In the center, people barely understand those from the north and south and so on. Example: "Busqué mi lapiz" means either 'I found my pencil' or 'I searched for my pencil' in the south. In the center and north, the sentence means only 'I searched for my pencil', which leads to confusion if people from the south use it meaning they've found the pencil because everyone else will assume they're still pencil-less.


But I also think it's a bit unfair to ignore that some varieties are further from the standard than others, and that those may have a harder time of "following the rules of standard English". There's a difference between the occasional slip and not making an effort at all.

But here we should ask ourselves, why are they having a hard time following the rules of standard English? I'm not talking about following them in a relaxed, friendly environment, but people who have trouble following the rules of standard English in an environment where effective communication is necessary. Regional variations are okay for when you're sure people are understanding you, but that is not going to be the case all the time. If people cannot agree on the correct meaning of words (as with the previous example that 'buscar' should only mean 'to search' and not 'to find', or that it should mean both things depending on the context), then how can anyone be sure the communication is effective? And, since there is nothing ortographically or gramatically wrong with the sentence, how can a non-speaker even pinpoint there is something different in the sentence and that he or she may not be understanding correctly? Moreso if no one sits down and says: "Okay, let us first agree on terminology before we start talking business".


A standard language is a good thing. Trying for it in the appropriate contexts (whatever they are) is polite. But dialects aren't "wrong". They have their own set of rules. They're not arbitrary. And - here's the thing - they tend to survive despite nobody teaching them.

I never said dialects were 'wrong'. For intracommunity comunicative purposes, they're great. I'm merely arguing about the extreme relaxing of a standarized language. Descriptive linguistics is okay, as long as you don't take it into the extreme (which I have seen some people do) of completely dismissing the necessity of a Standard language agreed upon by all its speakers.



I'd argue that using 133t in school writing is taking things a bit far; it's undermining the idea of standard English in the first place. But saying "lol" because you think something's funny, is nothing but a writing-habit translated to a speech-habit. I don't find that too surprising. (And I think I've heard people say "lol" before I ever heard of myspace; and I'm Austrian, so the people who said that would have spoken German.)

My issue with this guy is that he couldn't laugh. No matter how hard he tried, he'd forgotten how to laugh in favor of a word, because he'd spend so much time playing MMORPGs (Massive Multiplayer Online Role Playing Games, if anyone's curious) he hadn't had any 'normal' or shall we say 'verbal' interaction with people in a long time. That in itself is concerning but fit for another thread. And yes, even though he said 'lol' he speaks Spanish.

HeronW
01-30-2008, 07:43 PM
Sometimes lack of effective communication is cultural: you have people who insist on saying 'birfday' when there isn't an f to be found in the word. They know it, they don't care, they are proud of their lack of conformity, or lack of education in system that supposedly ignores them, etc.

Then there's the txt msg novel--like it's too difficult for these 'authors' to use whole words. Jeez, lazy much?

paprikapink
01-30-2008, 08:57 PM
Then there's the txt msg novel--like it's too difficult for these 'authors' to use whole words. Jeez, lazy much?

I don't think it's about difficulty or laziness. I think it's about what they want to say, how they want to say it, and who they are saying it to. They aren't talking to me (or I guess you either, HW) but that doesn't mean it doesn't resonate with someone somewhere.

DamaNegra
01-30-2008, 10:39 PM
But are there txt msg novels in let's say western languages? Because as far as I know, the phenomenum originated in Japan, and the novels were in Japanese, which is a whole different animal and I don't know how well it translates to writing in a reduced screen space. Also, we have to take into account that the Japanese use their cellphones like we use our laptops: blog surfing, news reading, watching tv programs, e-mail and all that. Their screens are bigger and, let's face it, their writing system is much different, so I can see the txt msg novel making sense in that specific cultural context.

Dawnstorm
01-30-2008, 11:11 PM
The challenges in effective communication come when the difference is lexical instead of grammatical. Hell, I have trouble getting people to understand me even though I'm still in the same country. People in the north barely understand people in the center and south because of vocabulary. In the center, people barely understand those from the north and south and so on. Example: "Busqué mi lapiz" means either 'I found my pencil' or 'I searched for my pencil' in the south. In the center and north, the sentence means only 'I searched for my pencil', which leads to confusion if people from the south use it meaning they've found the pencil because everyone else will assume they're still pencil-less.

Is there such a thing as a national standard?

Austria is a fairly small country, but just about every region has its own dialect. The most difficulty comes from phonetics (pronunciation), followed by the lexicon. We all learn a common language in school. There are regional differences in the lexicon, though; but they tend to be downgraded, with several results.

For example, in Austria we have a version of the quizshow "Who wants to be a millionaire". Up to € 500,-- the questions are supposed to be easy. But a lot of them are based on the Viennese dialect. If you're not from the capital, you're disadvantaged. Now, through TV etc. a lot of people are still familiar with the Viennese accent, so it's not that much of an issue. And since most candidates aren't from Vienna, it's not so much a disadvantage, as it is an advantage for the Viennese, or those who have lived in Vienna for some time.

We can play the game on a national level, too, considering German, Austria and Switzerland. Try selling sausages in a lake resort to German tourists. It involves a lot of pointing and shaking of the head usually. The difference between bacon and ham isn't easy either (but that's already difficult within Austrian regions).

The thing is, though, that the lexicon isn't necessarily standardised; and official dictionaries (e.g. Duden) often aren't precise enough to be of much help.

But this inherently a social issue. How do you determine the standard? Usually, there are two words, a regional one and a "standard" one. Problems arise when there is no standard word; i.e. when the concepts are unevenly distributed across the terms. Then, people often aren't aware that there are differences in the first place.

This kind of mapping usually goes ignored in schools, and there's no real public awareness either. Grocers in lake resorts will most definitely be aware of the difference, though. And they'll be careful. They get along, so the problem probably doesn't spill over.

But without a public awareness of difference, there is no public problem, and therefore no public solution. The best you can hope for is specialised language (such as business standards within import and expert of food for sausages). But specialised languages ("jargon") is pretty different from a "standard language" code. Dictionaries only take you so far.


But here we should ask ourselves, why are they having a hard time following the rules of standard English? I'm not talking about following them in a relaxed, friendly environment, but people who have trouble following the rules of standard English in an environment where effective communication is necessary. Regional variations are okay for when you're sure people are understanding you, but that is not going to be the case all the time. If people cannot agree on the correct meaning of words (as with the previous example that 'buscar' should only mean 'to search' and not 'to find', or that it should mean both things depending on the context), then how can anyone be sure the communication is effective? And, since there is nothing ortographically or gramatically wrong with the sentence, how can a non-speaker even pinpoint there is something different in the sentence and that he or she may not be understanding correctly? Moreso if no one sits down and says: "Okay, let us first agree on terminology before we start talking business".

Well, that's an issue for sociolinguists, I think. I suspect (but have no proof) that economic and political power of regions gives preference to certain dialects, so that the standard language is modelled after them. (Immigrant influence after a standard has been established is a different topic.)


I never said dialects were 'wrong'. For intracommunity comunicative purposes, they're great. I'm merely arguing about the extreme relaxing of a standarized language. Descriptive linguistics is okay, as long as you don't take it into the extreme (which I have seen some people do) of completely dismissing the necessity of a Standard language agreed upon by all its speakers.

Well, see, there is a standard lnaguage, but even "standard language" changes, because it's usually not set of statutes, but a set of abstractions based on super-regional media-usage, for example. It's less a thing of right/wrong as it is a common ground of all the dialects. Which is how such publications as, say, the OED can change their mind about what's standard. They're not fickle; a certain usage became more widespread.

The OED is pretty much an accepted authority on standard English in GB, but they're descriptivist. They don't set the rules, the set the methodology to determine what constitutes significant usage. They can do so, because they've got a large corpus to look at and expand.


My issue with this guy is that he couldn't laugh. No matter how hard he tried, he'd forgotten how to laugh in favor of a word, because he'd spend so much time playing MMORPGs (Massive Multiplayer Online Role Playing Games, if anyone's curious) he hadn't had any 'normal' or shall we say 'verbal' interaction with people in a long time. That in itself is concerning but fit for another thread. And yes, even though he said 'lol' he speaks Spanish.

You know him, I don't. MMORPGs may be the source of "lol", but they needn't be the reason for it. There's sociological research in the way that culture shapes supposedly spontaneous expressions, such as laughter. It's quite possible, for example, that being able to say "lol" is a relieve. Therapy rather than the disease. Job stress and the fear of being unprofessional when laughing, for example (wild guess; this could be total nonsense).

That said, the influence of internet interaction on face-to-face situations is very interesting.

JBI
01-30-2008, 11:48 PM
There is a Canadian poet named bill bissett (he spells it lowercase) who uses awkward slangs such as 2 instead of two, too or to, yu instead of you and th instead of the. All that really does is make the work less readable, not more profound.

Here is a sample

him her self is alredee enuff
is alredee fine is alredee all ther
can go now can b now she he is
sew flexibul now who 2 trust or
2 find discovr

I just find that annoying to read. I just take time converting the words into English, and not focusing on the meaning. There is no "incorrect", but there is definitely a proper/user friendly. Everyone bends the rules of course, from Shakespeare to today, but at least they are readable. Heavy uses of junk grammar, spelling, and pointless dialects just cut the prose/verses up into shreds of unreadable rubbish.

JoNightshade
01-31-2008, 02:27 AM
Maybe this is fundamentally a globalization problem. In the past, languages developed out of fairly isolated communities. There was no real concern about this because, well, they were isolated. Selected members of the population could be educated in other languages and thus communication between people groups continued.

Today, everyone is connected. We have to do business with one another. In countries where the process of "globalization" is taking a bit longer, you have these warring regional dialects and communication issues. I have to think that as these places become more linked in terms of economy, the dialects will filter out just as they (mostly) have in the United States.

All the same, that doesn't stop language from evolving. We still have new terms and change old ones to suit our purposes.

Perhaps most interesting is that now, languages are not evolving in a geographic sense. Rather, languages are being created based on common interests and pursuits. Example: my husband is a "gamer" and a computer programmer. Since I've married him, I've slowly become fluent in what I would definitely call a dialect or subset of English. This is a community highly literate in terms of internet usage and anything technological. I'm not just referring to "1337-speak" here... that's included, but it goes WAY beyond that. When they get together and talk, outsiders - say, our parents - have absolutely no clue what we're talking about. Take the word "Tanking," for instance. Most people think of a tank... so what is tanking? It refers to a big strong guy in a group who takes the brunt of damage when an enemy attacks, protecting everyone else. This applies to MMORPGs, videogames, movies, boardgames, and even to real life, say watching a movie or something.

Another example is lolcatz, which has clearly developed its own grammar system. When we're feeling affectionate, my husband and I frequently switch from regular English grammar to lolcatz grammar. When I'm browsing icanhascheezburger.com (main lolcatz site) and looking at new submissions, my brain singles out the captions which aren't "correct" in terms of lolcatz grammar. You can see that some people "get" it while others are just "pretending," much like a child would mimic the sound of a foreign language.

Ruv Draba
01-31-2008, 11:40 AM
Even a little bit of international travel shows that we can communicate quite functionally with a small local lexicon, not much knowledge of grammar and execrable pronunciation.

The issue of 'correct' language isn't a functional one then; I think it's a social one. To use language familiar to and accepted by your listeners is to say that 'I am of your tribe' or, if you are obviously a visitor, 'I understand and honour your tribe'.

Of course, if you're of the tribe already and don't use the tribe's approved language then you're saying 'I'm of the tribe, but I don't honour my tribe'. For those people for whom tribal tradition and belonging are important, this is close to personal insult.

With languages now spanning nations it can be a constant strain to share a lot of language but not a lot of tribe. While I don't consider myself very tribal, there are atribalisms that irritate me greatly -- like the habit common in young US English-speakers of using 'then' instead of 'than'.

Of course, these atribalisms are emergently tribalisms of their own - they're just not of my tribe.

I agree with Dawnstorm: fiction is a great place to explore both language and the tribal affiliations it imputes.

paprikapink
01-31-2008, 09:12 PM
The issue of 'correct' language isn't a functional one then; I think it's a social one. To use language familiar to and accepted by your listeners is to say that 'I am of your tribe' or, if you are obviously a visitor, 'I understand and honour your tribe'.

Of course, if you're of the tribe already and don't use the tribe's approved language then you're saying 'I'm of the tribe, but I don't honour my tribe'. For those people for whom tribal tradition and belonging are important, this is close to personal insult.


This seems so intuitive stated here. And yet I never thought of it just this way. Brilliant!

ColoradoGuy
02-01-2008, 02:40 AM
Of course, if you're of the tribe already and don't use the tribe's approved language then you're saying 'I'm of the tribe, but I don't honour my tribe'. For those people for whom tribal tradition and belonging are important, this is close to personal insult.
Grammarians can be quite tribal. Just try to split an infinitive in front of one and watch the veins in his head twitch.

Ruv Draba
02-03-2008, 02:01 AM
Grammarians can be quite tribal. Just try to split an infinitive in front of one and watch the veins in his head twitch.All tribes have their watchdogs, guardians and keepers of traditions. This runs by personality type, which is also often a predictor of profession.

In a Myers-Briggs/Keirsian (http://www.personalitypage.com/four-temps.html) sense SJ types are especially adept at zooming in on what's wrong. A lot of SJs notice grammar, spelling and punctuation solecisms like fingernails on a blackboard. Many SJs need to proof-read way that some people need to pick lint. NT types can also be quite picky (but often more about ideas and definitions than the actual expression), while the SP types are likely to say 'You know what I meant', and the NFs will often say 'Stop trying to constrain my expression'. :D

StephanieFox
03-03-2008, 12:25 AM
Feh!

ColoradoGuy
03-03-2008, 06:58 AM
Feh!
Que?

maxmordon
03-05-2008, 12:59 AM
1) Hollywood. Virtually anyone who learns English watches movies, movies, and more movies. In fact, "Hollywood English" has become so prevalent that it's wiping out a lot of American dialects. . :)

This is really subtly affecting Latin America. Since Mexico used to have until fairly recently the monopoly on dubbing, a lot of kids get the "cantadito" from Mexican Spanish and some slang too

maxmordon
03-05-2008, 01:07 AM
I was wondering if you people think that is a good idea to have an English academy to regulate the language and dialects. Has happens with Spanish, Italian, French, German, Swedish and so on

ColoradoGuy
03-05-2008, 03:41 AM
I was wondering if you people think that is a good idea to have an English academy to regulate the language and dialects. Has happens with Spanish, Italian, French, German, Swedish and so on
I've always thought one of the strenghs of English is that we don't have such an academy. It makes the language more supple and vibrant, I think. Also, the long reach of the British Empire in the last century left behind many authors in the post-colonial world who write in English by choice, adding still more richness to the language. So I sorta like the chaos.

Flay
03-05-2008, 10:15 AM
I was wondering if you people think that is a good idea to have an English academy to regulate the language and dialects. Has happens with Spanish, Italian, French, German, Swedish and so on
No. Never. (For the same reasons given by ColoradoGuy.)

z10
03-27-2008, 05:47 AM
it's a good thing that shakespeare didn't have a problem with 'incorrect' english

Axelle
03-27-2008, 05:14 PM
I was told to tell my students to "give up" on attempting to achieve a "native speaker" level.

Dang. Does that mean I'll never be able to write as well as a native speaker ? I think I'm going to shoot myself, now.
Seriously, I'm not sure I agree entirely. My brother has been living in England for a few years now, and sometimes people don't even notice he's French. Actually that's kinda funny because, at his work, he was nicknamed "Frenchie" by some of the staff. And once, a newcomer asked him, "hey, Frenchie, why do they call you Frenchie ?" Duh. Why, I wonder.


My problem with this approach is that some prescriptiveness is necessary to achieve a standarization in language, not only for didactic but also for comunicative purposes.

True enough. Take orthograph for instance. For a long while, there was no standardization of orthograph, people wrote as they pleased. But when I had to read texts in ancient French at school, the weird orthograph (I'm not going to say, "mispelling", obviously) put me off.


there are atribalisms that irritate me greatly -- like the habit common in young US English-speakers of using 'then' instead of 'than'.

Ah, yes. After a while, actually, I began to wonder. I stumbled upon that one so often, I wasn't sure anymore whether "then" couldn't be used instead of "than". Now I'll know.


I was wondering if you people think that is a good idea to have an English academy to regulate the language and dialects. Has happens with Spanish, Italian, French, German, Swedish and so on

Well, duh. Looks like so far the English managed pretty well without an academy. Seriously, we don't pay much attention what they say, at the academy. For instance, they aren't very happy with people saying "e-mail" instead of "courriel", but people keep doing it regardless. All the difference it makes is that you might not find the word "e-mail" in a dictionary, since it's not officially a French word - but frankly, who cares ?
Dunno if you've read Stephen Clarke, but in one of his books he mentioned that, and he thought that the lack of an academy is a strength of the English language, because it is more flexible and can better evolve.

Bartholomew
04-09-2008, 09:02 AM
I'm a fairly draconian grammarian, but I think standardized spelling is over-rated.

Honestly, does it matter if you use theatre, theater, or theter?

mscelina
04-09-2008, 09:15 AM
is that a serious question?

Standardized spelling is important because of the whole concept of an 'agreed upon series of symbols.' it makes a helluva lot of difference how you spell things. Look at 'to' 'too' and 'two' for example. In the example you cited, 'theater' is primarily American, 'theatre' is primarily British (and those of us who spell it this way in the States get called snobs for it) and 'theter' isn't even a word, save for a misspelled one. What if in Canada, the number '1' was written as a '2' instead? it represents the same thing, but if you're an accountant you'd better believe that it's going to give you fits.

Ease of use is no reason to abandon the rules. The rules are there for a reason. And, as writers, it is our responsibility to know the rules, understand the rules, and implement them correctly. *and NO I'm not talking about dramatic dialect usage*

Keyan
04-09-2008, 10:10 PM
I'm a fairly draconian grammarian, but I think standardized spelling is over-rated.

Honestly, does it matter if you use theatre, theater, or theter?

Standard spelling makes reading easier and faster. If it's just a few words that are non-standard, it's fine. But most people read by the "shape" of the word, and if that shape isn't fixed, it slows one's reading speed.

Try reading anything written in the Tudor era, before spellings were standardized. You almost have to sound out each word mentally to understand the text.

Reeding is a lott sloer iff yoo hav to trie and desyfer the meening fromm werds thatt loock lyke thiss.

StephanieFox
04-10-2008, 01:39 AM
Why can't the English teach their children how to speak?
Norwegians learn Norwegian; the Greeks have taught their
Greek. In France every Frenchman knows
his language fro "A" to "Zed"
The French never care what they do, actually,
as long as they pronounce in properly.
Arabians learn Arabian with the speed of summer lightning.
And Hebrews learn it backwards,
which is absolutely frightening.
But use proper English you're regarded as a freak.
Why can't the English,
Why can't the English
Learn To Speak?

Ruv Draba
04-10-2008, 03:18 AM
I was wondering if you people think that is a good idea to have an English academy to regulate the language and dialects.It's far too late for that, I think. The US would doubtless pay the academy's bills and thereby gain legitimacy for propagating spellings and usage that have otherwise propagated by stealth anyway.

(Sorry, but as an Australian whose language's colour has gradually been bleached modernised modernized by too much US content internationalisation internationalization, I felt obliged obligated to make this comment deliver this feedback. :tongue)

RG570
04-10-2008, 07:12 PM
Has poor discipline and attention span hit such a critical mass that now there is "serious" discussion about this subject?

Of course there is correct English. If you do not want to learn it, then invent your own language and call it something else. Honestly, the discussions that come up these days are very frightening.

Oh, someone will probably name-drop some postmodern theorist to support "let's all invent our own spellings", but I doubt this is what any of them meant.

Keyan
04-10-2008, 11:34 PM
I read a New Scientist article the other day that suggested that English was (again) fragmenting into dialects...largely because the number of people who speak English as a second language are outnumbering the ones who speak it as their mother-tongue.

It seems to me that standardized English was the result of mass communications and rapid transport - it gradually led to the old dialects being subsumed into RP. But now, with English being the most widely spoken language across the globe, the fragmentation is overwhelming standardization.

What this means for writers I'm not sure. It may imply a lot of niche markets. It may also mean there are markets out there for books that have good plots, interesting characters, but are written in English that's simple enough to be accessible to those who don't necessarily think in the language. And maybe for graphic novels.

Dawnstorm
04-18-2008, 07:35 PM
I just came across this (http://www.ateg.org/conferences/c10/yates.htm) very interesting article. It's a sociololinguistic take from the perspective of a teacher. Very comprehensive and well thought out. I had to share that one.

ColoradoGuy
04-18-2008, 08:21 PM
Interesting article. I agree with the authors that language is always value-laden, in spite of what linguists may think or wish were so. To me that's another way of pointing out communicating with language goes far beyond the actual meaning of the words themselves; the scaffolding of how the words are used is also key. I also noticed a whiff of Sapir-Whorf (http://www.absolutewrite.com/forums/showpost.php?p=978535&postcount=638) in there, that speech usage determines how we think (or its variant, Sapir-Worf: learn to speak like a Klingon and you will think like a Klingon).

Dawnstorm
04-19-2008, 10:12 AM
Interesting. I didn't get any Sapir-Whorf vibes. Actually, I think their main point, teach the standard and thereby give the speaker the confidence to use their own dialects, too, states that teaching the standard won't change the way you think much. More like diversion tactics. Offering a platform of social negotiation, primarily to what sort of language use is appropriate, but also - secondarily and not really made clear in the article (only in footnote 4) - to how the standard's supposed to look tomorrow.

But I haven't thought this article through enough, yet, so there's probably plenty I'm missing.

What parts reminded you of Sapir-Whorf?

ColoradoGuy
04-19-2008, 06:33 PM
What parts reminded you of Sapir-Whorf?
This part, which I happen to agree with to the extent the author writes:

Finally, because language is so closely related to thought, studying our language patterns, our grammar can give us insight into our own ways of thinking. If in some important sense we are what we say (and write), then examining the principles through which we express our meanings can help us understand ourselves as well as others. (Lundsford & Connors, 1995, p 156-7)
(I do realize he's quoting someone else here)

Dawnstorm
04-19-2008, 10:06 PM
(I do realize he's quoting someone else here)

Ah, I see. Actually, the quote is an example of what they explicitly refute:


It is interesting to note the argument beginning with finally. Although without the overt racism of Simon, the same underlying assumption is there; namely, we are what we say and write. If a student does not control Standard English, then there is something fundamentally wrong with her reasoning process. Skretta is right to object to such a reason for learning the standard.

I think the difference between their take and the Sapir-Whorf take is that they'd point towards social circumstance as influencing both thought and speech; or in other words, that Sapir-Whorf are ignoring a couple of hidden variables.

I'm not sure, yet, if I agree with them.

ColoradoGuy
04-20-2008, 10:51 PM
One aspect of teaching "correct" grammar seems obvious to me, so obvious that we ignore it -- the adverse social effects of not using accepted speech forms. Unfair as it is, people judge you -- immediately -- according to your speech. Like having poor table manners, it is a social disability to lack the ear to tell when "correct" speech is called for and when more lax constructions are acceptable. Or so I tell my kids.

Ruv Draba
04-21-2008, 11:53 AM
I know that I'm not well equipped to evaluate papers like this one. It's not my field, and I'll be the first to admit that my science forebrain gets allergies to reading scholarly papers written about imponderables like 'what's good' in language, without seeing a contestable definition of that good, grounded against the some set of social objectives. (Did I fall in through the wall from the 'absolute values' thread? Hmm. Maybe. :tongue)

But I'm wondering to myself: these people are writing about my language. I'm a user of it, a consumer of it, and also by virtue of my profession, a vendor and sometime teacher of it. Yates and Kenkel doubtless didn't write their paper with me in mind, but since it's my language just as much as theirs, I'm wondering why the heck not? Or put another way: when do the language owners rather than the language technocrats get to comment? Or must we always be seen as subjects of study in these papers, rather than partners?

I took exception to the self-serving tone which I'd paraphrase as "We, the elite who understand that grammatical prescriptivism is tosh, but can't persuade the hoi polloi that it is..." and the several dismissive value-statements about other scholars, and a language user or two... and over all it left me concerned. Do they really expect their audience to warm to that tone? If so, what does that say about them? Or their audience? It gave me shudders. (It's not that I felt they were pointed in a stupid direction - it's that they seemed to have to appease an audience that is.)

When Information Technology staff adopt that posture in projects (and goodness knows, many do) - seeing themselves as socially superior to (and not simply better trained and educated than) those they serve, then we typically get the case of the Inmates running the Asylum. Projects fail to deliver value, but rather serve the vested interests of their custodians, rather than their customers' interests, which they hold in trust. When biotechnologists take that tone (about say, Genetically Modified Organisms or stem-cell research), the public naturally gets suspicious and affronted.

I don't read many scholarly papers about language, so I don't know if that condescending tone is typical, and that attitude must be appeased or appealed to. But if it is, then I'm worried. Are these guys even on the planet still? If they consider themselves socially superior to everyone else then what do they see as their accountability for the pronouncements they make about social good? And to whom do they hold it?

Anyway, back to content...

I understand (from the good examples in this document and my own experience) why prescriptions in grammar are frequently self-contradictory. I agree that it can make die-hard "iron fist" prescriptivists look a bit silly. But equally bizarre to me is the idea that a scholarly elite should look down on the culture whom they're entrusted to serve, just because its cultural prescriptions bite in language as they do everywhere else. All cultures have such prescriptions. They shouldn't need defending. And if they need explaining then surely the explanation must extend far beyond the idiosyncracies of spoken language to the quirks in how we greet one another, break bad news, court one another and so forth. So this paper got me a big :Huh: I understand what they're saying - just not why they need to argue for it.

As a member of said culture who knows enough grammar to be dangerous but not enough to be scholarly, but who learned from childhood (like every other kid) to vary his grammar and lexicon to suit his audience I confesses to being bewildered, perplexed, worried and borderline offended, I does. :tongue:tongue:tongue

Dawnstorm
04-21-2008, 12:19 PM
One aspect of teaching "correct" grammar seems obvious to me, so obvious that we ignore it -- the adverse social effects of not using accepted speech forms. Unfair as it is, people judge you -- immediately -- according to your speech. Like having poor table manners, it is a social disability to lack the ear to tell when "correct" speech is called for and when more lax constructions are acceptable. Or so I tell my kids.

I agree with that. What I support is a "better safe than sorry" method in using grammar, when in formal situations. For the institutional context of writing this means, for example, standard English for query letters, anything for the story.

That's why I found the article so interesting. Instead of focussing on "appropriateness", they're focussing on speaker confidence, with the hypothesis that teaching standard English (in a prescriptivist manner that does not reference either "correctness" [iron fist] or "appropriateness" [velvet glove], but builds a basic model of language to deviate from) increases speaker confidence. Increasing speaker confidence, so they hypothesise, will in turn decrease the fierceness of the judgement:


To suggest that linguistic security is the appropriate goal of English language teaching is not to suggest that we do not promote linguistic tolerance. Of course we do. However, we believe, following Cameron, that speakers will not abandon making value judgements on linguistic form. In the real world, language is never value neutral. Moreover, a consequence of increased linguistic security will be decreased linguistic intolerance. Finally, a focus on the development of linguistic security will help teachers avoid the pitfalls of the velvet glove prescriptivism of appropriateness; instead, the goal will be for students to have enough confidence in their control of the standard to know that they can choose to follow the norms or can choose to be appropriately inappropriate.

That, I suppose, is the attitude behind the line "You must know the rules to break them," one encounters so often on the web. I've always been uncomfortable with the line, because I focussed on the "rule aspect", what they refer to as "correctness/appropriateness" in this paper. But to approach it from the linguistic-security angle is interesting. This makes standard English not a value in its own right, but a psychological trick. A mental set of crutches, so to say.

Tools, not rules. Insecure speakers will find more utility in the professed rules of standard English than secure ones. (I'll have to reference footnote 4 in the text, at that point, because I think that's vital: " This is a problem for even "velvet glove" prescriptivism which presumes clear delineated language rules in all social situations. We must remember the Standard is always being contested.")

I find that article interesting as it gives a pedagogic frame to the teaching of rules that helps make sense of many interesting phenomena, such as their "Betty". Or the case of the grammar nazi hypocrite (GNH) (http://really-really.blogspot.com/2004/12/do-as-i-say-not-as-i-do.html).

Ruv Draba
04-21-2008, 01:12 PM
Tools, not rules. Bless you, SS Dawnstorm and all who sail in you!

But if it's 'tools not rules', then our linguistic academics are engineers, architects and quality assurance advisors of language, not door-bitches. Their role is advisory, and they have an ethical duty of care to support the society that supports them - and its values. They are not the arbiters of those values and if they want to comment on them, it must be respectfully as a citizen, not patriarchally as an influential elite.

Dawnstorm
04-21-2008, 01:26 PM
As a member of said culture who knows enough grammar to be dangerous but not enough to be scholarly, but who learned from childhood (like every other kid) to vary his grammar and lexicon to suit his audience I confesses to being bewildered, perplexed, worried and borderline offended, I does. :tongue:tongue:tongue

Heh. I'm quite the opposite. An outsider to said culture with too much theoretical knowledge about grammar not to be scholarly. May well be my problem. ;)

This is an opinion piece, not a study. They pretty much sound like that, I figure. I'm used to it. My filters kick in early, and I try to get straight at the content. I didn't even notice the tone anymore.


But I'm wondering to myself: these people are writing about my language. I'm a user of it, a consumer of it, and also by virtue of my profession, a vendor and sometime teacher of it. Yates and Kenkel doubtless didn't write their paper with me in mind, but since it's my language just as much as theirs, I'm wondering why the heck not? Or put another way: when do the language owners rather than the language technocrats get to comment? Or must we always be seen as subjects of study in these papers, rather than partners?

This paper was delivered at the ATEG conference; Assembly for Teaching English Grammar. It's teachers talking to each other. What tone do you expect?

Second, it's an American affair, and it's so internal that their [http://www.ateg.org/index.php]website[/url] merely says it's a "national forum for discussing the teaching of grammar" without bothering to specify the nation. They're not expecting a foreign audience.


I took exception to the self-serving tone which I'd paraphrase as "We, the elite who understand that grammatical prescriptivism is tosh, but can't persuade the hoi polloi that it is..." and the several dismissive value-statements about other scholars, and a language user or two... and over all it left me concerned. Do they really expect their audience to warm to that tone? If so, what does that say about them? Or their audience? It gave me shudders.

When Information Technology staff adopt that posture in projects (and goodness knows, many do) - seeing themselves as socially superior to (and not simply better trained and educated than) those they serve, then we typically get the case of the Inmates running the Asylum. Projects fail to deliver value, but rather serve the vested interests of their custodians, rather than their customers' interests, which they hold in trust. When biotechnologists take that tone (about say, Genetically Modified Organisms or stem-cell research), the public naturally gets suspicious and affronted.

This is English teachers claiming that the goal of teaching grammar should be to build linguistic confidence in students, rather than to instill "correct" or "appropriate" language behaviour. It's drive is pedagoical rather than linguistic. And it's an opinion piece.

Considering this, what is worrying you? I didn't actually get the impression that the authors except themselves. The title of the text "We're Prescriptivists. Isn't Everyone?" is a bit odd. On the surface it sounds like sarcasm, but after reading the entire article, I'm not so sure. I do think they mean that. And I do think they try to figure out how to teach English when what they teach cahnges what they teach. I think the tone might be a result of having to leave out stuff (such as the discussion that "footnote 4" on the contested status of the standard).


I don't read many scholarly papers about language, so I don't know if that condescending tone is typical. But if it is, then I'm worried. Are these guys even on the planet still? If they consider themselves socially superior to everyone else then what do they see as their accountability for the pronouncements they make about social good? And to whom do they hold it?

I'm not convinced they consider themselves "socially superior". I think the impression you get might stem from their job as "teachers". As I said, it's not primarily a linguistic article; it's a pedagogic one. This does include a "teacher" and a "student" as a structural property. And since they hold the role of teacher...

I much prefer this article to the constant professing of the death of language because people misplace their prepositions or start sentences with conjunctions.


I understand (from the good examples in this document and my own experience) why prescriptions in grammar are frequently self-contradictory. I agree that it can make die-hard "iron fist" prescriptivists look a bit silly. But equally bizarre to me is the idea that a scholarly elite should look down on the culture whom they're entrusted to serve, just because its cultural prescriptions bite in language as they do everywhere else. All cultures have such prescriptions. They shouldn't need defending. And if they need explaining then surely the explanation must extend far beyond the idiosyncracies of spoken language to the quirks in how we greet one another, break bad news, court one another and so forth. So this paper got me a big :Huh:

What I find interesting is that you're taking from the article a different set of participants in the game than I do. Or at least that's how I see it.

You: There's the elite. And then there's the culture, which includes the prescriptivists.

Me: There's the culture, which includes the prescriptivists (mavens), on the one hand, and the descriptivists (they mention Nunberg and Pinker) on the other, each fulfilling different roles and clashing occasionally, most often in the media. And then there are teachers, whose job involves prescriptions, but who may be sympathetic to the descriptivist side as well. These are teachers trying to find their place in this conflict. They're basically saying: Prescriptivist teachers? Descriptivist teachers? Bah, humbug. People will make value judgments anyway. The question is, should we, or should we leave it to our students. The former leads into the prescriptivist trap, the latter leaves the students with no foundation. What to do?

Genetics is different. You don't get the general public to tell them how to splice genes. It's esoteric knowledge. Language isn't. Which makes teaching so convoluted in the first place.

***

I'm still processing this article. But to answer your question about the tone in scholarly linguistics papers: in the grammar wars most texts are way fiercer, but most linguistics texts don't play in that field.

SPMiller
04-21-2008, 01:32 PM
Orthographical systems exist to facilitate the recording of speech.

We resist change to these systems as the speech they represent changes. We often refuse to respell loanwords even while we're mangling the pronunciations with our restricted phonetic inventories.

And then we wonder why spelling is so difficult.

Go figure. Silly humans.

Ruv Draba
04-21-2008, 02:26 PM
This is an opinion piece, not a study.
There. You nailed it! Opinion pieces about a generalist topic don't need to be written for an exclusive audience in a condescending tone. By their nature they belong to the culture as a whole. (Agreed, it's US teachers and hence not my culture, but my culture does inherit a lot of teaching trends from the US - though thankfully we still haven't made class-mate shooting part of our curriculum yet. :()

This paper was delivered at the ATEG conference; Assembly for Teaching English Grammar. It's teachers talking to each other. What tone do you expect?What I expected probably isn't reasonable. Professional neutrality. Respect for other scholars. Leave your political agendas at the door. You're right. At $50K average salary and with sinking morale, it's probably not that community.

This is English teachers claiming that the goal of teaching grammar should be to build linguistic confidence in students, rather than to instill "correct" or "appropriate" language behaviour.Yes, but surely, the primary objective of education is competence and here I harken to your 'tools' comment. "Confidence" (or "security" as the authors put it) follows from how people treat you, and how you think of yourself.

I speak and write about five distinct versions of English. I have an 'intimate' Australian dialect, a 'friendly but polite' Australian dialect, a 'formal' Australian dialect (almost an oxymoron), an 'international, friendly' dialect (which comes in a few flavours if I know my audience), and an 'international, formal' dialect. Diction, spellings and grammical principles (if not 'rules') can vary in each. My competence (such as it is) is that I have more than one version to play with, and know how to play with it. My confidence is that I trust myself to use them and say what I want to say. (And by the way, I only got taught one of these forms formally - and by a HGN at that. :) The others were either mocked or ridiculed or beaten into me by my fellows.)


Considering this, what is worrying you?Imagine that you were reading well outside your field, and came across an opinion piece in a biological magazine (for example), entitled 'Hey! Red-heads are humans too!' It's hard to disagree with, but why the heck was it ever in question?

You: There's the elite. And then there's the culture, which includes the prescriptivists.Not quite. Me: there's the elite, comprising Big-Endians and Little-Endians ostensibly warring over how people should eat eggs, but really warring for which faction has the power to control society. Then there's the rest of us, who have to farm, harvest and eat the eggs. ("Four eggs good. Two eggs baaaad.")

These are teachers trying to find their place in this conflict. They're basically saying: Prescriptivist teachers? Descriptivist teachers? Bah, humbug. People will make value judgments anyway. The question is, should we, or should we leave it to our students. The former leads into the prescriptivist trap, the latter leaves the students with no foundation. What to do?See, I think that's a good phrasing of the question: 'We're English Grammar teachers. We're here to serve and strive to do a good job. We're in a jam. How do we get out of it?' Anyone can read and appreciate that phrasing of the question. It's humble, focused, dedicated, professional and practical. There's no *-Endian self-aggrandisement to it. You should have written the paper! :D

TerzaRima
04-21-2008, 08:13 PM
Like having poor table manners, it is a social disability to lack the ear to tell when "correct" speech is called for and when more lax constructions are acceptable.

As a side note, problems with this kind of code switching plague otherwise brilliant people with autism--they tend to use the same forms of speech with everyone, along with the same nonverbal meta aspects of conversation-- frequency of eye contact, congruency of facial expression, and interpersonal distance.

It's interesting to consider what this says about the relationship language bears to theory of mind, which concept is probably disordered or absent in the autistic brain.

Dawnstorm
04-21-2008, 09:30 PM
There. You nailed it! Opinion pieces about a generalist topic don't need to be written for an exclusive audience in a condescending tone. By their nature they belong to the culture as a whole. (Agreed, it's US teachers and hence not my culture, but my culture does inherit a lot of teaching trends from the US - though thankfully we still haven't made class-mate shooting part of our curriculum yet. :()

I agree that it needn't be in a condescending tone, or to an exclusive audience. But if your expected target audience is all teachers it can be. (I still don't really see the condescension, though. They're making generalist statements, yes, but as far as I can tell they're including themselves. I don't have time to hunt the quotes, but there are small hints throughout the text, I think.)


What I expected probably isn't reasonable. Professional neutrality. Respect for other scholars. Leave your political agendas at the door. You're right. At $50K average salary and with sinking morale, it's probably not that community.

The grammar war is on odd arena, where everybody mingles. Everybody has an opinion, but they don't all speak the same language. For example, it's very hard for me not to make fun of Stephen King, when he's referring to the "passive tense" in On Writing. Basically, he's getting the terminologoy wrong. So what? I still see what he's saying about the passive voice (and I disagree).

The thing is, though, that most of the time, people don't distinguish clearly between grammar and style. Pay attention to the discourse and you'll see most of it is about style not grammar.


Yes, but surely, the primary objective of education is competence and here I harken to your 'tools' comment. "Confidence" (or "security" as the authors put it) follows from how people treat you, and how you think of yourself.

But there you have a problem: most native speakers are already pretty competent. If they weren't, you couldn't language at all to teach grammar. That's the point. What is it they're teaching in the first place? The subject matter is elusive.

Basically, grammar classes that teach the subject take the student's natural dialect and point out the difference to the standard. Do this in an authoritarian fashion, and you may scare the student into insecurity. Their own dialect isn't good enough, and they don't master the new one yet. In addition, other teachers may give them heat for applying what their English teacher has taught them (as teachers don't always agree what's "correct" or "proper" - see the "don't start sentences with 'but'; say 'however'" vs. "don't start sentences with 'however'; say 'but'" wars. It's enlightening.)

Studying the grammar of the language you already speak is different from studying the grammar of a language you're learning from scratch. All that's left to really "teach" are superficial quibbles. So - of course - people cling to that. You don't generally have to teach a native speaker that - in English - "The tomato ate the man," is a rather odd sentence, because it means that the tomato is doing the eating, for example. All the basic stuff has been mastered years ago. The stuff that's left is up for negotiation. Naturally, teachers go for that. How? That is the question.

Focussing on confidence means that you're making your students aware that what they'll be saying may affect their social standing, but that it may also affect what's considered standard. "Me" in non-object position, and so on.

Also, this means that "standard English" is at its most valuable in non-familiar situations. Which means - in practise - that you can allow your students to use their own dialect in class to discuss the standard. The formality of a situation is - after all - a matter of negotiation.


I speak and write about five distinct versions of English. I have an 'intimate' Australian dialect, a 'friendly but polite' Australian dialect, a 'formal' Australian dialect (almost an oxymoron), an 'international, friendly' dialect (which comes in a few flavours if I know my audience), and an 'international, formal' dialect. Diction, spellings and grammical principles (if not 'rules') can vary in each. My competence (such as it is) is that I have more than one version to play with, and know how to play with it. My confidence is that I trust myself to use them and say what I want to say. (And by the way, I only got taught one of these forms formally - and by a HGN at that. :) The others were either mocked or ridiculed or beaten into me by my fellows.)

I suppose you speak more, but these are the blanket categories. Heh.

I, personally, don't speak much at all when I come into an unfamiliar surrounding. I spend most of my time listening. I remember one occasion where I was chiming in on a conversation. The one person who didn't know me fell silent, turned to the others and said, amazed tone: "He speaks!" Made me chuckle.

People who tend to interact on a regular basis do adjust their diction to each other. And language can be used to demarket, erect or tear down barriers. (Which is what causes the most troubles for non-native speakers, or between different cultures who share a surface language. The social antennae suddenly malfunction.)


Imagine that you were reading well outside your field, and came across an opinion piece in a biological magazine (for example), entitled 'Hey! Red-heads are humans too!' It's hard to disagree with, but why the heck was it ever in question?

Yes, but I don't see that analogy. They're not arguing any such thing. They're arguing that English is taught in a way that makes/keeps people insecure about the way the talk. They're pointing out that - out there in the real world - there are people who mouth a liberal dogma about it being good that there are many different dialects and then apologise for their own dialect and turn to the standard for comfort. (The "Betty" episode.)

It seems to me, the worrying thing isn't in the article; it's in the culture.


Not quite. Me: there's the elite, comprising Big-Endians and Little-Endians ostensibly warring over how people should eat eggs, but really warring for which faction has the power to control society. Then there's the rest of us, who have to farm, harvest and eat the eggs. ("Four eggs good. Two eggs baaaad.")

Thanks for the clarification.

But it's really less a question of elitism, as it is one of social status, particularly education. "Oh, he's educated. He must know." This cuts across teachers and students and novelists and poets and journalists and graffiti artists and texters alike. The idea that something as ubiquitious as speech is a sign of social hierarchy. (I wonder whether this explains, to a certain extent, reality TV: "Yay! They speak like us!" (not not "like we"!)



Anyone can read and appreciate that phrasing of the question. It's humble, focused, dedicated, professional and practical. There's no *-Endian self-aggrandisement to it. You should have written the paper! :D

Anyone might understand that, but would it get me funds? ;)

***


As a side note, problems with this kind of code switching plague otherwise brilliant people with autism--they tend to use the same forms of speech with everyone, along with the same nonverbal meta aspects of conversation-- frequency of eye contact, congruency of facial expression, and interpersonal distance.

It's interesting to consider what this says about the relationship language bears to theory of mind, which concept is probably disordered or absent in the autistic brain.

That's a very interesting angle.

C.bronco
04-21-2008, 09:36 PM
Grammar serves a purpose. I don't want a reader or listener to have a vague idea about what I meant; I want the reader or listener to know exactly what I meant.

Double negatives make it more laborious to access meaning. Why say "he didn't have no shoes" when you can say "he had shoes?"

Dawnstorm
04-21-2008, 10:02 PM
Double negatives make it more laborious to access meaning. Why say "he didn't have no shoes" when you can say "he had shoes?"

Actually, "He didn't have no shoes," usually means "He didn't have any shoes." In dialects that say that, "no" is equivalent for "any". There's very little ambiguity in practice.

Of course, context can override that meaning.

A: He had no shoes.
B: He didn't have no shoes.

Ruv Draba
04-22-2008, 04:50 AM
The thing is, though, that most of the time, people don't distinguish clearly between grammar and style. Pay attention to the discourse and you'll see most of it is about style not grammar.A fair point; but maybe that distinction isn't as clear in usage as it is in structural analysis. When I assemble sentences in my native language I believe that I use a process something like this:

My message, which is sitting in my head in whatever form it normally resides there:

gets moderated by common knowledge, shared objectives and shared values;
becomes a stylised representation of what I really mean, which I construct from numerous previous examples;
only then is checked against context, interpretation and 'rules' (or principles, guidelines) of formation - and then only for clarity and impact.When I'm not communicating in my native language, I do it very differently, but for native communications style is far more important than grammatical perfection. Perhaps this because I know that while my audience can parse and forgive 'imperfect' grammar, they are far less forgiving about imperfect style and often won't even read or hear my messages if they don't like the style. So very often, my style (and the implicit messages it carries about shared values and knowledge) informs my choice of grammatical (or nongrammatical) construction.

Indeed, I can think of many social situations in Australia in which 'perfect' grammar just doesn't work. Australians often use imperfect grammar semideliberately to create intimacy, soften harsh messages and create a sardonic, semiapologetic tone. E.g.

'You can't do that, but!' can work better in some circumstances than 'But you can't do that!' - more so if you don't normally use that construction. This is a local example of 'inappropriate but highly effective' - or my analog of the authors' 'appropriately inappropriate'.

What is it they're teaching in the first place? The subject matter is elusive.Well, to teach "confidence" then surely they must teach style. But to teach style, then you surely need to teach the building blocks - and grammar is part of that toolkit.

Basically, grammar classes that teach the subject take the student's natural dialect and point out the difference to the standard.The ones I attended did that too, but not just that. They didn't just reform; they informed too. I learned what nouns and verbs are; what objects and subjects and predicates are, and how they normally hang together. Only once I understood that, did I learn that there were certain conventions that standard English adhered to. And simultaneously, I learned that nobody I knew much adhered to those conventions - including my teachers. My HGN teacher tried to explain that they should but (I felt at around the age of 10) didn't make the case very clearly. It seemed like a moral argument being used inappropriately to justify mere custom.

Do this in an authoritarian fashion, and you may scare the student into insecurity.Even before you go there, I think there's a bigger problem: Grammar is a dry subject. It's great for analytic nerds like me, but most people aren't like that. Maybe there's an experiential or dramatic angle for teaching it, but the theoretical one turns kids off. How can they learn either rules or comparative linguistics if they can't learn the fundamentals?

Studying the grammar of the language you already speak is different from studying the grammar of a language you're learning from scratch. All that's left to really "teach" are superficial quibbles.While I agree with the first, I think that it's the "superficial quibbles" that carry much of the cultural import, like shared values and common knowledge. I don't think that they're superficial quibbles at all. "He my baby daddy" can convey a very different social message than: "He is my baby's father". In fact it can mean: "he is a man whom I've chosen to father one of my children" - and convey respect, loyalty and mutual obligation but not necessarily mutual and monogamous devotion.

But if I said it (looking and sounding like an Ango Australian) it probably wouldn't be seen as meaning that at all. It more likely be interpreted that I was mocking African Americans and possibly making disparagaing cultural judgements about marriage and fidelity.

So... style's important - even paramount... but I suspect that you already know this. What I still don't understand is why, if it's been recognised since the time of writers like Aristophanes, Shakespeare, and Robbie Burns, it's not yet recognised by grammarians. :p

...out of time. More later.

Dawnstorm
04-23-2008, 09:44 PM
...but maybe that distinction isn't as clear in usage as it is in structural analysis.

I don't actually think the distinction is very clear, even in structural analysis. But, I thought, the terms were precise enough for my purposes.

For example:


When I'm not communicating in my native language, I do it very differently, but for native communications style is far more important than grammatical perfection.

See, in the way I used the terms "grammatical perfection" is a stylistic concept, not a grammatical one.


'You can't do that, but!' can work better in some circumstances than 'But you can't do that!' - more so if you don't normally use that construction. This is a local example of 'inappropriate but highly effective' - or my analog of the authors' 'appropriately inappropriate'.

An interesting construction. Are there other instances of "main clause, conjunction", or is "[chiding remark], but" unique?

See, if it's used regularly, it's open to grammatical analysis and thus grammatical. The evidence is that people use it and it's not a slip. It's, of course, not grammatical in any version of Standard English I know.

And this is where style and grammar blur: in the standard. My rule-of-thumb, and it's purely methodological, is that - if some experts say a usage is standard, and others disagree, it's style. That's clearly not scholarly rigour, but it works for me.

The thing is: you can't teach the basics of grammar of your own language easily, because it's stating the obvious. You'll have to point out languages that differ, creating a feeling for how this particular feature you're using unreflectively isn't universal, so that you can go back to your native tongue and realise what you're doing.


Well, to teach "confidence" then surely they must teach style. But to teach style, then you surely need to teach the building blocks - and grammar is part of that toolkit.

I definitely agree. But I think you'll have to point out two things: (a) you're teaching the grammar of the standard, and (b) "standard" isn't better than "non-standard", it's just more widespread.

There's nothing wrong with being non-standard, but if you're not among your own, your asserting yourself, and you'll have to decide for yourself how comfortable you are with that. There's a huge area between "conformist" and "enfant terrible".


The ones I attended did that too, but not just that. They didn't just reform; they informed too. I learned what nouns and verbs are; what objects and subjects and predicates are, and how they normally hang together. Only once I understood that, did I learn that there were certain conventions that standard English adhered to.

That depends what terminology they served you. If they fed you on a diet of traditional grammar ("parts of speech theory", "sentence diagramming" etc.), the idea that English should be more like Latin was ingrained within the terms. It's the style of rennaisance humanists, filtered through the centuries. (I've heard people claim "the" is an adjective. See, Latin doesn't have articles, so there are parts-of-speech theories that lack that category. "the" modifies a noun, so it must be an adjective. Simple.) As an example, the (currently not very strong) aversion of traditionalists against "It's me," (which should be "It's I") is encoded in the term "objective pronoun".

Traditional grammarian: "Me" is the objective pronoun.
Me I: But in "It's me," "me" isn't an object.
TG: Exactly!
Me I: Huh?
TG: It should be "It's I."
Me I: Why?
TG: Because "me" is the objective pronoun and as you said "me" in "It's me" is no object.
Me I: But what if "me" wa... weren't an objective pronoun?
TG: Then, I suppose, "It's me," might be correct. But, as it happens, "me" is the objective pronoun.
Me I: Why?
TG: "He gave me the book." Do I need to explain?
Me I: "It's me." "Me, I'm not convinced." All wrong?
TG: They're not objects.
Me I: No, they're not.


And simultaneously, I learned that nobody I knew much adhered to those conventions - including my teachers. My HGN teacher tried to explain that they should but (I felt at around the age of 10) didn't make the case very clearly. It seemed like a moral argument being used inappropriately to justify mere custom.

I suppose you used different words back then? ;)

I wish more people would realise that.


Even before you go there, I think there's a bigger problem: Grammar is a dry subject. It's great for analytic nerds like me, but most people aren't like that. Maybe there's an experiential or dramatic angle for teaching it, but the theoretical one turns kids off. How can they learn either rules or comparative linguistics if they can't learn the fundamentals?

The folks at Language Log suggest Word Rage Therapy. That might even work, if people wouldn't enjoy ranting so much (me included).


While I agree with the first, I think that it's the "superficial quibbles" that carry much of the cultural import, like shared values and common knowledge. I don't think that they're superficial quibbles at all. "He my baby daddy" can convey a very different social message than: "He is my baby's father". In fact it can mean: "he is a man whom I've chosen to father one of my children" - and convey respect, loyalty and mutual obligation but not necessarily mutual and monogamous devotion.

That goes back to "What do we teach in the first place?" "He my baby daddy," isn't any standard English I recognise. Would anyone you know argue it is? I suspect not, as you said:


But if I said it (looking and sounding like an Ango Australian) it probably wouldn't be seen as meaning that at all. It more likely be interpreted that I was mocking African Americans and possibly making disparagaing cultural judgements about marriage and fidelity.

Which means that people recognise non-standard grammar, doesn't it?

The operative question, here, is at what point in first-language acquisition do we acquire the distinction between dialects? The idea of "wrong"/"right"? I suppose, it's quite early; usually before we start to learn grammar in school.


What I still don't understand is why, if it's been recognised since the time of writers like Aristophanes, Shakespeare, and Robbie Burns, it's not yet recognised by grammarians. :p

It is, as far as I know. No grammarian I've read, however descriptive, would subscribe to "anything goes", or "there's no difference".

Ruv Draba
04-24-2008, 06:46 AM
An interesting construction. Are there other instances of "main clause, conjunction", or is "[chiding remark], but" unique?Hard to say... if you take the traditional "FANBOYS" conjunctions of for, and, nor, but, or, yet, and so then I've seen sentences end in:

For: "What are you doing that for?"
But: "You can't do that, but!"
Or: "Would you like some milk in your tea, or...?" (sic)
So: "He's having fun, so..." (sic)

Of these, the but usage is conspicuously different. It's far from standard and always surprising when I hear it. I've never heard of it outside Australia. Yet it's deliberate, rather than accidental - not always deliberately wrong; just deliberate.

As for sentences that end in elisions - I often think that the speaker has no idea what the ending is. They don't just run out of utterance, they run out of ideas and confidence! :tongue


And this is where style and grammar blur: in the standard. My rule-of-thumb, and it's purely methodological, is that - if some experts say a usage is standard, and others disagree, it's style.That shuffles the problem neatly onto the experts, who doubtless don't agree much in their criteria. :D And expertise does not preclude crackpottery, so maybe it's all style? :D :D :D

The thing is: you can't teach the basics of grammar of your own language easily, because it's stating the obvious.Actually, I learned almost as much grammar in studying computing as I did when schooled in standard English. A grammar is very important in computer programming, computer language design and information processing.

Computer programming contains statements that are declarative, imperative and interrogative. It has its equivalent of verbs, nouns, subjects, objects and adverbs and (at a stretch) adjectives (just properties of nouns). There are also conjunctions and the occasional preposition; even sorts of definite and indefinite articles. But unlike human languages computer grammars are highly prescriptive. You learn these prescriptions against an operational model which happens to be very limited in comparison to human behaviour.

But it strikes me that learning grammar in your own language can work much the same way. You take stories of very simple behaviours, (e.g. subject acting on object), and relate them to grammar ("The dog chased the ball"). If you do that, you don't really need to know other languages to get a basic understanding of how your own grammar works.

I think you'll have to point out two things: (a) you're teaching the grammar of the standard, and (b) "standard" isn't better than "non-standard", it's just more widespread.Relating to the IT domain again, standards are a great joke here. A common quip in IT is that standards are wonderful - if you don't like the one you have, you can always find another. :D

In the information industry, reform and standardisation are driven by the desire to better communicate and integrate information systems. But a countervailing force is the desire to innovate and commercialise (or put another way: control what else you buy if you buy this information system). I suspect that there are analogous forces in general and specialised human languages too.

There's a huge area between "conformist" and "enfant terrible".Whether one accepts Sapir-Worf I think it's incontestable that language holds social power. Changing language helps to redistribute power and perhaps that's why each generation of teenagers invents or reinvents its own words and sometimes its own grammar. Certainly, it helps explain the jargon treadmill in IT and marketing.

But equally, reforming language may be seen as reconsolidating power. Which raises the question: if language reforms are useful (and I agree with the authors, they often are), who gets to decide how the reform occurs? I don't believe that language 'experts' or teachers have this moral right - though they certainly have the right to commentary.

I've heard people claim "the" is an adjective. See, Latin doesn't have articles, so there are parts-of-speech theories that lack that category. "the" modifies a noun, so it must be an adjective. Simple.My engineering mind doesn't care, because I feel that for most human languages, grammar is just a descriptive model of some artefact of cognition - it's not a model of cognition itself.

Indeed, I'd argue that human language preceeds its formal grammar: language is an artefact of cognition and also a tool of cognition, but not all cognition is lingual. We can make decent sense of some utterances without even knowing what the grammar is.

Moreover, my science mind tells me that descriptive models are often found to be flawed, and can be improved over time, and that fuzziness and ambiguity are common properties of complex systems. Given that linguists sometimes argue over vowels and consonants I'm not surprised that they argue over bits of grammar too.

But for computer languages, grammars are a prescriptive model. I know exactly what operations I want language to describe, and so I know exactly what my nouns and verbs are.

I suppose you used different words back then?I doubt that I used words at all to capture that understanding - I just recognised a familiar pattern of adult hypocrisy. :D

That goes back to "What do we teach in the first place?" "He my baby daddy," isn't any standard English I recognise. Would anyone you know argue it is? I suspect notIf you mean "international standard" then, probably not. But perhaps someone from a sociological perspective might argue that it's a subcultural standard. ("Ebonics" anyone?).

Which means that people recognise non-standard grammar, doesn't it?I've asserted my belief earlier that people recognise meaning without clear grammar. Since language is both a cognitive artefact and a cognitive tool I think it's easy to confuse a description of the artefact with a description of cognition. We understand many utterances just fine without knowing what the grammar is, because we understand the cognition behind the utterance. Understanding "nonstandard" grammars is fairly trivial by comparison.


The operative question, here, is at what point in first-language acquisition do we acquire the distinction between dialects?Again, I see the terms "language" and "dialect" as a sort of classification of certain artefacts of cognition - and therefore not to be confused with cognition itself.

For some kids there may not be a first language at all - they may learn multiple languages concurrently and not realise (for a time) that they are different languages.

I suspect that kids learn early on to engage different people in different ways and for different purposes. Certainly, kids learn to express anger, disappointment and so on long before they learn to speak. Might it simply be that 'dialect' is just an extension of varying intent for varying audiences? Is it possible that children learn well enough by example that they don't need grammar as a conscious model, but simply build an "expert system" of thought in their head that creates affine utterances without knowing why they are affine?

Certainly, computer systems can do this. It's possible to build neural networks, Markov chains and similar stochastic information systems that can produce English-like utterances just from examples without ever entering grammatical rules. Indeed, if you try and pull these systems apart to find the grammars, you may not succeed.

Dawnstorm
04-24-2008, 10:52 AM
Hard to say... if you take the traditional "FANBOYS" conjunctions of for, and, nor, but, or, yet, and so then I've seen sentences end in:

For: "What are you doing that for?"
But: "You can't do that, but!"
Or: "Would you like some milk in your tea, or...?" (sic)
So: "He's having fun, so..." (sic)

Of these, the but usage is conspicuously different. It's far from standard and always surprising when I hear it. I've never heard of it outside Australia. Yet it's deliberate, rather than accidental - not always deliberately wrong; just deliberate.

As for sentences that end in elisions - I often think that the speaker has no idea what the ending is. They don't just run out of utterance, they run out of ideas and confidence! :tongue

You're right, the "or" and "so" examples are merely sentences trailing off. (The "for" up there is a preposition; classic sentence final preposition that people advise against. Guess what I'm thinking about that advice. ;) )

The "You can't do that, but." is definitely different. I just thought a "standard English variant, informal" that's similar. "You can't do that, though." "(Al)though" is, generally, a conjuntion.

(I'm assuming it's not "You can't do that, bud." :tongue )


That shuffles the problem neatly onto the experts, who doubtless don't agree much in their criteria. :D And expertise does not preclude crackpottery, so maybe it's all style? :D :D :D

Tee hee hee. But seriously, I doubt anyone would say "The cat sat on the mat," is ungrammatical. There's a lot of consensus, really. It's just not headline material.

But I agree, it's an unsystematic and not really foolproof methodology.


Actually, I learned almost as much grammar in studying computing as I did when schooled in standard English. A grammar is very important in computer programming, computer language design and information processing.

Computer programming contains statements that are declarative, imperative and interrogative. It has its equivalent of verbs, nouns, subjects, objects and adverbs and (at a stretch) adjectives (just properties of nouns). There are also conjunctions and the occasional preposition; even sorts of definite and indefinite articles. But unlike human languages computer grammars are highly prescriptive. You learn these prescriptions against an operational model which happens to be very limited in comparison to human behaviour.

Well, the main difference is that "syntax errors" don't crash people as they do programs. The metaphor takes you only so far.

But it does spawn an entire discipline: Computational Linguistics (http://en.wikipedia.org/wiki/Computational_linguistics).


But it strikes me that learning grammar in your own language can work much the same way. You take stories of very simple behaviours, (e.g. subject acting on object), and relate them to grammar ("The dog chased the ball"). If you do that, you don't really need to know other languages to get a basic understanding of how your own grammar works.


The same narrative can be expressed in different grammars. Google "ergative-absolutive languages" and "nominative-accusative languages" (the ones you know, I wager), and then come back and explain the difference in simple narrative. I wouldn't know how.


Relating to the IT domain again, standards are a great joke here. A common quip in IT is that standards are wonderful - if you don't like the one you have, you can always find another. :D

Heh. You're not using the word "standard" as it's usually used in "standard English". More later.


In the information industry, reform and standardisation are driven by the desire to better communicate and integrate information systems. But a countervailing force is the desire to innovate and commercialise (or put another way: control what else you buy if you buy this information system). I suspect that there are analogous forces in general and specialised human languages too.

Actually, there's little drive to change language. You do hear about the occasional codification reform (German had a spelling reform, recently; oh the chaos, what with three countries involved, and nobody really wanting it except the experts...)

Which doesn't matter. Language is naughty and changes anyway.


Whether one accepts Sapir-Worf I think it's incontestable that language holds social power. Changing language helps to redistribute power and perhaps that's why each generation of teenagers invents or reinvents its own words and sometimes its own grammar. Certainly, it helps explain the jargon treadmill in IT and marketing.

Agreed.


But equally, reforming language may be seen as reconsolidating power. Which raises the question: if language reforms are useful (and I agree with the authors, they often are), who gets to decide how the reform occurs? I don't believe that language 'experts' or teachers have this moral right - though they certainly have the right to commentary.

Language reform, IMO, is both unnecessary and very hard to pull off. The best you can do is change the sermon every few years or so. Doesn't matter much, as the he(a)rd thrives on hypocracy anyway. ;)


My engineering mind doesn't care, because I feel that for most human languages, grammar is just a descriptive model of some artefact of cognition - it's not a model of cognition itself.

I never said language is a model of cognition itself. I don't believe that at all. I just think that it's not really plannable.


Indeed, I'd argue that human language preceeds its formal grammar: language is an artefact of cognition and also a tool of cognition, but not all cognition is lingual. We can make decent sense of some utterances without even knowing what the grammar is.

Oh yes!

Btw, what I call grammar is the regularities that language needs to have to transcend individuals and fascilitate communication.

But cognitive linguistics isn't really my area of expertise; we ought to ask Schweta as soon as we figure out the questions. ;)


Moreover, my science mind tells me that descriptive models are often found to be flawed, and can be improved over time, and that fuzziness and ambiguity are common properties of complex systems. Given that linguists sometimes argue over vowels and consonants I'm not surprised that they argue over bits of grammar too.

Can't argue with that.


But for computer languages, grammars are a prescriptive model. I know exactly what operations I want language to describe, and so I know exactly what my nouns and verbs are.

Yes, but then they talk to computers, who don't easily forgive fuzziness. Humans don't easily forgive accuracy, instead. ("or" as a logical operator, vs. "or" as a conjunction. or/xor? *Evil chuckle.*)


I doubt that I used words at all to capture that understanding - I just recognised a familiar pattern of adult hypocrisy. :D

Hehe.


If you mean "international standard" then, probably not. But perhaps someone from a sociological perspective might argue that it's a subcultural standard. ("Ebonics" anyone?).

The way linguists use "standard" in "standard language" there is no subcultural standard. They'd simply call that dialect (specific to a subculture). The smallest *lect is the idiolect, spoken by a single speaker.

But again, you're right that the borders are fuzzy. The standard is different for America and England. So, in your case, I was probably talking about standard Australian English.

The notion of a "standard language" is anything but well-defined. However there's a special relationship between the standard and its dialects. A standard is a construct nobody really speaks (as opposed to real dialects) but everybody's supposed to understand (again, as opposed to real dialects). However, some dialects are closer to the standard than others.

Fuzzy notions abound.


I've asserted my belief earlier that people recognise meaning without clear grammar. Since language is both a cognitive artefact and a cognitive tool I think it's easy to confuse a description of the artefact with a description of cognition. We understand many utterances just fine without knowing what the grammar is, because we understand the cognition behind the utterance. Understanding "nonstandard" grammars is fairly trivial by comparison.

Oh yes!


Again, I see the terms "language" and "dialect" as a sort of classification of certain artefacts of cognition - and therefore not to be confused with cognition itself.

How come Dutch and German are two languages rather than dialects of a blanket language? I've heard/read German dialects that I found more obscure than Dutch (which I never learned). And it's not just phonetics and spelling.

It's not easy to decide what's a language and what's a dialect, and often political relations cloud the judgment (look at what was once former Yugoslavia).


For some kids there may not be a first language at all - they may learn multiple languages concurrently and not realise (for a time) that they are different languages.

Well, that's the point I was making earlier. At what point do children realise that there are different languages/dialects?


I suspect that kids learn early on to engage different people in different ways and for different purposes. Certainly, kids learn to express anger, disappointment and so on long before they learn to speak. Might it simply be that 'dialect' is just an extension of varying intent for varying audiences? Is it possible that children learn well enough by example that they don't need grammar as a conscious model, but simply build an "expert system" of thought in their head that creates affine utterances without knowing why they are affine?

Er, yes, that's what I've been arguing all along. Nobody has ever been taught a formal grammar before learning to speak (I think).


Certainly, computer systems can do this. It's possible to build neural networks, Markov chains and similar stochastic information systems that can produce English-like utterances just from examples without ever entering grammatical rules. Indeed, if you try and pull these systems apart to find the grammars, you may not succeed.

That's interesting. I'll look it up some time.

Shweta
04-24-2008, 12:37 PM
Oh what the hell, it's this or finish chapter 3 of the WIP :D

*deep breath*
*plunge*


My problem with this approach is that some prescriptiveness is necessary to achieve a standarization in language, not only for didactic but also for comunicative purposes. Chaos would ensue if everyone was just let to his or her own devices and every single deviation from the original language was considered 'normal'.

Well, sort of. Like any cultural activity, people are pretty good at policing. And kids at self-policing; they fix their own non-standard grammar when they hear that adults say it differently.

So there's no way we'll ever lack prescriptivists; is there anyone here, even in this linguistically-clueful group, who doesn't cringe when a word is used "wrong"? Even if, in 60 years, that usage will be considered right? I know I do. And there's nothing wrong with that! It's just that... who gets to say what's right?
It's generally an issue of power. IMO this is the biggest reason people care about different dialects. It's really got less to do with comprehension than the feeling of group identity. Going back to the tribal thing.
There is no objectively correct language out there. There's just a set of mutually comprehensible dialects that are spoken by interacting communities. The dialects that get called "correct" are the prestige dialects, the ones spoken by people with the power.
Academia is in some sense just another power in the game. Not a big one, in my opinion, but people do respect that white lab coat :D

And communities like (for example) Black communities in the USA resist standardization because it's seen as taking away their power. Their language. Which... honestly, it is. If you're trying to teach people to speak the prestige dialect instead of their own, or telling them they're stupid for speaking a non-prestige dialect. (For an extreme example of this, look at what happened to so many Native American languages and cultural knowledge bases because the people in power decided it would be good for those children to speak fluent English).


Especially in such a globalized world, when so many different cultures are trying to communicate using a 'standard' language, you can't just tell people: "well, you tried, which is important". I feel we're approaching an era in which effective communication is more than necessary, it's vital, and truly effective communication cannot be established unless people agree 100% on the meaning of symbols and words.

Except that... the only way to do that would be to have a monoculture. Which is far worse than miscommunication, IMO. And really, you don't need 100% overlap for effective communication. From getting by in non-native languages, I'd guess that 50% is plenty.
But... yeah, the monoculture thing... language change does (at least) reflect cultural change.

But as far as many people understanding a standard language or languages, I think we're closer to that than ever before. The scary thing is not how many people don't speak one language, but how many langauge groups and dialect groups are dying off.
People have generally lived in a multilingual world, and it used to be normal for people to learn several different languages, not just one or two. And even now, we're all hugely multidialectal. So we do have pretty good communication without monoculture.

Now, communities with incomprehensible dialects? Either they've been isolated, or they have a social reason to be incomprehensible. It's a tribal mark, as (I think) Ruv said. Most teenagers aren't using IM speak because they can't spell, they're using it because adults don't read it as easily. It marks their culture as distinct. This is nothing new. In-groups have always had their linguistic markers, probably as long as we've had language.



Yeah but more and more, in literature at least, authors are opting to write using their particular dialects.

It's always happened.
As long as we have had written literature, it's happened. If you do language change research you find new usages that came in from other dialects in the fiction first.
I think it's a good thing.


I've tried reading works by Latin American authors and I don't always understand, even if we're both supposed to be speaking the same language.

And that's communicating something. That your groups are not homogenous, even if they are supposed to be "the same". That you don't speak the same language, literally or metaphorically.

I have the hardest time reading Nalo Hopkinson for this reason. But for exactly this reason, she's worth the effort. I'm getting more than just her stories. I'm getting the culture behind them, by immersion, which is scary and exhilarating and really does change the way I think. Which it wouldn't if it was written in standard English.


Just look at what happened to the Roman empire after latin evolved into completely different languages.

Uh... :Wha:
- Imperialism good? Not so much.
- Latin changing caused the Roman Empire to fall? Uh, no.

---


Standard English to me is a beautiful thing. A good sentence can stop me cold, take my breath away. I love the art of language.

What though on hamely fare we dine,
Wear hoddin (http://www.robertburns.org/works/glossary/909.html) grey, an' a that;
Gie (http://www.robertburns.org/works/glossary/769.html) fools their silks, and knaves their wine;
A Man's a Man for a' that:


Not just standard English.
And yep, choice of language is political. Burns (http://www.robertburns.org/works/496.shtml) dips in and out of the dialect with masterful grace.


But in other contexts, "R U OK?" might be the most important message you see all day. It'd be a shame to reject it because it ain't purty.

Yes.

---


Is there such a thing as a national standard?

No.
There are prestige dialects called the standard, and prestige groups who blithely think they're talking right.


But this inherently a social issue. How do you determine the standard?
Yes :)

Well, that's an issue for sociolinguists, I think. I suspect (but have no proof) that economic and political power of regions gives preference to certain dialects, so that the standard language is modelled after them. (Immigrant influence after a standard has been established is a different topic.)

Yep. There's lots of proof. While less-prestige dialectal words come into the standards all the time as "cool" terms, it's so very clear that the standard is a political/power issue rather than a deep linguistic one, on any in-depth look at dialect use in any language, that I'd be hard-pressed to find you specific citations.

I can try if you're really interested though, or at least tell you who to ask :)


Well, see, there is a standard lnaguage, but even "standard language" changes, because it's usually not set of statutes, but a set of abstractions based on super-regional media-usage, for example. It's less a thing of right/wrong as it is a common ground of all the dialects.
Not quite. It's often not common ground at all, except insofar as everyone is made to learn it, so it becomes common ground.

To take an extreme example, the standard "pure" dialect of Tamil is one nobody speaks natively any more. We hear it on the news and in the movies, and in poetry. It's the written dialect, so everyone has to learn it to be literate in the language. But it's like the 300 years archaic Brahmin dialect (Tamil has serious caste-based and regional-based dialect differences).

So this thing, it's called the standard. Nobody. Speaks. It. Except on TV.

This is an odd example, sure, but it's just an exaggeration of what happens everywhere.


There's sociological research in the way that culture shapes supposedly spontaneous expressions, such as laughter. It's quite possible, for example, that being able to say "lol" is a relieve.

Yeah, and nobody looks at me funny when I say "heh" instead of laughing. "lol" is just too new, so it still sounds funny.

There's research into how people say "ouch" in different languages. It's pretty different! :)

---


There is a Canadian poet named bill bissett (he spells it lowercase) who uses awkward slangs such as 2 instead of two, too or to, yu instead of you and th instead of the. All that really does is make the work less readable, not more profound.

To you. You're making apparently objective statements, but the thing is that they're judgments, not statements of fact.


Here is a sample

him her self is alredee enuff
is alredee fine is alredee all ther
can go now can b now she he is
sew flexibul now who 2 trust or
2 find discovr

I don't get it, but even I can see that converting it to standard English would totally change the experience.


There is no "incorrect", but there is definitely a proper/user friendly.

But there are many ways in which poetry isn't user friendly, Jon. Do you dislike them all? Or just the neophile ones?


Everyone bends the rules of course, from Shakespeare to today, but at least they are readable. Heavy uses of junk grammar, spelling, and pointless dialects just cut the prose/verses up into shreds of unreadable rubbish.

From well before Shakespeare :)
And again, judgments stated as fact.

What I get from this is that you're not the audience this poet is reaching. Honestly, neither am I. But that very fact means that you and I aren't qualified to judge this poet. We can, of course, but we'll just look silly. Now, if someone who followed work like this had reasons why this particular poem was bad, that's fine.
But what you're saying here is exactly analogous to the people who decided that poetry in English was just bad because you don't write poetry in English! You write it in Latin or French! English is a vulgar tongue!
A little historical perspective makes many of our cherished positions seem a bit silly, ya know?

---


I've always thought one of the strenghs of English is that we don't have such an academy. It makes the language more supple and vibrant, I think. Also, the long reach of the British Empire in the last century left behind many authors in the post-colonial world who write in English by choice, adding still more richness to the language. So I sorta like the chaos.

I am being very much agreeing with you, yaar.

---


Dang. Does that mean I'll never be able to write as well as a native speaker ?

No, it means you'll never write exactly like a native speaker.
You already write better than many native speakers who've handed in college papers to me :D


Well, duh. Looks like so far the English managed pretty well without an academy. Seriously, we don't pay much attention what they say, at the academy. For instance, they aren't very happy with people saying "e-mail" instead of "courriel", but people keep doing it regardless.

:ROFL:
and of course you send that courriel on l'ordinateur?
L'Academie is pretty funny, and I say that as someone who can barely put together a French sentence, but I hope you'll forgive me there :)

---


Of course there is correct English. If you do not want to learn it, then invent your own language and call it something else. Honestly, the discussions that come up these days are very frightening.

Tip-top, old chap. Just as you say. Those zany Americans! Dropping vowels left and right, turning all their s's into z's. Whatever will those barbarians do next and call it English? Forsooth.

---


Interesting. I didn't get any Sapir-Whorf vibes. Actually, I think their main point, teach the standard and thereby give the speaker the confidence to use their own dialects, too, states that teaching the standard won't change the way you think much.

Didn't read the article. But your discussion reminded me of a study I did read up on at one point (7 years ago so don't ask me for citation pleeez). They basically found that you could teach kids the standard dialect pretty well if you told them that it was just another dialect, one they needed to use in certain situations.
If you told them the way they spoke was wrong, and this was right, they had a much harder time learning the standard dialect, and were quite resentful of it. Probably because they were being teased at home for talking funny.

---


I took exception to the self-serving tone which I'd paraphrase as "We, the elite who understand that grammatical prescriptivism is tosh, but can't persuade the hoi polloi that it is..."

But you know, I think that's true. Who really deep-down agrees that prescriptivism is tosh? (love that word.) It's horribly counterintuitive. Sure, we can - all - see it when we put on our thinky hats, but as soon as we take them off we're back to going "eww, that's not what 'enormity' means!" And some people never ever want to put that thinky hat on. It's a stinky itchy uncomfortable thinky hat.

So like, we're all hoi polloi sometimes, and some people always are, depending on knowledge and willingness to think that way.


My engineering mind doesn't care, because I feel that for most human languages, grammar is just a descriptive model of some artefact of cognition - it's not a model of cognition itself.

Except that differences in language do correlate with differences in thought, even when a bilingual is using another langauge entirely. They're not purely descriptive. I've already thrown data at you on this, I think, so I'm just going to stamp my foot and whine this time :D


Indeed, I'd argue that human language preceeds its formal grammar:

Well, what then is language?

It's more than the normally-defined "grammar," sure. It's a set of structured mappings between form and meaning, associated with all sorts of things (like social situation, statistical likelihood, etc), sez I. Some of those are what you'd call "grammar", and some aren't. And sure, the bits that aren't, some of them precede the bits you'd call "grammar", but in fact that split (between words and sentences) is fairly arbitrary and has to do with how English and the Classical languages are written.


language is an artefact of cognition and also a tool of cognition, but not all cognition is lingual. We can make decent sense of some utterances without even knowing what the grammar is.

And while I agree with all of this, it doesn't follow from the previous line. I'd say these are entirely independent claims which I happen to think are right :)


Moreover, my science mind tells me that descriptive models are often found to be flawed, and can be improved over time, and that fuzziness and ambiguity are common properties of complex systems. Given that linguists sometimes argue over vowels and consonants I'm not surprised that they argue over bits of grammar too.

All the more so because there is no one standard unchanging dialect to be talking about. And every utterance is in some way unique.

But arguing about the details and arguing about the overall structures and mechanisms -- those are two different things. Both sets of arguments happen, but I do think there's overwhelming empirical evidence for one basic approach to the problem.

Which is, funny enough, to think about language as an artefact and tool of cognition :D


I've asserted my belief earlier that people recognise meaning without clear grammar. Since language is both a cognitive artefact and a cognitive tool I think it's easy to confuse a description of the artefact with a description of cognition. We understand many utterances just fine without knowing what the grammar is, because we understand the cognition behind the utterance.

Yes, but we recognize the cognition by virtue of recognizing sets of form-meaning correspondences. And the only way to say that's not grammar is to draw arbitrary lines in the sand.

We recognize meaning even when the grammar is unclear not because we're not using grammar, but because we're so good at using grammar that we manage to find a context-dependent best fit even when an utterance doesn't match anything we know.


Again, I see the terms "language" and "dialect" as a sort of classification of certain artefacts of cognition - and therefore not to be confused with cognition itself.

I think there's actually a three-way distinction. There's cognition, there's language, and there's language use. Any given utterance, any particular instance of language use, is a cognitive act. And a motor act, and a social act...
A "language" is a generalization over many of those uses, by many different people, over time. Cognition, meanwhile, encompasses language use and a bunch of other things we do, but we certainly don't store the platonic ideal of LANGUAGE in our brains.


For some kids there may not be a first language at all - they may learn multiple languages concurrently and not realise (for a time) that they are different languages.

Yep. One of my favourite examples is the "hathiphant", a term a hindi/english speaking toddler I know came up with to denote an elephant (Hathi in Hindi).

On a mostly unrelated note, it seems that early languages learned together are sort of represented together, while languages learned at different times have distinct rather than overlapping representations.
So something like Broca's Aphasia might not knock out all your languages, if you learned some as a kid and some as an adult. Which I think is just plain odd.


Might it simply be that 'dialect' is just an extension of varying intent for varying audiences? Is it possible that children learn well enough by example that they don't need grammar as a conscious model, but simply build an "expert system" of thought in their head that creates affine utterances without knowing why they are affine?

Yeah, I think this is right. Very few people really have grammar as a conscious model. Conscious cognition is the slow, unwieldy, clunky stuff. The important stuff, like vision, hearing, motor control, expert knowledge, we deal with unconsciously. If we had to build sentences consciously it'd take us days to say how the weather was.

I read a paper a while ago (I know, I suck at citations) that suggested that language is sort of like social grooming, only for larger groups. There seems to be a limit on group size in other primates - the number of members that can groom one another & thus build social bonds. I think there's a brain size correlation... anyway, humans don't fit the pattern at all, our social groups are far too big for our brain sizes. And when the split seems to have happened is about when they think language started...


Certainly, computer systems can do this. It's possible to build neural networks, Markov chains and similar stochastic information systems that can produce English-like utterances just from examples without ever entering grammatical rules. Indeed, if you try and pull these systems apart to find the grammars, you may not succeed.

And in some cases, markov chains model human behavior & child learning better than any rule-based model does. This one I actually can hunt down the cite for if you want it. I know when the paper was given & where; it was the Marr scholar paper at the Cognitive Science conference in um, I think 2005.

---


The same narrative can be expressed in different grammars. Google "ergative-absolutive languages" and "nominative-accusative languages" (the ones you know, I wager), and then come back and explain the difference in simple narrative. I wouldn't know how.

Nobody seems to know how.
And translations between the types of language have to actually change meanings to make sense in the different grammars.


Which doesn't matter. Language is naughty and changes anyway.
:LilLove:


But cognitive linguistics isn't really my area of expertise; we ought to ask Schweta as soon as we figure out the questions. ;)

But if you figure out questions, I have to like, answer them.
Instead of blathering on at random.
Where's the fun in that?
(and yes, I think grammar is just a system of regularities at different levels of abstraction, meself)


How come Dutch and German are two languages rather than dialects of a blanket language? I've heard/read German dialects that I found more obscure than Dutch (which I never learned). And it's not just phonetics and spelling.

*ahem*
"A language is a dialect with an army."
Thankya.

That's what we learn in Ling 5 :D
Division into languages is a political thing, not a linguistic thing.
cf. Dutch/German; Scots/English; Norwegian/Swedish; Hindi/Urdu.
On the other hand, cf. "Chinese", which is (in the spoken form) several mutually incomprehensible "dialects".


Well, that's the point I was making earlier. At what point do children realise that there are different languages/dialects?

Well, when you tell them, pretty much. And not even then, in some cases.
Especially since multilinguals are always code-switching. I think I was twelve or so before I realized that some suffixes I used when speaking Tamil were actually English.


Er, yes, that's what I've been arguing all along. Nobody has ever been taught a formal grammar before learning to speak (I think).
Er, I suppose Deaf Native Signers might, since they don't exactly, speak.
But in general yeah, not so much. Formal grammars are useful for analysts, not for speakers.

...Oof!
Did I get stuffs in anything like a comprehensible and sensible manner?

Ruv Draba
04-24-2008, 03:51 PM
is there anyone here, even in this linguistically-clueful group, who doesn't cringe when a word is used "wrong"? Even if, in 60 years, that usage will be considered right? I know I do. And there's nothing wrong with that! It's just that... who gets to say what's right?Well, maybe the culture is allowed to make its policy decisions on what is taught as standard (calling it 'right' might be stretching the point, but until we selectively breed pomposity out of our gene pool, I don't think we'll stop that either).

As for cringing when words are used in ways that indicate laziness, ignorance and indifference of my expectations, why not? I do the same when I eat food prepared this way, share the road with drivers who drive this way. Language is a social activity. We flinch when people don't know how to relate to us. It places additional burden on us to adapt to them after we've already learned to accommodate our local norm. It's painful and irritating. S'only human.


It's generally an issue of power. IMO this is the biggest reason people care about different dialects. It's really got less to do with comprehension than the feeling of group identity. I don't quite agree. I think that there are different classes of response.

In my teen years I moved from a place with a very high immigrant English population to a place with high blue-collar Australian representation. My language - diction, grammar, and also many of my values and thoughts set me apart for a year or so. Although my family had less money than most of the local kids, and lived in the same sort of house, my schoolmates found it threatening, affronting, condescending and reacted accordingly. I don't believe they were exercising power for its own sake, or reacting to (nonexistent) power of mine but simply dealing with the same sorts of reactions I described above: who is this tool and why won't he try to relate to me?

Within a year I'd found some humorous books on what we'd probably call working-class Sydney dialect, idiom and accent, and didn't just read them - I studied them. At home I spoke as I normally did, but at school I talked completely differently. While I made a few comical mistakes (because the books were 20 years out of date), it worked just fine.

So at the cultural level I don't think it's power - just human belonging. But at the pedagogic level... well, teachers inherit a lot of referred power. Some get excited by this and see themselves as divinely appointed moral authorities - and that's a point that I've probably flogged enough on this topic already.


And communities like (for example) Black communities in the USA resist standardization because it's seen as taking away their power.I'm back to agreeing with you here. Cultures make decisions on language based in part on power. That's very evident in how colonial Australian language changed, for instance.

Their language. Which... honestly, it is. If you're trying to teach people to speak the prestige dialect instead of their own, or telling them they're stupid for speaking a non-prestige dialect.

Except that... the only way to do that would be to have a monoculture. Which is far worse than miscommunication, IMO.Well in the worst case, it's comparable to genocide, so it's a human rights abuse. Article 2 of the 1948 UN convention (http://www.hrweb.org/legal/genocide.html)includes among its genocide definitions "causing serious bodily or mental harm to members of the group" where that act is " committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group". In certain (admittedly extreme) instances, I think you could mount a case. You can destroy a group without necessarily destroying any of its members. It's sufficient to take a run at their group identity - and language is a key part of that.


The scary thing is not how many people don't speak one language, but how many langauge groups and dialect groups are dying off.That scares me for quite selfish reasons - we're losing a truckload of ancient and valuable thought with it. Stuff we don't even know how to translate or digest. Stuff that isn't recorded, and really can't be.

While less-prestige dialectal words come into the standards all the time as "cool" terms, it's so very clear that the standard is a political/power issue rather than a deep linguistic oneYep. Language follows power, though I'm not terribly persuaded that the reverse is true (and hence while I'm fine with your academic interpretation of Sapir-Whorf, I'm not fine with its political misuses).

Yeah, and nobody looks at me funny when I say "heh" instead of laughing. "lol" is just too new, so it still sounds funny.I've accomodated emoticons happily, but don't use lol. Maybe I'm afraid of incurring the Smiley Intervention (http://www.youtube.com/watch?v=NVOFmu2ZIqI) but I think more likely it's that LOL seems deceitful to me.


There's research into how people say "ouch" in different languages. It's pretty different! :)Aduh! Aduh! Panjang sakit!

So like, we're all hoi polloi sometimes, and some people always are, depending on knowledge and willingness to think that way.I'm fine with that, but I hold professionals to higher standards. I expect them to leave self-interest and cultural knee-jerks at the door (and collect them again on the way out).

(Yes, I'm often disappointed, but at least I get to rant. :rant:)



Except that differences in language do correlate with differences in thought, even when a bilingual is using another langauge entirely. They're not purely descriptive. I've already thrown data at you on this, I think, so I'm just going to stamp my foot and whine this time :DYes, but I've implicitly accepted that in saying that 'language is a tool of cognition', so you needn't whine at me any more. :poke:

On the other hand, that isn't why Sapir-Whorf is socially interesting. It's that there's this almighty Evil Knievel jump from an academic observation ("oh look! the tool changes the hand") to a piece of self-appointed and ill-conceived social policy ("and if we make hammers small, pointy and mandatory we'll breed Big Endianisms out of our society! Nyuk nyuk nyuk"). You couldn't argue that the tool changes the hand that much and I know - I tried to get you to argue it and you sidestepped neatly. :D So from a policy perspective, I'm not sure that Sapir is Whorfwhile.


Well, what then is language?

It's more than the normally-defined "grammar," sure. It's a set of structured mappings between form and meaning, associated with all sorts of things (like social situation, statistical likelihood, etc), sez I.I like that description better than anything a grammar-based description can give me because there are plenty of grammatically well-formed sentences that have no meaning at all - and would never appear in usage because of it. And converselywise, there are plenty of agrammatical utterances (such as those yelled by mothers at children playing on the road) which have plenty of meaning.




language is an artefact of cognition and also a tool of cognition, but not all cognition is lingual. We can make decent sense of some utterances without even knowing what the grammar is.
And while I agree with all of this, it doesn't follow from the previous line. I'd say these are entirely independent claims which I happen to think are right :)That's because I'm fond of making humpty-dumpty jumps and spraying yolk everywhere. There's a missing middle step, which is about how great lumps of our cognition involve mirroring and modelling our impacts on others and theirs on us. We even have neurons dedicated to this task, and they've been linked (theoretically rather than clinically so far), to language skills. So I don't think they're independent - I just didn't connect the dots clearly.

We recognize meaning even when the grammar is unclear not because we're not using grammar, but because we're so good at using grammar that we manage to find a context-dependent best fit even when an utterance doesn't match anything we know.Well, we're very good at making up ad-hoc rules to fill in gaps. Wasn't there an experiment done in Japan where bogus ideograms were designed, and subjects were told that they are real but obscure kanji, and asked to decypher meaning?

This is back to tools change hands again, isn't it?


I think there's actually a three-way distinction. There's cognition, there's language, and there's language use.It's a rare departure for me, but I might go behaviourist on you here, Shweta. When you say that there is language do you mean that there's a referential abstract that can't be defined or that there's a clinically isolatable, unitary culturally circumscribed and shared object called language? Cos I'd believe the first (and argue that it's not unique or terribly well-defined but still handy to talk about), but I'd balk at believing the second.

Permit me to graffito:
Any given utterance, any particular instance of language use, is a cognitive act.I don't think you need the middle bit.

A "language" is a generalization over many of those uses, by many different people, over time.Or maybe it's just a thematic field of interest on clusters of affine utterances and scratchings? Okay, you said "language = dialect + army", (which I love!), but I'd argue (or cos it's not my field 'speculate' is probably more accurate) that "dialect" is just a cluster of interest.

Cognition, meanwhile, encompasses language use and a bunch of other things we do, but we certainly don't store the platonic ideal of LANGUAGE in our brains.Which is good, 'cos I'm still trying to keep down my platonic solids.

One of my favourite examples is the "hathiphant", a term a hindi/english speaking toddler I know came up with to denote an elephant (Hathi in Hindi).Unconnected, but my sister had a great approach to pluralising in her younger years. You'd just repeat the last syllable of the word as often as you saw an instance. So a herd of elephants was "ellypadumpadumpadumpadump..."

HeronW
04-24-2008, 11:04 PM
Why can't the English teach their children how to speak?
Norwegians learn Norwegian; the Greeks have taught their
Greek. In France every Frenchman knows
his language fro "A" to "Zed"
The French never care what they do, actually,
as long as they pronounce in properly.
Arabians learn Arabian with the speed of summer lightning.
And Hebrews learn it backwards,
which is absolutely frightening.
But use proper English you're regarded as a freak.
Why can't the English,
Why can't the English
Learn To Speak?

My Fair Lady!

I recall reading that the original Beowulf, Ms Jolie as a golden tailed demon nothwithstanding, is considered to be Old Eng., Chaucer is Middle and Shakespeare is Modern. I'm sure most people still find learning the Bard or Merry Wives as an onerous task but it does give a feel for the era, for the insults and slurs: 'don't give a fig' & 'biting my thumb', though I hardly think everyone went about rhyming or making sure their meter was correct.

English is a prodigious borrower of other languages and sometimes the growing pains are eye-watering.

There can be incorrect English (or any other language) when a dialect has degenerated to such a degree that it is unrecognizable and fails to convey the speaker's thoughts. It can be due to laziness of diction, to excessive slang, or verbal/written shortcuts.

As a translator I did a job with an Israeli woman who supposedly spent 10 years living in the US. She spoke fluent and yes, proper American English. She insisted on using a word in her web site and in her presentation that is not a word by any stretch in English, be it American or British.

She liked it, she liked the way it sounds even if it's a complete fabrication and dead wrong. Anyone who knows any variant of English will see it and be drawn to a screeching halt reading or listening to her spiel. She won't appear educated, she certainly won't seem knowledgeable, and all because she wants this non-word.

Yes, there is incorrect English, as there is incorrect language of any sort when those who use the language prefer fiction to reality.

Shweta
04-24-2008, 11:25 PM
As for cringing when words are used in ways that indicate laziness, ignorance and indifference of my expectations, why not? I do the same when I eat food prepared this way, share the road with drivers who drive this way. Language is a social activity. We flinch when people don't know how to relate to us. It places additional burden on us to adapt to them after we've already learned to accommodate our local norm. It's painful and irritating. S'only human.
Not saying we shouldn't, but the fact is people are sometimes lazy, ignorant, and indifferent to other people's expectations. Saying that being so is wrong is going prescriptivist on their ass. And yeah, I want to say it's wrong anyway, even though I'm academically committed to a descriptive view of language, because yes it's painful and irritating. It's fingernails on chalkboard to my aesthetic sense of language.

If we view language as just a method of transmitting information, then this shouldn't really be an issue. So long as we stay mutually comprehensible, what's to whine about? But (I'm agreeing with you,) it's a deeply social, socially important, activity. And I think that's why we care enough to be unable to kill our inner prescriptivists.


I don't quite agree. I think that there are different classes of response. Well yes, I'd be horrified if everything about language actually came down to one word :tongue
But really, power structures are an incredibly important aspect of social structure in general, so much so that I think using "incorrect" language as an intimacy marker is saying "We are intimate, so I can trust that you're not going to play the power games with me". So I'd argue that there are power dynamics hidden in all of this, even if they're not (gods forbid) the only thing going on.

(To briefly derail this onto the topic of writing - I know, gasp, can I do that? - I think this is incredibly important in fictional dialogue. From what I've seen, dialogue works best when the characters are navigating a power dynamic as well as whatever else they're doing. It's one aspect of 'tension'.)


Although my family had less money than most of the local kids, and lived in the same sort of house, my schoolmates found it threatening, affronting, condescending and reacted accordingly. I don't believe they were exercising power for its own sake, or reacting to (nonexistent) power of mine but simply dealing with the same sorts of reactions I described above: who is this tool and why won't he try to relate to me?
But that's also a power issue. Your lack of relating isn't a threat unless it's an implied power issue. Consider: if you came in speaking very Indian English, or English with a German accent, or if you lisped, you'd probably have been viewed as funny rather than threatening. Even though you similarly didn't relate. It was the white-collar implication of your diction that was the threat.

We are always sensitive to differences in diction, and what they say about how we relate, sure. But we only react badly to other people's differences in diction when either we think they sound like they're better than us, or we think we're better than them and they aren't recognizing our power/prestige by adapting.


Within a year I'd found some humorous books on what we'd probably call working-class Sydney dialect, idiom and accent, and didn't just read them - I studied them. At home I spoke as I normally did, but at school I talked completely differently. While I made a few comical mistakes (because the books were 20 years out of date), it worked just fine.
Which is sort of proving my point. When your difference in diction implies that you're trying to adapt and failing, it's funny. When your difference in diction implies that you're not trying to adapt, and it's a difference that conveys a different social standing, it's a threat.

I got bullied horribly for not speaking "proper" English as a kid; but the American girl in the class got teased rather than bullied. Her difference was funny. Mine was a threat: how dare the stupid colonial not learn to be like us? She think she's as good as we are?

And here in lovely Southern California :rolleyes:, as far as I can tell, it's fine to speak English with a non-American accent so long as that accent is not Spanish. Then you're subject to all sorts of social stereotypes, and you're reviled for not trying to fit in. But oh my, if you're doing the same thing with a French or German or Indian accent, how nice that your English is so good.


So at the cultural level I don't think it's power - just human belonging.
Whereas I think humans are pretty good at adapting to and accepting differences, and allowing different people to belong to a group, so long as the difference doesn't go with a perceived power difference.


But at the pedagogic level... well, teachers inherit a lot of referred power. Some get excited by this and see themselves as divinely appointed moral authorities - and that's a point that I've probably flogged enough on this topic already.
Heheh, I'm not disagreeing with you on that, just that I think we all do it, and teachers just have that aspect of human nature exaggerated.
And hell, so do writers.


Well in the worst case, it's comparable to genocide, so it's a human rights abuse.
Yep.


That scares me for quite selfish reasons - we're losing a truckload of ancient and valuable thought with it. Stuff we don't even know how to translate or digest. Stuff that isn't recorded, and really can't be.
Yep.


Yep. Language follows power, though I'm not terribly persuaded that the reverse is true (and hence while I'm fine with your academic interpretation of Sapir-Whorf, I'm not fine with its political misuses).
I'm not fine with the political misuses of S-W either; they're all rooted in misunderstandings/misinterpretations, for a start.

But I do think that speaking "the language of power", the prestige dialect, causes you to gain power, because power is just another social construct here, and it's all about how people view you. So in the individual case, power can follow language.


I've accomodated emoticons happily, but don't use lol. Maybe I'm afraid of incurring the Smiley Intervention (http://www.youtube.com/watch?v=NVOFmu2ZIqI) but I think more likely it's that LOL seems deceitful to me.
Because generally we're not actually lolling when we say it?
Is that really deceitful when nobody expects you to be, or is it just hyperbole?


I'm fine with that, but I hold professionals to higher standards. I expect them to leave self-interest and cultural knee-jerks at the door (and collect them again on the way out).
:ROFL:
You're doomed to be disappointed, mydear. I don't know where one sees more pettiness, self-interest, knee-jerks, defensiveness, and pomposity than academia. Er well... politics, I guess. The saving grace is that generally reputation follows competence in academia.

I do think academia a wonderful thing, but... oh my. A group of people for whom reputation is everything, for whom being part of the in-group determines their whole career... yeah, they're not gonna be pushing their own agendas or anything.



Yes, but I've implicitly accepted that in saying that 'language is a tool of cognition', so you needn't whine at me any more. :poke:
Sort of. I'd also say that cognition is affected by language, just as well, it's affected by all other cognitive acts too.


You couldn't argue that the tool changes the hand that much and I know - I tried to get you to argue it and you sidestepped neatly. :D So from a policy perspective, I'm not sure that Sapir is Whorfwhile.

:roll:
I wouldn't want to argue that the tool changes the hand that much. The evidence only supports the notion that the tool constrains what the hand is likely to do, not that it determines anything at all. The people who claim it does are selling something.
Something icky.


I like that description better than anything a grammar-based description can give me because there are plenty of grammatically well-formed sentences that have no meaning at all - and would never appear in usage because of it. And converselywise, there are plenty of agrammatical utterances (such as those yelled by mothers at children playing on the road) which have plenty of meaning.
Yep. It's where the "formalist" syntax-based theories fail every time, and why the only reasonable approach is one grounded in meaning, not form. In 60 years though, they haven't acknowledged it yet :rolleyes:
That gets us back to self-serving ideas in academics, of course. It'd ruin a lot of careers to acknowledge that.


There's a missing middle step, which is about how great lumps of our cognition involve mirroring and modelling our impacts on others and theirs on us. We even have neurons dedicated to this task, and they've been linked (theoretically rather than clinically so far), to language skills. So I don't think they're independent - I just didn't connect the dots clearly.

So I tend to assume mirror-neuron stuff* all over the place, and I still don't see your connection.... unless you're saying that the existence of non-linguistic or prelinguistic cognition means that some things just aren't affected by language.

Which is intuitive and makes sense but doesn't really seem to be true except in very low-level cognition. Even our categorical color perception seems to be affected by the language we speak. And color categories... well! Some of that neural processing is happening before it even gets to the brain!
The thing is, the brain is an ongoing mess of activation. And we use langauge so much that some linguistic structures are always, often, easily activated. And once activated, they're going to have an effect on the other stuffs. So language has effects even when technically we could do that cognitive task without language at all.

* By the way, the clinical/experimental situation there is... not nonexistent. Say rather that it's messy, and being hidden under the carpet a bit until it starts to make sense.


Well, we're very good at making up ad-hoc rules to fill in gaps. Wasn't there an experiment done in Japan where bogus ideograms were designed, and subjects were told that they are real but obscure kanji, and asked to decypher meaning?
I'd agree, except that the rules aren't exactly ad hoc. They're informed by all the other structures we have in our heads, and generally consistent with them. [/quote]


It's a rare departure for me, but I might go behaviourist on you here, Shweta. When you say that there is language do you mean that there's a referential abstract that can't be defined or that there's a clinically isolatable, unitary culturally circumscribed and shared object called language? Cos I'd believe the first (and argue that it's not unique or terribly well-defined but still handy to talk about), but I'd balk at believing the second.

Oh, I just mean the first. I don't think language is an object at all, it's a process distributed over groups of people, that we try to treat like an object because we like turning everything into nouns. That's why I keep trying to talk about usage and utterances - because language is really action rather than thing. (darn, more nouns.)

However, as a point of reference, as an abstraction/generalization, the notion of a language is a useful one. It's like "race" and "culture" and "gender" and "art" -- sure, they exist, they're useful terms, but we run into trouble when we decide that they're Really Truly Things and draw lines around them and try to shoehorn everything/everybody into them.


Or maybe it's just a thematic field of interest on clusters of affine utterances and scratchings? Okay, you said "language = dialect + army", (which I love!), but I'd argue (or cos it's not my field 'speculate' is probably more accurate) that "dialect" is just a cluster of interest.
It's a cluster of something, anyway. Not entirely clear on how you're using interest here.


Which is good, 'cos I'm still trying to keep down my platonic solids.
Only use platonic solids as a supplement to a full meal. Digestion of platonic solids alone may lead to anorexia.


Unconnected, but my sister had a great approach to pluralising in her younger years. You'd just repeat the last syllable of the word as often as you saw an instance. So a herd of elephants was "ellypadumpadumpadumpadump..."
:LilLove:
That's awesome!

nybx4life
04-24-2008, 11:36 PM
Wait, so that means that, just possibly, the grammar errors that the other posters keep telling me about my work aren't really errors at all, but a different view of correct English?

Wow, that light just made me say, AWESOME!!:D

BTW, if you think I'm getting ahead of myself, they were verb tense errors (past, present)

Shweta
04-24-2008, 11:42 PM
There can be incorrect English (or any other language) when a dialect has degenerated to such a degree that it is unrecognizable and fails to convey the speaker's thoughts to whom?. It can be due to laziness of diction, to excessive slang, or verbal/written shortcuts.

Lots of assumptions in here!
"Degenerate" implies that there is a good true ideal that something has fallen from. Which is indeed the old idea. The ideal, though, was Latin, which is why some grammarians are still trying to stop us from splitting our poor infinitives. When a dialect changes, all it's doing is changing. It's still recognizable to the people who speak it, and conveys thoughts to them! It's just to us clueless folk on the outside that it seems like excessive slang.

And when we clueless folk speak the prestige "correct" dialect we blithely claim that everyone else is wrong. And we get to enforce that, of course, so it's socially true. But it's still nonsense, linguistically speaking. Our "proper" modern English is no more like Chaucer's than all those "wrong" dialects are; in fact some "wrong" dialects are in fact archaic, and closer to the earlier language than what the in-group speaks now.

"laziness" of diction and "shortcuts" are in fact drivers of language change, and much of why the above is true.


As a translator I did a job with an Israeli woman who supposedly spent 10 years living in the US.

As a translator, it's your job to be a prescriptivist and say "This is right, this is wrong," with an eye to general comprehension. But as a translator, you'd have to say that (for example) Robert Burns' poetry is wrong. It's far too slangy, too dialectal, and so far from standard English as to be incomprehensible at times.

It's all about the purpose for which language use is right or wrong, though. Burns' poetry is wrong for ideal easy comprehension, but that wasn't his intent when he used dialectal language. His intent was political, it was to express a tribal difference. He couldn't have done that without the dialectal language.


She spoke fluent and yes, proper American English. She insisted on using a word in her web site and in her presentation that is not a word by any stretch in English, be it American or British.

And if her only purpose was to come across as a native English speaker, this is certainly wrong. From your point of view, with your professional goals, it's wrong, and you had to advise her of it. If she insisted on using it anyway, she had another goal besides clear/transparent comprehension to the masses.

This goal might have been profound or silly; we don't know. But whether she was actually right or wrong depends on whether her form of communication matched her desired outcome or not. There is no objective right or wrong there.


Yes, there is incorrect English, as there is incorrect language of any sort when those who use the language prefer fiction to reality.

No-o, there are things that seem incorrect when different people come in with different expectations, and there was definitely ways to fail in achieving our own goals, but I'd say that nothing is necessarily wrong; it depends on the language user's goal.
And we just need to look at "A Clockwork Orange" for that.

nybx4life
04-24-2008, 11:47 PM
Wait, so, it's bad to say that grammar is "incorrect" when the standards of grammar are different everywhere, right?

Like, if a person is speaking English, and their grammar is bad according to you, as long as you understand at the end what the person is saying, or meant to say, it's good?

Shweta
04-24-2008, 11:47 PM
Wait, so that means that, just possibly, the grammar errors that the other posters keep telling me about my work aren't really errors at all, but a different view of correct English?

Wow, that light just made me say, AWESOME!!:D

BTW, if you think I'm getting ahead of myself, they were verb tense errors (past, present)

I'd say it depends on your purpose. If your purpose is to communicate clearly with the people here, getting them to think the thoughts you want to put into their heads, and if your purpose is also to appear smart to the group -- then no, they're errors.

If your purpose, on the other hand, is to mess with people's sense of time, then sure, they might be correct.

There's no objective correctness. That doesn't mean there's no "correctness", just that it is the match between intended effect and actual effect, rather than A Grand Ideal In The Sky. Because language is a social/cognitive tool.

For an analogy, if I punch someone in the face and I intend that to be a friendly gesture, I failed. It was incorrect. But if I intended to hurt them and piss them off, then sure, it was right. And there will be consequences, just as there are to any social action.


Wait, so, it's bad to say that grammar is "incorrect" when the standards of grammar are different everywhere, right?

Like, if a person is speaking English, and their grammar is bad according to you, as long as you understand at the end what the person is saying, or meant to say, it's good?

Looking for excuses to get away with sloppy grammar, much? :D
Language is a tool. You can use it well or badly. If someone's grammar is unusual according to me, but I can still understand them, then the next question is what are they communicating with the unusual grammar? It might be a deep and subtle insight. It might also just be that they're a bleeding idiot. Depending on the context and the type of irregularity, I'll come to a conclusion on it.

You can't completely control what I make of your grammar, of course; so variations from the standard better happen for a good reason.

Certainly you can say "U dont gett i, i issa genyus, and gramar mines be fine okay?" and in some contexts that would be a pithy observation. Like (I hope), here. However, if you use the tools in a way that seems clunky and clueless to the reader, you lose the readers.

nybx4life
04-24-2008, 11:53 PM
So it depends on my view.....

Okay, so if it's to tell the story as it is, and not give a damn if they actually understand the time phrase of everything happening because I assume they SHOULD know, I should be correct...right?

For people to adjust the tenses of my work, I guess it's all about telling it as I see it, regardless if the message is clear or not.
I think I'm going completely off right now.

Shweta
04-24-2008, 11:57 PM
Okay, so if it's to tell the story as it is, and not give a damn if they actually understand the time phrase of everything happening because I assume they SHOULD know, I should be correct...right?
Certainly one can call anything "my style" and claim it's therefore correct. If your intent is to give people misleading cues about times and make them figure it out themselves, sure, that's "correct". But does it matter if you're correct or not, if your style causes people to go read something else? If you're careless about tense, they will, because that's a pretty lame style. :Shrug:

Critiques are normally given with the assumption that you want your writing to be readable. If that's true, then you're incorrect. If it's false, why post anything for critique? And seriously, if you care so little for the craft that you're looking for excuses not to work on it, why are you writing?

ETA: I've read at least one story that did mess around with tense, for story-specific reasons. It was excellent. Some people didn't get it, though, and thought the writer had tense issues. I just sold a story with no punctuation at all, for story-specific reasons. Some editors I sent it to before the one who took it, though, clearly thought I was sending them gibberish. That's a risk you take when you deviate from the norm. If you do it in an informed way, certainly it can work. The correct question, I think, as a writer, is not "Is it correct" but "Does it serve the story". If you're using weirdness for any reason other than "The story needs it", IMO you're doing it wrong.

nybx4life
04-25-2008, 12:02 AM
Nah, I was just trying to see if it was possible.

I'm just saying can that style possibly be used and still create a successful novel?
Probably used in a mystery, crime type story.


If I was that careless, I wouldn't be on these forums at all:D

Shweta
04-25-2008, 12:11 AM
I'm just saying can that style possibly be used and still create a successful novel?

Anything can be used to create a successful novel, IMO. Look at Memento/Memento Mori (http://en.wikipedia.org/wiki/Memento_Mori_%28short_story%29). The movie at least is told backwards.

But the trick, the weirdness, needs to be relevant somehow. You can't just expect readers not to notice. They'll notice. But they'll deal (at least some people will), if there's a reason for it.

Ruv Draba
04-25-2008, 04:13 AM
I'm just saying can that style possibly be used and still create a successful novel?Sure, if it's for effect and the effect is either entertaining or illuminating. E.g. suppose you have a stressed-out character running two concurrent thoughts in its head - one is of something that has happened, and another of something that might. If those thoughts have resonance with one another you might get something like this:

They beat me and I didn't do it I'm not that sort of person even if she wants me to I won't do that not even when she looked at me and did that thing with her hair I didn't do it and I won't I'll just look just look.
The sentence isn't punctuated in a standard fashion. It's a run-on sentence and mixes tenses all over the place - all things that you wouldn't and shouldn't normally do. But it might convey this particular situation better than if you 'fixed' it:

(They beat me and I didn't do it - I'm not that sort of person!)

Even if she wants me to, I won't do that.

(Not even when she looked at me and did that thing with her hair...)

I didn't do it, and I won't...

I'll just look...

...just look.

The first one gives me a sense of urgency or panic; the second one gives me a sense of a character struggling with temptation. Either could be my preferred one in different situations.

Dawnstorm
04-25-2008, 01:20 PM
So there's no way we'll ever lack prescriptivists; is there anyone here, even in this linguistically-clueful group, who doesn't cringe when a word is used "wrong"? Even if, in 60 years, that usage will be considered right? I know I do. And there's nothing wrong with that! It's just that... who gets to say what's right?

Well, I do know the feeling, but I very rarely cringe. Amusement is more likely with me, as in the classic:

"Omg, I just saw a plain barley land!"

for

"Omg, I just saw a plane barley [edit: barely, dammit! ;)] land!"

I generally just translate inwardly and move on. (Which means I have inner correctness conditions that trigger for both production and reception, but don't spill outwards much when perceiving.)

There are things that rile me, but they have little to do with grammatical correctness. This can be oddly inconsistent.

My mother tongue is German. German inflects nouns for gender and has gendered articles, as well pronouns. (English has gendered pronouns, but that's it.)

Now, the German word for "girl" is "Mädchen". It's grammatical gender is neuter, but that's no surprise, since the word is the diminutive form of "Maid" (which is pretty similar to English word that looks the same, except for capitalisation). The "-chen" suffix indicates diminutive form. You can tag it on to lots of nouns (you'll run into trouble with words that end in "ch" or "che", but that's about it). And here's the thing: all diminutive nouns derived in that way become neuter, regardless of original gender.

Now, of course, "Mädchen" takes the neuter article: "Das Mädchen". This doesn't bother me a bit.

Also, it's grammatically correct to refer to a "Mädchen" as "es", which is the neuter pronoun ("it"). Using "she" is technically incorrect. But I simply refuse to use the neuter pronoun.

So what's going on here? I'm viscerally opposed to correctness, but only when it comes to the pronoun, not when it comes to the article ("Die Mädchen" sounds positively weird to me.)

Is it a question of collocation (words that you're used to see/hear together)? I don't know.

What makes us gripe is an interesting question. I feel that "correctness" isn't the whole of the story.


The dialects that get called "correct" are the prestige dialects, the ones spoken by people with the power.
Academia is in some sense just another power in the game. Not a big one, in my opinion, but people do respect that white lab coat :D

And communities like (for example) Black communities in the USA resist standardization because it's seen as taking away their power. Their language. Which... honestly, it is. If you're trying to teach people to speak the prestige dialect instead of their own, or telling them they're stupid for speaking a non-prestige dialect. (For an extreme example of this, look at what happened to so many Native American languages and cultural knowledge bases because the people in power decided it would be good for those children to speak fluent English).

[snip]

Not quite. It's often not common ground at all, except insofar as everyone is made to learn it, so it becomes common ground.

To take an extreme example, the standard "pure" dialect of Tamil is one nobody speaks natively any more. We hear it on the news and in the movies, and in poetry. It's the written dialect, so everyone has to learn it to be literate in the language. But it's like the 300 years archaic Brahmin dialect (Tamil has serious caste-based and regional-based dialect differences).

So this thing, it's called the standard. Nobody. Speaks. It. Except on TV.

This is an odd example, sure, but it's just an exaggeration of what happens everywhere.


The standard has always fascinated me. When I said "common ground", I meant more of a free-for-all social arena than a lowest common denominator; your "becomes common ground" really. I agree with what you're saying here, except I think that nobody speaking the standard ought to be the default analysis. I see the standard a bit like a pot plant; it's still a plant, obviously, and it grows like all the other plants, but if you don't pay attention, water it too little or too much, it dies. Unlike "real" (I'm trying to cut down on fany vocabulary) dialects, the standard is dependent on conscious upkeep activity. You need to teach it. You need to compose in it. You need to edit dialects out. Etc.

Now for an abstract visualisation.

Imagine a blank computer screen. I prefer a rather dark grey, but black should work. In the middle of the screen we'll put Standard English. It's a bright amorphous oscillating blob, constantly growing and shrinking and reaching out tentacles and retracting them. In the immediate vicinity, we start ot discover the natural dialects. They're less bright than the standard and even more amorphous, but their oscillations are more fluid, less erratic. The further you get to the marginal areas, the dimmer the dialects become, but unlike the more central dialects, their internal brightness varies a lot. Some may have spots of brightness that can, on occasion, even outshine the standard (the poets live there, and the preachers).

And now you start to notice the newsspeaker. He awakes and makes a couple of "wh-" and "mm-" noises. He's somewhere on the margins of his dialect now, standing half in the unlit parts. But as he becomes coherent he moves into the brightness of his native dialect. He lives in a capital, so the dialect is pretty bright already. He'll begin work in a few hours. He'll move to standard English, then, as that's what Newsspeakers do. Chances are, he won't even notice the transition. There's the twilight zone, where the standard has grown dimmer and his native dialogue has grown brighter. A couple of tentacles meet there, as long as he's speaking. He could move further in, or stay there. The trip isn't much to speak of. Effortless. Few borders.

And that's why we get bored and go look for an ethnic poet. We'll find her somwhere in the dark zones, near a bright spot. She's reciting a poem. She makes a point of staying where she is, but she's probably facing the standard. She might be glaring, but I prefer to imagine her with a serene smile on her face, wink wrinkles around her eyes. The choice is up to you.

She's going be interviewed about the poem soon. She's built herself a teleport device to make the trip to the standard, and - boy! - was that a chore. But she has one now, and it works. So as she finishes to recite the poem she steps into the device, and - poof! - she teleports. She ends up not quite in the center, a few dialects away. No biggy. She's used to it. Some of her friends live here. She realises where she is, clears her throat instead of speak, and back into the device it is. Now she's arrived in the standard, but there's a thread, rather thin but glittering with alternating threads of dark and light, and it connects her to her native dialect. The thread can be embarrassing at times. People might stare and point. But, you know, she's chosen to light her own dialect a bit, so she doesn't want to sever the thread. She's seen what severing the thread has done to others. A former acquaintance, for example, is now running around with a dark cloud over his head, raining bits of discarded dialect at him in the most importune situations. Anyway, sometimes a bit of her thread comes off and gets absorbed by the standard. She's not sure how she feels about that, but, you know, why worry about the inevitable. The mission is to light the far reaches. Which can be quite hard, really.

Inbetween Mr. Newsspeaker and Mrs. Ethnic Poet who can do interviews in standard live lots of people. Most of them will visit the standard at times. Some go there, curious. Some commute daily. Others get dragged there, but they often don't really stay, existing in a confused flicker between disjointed places. A lot is possible.

There's a lot of fun to be had in the middle, where people live, still in the bright regions, but not quite in the centre. They'll tell you how the standard should look, and it's quite obvious they think they live there. But nobody lives there. It's a marketplace. You can take words there, and bring some home. But you can't live there. Not really.

So that's how I think of the standard. Kind of. ;)


I have the hardest time reading Nalo Hopkinson for this reason. But for exactly this reason, she's worth the effort. I'm getting more than just her stories. I'm getting the culture behind them, by immersion, which is scary and exhilarating and really does change the way I think. Which it wouldn't if it was written in standard English.

Absolutely.

I also enjoyed reading Xiaolu Guo's A Concise Chinese-English Dictionary for Lovers (http://www.fantasticfiction.co.uk/x/xiaolu-guo/concise-chinese-english-dictionary-for-lovers.htm). It's a story about a young Chinese country woman being sent by her parents to London to study English to help out in the family firm. It's written in diary form, but addressed to her lover, and her English gets better as the story progresses. In the first few chapters she knows hardly any English at all.

I haven't read anything like that before. A wonderful read (though it contains a couple of sexually explicit scenes that some people might not appreciate).



I can try if you're really interested though, or at least tell you who to ask :)

I am interested, but it's not a priority right now. I'm still reading up on construction grammar (thanks for that), as well as on relevance theory, which sounds interesting, too. Considering that linguistics isn't my job, it's going rather slowly. Hehe.


Dropping vowels left and right, turning all their s's into z's. Whatever will those barbarians do next and call it English? Forsooth.

Those traditionalists. Not realising that the British have turned the z's into s's. The -ize is the classicist's choice. But when the Brits turned francophile, -ise became more common. And now they claim the American's bastardised their language. (Notice my preference for "s" in the previous sentence? How anti-classicist of me! Tsk.)

No, man, for once tradition is on America's side. And, yes, it feels odd to say that.


Didn't read the article. But your discussion reminded me of a study I did read up on at one point (7 years ago so don't ask me for citation pleeez). They basically found that you could teach kids the standard dialect pretty well if you told them that it was just another dialect, one they needed to use in certain situations.

If you told them the way they spoke was wrong, and this was right, they had a much harder time learning the standard dialect, and were quite resentful of it. Probably because they were being teased at home for talking funny.

Yay! (I haven't left my agenda at the door, see?)


But you know, I think that's true. Who really deep-down agrees that prescriptivism is tosh?

Me!

Wait!

I!

Um...


But if you figure out questions, I have to like, answer them.
Instead of blathering on at random.

Well, I could work out questions that can't be answered in any other way... Actually, I think I've done that before. Often. Hm...


"A language is a dialect with an army."

Hehe.


Er, I suppose Deaf Native Signers might, since they don't exactly, speak.

Te he. But seriously, I know very little about sign language. Are there "functional signs" or are they all lexical? Hm...


For people to adjust the tenses of my work, I guess it's all about telling it as I see it, regardless if the message is clear or not.

There may be something else going on. It's what I call "editor's blindness". It's the effect that when you're looking for flaws to amend them, you see them everywhere, and you forget to actually "read". The rules take over and lead you astray. That's why I need to take breaks from reading.

That is: sometimes things that don't work are "editing artefacts". If you were actually reading, you'd find it works, but you don't have the analytic tools to express why, and this is why it's kind of a blind spot in analytic mode.

I've seen critiques decrying "tense switches", which were - to me - pretty clear examples of the dramatic present tense. Kind of like this:


I was walking down the street the other day, and a nice day it was. I turned a corner and, before you know it, there's this big rhino in the middle of the road. And I'm not sure I can trust my eyes, because, really, what's a rhino doing in the middle of the road. Cars I get. Bikes. Perhaps even trollies, though they'd be odd. But... a rhino?

Anyway, the rhino hadn't noticed me yet, so I started to back off. Slowly. Quietly. So that it wouldn't notice me.

It noticed me.

I was scared.

It ignored me.

I was miffed.

I'm silly, aren't i? [Edit: Look, a mistake. This should be a capital "I". Why is it a mistake? Because I didn't intend to do it. How can you tell? All the other first-person-pronouns in the text are capitalised, even in the middle of a sentence. So it's pretty safe to say it's a slip. (I could have some obscure reason for that particular "i" to be lower case, but that would be a *very* obscure reason. The burden'd be on me to make it clear[er].)

Things like that aren't even that rare in spoken language. There are plenty of other examples.

When it comes to literary style, the tense purists are more popular these days, certainly. I think this has something to do with a preference for "invisible" narrators, as tense switches midstream usually point towards a psychological relationship between speaker and content.

***

Too many interesting posts, too little time. (This one took me two hours. I wonder if there'll be another post before mine that wasn't there when I started typing? Happens to me a lot.)

Dawnstorm
05-06-2008, 11:28 AM
I'm going to use this thread to archive interesting links concerning rules. ;)

Here's another one (http://paperpools.blogspot.com/2007/08/cormac-mccarthy-semi-colon.html), from Helen DeWitt, whom I've never read but might now look for.

My fave quote:


I say: Look, we give force to these rules by complying with them for the sake of compliance. The more people comply with them, the harder it is for others not to comply with them, the more non-compliance looks like the prerogative of genius rather than just what any writer can choose in doing what's right for--

And later she elaborates in comments:


Yes - I hate those cleaned-up editions of Dickinson. I think one problem is that a lot of people are like Lynne Truss. They are happy to make an exception for writers of genius (if you're James Joyce it's OK to break the rules). The problem is, though, that writers of genius don't walk around with little halos over their heads, enabling us to distinguish the writers of genius from the riffraff. If the only justification for breaking rules is the claim to genius, very few unpublished writers will be allowed to break rules. They are by definition writers who have had no reviews, no critical assessment; the culture at large has not yet endorsed these particular violations of rules. Most writers surely want to make a much more modest claim, which is that the text handed in reflects the sensibility of the person who wrote it. Replacing that sensibility with that of a randomly-chosen stranger should not be the default procedure; it should be done only in exceptional cases, when very compelling reasons can be put forward. As things stand, however, replacing the author's preferences with those of a stranger is the norm; allowing the author's preferences to stand is the exception. Hence, I take it, the dreary sameness of so much that is published.

Shweta
05-06-2008, 12:51 PM
That's interesting, but I think there's also weight to the idea that, as a writer, one needs to know the rules to know how to break them effectively.
Rather than getting garbled prose.
Every diversion from the norm is a great big flag to the reader saying "This Means Something!" So it better.

Dawnstorm
05-06-2008, 08:58 PM
That's interesting, but I think there's also weight to the idea that, as a writer, one needs to know the rules to know how to break them effectively.
Rather than getting garbled prose.

Ah well, there's the rule tango again.

You see a text and think it doesn't work. How do you figure out which rule has been violated? What if someone else thinks it does work?


Every diversion from the norm is a great big flag to the reader saying "This Means Something!" So it better.

Meaning is cheap. I mean we're able to see bunnies in clouds. It's the default attitude that matters. If you're emphasising rules, the default attitude is that a diversion is a mistake, and you'll stop looking for meaning. If the default attitude is that a diversion means something, you'll go looking for meaning.

Really, if your default is to look for meaning all you risk is looking like a fool. But finding meaning that isn't there (or wasn't before you found it) is rewarding, too, in its way.

Rules, as a default, make you miss everything that's interesting about writing, ticking it off as a mistake.

To be sure, very few people are ever extreme in any one direction. Emphasising the "art" (for lack of a better word) to the extreme makes a solipsistic writer. But emphasising the rules to the extreme makes for a solipsistic reader. Neither position is very attractive, but - as a reader - trying to understand the solipsistic writer is at least potentially a rewarding project. Insisting on what I think is correct accomplishes nothing at all. I'd rather be a fool than a blockhead. (And of course I'm both. ;) )

In all human practise, "rule-status" is a thing of negotiation. It would be interesting to compare editing habits of the publishing houses I enjoy most (say Vintage) to the others. I wonder if their editing habits are more relaxed?

Ruv Draba
05-06-2008, 11:20 PM
You see a text and think it doesn't work. How do you figure out which rule has been violated? What if someone else thinks it does work?
I guess that as writers we crit (and surely produce) a lot of texts that don't work. From personal experience, maybe eighty percent of the texts that don't work for me, fail for reasons of conceptual design rather than expression. In other words, it's the assembly of ideas rather than the formulas that represent them, which somehow fails for me.

Really, when it's just the words failing, that's rather easy to fix - a single, painless revision can do it. Even quite pedestrian expression will carry decent ideas forward. But in my experience no amount of syntactic or grammatical fiddling seems to fix lame ideas (despite what rhetoricians would have us believe). Broken ideas are the ones that seem to require revision after revision to mend - if they're mendable at all.

Meaning is cheap.Interpretation is cheap! Insight is painful and hard-won for both writer and reader. What's the difference? Robustness I think.


Really, if your default is to look for meaning all you risk is looking like a fool. But finding meaning that isn't there (or wasn't before you found it) is rewarding, too, in its way.

Rules, as a default, make you miss everything that's interesting about writing, ticking it off as a mistake.It's not mutually exclusive though, is it? If the syntax is making sense to us, we'll tend to jump on meaning. When the meaning's not making sense to us we'll start paying attention to syntax again.

And sure, maybe some pedants refuse to look for meaning when the syntax isn't right, but that's like insisting that every meal is meat on the right, peas on the left and mashed potato at the top of the plate (yes, I believe that rigid grammarians would make fine English cooks).

I'd rather be a fool than a blockhead.This fine maxim pretty surely sums all fiction-writing endeavours, while the reverse might make a halfway decent description of editing. "No, no you be the fool. I'll be the blockhead." :banana::banana::banana:


In all human practise, "rule-status" is a thing of negotiation. It would be interesting to compare editing habits of the publishing houses I enjoy most (say Vintage) to the others.Wow. Is that inferable without knowing with the input manuscripts are? What sample size would you need for 95% confidence.... And against what 'standard' would you need to compare... And can you do that with ideas and tropes too say? Hmm..hmm...

Shweta
05-07-2008, 03:20 AM
You see a text and think it doesn't work. How do you figure out which rule has been violated? What if someone else thinks it does work?
We negotiate it as a culture. There is no free and easy right and wrong.


Meaning is cheap. I mean we're able to see bunnies in clouds.
Which doesn't actually mean meaning is cheap, it means people are very good at finding meaning. Which is different. It means that was crucial to survival. Finding bunnies in clouds is the same skill as finding tigers in the undergrowth.


It's the default attitude that matters. If you're emphasising rules, the default attitude is that a diversion is a mistake, and you'll stop looking for meaning. If the default attitude is that a diversion means something, you'll go looking for meaning.

Humans go looking for meaning anyway. Humans who have learned that There Are Rules just get it wrong. We think that what's wrong is the syntax, when what's wrong is that we're getting garbled meaning.

Which is why every "rule" can be broken if you do it right.


Really, if your default is to look for meaning all you risk is looking like a fool.

I think you mean something different from me by "looking for meaning" :)
Looking for meaning, by every study I've seen, is just something people do. Looking for Deep Symbolism is something we're taught to do, and it does get silly when taken to extremes. But we all see... bunnies, say, or human faces, in the strangest places.

Can't you just see this one going "eek, that snake is headed my way"?

http://pro.corbis.com/images/42-17662601.jpg?size=572&uid=%7B29650750-8566-4dee-a35b-423a7ae990e3%7D


Rules, as a default, make you miss everything that's interesting about writing, ticking it off as a mistake.

Yes, I think we're in agreement. On the other hand, there are ways to make one's meaning clear and levels it needs to be clear on. There are useful rules of thumb for getting there, and it's silly to think one does not need to know them.


From personal experience, maybe eighty percent of the texts that don't work for me, fail for reasons of conceptual design rather than expression. In other words, it's the assembly of ideas rather than the formulas that represent them, which somehow fails for me.
Is it "bad ideas" or is it "ideas that are not taken to their logical conclusions, that the reader isn't taking seriously enough"?
For me it's generally the latter.



Interpretation is cheap! Insight is painful and hard-won for both writer and reader. What's the difference? Robustness I think.
I like this. I think it's a good way of looking at it.



And sure, maybe some pedants refuse to look for meaning when the syntax isn't right
Well, they can try... :tongue

Ruv Draba
05-07-2008, 06:19 AM
Which doesn't actually mean meaning is cheap, it means people are very good at finding meaning. Which is different. It means that was crucial to survival. Finding bunnies in clouds is the same skill as finding tigers in the undergrowth.Further to that, there's quite a bit of evidence to show that it's an evolved, hard-wired ability rather than simply learned. Two interesting pieces of evidence (from memory - references available if you press me hard enough):

Facial recognition involves brain cells that appear pre-lingually. People who lose those brains cells (and hence have terrible trouble recognising people) can also no longer see faces in clouds and powerpoints and grilled cheese sandwiches.
Girls have an extra optical sensor for green. It picks up a different frequency to the one that both girls and boys have. Other than giving girls excuses to make up colour names (like 'taupe' and 'teal') that boys seldom use this would be an extremely useful facility when you're gathering nuts and fruits from among leaves - and avoiding tigers.
Humans go looking for meaning anyway. Humans who have learned that There Are Rules just get it wrong.This may be an intuitive/sensate judging/perceiving dichotomy in the Myers-Briggs sense (I'm married to a former psychotherapist; she guesses Myers-Briggs types with an accuracy that would make astrologers shrink in shame). At the extremes, intuitive perceivers are very good at considering alternatives and contradictions in parallel, formulating multiple models for an event and making messes all through their house; sensate judgers are very good at finding exceptions to rules and sorting buttons.


there are ways to make one's meaning clear and levels it needs to be clear on. There are useful rules of thumb for getting there, and it's silly to think one does not need to know them.If you want to make sense to sensates (especially sensate judgers), you have to work in familiar modes. Most writers are intuitives and many are very happy to wade through gobbledygook to find 'possible meaning'. But try to get your accountant or filing clerk to do the same. They'll take a quick look and discard it definitively as 'crap'.


Is it "bad ideas" or is it "ideas that are not taken to their logical conclusions, that the reader isn't taking seriously enough"?Oh heavens.. don't get me started! It's more than that... wrong characters in the wrong situations, doing the wrong things for the wrong motives with the wrong arcs supporting murky, fuzzy or nonexistent themes... Sometimes, the only logical conclusion is that the writer didn't know what he was doing or why he was doing it.

(Mothers, never let your logician son turn to fiction. It'll blight his mind, curse your family and he'll be a wen on the arse of the world. Even Berty Russell gave up after his first novel.)

Shweta
05-07-2008, 06:26 AM
Further to that, there's quite a bit of evidence to show that it's an evolved, hard-wired ability rather than simply learned. Two interesting pieces of evidence:

Let's add in that babies mimic facial expressions at a few hours old :)
Apparently they stop doing this after a very little while.


This may be an intuitive/sensate judging/perceiving dichotomy in the Myers-Briggs sense
Maybe!
However, I do think (and this is the cognitive science bit at work) that all humans are pretty much alike on a bigger scale. Compared, for example, to other species or objects. However logical a human, they're still more intuitive and pattern-matchy than a computer; however intuitive, they're still more logical than a bunny.
So we're all a mix-up, no matter how we score on those tests.

Ruv Draba
05-07-2008, 08:45 AM
I do think (and this is the cognitive science bit at work) that all humans are pretty much alike on a bigger scale. Compared, for example, to other species or objects. However logical a human, they're still more intuitive and pattern-matchy than a computer; however intuitive, they're still more logical than a bunny.Oh, we're something like 98% genetically identical. We're more alike one another than fruit-flies are to each other. From a species perspective, all the cultural, racial, cognitive, gender, personality differences are either meaningless or trivia. We all do our thing like primates and that's that.

However to us as a cognitive animal differences are extremely important because our ideas are so rich, complex and nuanced. If you've ever tried to pitch an idea well outside your Myers-Briggs type (and don't know that you're doing this), it can be the difference between a minute of hand-waving and mutual head-nods and six hours of fruitless mutual frustration. It might not be that the idea isn't comprehensible. It's that the strategies by which we take in information, make decisions and validate them can be very different (http://www.youtube.com/v/Zo1XFz0kac0) (I use this example to help train my techy consultants in how to deal with business-people. :D)

Dawnstorm
05-07-2008, 11:08 AM
Well, that last one wasn't exactly my best post ever. I'm rather confused, lately. (Lately? Well, more confuseder, anyways.)


Really, when it's just the words failing, that's rather easy to fix - a single, painless revision can do it. Even quite pedestrian expression will carry decent ideas forward. But in my experience no amount of syntactic or grammatical fiddling seems to fix lame ideas (despite what rhetoricians would have us believe). Broken ideas are the ones that seem to require revision after revision to mend - if they're mendable at all.

Which is why I do language flow and grammar last. Doing it first is kind of silly, considering all the re-writes and cuts. I don't have that time.

But I was specifically thinking grammar. I know what it's like to have rules disproved. Example: A couple of years ago I thought singular they was a recent phenomenon, born of the political correctness movement to equalise gender. I rejected it, corrected it where I saw it (with the usualy disclaimers, since I hate rules and I hate being hypocritical and being hypocritical about being hypocritical makes me feel better). Fine, so far. Then I stumble upon a debunking of proscription against singular they, and I'm given examples dating back at least to Shakespear. That pretty much shot my political correctness hypothesis down. (They might have introduced some variant, or popularised something latent, or...) Anyway, I'm intrigued, now. And I found examples everywhere. And then I found an example much like, "A mother should love their children." Fine. Now I know it's definitely not about gender ambiguity either. I don't know many male mothers. One day, this scene pops into my head, just like that. Child at the table, feet dangling. Eyes anywhere but on the plate before him. Mom squints at him: "Somebody hasn't eaten their dinner." Now scenes this banal often pop into my head just like that. But where did that piece of singular their come from? Could that have popped up during my I-hate-singular-their phase? (I wouldn't be surprised, since "rules" we hold tend to have little to do with what we really, deep down think about language. Their usually rationalisations for... something? Or maybe not.)

And yet something changed. Getting rid of a rule is liberating; and in this case it has improved my analytical reading ability (might have improved my general reading ability, too, but I'm not sure).

This is about daily-life policy as much as it is about language. Did I really ever hold this rule as a language production rule? A reception rule? Or did I only hold the rule in contexts of political correctness, overgeneralising, and not even noticing singular-their occurances outside of these contexts?

I've become wary of grammar rules we hold discoursively, since. There's always the possibility that my practical grammar - the generator and the parser - doesn't really care about my words.

"Grammatical rules" as expressed in words, I think, are pretty different from the actual generative or perceptive rules we're using to make sense of language. The reasons that we're spreading certain rules (such as my singular-their aversion [ex] above) may have little to do with language.

Lurking online, also in grammar forums, has tought me that - at least with native speakers - intuition is a more reliable measure of "correctness" than "rule mongering".


Interpretation is cheap! Insight is painful and hard-won for both writer and reader. What's the difference? Robustness I think.

No insight without interpetation. Plenty of interpretation without insight. ;)


It's not mutually exclusive though, is it? If the syntax is making sense to us, we'll tend to jump on meaning. When the meaning's not making sense to us we'll start paying attention to syntax again.

Nah, I wasn't talking semantics vs. syntax; I was talking the semantics of syntax - as Schweta said (I'm paraphrasing, hopefully correctly) that "breaking the rules" should be a meaningful, rather than an accidental activity. (Did I get that wrong?)

My take here is that a certain usage can be meaningful, independant of the rule-status you assign.


And sure, maybe some pedants refuse to look for meaning when the syntax isn't right, but that's like insisting that every meal is meat on the right, peas on the left and mashed potato at the top of the plate (yes, I believe that rigid grammarians would make fine English cooks).

I seem to remember Steven Pinker analysing William Safire's reading of a sentence by Barbara Streisand, and showing how (idiosyncratic) rule application has made him blind for the meaning. (Found it. (http://camba.ucsd.edu/~bakovic/ll/grammar_puss.html))

Now, I doubt that Safire is a bad reader. I rather think that's what happens when you're looking for mistakes.


This fine maxim pretty surely sums all fiction-writing endeavours, while the reverse might make a halfway decent description of editing. "No, no you be the fool. I'll be the blockhead." :banana::banana::banana:

My inner block-head is hard to move...


Wow. Is that inferable without knowing with the input manuscripts are? What sample size would you need for 95% confidence.... And against what 'standard' would you need to compare... And can you do that with ideas and tropes too say? Hmm..hmm...

Oh, I'm satisfied with qualitative methods, here. I think quantification in these instances would simulate an exactness that doesn't really exist. And, no, no ideas/tropes. Just grammar. As far as you can tell them apart that is...


We negotiate it as a culture. There is no free and easy right and wrong.

Absolutely. But the negotiation still happens between individuals. (My degree's in sociology, so don't ask how individuals intersect with culture. Pleaaase...)


Which doesn't actually mean meaning is cheap, it means people are very good at finding meaning. Which is different. It means that was crucial to survival. Finding bunnies in clouds is the same skill as finding tigers in the undergrowth.

Agreed. (Did I mention that the "meaning-is-cheap" line isn't among my best?)


Which is why every "rule" can be broken if you do it right.

Again, yes. But it's not a given that two people are using the same rules. And it's not a given that people are using the rules they think they are using. And it's not a given that formulated rules are helpful in understanding others/being understood.


I think you mean something different from me by "looking for meaning" :)
Looking for meaning, by every study I've seen, is just something people do. Looking for Deep Symbolism is something we're taught to do, and it does get silly when taken to extremes. But we all see... bunnies, say, or human faces, in the strangest places.

No, that's actually pretty much what I'm thinking. What I'm arguing is that rules hinder that process, and once a rule interferes, you'll have to make an effort to go back to that natural process, and if you fail you end up with nonsense interpretation (not just wrong, but something that's neither conforming to rules, nor intuition; something that makes no sense either way.)

This is why I dislike rules. I find I'm a better reader without them.


Can't you just see this one going "eek, that snake is headed my way"?

Hehe. Good one. (You had to tell me, though.)


Yes, I think we're in agreement. On the other hand, there are ways to make one's meaning clear and levels it needs to be clear on. There are useful rules of thumb for getting there, and it's silly to think one does not need to know them.

Actually, I think it's sufficient to know the ways and levels. You can then figure out how to get there. "Rules of thumb" are really just short cuts. Knowing them can save you from inventing the wheel all over again, but inventing the wheel all over again will give you a more solid understanding.

Shweta
05-08-2008, 02:08 AM
I think we're pretty much in agreement.

Let me try and splain what kind of "rules" I think are useful. Maybe I should call them "consequences" instead? What I mean is stuff like "If you do X the reader will get Y out of it."

For example, if you use lots of "speech" verbs that aren't "to say" the reader will notice them, and might be popped out of the story. Not because they're wrong, IMO, but because they're a deviation from the (current) norm.

Or, if you use the passive construction, you shift focus (generally) from an actor to an experiencer. Overused, this shifts the emphasis of the story away from action, which might hurt you. It's also more words, so it reads as being slower.

Stuff like that. Rather than "Don't ever use anything but say" or "Avoid The Passive".

Ruv Draba
05-08-2008, 03:10 AM
Let me try and splain what kind of "rules" I think are useful. Maybe I should call them "consequences" instead? What I mean is stuff like "If you do X the reader will get Y out of it."I'd call that a model, not a rule.

Writers need to model the impacts of their language and flow of ideas on readers. If we don't do that then we assume (incorrectly) that whatever way our brains think, will make perfect sense to others.

In practice, writing out thoughts we already have produces a very different impression on ourselves than reading thoughts you don't already have.

I suspect (indeed, more than suspect) that we need less grammar to express existing thoughts to ourselves than we do in expressing new thoughts to others. From that notion I conclude that grammars are often just guide-rails to help hearers of new utterances infer the intended meaning and impact. But just as guide-rails don't always prevent accidents and aren't always needed in well-trodden pathways, so grammars are neither sufficient nor always necessary to assure clarity of meaning.

And as a corrollary: if you are very good at modelling impact then you may not need to think in terms of grammar and 'rules' much, or at all.

Shweta
05-08-2008, 03:17 AM
Oh, we're something like 98% genetically identical.
I thought we were like 98% identical to chimps and more like 99.something% identical to one another genetically. Not that the numbers matter.


However to us as a cognitive animal differences are extremely important because our ideas are so rich, complex and nuanced.
Agreed. We're sensitive to the differences. They seem huge. Just saying that, in context, we're not that different. Which is why we can communicate at all.


I'd call that a model, not a rule.
I like that word. I steals it. I have models :tongue


I suspect (indeed, more than suspect) that we need less grammar to express existing thoughts to ourselves than we do in expressing new thoughts to others.

Or to evoke old thoughts in other people, too. Thus, couple-talk.


From that notion I conclude that grammars are often just guide-rails to help hearers of new utterances infer the intended meaning and impact.
Ha. And it only took Linguistics 60 years to get there :D

Ruv Draba
05-08-2008, 05:27 AM
Agreed. We're sensitive to the differences. They seem huge. Just saying that, in context, we're not that different. Which is why we can communicate at all.Sure. In fact, the Pinker article that Dawny cites points out that kids build language without grammar. Indeed, they can even do it with other kids whose native languages are something else. (It was kids who invented the Creole tongue for instance.) That's only achievable because of the ninetysomething percent simian similarity, I'll wager.


Or to evoke old thoughts in other people, too. Thus, couple-talk.Or.. to display infantile regression and the triumph of the limbic mind over thousands of years of human civilisation. Thus, LOLcats. :tongue

Shweta
05-08-2008, 09:09 AM
Sure. In fact, the Pinker article that Dawny cites points out that kids build language without grammar. Indeed, they can even do it with other kids whose native languages are something else. (It was kids who invented the Creole tongue for instance.)
There are quite a few creoles. English is, arguably, one of them.
Pinker and his ilk use that to argue that grammar is innate. My lot use it to argue that people are good at finding ways to communicate meaning :)


Or.. to display infantile regression and the triumph of the limbic mind over thousands of years of human civilisation. Thus, LOLcats. :tongue

But LOLcats are pretty sophisticated. They're feigning several types of simplicity for humorous effect, and choosing the language so that it makes people to construe/reconstrue a scene accordingly. I hypothesize that, in the best LOLcats, the level/type of misspelling slows the reader down juuust enough for the timing to be right for the joke.
That would take experimentation of course, and can you imagine the grant proposal? :tongue

Dawnstorm
05-08-2008, 10:37 AM
Let me try and splain what kind of "rules" I think are useful. Maybe I should call them "consequences" instead? What I mean is stuff like "If you do X the reader will get Y out of it."

Yep, that's fine with me. (But like Ruv, I wouldn't call that rules. To me rules are in the imperative.)


Or, if you use the passive construction, you shift focus (generally) from an actor to an experiencer. Overused, this shifts the emphasis of the story away from action, which might hurt you. It's also more words, so it reads as being slower.Must... resist... disagreeing... (Other thread. Other thread. Other thread.)

;)

***

Ruv, I have to get used to the way you're using the word "grammar". Intuitively, I think of grammar as the structure that underlies all language usage and can be analysed, not so much a conscious generative/revision model.


My lot use it to argue that people are good at finding ways to communicate meaning :) That's more my position, complemented by a good social (re)production model, perhaps. Need to think.


can you imagine the grant proposal? :tonguehttp://icanhascheezburger.files.wordpress.com/2008/03/funny-pictures-angry-cat-questions-lolspeak.jpg

http://icanhascheezburger.files.wordpress.com/2007/04/469758086_051b1dd752.jpg

http://languagelog.ldc.upenn.edu/myl/llog/ORly.jpg

I needz munnies!

Plz?

Shweta
05-08-2008, 11:12 AM
Oh uh, I thought we ended up mostly agreeing about the passive thing except for some things we defined differently.

Annnnyway, back on subject, ish:

http://www.sheldoncomics.com/strips/sd080507.gif (http://www.sheldoncomics.com/archive/080507.html)

Ruv Draba
05-08-2008, 03:11 PM
There are quite a few creoles. English is, arguably, one of them.Fair call, but... but... I Capitalised it. Which is like Trade-marking! You know I meant the Louisana regional creole. (Same as Miss Universe doesn't include venusians, and World Wrestling Entertainment barely stretches past Mexico.)


But LOLcats are pretty sophisticated.Only in the sense that Look Who's Talking (http://www.imdb.com/title/tt0097778/) appears sophisticated. :tongue. The cat images are proxies to decontextualise and condense the human emotions in the dialogue. If we put the lines back into the mouths of adults we'd probably call it infantile and somewhat repulsive. :crazy: To be humorous as opposed to just bewildering, it requires the viewer's partial but imperfect understanding of cat behaviour. Rather than reading cat emotions (which differ substantially from human emotions), we need to be able to do the "bunnies in the clouds" thing and project human analogs onto their actions, while at the same time remain ignorant of (and not think about) the true cause of the events depicted.

It's awfully postmodern in some ways, since the stories are 'found' stories. The cleverness is in the juxtaposition or deconstruction more than the individual elements. I think of LOLcats as the graphic novellist's equivalent of rap music. :D

LOLcats alienates (alienate?) me terribly (which is funny in itself) - because I preconsciously interpret the image and text as separate information streams. I have to actively dismiss my mental image of the author uttering those lines, and dispell whatever situational story I was concocting about the cat and then conceive of the cat uttering them instead. Only then do I get the joke. :D

Ruv Draba
05-08-2008, 03:26 PM
Ruv, I have to get used to the way you're using the word "grammar". Intuitively, I think of grammar as the structure that underlies all language usage and can be analysed, not so much a conscious generative/revision model.I've noticed the difference too. Bear in mind that this is your field and not really mine. What I've pieced together is largely utilitarian - and may well be flawed since my uses are limited.

On the other hand, I like Pinker's point that we're all (albeit preconscious) experts in language construction and recognition. The nearest that I can come to your view, Dawny is that a grammar is a heuristic recognition model for a specific language, and hence also a heuristic generative model.

If the high level of fuzz is bothersome there, it needn't be! Modern computation makes use of such models all the time for complex systems. You can find uses in market segmentation, fraud detection, epidemiology, stockmarket predictions - any time where the system is complex, dynamic and reacts to its own behaviour in chaotic ways. My gut feel is that language bears a close resemblance to such systems. (It changes more slowly than most of my examples, but still faster than grammarians seem able to keep up.)

Arguably (in a gedanken sense) one might be able to produce a non-grammatical language recognition/production system that worked much better (i.e. more accurately and comprehensively in the recognition/production of meaning) than any standard language grammar and lexicon. And if that's the case, then grammar is likely not the foundation of language, but merely a convenient if imperfect abstraction to assist study.

ColoradoGuy
05-08-2008, 04:28 PM
My simplistic observation is only that grammar is both an explanatory system imposed by humans to understand the cacophony of language noise we make and to bring order to it -- thus both comprehending language and regulating it. One's viewpoint of what grammar fundmentally is then depends upon where he finds himself on that explanatory spectrum.

Higgins
05-08-2008, 05:24 PM
Girls have an extra optical sensor for green. It picks up a different frequency to the one that both girls and boys have. Other than giving girls excuses to make up colour names (like 'taupe' and 'teal') that boys seldom use this would be an extremely useful facility when you're gathering nuts and fruits from among leaves - and avoiding tigers.


The proportion of females with an extra cone type seems to be pretty low...maybe no more than one or two percent.

Supposedly females can also see better at low and high ends of the visual spectrum...but this all seems pretty thin in terms of actual evidence and evolutionary significance.

Dawnstorm
05-08-2008, 08:21 PM
Oh uh, I thought we ended up mostly agreeing about the passive thing except for some things we defined differently.

We came to common grounds on this:


Or, if you use the passive construction, you shift focus (generally) from an actor to an experiencer.

Different terms.


Overused, this shifts the emphasis of the story away from action, which might hurt you.

No matter what terms we're using (but I might be confused here), I don't see how this shifts the emphasis away from the action, as the action itself doesn't care which way it flows. The action is exactly the same in both active and passive, no matter "which way it flows". (I can't remember actually talking abou this in the other thread, but it might again be terminology: "it shifts the emphasis away from "agency", but towards the action itself," quoth I. *Shrug*)


It's also more words, so it reads as being slower.

This I don't remember having talked about at all. I don't think "slow prose" is as much a function of "word number" as it is of rhythm. A well placed auxiliary might actually speed up the reading by connecting two stressed syllables that would otherwise clash (two stressed syllables one after the other slow down *my* reading, I find). When it comes to language flow, I think that rhythm is more important than word count; and the passive isn't disadvanted here, necessarily (it could make the transition from "Dam da" to "da Dam da", which could improve the flow in context [see "TOok | aWAy] vs. [was TA- | -ken aWAy]; different prosodic emphasis, different flow [since the break comes within a word rather than between words it feels more connected] plus accidental parralel "ay-sound").

(I'm not so fond of the more-words-is-bad school of writing. I find, it's better to think in terms of syllables and stress.)

I think we're naturally more sophositicated in writing than the rules can ever be, so that thinking of our writing in "rules-of-thumb" (such as the general "more words is bad", "or have agents do things") we're dumbing down our writing for the sake of easy handling. (Part of this is why I linked to deWitt's article; the point isn't whether she's right or wrong about her choices; the point is that her choices are disregarded in favour of a smooth handling of business - without the rules in the first place communication could be a lot quicker, and more reliable; even though - or because - the initial communicative effort is higher.)

The link above is about how industry standards enforce a rigor when it comes to rules that is detrimental to the author's intention, and favours conformity (whether it's detrimental to the text is a different question altogether). I think the rather complex relationship between "rules" and "exceptions" can be puzzled out by looking at the Shakespear line:

"Shall I compare thee to a summer's day?"

It's the opening line of a sonnet. Shakespearean sonnets are in "iambic pentameter". It's one formal property of the sonnet, though not the most important one. Now look at how many iambs there are in the above line. The only clear "iamb" is "compare". The way I read it, I get a second one at "-er's day" (-> Shall I | compare | thee | to a sum- | -mer's day? ) A consistent iambic reading would make "thee" an unstressed syllable, leading up to stressed "to", which readingwise makes no sense at all.

So what constitutes a iambic pentameter, then, if it's not what's actually on the page? There are many takes on takes on that. A common one's here (http://meadhall.homestead.com/BlankVerse.html):


Blank verse, on the other hand is a traditional form--unrhymed iambic pentameter. In theory it consists of ten syllables made up of five pairs of syllables, the first unstressed, the second, stressed. In practice a passage of iambic pentameter is made of lines that range from nine to eleven syllables with five stresses to each line, though not all arranged in an iambic ( x / ) pattern. The perfectly regular iambic pentameter line lies like a bass line behind the verse, but is seldom followed exactly. We seem to have some inner sense of iambic rhythm, for we hear it in the tick-tock of every clock, though in reality the sound is only in our ear. There is no actual difference between the ticks and the tocks; we merely impose the rythm upon them.

Basically, we're seeing bunnies in clouds. But it's more complicated, because we not only perceive but also produce. Now the question is: to what extent does practising "perfect iambs" help you get at the behind-the-scene rhythm? This is where people are different. But at the critique stage, rule-application becomes moot. You won't go tell Shakespear to regularise his iambs. And it's not because Shakespeare's a genius. It's because the rule was never supposed to be a stricture in the first place.

My take is basically that as a writer you should either go on seeing bunnies and not bother with the clouds; if your intuition works you'll be fine. Or you should go straight at the clouds - as you won't be able to ignore the bunnies anyway. Once you're able to see the cloud for what it is better you can start to try and find out why you saw a bunny in the first place. I don't think focussing on the bunnies is a good idea. It diminishes natural variation; every bunny will look alike. (I've seen many interesting but raw text regularised that way - and become dull. I now rarely venture into critique threads; I always end up disagreeing with the critiques, which isn't exactly popular.)


The nearest that I can come to your view, Dawny is that a grammar is a heuristic recognition model for a specific language, and hence also a heuristic generative model.

That's the point, though. Is it a view? Or do we merely use the same sign for different concepts?

How are we capable of telling we made a mistake? Why do we defend some usage, but are embarrassed about other usage? That's where we see grammar; it surfaces like the Loch Ness monster - we're never quite sure whether we've really seen it, but we have a word for it.

Ruv Draba
05-08-2008, 11:18 PM
The proportion of females with an extra cone type seems to be pretty low...maybe no more than one or two percent.Interesting, Higgins. Got a reference? I've had the extra cone type report come my way from a couple of decent sources -- they never mentioned the low proportion, but it's significant in how we interpret meaning.

One or two percent is quite a high proportion genetically - enough to change a population in a few generations if circumstances were to change. (I wonder what the proportion was 20,000 years ago?)

Ruv Draba
05-08-2008, 11:30 PM
That's the point, though. Is it a view? Or do we merely use the same sign for different concepts?I don't have enough expertise to answer that. I could just be a numpty who's misusing a word for reasons he doesn't fully comprehend. :D


How are we capable of telling we made a mistake? Why do we defend some usage, but are embarrassed about other usage? That's where we see grammar; it surfaces like the Loch Ness monster - we're never quite sure whether we've really seen it, but we have a word for it.
We're capable of telling that we've made a mistake when we gauge the responses of our listeners. They'll either not get our meaning, or seem uncomfortable in how we've expressed it. In fact (I believe) it's really the only authoritative way of knowing whether we've connected, had a near miss, or heading in different directions. Experience shows us that grammatically 'correct' communication is not necessarily effective communication.

As to what 'rules' we make up to explain why we did or didn't connect - that's a modelling question. We can produce many alternative models for the cognition of others. Only some of those models will have the structure of grammars.

Here's an odd idea: if we see a grammar as a way to model cultural "belonging" in language, then surely we could consider adapting grammars to model cultural belonging in other kinds of formal social behaviour - such as greeting one another, gift-giving, courtship... But if we did that, would we expect the results to be authoritative (when viewed behaviourally), or merely proximate? And if they're only proximate for other formal behaviours, are we perhaps leaning too hard on the authoritativity of grammar in language definition too?

Shweta
05-09-2008, 12:49 AM
Dawnstorm, I think our actual disagreement is deeper than the passive. You see grammars as structures. I see them as conventionalized methods of focusing interlocutors' attention and getting them to build up thoughts. Yours is the traditional view. I think my view has a sounder empirical basis. However, I haven't read many of the psych studies on the other side; the ones I did were just too methodologically unsound.

By my view, necessarily, if you're focusing your reader's attention on experience (for example "Her hair was grabbed") rather than action ("he grabbed her hair"), their construal of the scene is going to be hugely different. Even if the scene itself is the same. It's the difference between imagining hair in your fist and imagining your hair being pulled as it's grabbed.

By this view, the passive construction brings with it a risk of over-focus on the object that's not in motion. Same as overuse of the active risks focusing on the actions to the exclusion of the experience.

I realize you disagree. However... well, I'm happy to have dueling empirical data in another thread, if you like. I haven't been convinced by yours so far.


Here's an odd idea: if we see a grammar as a way to model cultural "belonging" in language, then surely we could consider adapting grammars to model cultural belonging in other kinds of formal social behaviour - such as greeting one another, gift-giving, courtship...
I would guess this is pretty close to what we're doing. We are, after all, primates :)


But if we did that, would we expect the results to be authoritative (when viewed behaviourally), or merely proximate?
If I understand your question, I think it would depend on the social standing of the person concerned. As they move from child to adult to elder their understanding gets closer to authoritative.

Dawnstorm
05-09-2008, 10:50 AM
I don't have enough expertise to answer that. I could just be a numpty who's misusing a word for reasons he doesn't fully comprehend. :D

And so could I.

It's never easy to spot these things when you're caught up in the debates; it's always easier when you're standing on the sidelines with little interest in either position.

I suppose lots of disagreement in academia is grant-rivalry induced misunderstandings based on different terminology (which needs to be "sold").


We're capable of telling that we've made a mistake when we gauge the responses of our listeners. They'll either not get our meaning, or seem uncomfortable in how we've expressed it. In fact (I believe) it's really the only authoritative way of knowing whether we've connected, had a near miss, or heading in different directions.

You're a step ahead of me, here. Sometimes we correct ourselves before people even have a chance to react. Sometimes we're the only ones to notice our own mistakes.


Experience shows us that grammatically 'correct' communication is not necessarily effective communication.

I do think self-corrected mistakes (say when you're alone in your room writing a letter) point towards a plan (or a model as you aptly call it) we're using. Something internal that's possible to deviate from. What would we call this, if not grammar?


As to what 'rules' we make up to explain why we did or didn't connect - that's a modelling question. We can produce many alternative models for the cognition of others. Only some of those models will have the structure of grammars.

Interesting. I'd argue we have a practical grammar and a discoursive grammar, and the two don't necessarily match. They're related though, since they inhabit the same brain. ;)

I have a feeling you're only calling the "discoursive grammar" "grammar".


Here's an odd idea: if we see a grammar as a way to model cultural "belonging" in language, then surely we could consider adapting grammars to model cultural belonging in other kinds of formal social behaviour - such as greeting one another, gift-giving, courtship...

...lolcats... :tongue



But if we did that, would we expect the results to be authoritative (when viewed behaviourally), or merely proximate? And if they're only proximate for other formal behaviours, are we perhaps leaning too hard on the authoritativity of grammar in language definition too?

My hypothesis is that authority only directly impacts "discoursive grammar", but that "discoursive grammar" in turn impacts "practical grammar". They can remain separate, too, as in the case of the self-professed "grammar nazi hypocrite" I think I linked to somewhere above.


I realize you disagree. However... well, I'm happy to have dueling empirical data in another thread, if you like. I haven't been convinced by yours so far.

Um, actually, I don't disagree. I framed things differently. Again. Yes, we've been through that, too. Your mentioning "experience" in relation to "action" made me realise my mistake. (I've still got a couple of questions, but not here, not now.)

(Empirical data duels are interesting; but I'm a bit out of the roster, there, so I'm mostly talking theory, and old data - vaguely remembered.)


You see grammars as structures. I see them as conventionalized methods of focusing interlocutors' attention and getting them to build up thoughts. Yours is the traditional view. I think my view has a sounder empirical basis.

Yup, I'm a bit of a structural thinker, but I don't see how one precludes the other, here. I'm not a formalist in the sense that I think structures we detect in grammars are necessarily what's going on in all language users' minds. I don't think grammar is inate.

I can actually live very well with "conventionalised methods of focusing interlocutors' attention," though I'm not knowledgeable enough to comment on the "getting them to build up thoughts" part. Since I come from sociology, the complex part - for me - is "conventionalised" while the "focusing" part is pretty much a crude model (mostly derived from phenomenology). So far, I found that your expertise complements my take pretty nicely. And yet when we're talking about rules...


By this view, the passive construction brings with it a risk of over-focus on the object that's not in motion. Same as overuse of the active risks focusing on the actions to the exclusion of the experience.

See, if you phrase it like that, my misgivings vanish. I might still disagree with one detail or another, but that's quibbles. [For example, I find the above more plausible for syntagmatig transformations (active <-> passive) than for paradigmatic substitution ("to receive/be given a present"). That's not to say there isn't an effect; but it pretty much gets over-ridden by referential semantics, I think. (I realise I'm talking structure again; but wouldn't the fact that the subject stays the same and that the real-world referent for both "be given" and "receive" is the same process overlay and thus weaken the "passive-effect"? I'm honestly asking, since this isn't my area of expertise at all.]

Cautioning against the perils of the passive voice but neglecting the perils of the active voice seems to me a bit one-sided, inadvertantly increasing the risk of future awkwardness on account of needlessly avoided passive voice. This is not a unique phenomenon in the passive voice; it goes for noun-verb priority (over adjectives and adverbs; but totally ignoring prepositions which can express as much movement as a verb) and so on.

Ruv Draba
05-09-2008, 01:16 PM
I do think self-corrected mistakes (say when you're alone in your room writing a letter) point towards a plan (or a model as you aptly call it) we're using. Something internal that's possible to deviate from. What would we call this, if not grammar?

Because of how I was educated, I think of a grammar as a particular kind of substitutive structure, expressed as a series of 'production' rules like this.
Sentence::= Noun Verb Noun


Sentence::= Noun Verb Noun Adverb


Sentence::= Sentence Conjunction Sentence
This recursive decomposition of 'goals' into 'strings' of 'tokens' is used all the time in the definition of computer languages, and captures pretty much how 'real' (human) language grammar was taught to me too.

But I know that you can define languages using other modelling structures. Those structures may also represent the space of valid utterances but not look like grammars. Here's a really simple partial model for a language that doesn't use tokens to represent parts of speech.

O(start, 'The', 100)
O(start, 'A', 20)
O('The', 'cat',4)
O('A', 'cat',2)
O('cat', 'sat', 2)
O('cat', 'slept', 1)
....

This is just a record of pairs of words I've seen together and the number of occurrences in which I've seen them. With a bit of cleverness I can use these pairs to randomly generate sentences with a frequency similar to the frequency with which I've seen whole sentences.

This 'model' can produce 'incorrect' English utterances which (theoretically) a grammar cannot. But I can enhance it to ever greater levels of accuracy by storing words in triples, quartets etc... I can also enhance it by adding records of 'likely' word combinations that are known to not be sentences.

What's interesting about this 'model' is that the amount of 'meaning' it produces is much higher than a grammar alone can produce.

(I don't believe that this is a model of human linguistic cognition by the way, but it's an alternative to using grammars.)

StephanieFox
05-09-2008, 07:48 PM
The proportion of females with an extra cone type seems to be pretty low...maybe no more than one or two percent.

Supposedly females can also see better at low and high ends of the visual spectrum...but this all seems pretty thin in terms of actual evidence and evolutionary significance.



This 'extra cone' thing is just an excuse for men to dress badly. "What do you mean they don't go together?" Feh! My husband is a designer and has no problem seeing the difference between teal, turquoise, lime or any other color. Neither do any of his designer friends, male or female. He can even tell you during what decade the color was popular.

(What color is the little smiley face above?)

Shweta
05-10-2008, 01:41 AM
*runs in*
Whee, agreement!
*runs back out again*

Smack me down if I'm back in here before thursday, seriously. I have a paper to write.

Ruv Draba
05-10-2008, 05:30 AM
What color is the little smiley face above?BLUE!

Ruv Draba
05-10-2008, 05:31 AM
Er... GREEN!

Ruv Draba
05-10-2008, 05:32 AM
Er... TEALQUOISEMARINE!

Dawnstorm
05-10-2008, 01:56 PM
(I don't believe that this is a model of human linguistic cognition by the way, but it's an alternative to using grammars.)

Actually, what you've described is pretty compatible with the way in which many sctructuralist grammarians work. ;) Structural grammar doesn't concern itself with cognition much. The thing is: I think all those theories about grammatical form we have are as much data as they are theorising. More interesting than black box models. ;)