25 September 2016

Don't get any big ideas: the dominance of names in social sciences

It has bugged me for quite some time that in language, in education and in the social sciences in general, there are certain names whose big ideas get repeated ad nauseam even once others have moved the state-of-the-art onwards.

For example, when textbooks and courses discuss formal grammars, they typically focus on Noam Chomsky's generative grammars. Chomsky's model of grammar divorced structure from meaning, which he demonstrated with the nonsense sentence "Colorless green ideas sleep furiously." Since then, however, it has been observed by many commentators that a command of grammar as an independent from semantics is only really valid for people with training in grammar, and culturally not universal.

Take for example the early nineties computer game "First Samurai". Developed in the UK, they asked a Japanese translator what the game's title would be in the Japanese script, as they wanted this for the cover. The response was that it was impossible, because you can't rank samurai. Reportedly, they then asked how you would say the "first samurai" you see in the morning -- still impossible. In the end, they had to ask for another phrase to be translated and then take the symbol for "first" from that and place it next to the symbol for samurai.

There are dozens of similar anecdotes attested worldwide. In her 1978 book Children's Minds, Margaret Donaldson cites a report of an adventurer who asked a Native American to translate "the white man shot six bears today."
"'How can I do that?' said the Indian. 'No white man could shoot six bears in one day.'"
Donaldson roundly rejects the idea that this grammaticality sense is anything more than a result of our education.

The other major blow to Chomsky's model was Lucien Tesnière's valency/dependency grammars. Whereas Chomsky built his trees based on part-of-speech only, Tesnière identified that certain words had to be accompanied by certain other features, and could optionally be qualified by additional ones.

Chomsky split his basic sentence trees into subject and predicate, as was the norm at the time. Tesnière instead argued for "verb centrality", putting the verb at the top of the tree. This meant that the verb in Chomsky's model had a direct link to the grammatical subject. It is trivially obvious that this is a superior model, because with no direct link, Chomsky's model essentially claims that "*He say I does" would pass a grammaticality test. Now I'm sure Chomsky at some point will have presented a round-about argument to say why that's not acceptable, quite simply Tesnière's model was better. Tesnière's model is widely accepted, and it's key to a lot of computer-based language techniques.

And yet when I studied language, lots of space was presented to Chomsky, and I have no recollection of seeing the name Tesnière or talk of valency or dependency grammars. When I briefly studied formal grammar in computing, lots of time was given to Chomsky, and dependency grammars were mentioned only in passing. To me, verb centrality was an obvious notion, and every time Chomskyan grammar was presented to me, I wanted to put the verb at the top. It wasn't until about four years ago that I picked up a book on Computational Linguistics/Natural Language Processing and was introduced to Tesnière's theories.

This is a very dangerous state of affairs -- students are being taught outdated, disproven theories instead of the current state of the art. In education, one of the best examples would be Piaget, whose theories have been proven wrong time and again, but are still one of the main focuses in most introductions to childhood development.

Why? The typical answer is that to understand the current system, we have to understand the underlying theories they're built on. This, I'm afraid, is not true. Or rather, it is true, but the underlying theories of modern grammar are derived from Chomsky, and the underlying theories of modern childhood development are derived from Piaget. By teaching the original theories, we end up holding back development in the field: most courses spend so long talking about the outdated theories that they don't leave time to fully discuss the current ones and the students leave the courses with a working model of the wrong theories. We therefore spend a lot of time debating the same thing as the generation before.

Certainly, we don't do this in the physical sciences. No-one would suggest that in order to learn about the big bang theory we first have to learn about the theories that predated it. Such theories are clearly of some interest, but are best restricted to specific modules on the history of science.

The problem in the social sciences seems to be a reluctance to rewrite part of a major theory based on subsequent observations and refinement. It appears that to be genuinely influential in social sciences, you cannot simply do incremental improvement, and instead must write a new theory practically from the ground up. In doing so, you are guaranteed posterity, because your grand theory will continue to be published, cited, repeated and taught as is long after all the elements it is built of are individually discredited -- the field will not allow anyone else to revise it for you.

Take Bloom's Taxonomy for instance. Even when it was first devised it was a bit of a kludge. The most common form seen today is still the simplest triangle form, and all versions and derivatives still hold the same ordering of "remembering" before "understanding" -- i.e. it preaches rote learning in line with the behaviorist thinking of Bloom's day, even though no modern school of thought actively professes a belief in rote learning as a useful mechanism.

The book Second Language Learning Theories by Mitchell, Myles and Marsden discusses the proliferation of theories in the introduction, and says "We believe that our understanding advances best when theories are freely debated and challenged among a community of scholars." I would certainly not dispute that, but I think we waste an awful lot of time when we discuss disproven theories and treat them as though they have equal merit to theories not yet disproven. We also, as I said earlier, do a lot of damage to the next generation of academics by preparing them to discuss the disproven theories rather than the current ones.

Worse, though, is the effect on non-academics. Teachers overexposed to outdated theories and not familiar with current ones are unable to take advantage of advances in the field and translate it into classroom practice.

Does cross-language transfer exist?

[I actually wrote this about 6 months ago and I'm not sure why I didn't publish it. I've read it over and it looks finished to me...]

It's always seemed weird to me that some people claim that L2 errors are not related to the form of the learner's L1. Surely it's common sense? For instance, a speaker of Chinese, or Polish, or any other language with no articles is going to have a problem learning when to use "a", when to use "the" and when not to use an article at all in English, right?

But of course, common sense is only common sense until it's proven wrong. It was once common sense that the sun went round the Earth and that there was such a thing as "races" of people -- scientific study has since shown us otherwise.

So language must be open to scientific study, and teachers must be open to the results of study. And there are plenty of studies that purport to disprove the idea of cross-linguistic interference.

For example, I'm currently skimming through bits of the book Understanding Second Language Acquisition by Lourdes Ortega. There's a chapter on "cross-linguistic influences" (they reject the term "interference", because they reckon it puts too much blame on the L1) and they open by looking at a couple of studies that show that the similarities and differences between languages have less influence than might be expected. The first of these is to do with the placement of negatives in Swedish, where they find that learners on the whole tend to incorrectly place negatives before the verb, even if their own language also places negatives after the verb. [Hyltenstam, K., 1977. Implicational patterns in interlanguage syntax variation. Language learning, 27(2), pp.383-410.]

What questions does this raise? First I would say that negation is an undeniable universal of language, so here we're talking about a linguistic concept that is already known to the learner, and the only variable under examination is word placement -- syntax. There is no examination of usage -- for example, what happens with complex clauses? In English we most commonly say "I don't think so", but some languages are more likely to say "I think not", which now sounds quite old-fashioned. It's far more likely that a native English speaker would say "I don't think he's coming" than "I think he's not coming", but "creo que no viene" is perfectly normal in Spanish. It's readily apparent that this sort of transference occurs at the phrasal level, and this is the sort of thing that is often never taught and simply left to the student to work out. It is also readily observable that learners make errors handling polysemy (multiple meanings of a single word) -- again, looking at Spanish "esperar" is to wait, to hope and to expect, and it doesn't take long in the company of Spanish learners of English to hear someone pick the wrong one of the three.


I haven't had a chance to read the original paper yet (I'm not familiar with my new university's online search and I'm feeling too impatient today to start hunting) so for the moment I'm just thinking about what I'll be looking for when I get round to reading it (which might not be for a while!). First up, is there anything that goes deeper than simple word placement? Secondly, as a tangent to the point about cross-language transfer, and continuing the thread on slots, does something change when it's a multi-part structure, with the two sections separated by other language content?

Besides, is negatives even a fair example? My gut reaction is that in every language that I've come across a sort of "Tarzan speak", the stereotypical form is for pre-negation ("me no hurt pretty pretty" and the like).

The second paper cited discusses object pronoun placement in French-speaking learners of English and English-speaking learners of French. [Zobl, H., 1980. The formal and developmental selectivity of LI influence on L2 acquisition. Language learning, 30(1), pp.43-57.]

Again, I've not read the paper yet, but the summary in the book baffles me a bit, because it seems somewhat obvious. We have the example of the pair "je les vois"/"I see them" and it is observed that English speakers often erroneously say "*je vois les", but French speakers don't tend to say "I them see". This is not a symmetrical error, but then, this is not a symmetrical pattern. As I see it, the difficulty for the English speaker isn't simply one of the absolute position of the pronoun, but the fact that pronouns appear in a different place from explicit noun phrases as object -- je les vois vs je vois des hommes. The English speaker learning French has the difficulty that he sees pronoun noun phrases and explicit noun phrases as subcategories of a single thing that share a position, and in order to learn French accurate must learn a fundamentally new distinction between weak pronouns and other noun phrases. The French learner of English, conversely, doesn't necessarily need to learn the English speaker's categorisation, and can maintain the French distinction and notion of two slots, and can then simply focus on the syntax, learning two different positions that happen to be the same. It feels to me that focusing on the superficial differences here has failed to account for the real underlying difference.

Another question this leaves in my head is about my own observations from dealing with Romance speakers. I recall hearing a lot of them dropping weak pronouns altogether, and my first reaction is that the article focuses on the lack of examples of "*I them see" as proof that there's not a problem, but doesn't comment on the presence or absence of examples of pronounless renderings (i.e. "I see", with no them). My observation from my own pupils was that several of them had learned not to place the pronoun before the verb, but if the verb in Spanish would be the last word of a sentence, they'd just stop without including the pronoun at all. I'm now left second-guessing my own recollection here, though, as recently I only recall hearing this error with Spanish people saying "I like" for "me gusta", and of course there's no explicit "it" in the Spanish, so that's a different issue (but still cross-linguistic).

This is an issue that intrigues me, and I hope to revisit it during the year when I'm working on cross-linguistic issues as part of my masters. For now, though, I just wanted to get my thoughts jotted down for future reference.

24 September 2016

The master approaches...

So I've just embarked on a new phase in my career, beginning a masters degree programme in TESOL. After Christmas I get the opportunity to specialise, and the plan at the moment is to specialise in computer-assisted language learning, which really is my kind of thing.

I figure it's time to dust off this old blog and start using it as a scratchpad to reflect on the sort of issues that I'm dealing with on the course, and to comment on the materials I come across in my reading.

It's also an opportunity for me to break the habit of a lifetime and start using proper citations and referencing on the blog, something which I hope to stick to in the future so that things I post here are better informed and therefore more useful to others.

27 February 2016

Edinburgh's trouble with multilingual education.

The Scottish Government has long had an aspiration to wider availability of multilingual education, and recently formalised on the European model of 1+2. 1+2 is the idea that a child will be educated in their first language, and that during their primary schooling, they will be taught at least two additional languages; the first being introduced from the first year of schooling, the second no later than the 5th year of primary school. (Earlier draft versions of the regulations said the first additional language should be introduced no later than P3, but this has since changed.)

There are several steering principles underpinning the 1+2 approach. With regards to the first additional language ("L2") the government themselves state:
" The Working Group expects young people to continue with some form of language study in the L2 language up to the end of the broad general education, i.e to the end of S3. " [Language Learning in Scotland: A 1+2 Approach]
Children are therefore expected to be given the opportunity for continuity of access to their L2 until around the age of 14 or 15, and it is assumed that there will be the option to continue beyond that age, subject to the usual logistical constraints around class sizes and the viability of running exam-level classes for a small number of pupils.

Another of the principles is that language should not merely be taught as a subject, but should be embedded into classroom routine. There is even the hope that in the future it would be possible to offer subjects (or units within subjects) delivered through foreign languages. What could be more natural than listening to accounts from French or German WWII soldiers and civilians in their own language as part of the history curriculum, for example? It's a laudable goal, and even if we're not likely to achieve it in the foreseeable future, it's certainly something to aspire to.

The government's deadline for the implementation of this policy is 2020, and several local authorities are pushing to get themselves ready ahead of this date. Last year, Edinburgh City Council announced their intention to have the scheme implemented by 2017.

This too is laudable, but recent news has thrown the city council's commitment to this into doubt.

Gaelic-medium education (GME) has been available since 1988, when a Gaelic-medium unit was opened within a mainstream school, Tollcross Primary. Since then, uptake of the option for GME in the city has increased year on year. Tollcross Primary is a feeder school for the city-centre high school James Gillespie's, so secondary Gaelic-medium was implemented there. In 2013, primary GME education was moved to a dedicated school on Bonnington Road in the north of the city, outside of James Gillespie's school normal catchment area, but JG's retained its place as the city's secondary GME facility and the new school was given official status as a "feeder primary" to the school.

This year, however, James Gillespie's have found themselves with more applications for new admissions than they have capacity to accept, and the council have announced that the standard rules for oversubscription apply: priority to children within the geographical catchment area and those with older siblings already attending the school. As the intake for the Gaelic primary is drawn from the entire city (and beyond), it is most likely that the pupils who lose out will be those currently going through GME. There are 24 pupils in this year's primary 7 class, and current projections see 9 of them being refused a place at JG's.

The council's current proposed solution to this is to offer these children the option of attending Tynecastle High School, or the school for their local catchment area, but neither of these options fulfils the aspirations set out for 1+2, as local schools will offer these children no continuity in their L2 (Gaelic), and Tynecastle is little better. Tynecastle currently only offers Gaelic for learners, something which is not appropriate to children from a GME background. Indeed, children who have undergone three or more years in GME are not allowed to sit the Gaelic learners' exams at National 5 or above at all.

Going by the council's current projections, then, we're likely to see 15 GME kids in JG's first-year intake and at most 9 in Tynecastle's. With class sizes pegged at 30, that means that we've taken one class and turned it into two, which certainly does nothing to reduce problems of capacity at either school. When we look at what that means for course choice at 3rd and 4th year, when come of the pupils may be dropping Gaelic, what are the chances that either school will see a continuing Gaelic class for the GME pupils as viable?

This then leads on to a wider issue with GME provision at JG's. Aside from Gaelic itself, the school currently only teaches Art, RE, PE and Modern Studies through Gaelic, and currently none at a certificate level, although National 5 Modern Studies will be offered next year (see section 3.78). It seems likely that these classes will not operate in Gaelic for next year's first year, as that would mean having half-empty classrooms in a school that had already turned children away for capacity constraints.

Part of Edinburgh Council's justification for this decision is that:
"The level of current Gaelic provision at James Gillespie’s High School is not significant and could be relatively easily replicated, at least in part. There continue to be significant issues nationally with the recruitment of Gaelic speaking staff which limit what could actually be delivered at a secondary level, regardless of where it was provided. " (section 3.75, same document as above)

Both of these statements are true, but this is something of a question of cause and effect.

First of all, the reason for the low level of Gaelic provision is due to the lack of critical mass, and dispersing the GME primary cohort across two or more high schools will certainly not resolve this. Secondly, part of the problem nationally with the availability Gaelic-speaking staff is that for they typically spend the majority of their time teaching in English, and again this stems from a lack of critical mass within the pupil cohort. If the council's actions will lead to Gaelic-speaking teachers spending even less time teaching in Gaelic, then the council's justification is little more than a self-fulfilling prophecy that leads them to further squander what is already a limited resource, rendering their argument somewhat self-defeating.

The lack of availability of trained GME teachers is something that is being addressed at the national level, but there's something of a chicken-and-egg situation: with the low number of classes being taught in GME at present, it is very difficult for a teaching student to gain placement experience in a Gaelic-medium setting. Depending on the subject you are training to teach and the school you are placed in, Gaelic-medium classes may be limited to BGE (the first three years) or even only the first year or two. Some subjects may not be available at all. This makes it very difficult for a new teacher to build up the confidence required in delivering through Gaelic a subject that they themselves will have learned through English. Any action at a local level that risks decreasing the availability of GME has knock-on effects at a national level that hamper our ability to address the issue.

Not just a problem for Gaelic

Many people will shrug their shoulders and say "it's only Gaelic", but they're missing the point, because at the moment it's only Gaelic that offers us a current model for language learning throughout schooling, and much of the Scottish Government's policy on language learning leans on the experience of GME.

Four years away from the government's deadline on 1+2 and one year from its own self-imposed deadline, the council is already making decisions that take it further away from its goal. This does not bode well for children going through primary education in other languages, and Edinburgh council's schools will be offering a fairly broad selection (among them French, Spanish, Mandarin, Polish, Farsi and Gaelic). What happens if a child who has learned Spanish since primary 1 finds themselves allocated to Forrester High School (French and German)?

This is a logistical matter that will become a serious issue for parents across the city in the next few years, and this is an opportunity for the council to pilot a solution on a small scale and work out a strategy before it's too late.

If the council can't handle the transition between primary and secondary correctly, it will turn children off languages: kids placed in a language class that is too easy for them will lose interest in languages, and kids placed in classes above their level will lose confidence in their own ability to learn.

The goal of 1+2 isn't just to give kids "the right language", but to give them the right attitude to language, so that they can go on to be successful language learners and pick up the particular language they need later in life when the need arises. Getting the primary-secondary transition right is absolutely vital in developing this attitude, and if the council can get this right for 24 pupils next year, how can it hope to do so for the hundreds of pupils moving into its high schools in 2017 and every year after?

15 February 2016

Implicit and explicit, meaningful and meaningless

Last week I was across in Edinburgh catching up with friends. I arrived early, so went into a bookshop to kill time... and came out with two chunky academic texts. I probably would have escaped without buying anything if one particular book title hadn't caught my eye: Implicit and Explicit Knowledge in Second Language Learning, Testing and Teaching.

The main thrust of the book was looking into the ongoing debate as to whether implicit teaching styles lead exclusively to implicit knowledge and explicit styles to explicit knowledge only, or whether explicit teaching could lead to better implicit knowledge. It's an important area of discussion because at present the mainstream theory of language teaching holds that only implicit learning can ever lead to implicit understanding and production, and that explicit teaching only ever makes people consciously aware of rules and able to apply them mechanically and consciously.

And yet there are very few teachers who don't include some explicit instruction in their lessons, whether that's word-lists or conjugation tables. (Even Assimil, who sell themselves on the principle of natural assimilation, dedicate more letters to grammatical explanations than they do to the dialogues and their transcriptions and translation.)

I haven't read the whole book yet, and (unsurprisingly) what I've seen so far is pretty inconclusive. However, it does lean towards the opinion that explicit teaching does indeed help in language mastery. It also discounts a lot of the past counter-evidence to this theory on the grounds that their models of explicit teaching are simply bad examples, using overly mechanical, rote methods, and not being equivalently "meaningful" to the implicit method under examination.

It's this word "meaningful" that I think is the crux of the problems faced in language learning – language is nothing without meaning.

In the language classroom, items that seem to be inherently rich in meaning can paradoxically be rendered devoid of meaning by context.

Consider:
My cousins buy trousers.

In an objective sense, it carries a lot of meaning, and there is no truly redundant information in the sentence – every word, every morpheme, brings something not explicitly present elsewhere. (But even then, it has no real personal meaning to me, as I can't imagine myself ever saying it. This is a side issue for the moment, though.)

But what happens when we put that sentence into a classroom exercise?

For an extreme example, let's take the behaviorist idea of substitution drills. In "New Key" style teaching, a substitution drill would be target language only, and one element of the sentence would be substituted with something else in the target language. So our theoretical exercise might go:
Teacher: My aunt buys hats
Learner: My aunt buys hats
Teacher: My mother
Learner: My mother buys hats
Teacher: Trousers
Learner: My mother buys trousers
Teacher: My parents
Learner: My parents buy trousers
Teacher: My cousins
Learner: My cousins buy trousers

At the point of utterance, the learner does not have to pay any attention whatsoever to the meaning of anything in the sentence beyond the plural marking of my cousins and the s-free verb form of buy, which is made even easier by the fact that this plural example follows an earlier plural example. Thus the student has no immediate motivation to attend to meaning, and it is a struggle to do so.

Substitution drilling is, as I said, an extreme example, but I do feel it is useful in establishing a principle that affects a great deal of learning, even where the effects are not so obvious.

Consider, for example, the fairly established and mainstream idea of focusing on a particular grammar point in some particular lesson, or section thereof. If I am set a dozen questions all of which involve conjugating regular Spanish -er verbs into the present simple third person singular (or whatever), then I do not need to attend to the meaning of the present simple third person singular, just the form -e.

To me, attending to meaning is the single most important matter when it comes to language learning, and yet it is rarely explicitly discussed. Instead, it is typically wrapped into a specific embodiment of the principle. Krashen's comprehensible input hypothesis suggests we learn language by understanding input, the communicative approach says we learn when we use language to solve problems. Total Physical Response says language has to be tied to physical action All of these are attempts to address the meaningfulness of language, but they are a narrow, specialised form of attention to meaning. CI and TPR deny us access to the colourfulness of abstract language with its subtle, personal meanings, and the CA doesn't do much better in that regard – while modal language may be taught in a communicative classroom, the nature of the task implies a rather pragmatic, utilitarian meaning, so there isn't really any meaningful difference between blunt orders like give me it; plain requests (can I have...?) or those indirected with a conditional mood (could I have...?); and statements of desire either, whether in declarative (I want...) or indirected further in the conditional (I would like...).

Other teachers take the idea of "personalisation" and raise it above all other forms of meaningfulness, insisting that students only learn by inventing model sentences that are true for them. But isn't (eg) I want it, but I don't have it true for everyone? Does the brain not immediately personalise a sentence such as that? (When I came across a similar sentence in the Michel Thomas Spanish course, I was cast back to throwing coins in a wishing well as a child.)

Perhaps the reason few writers wish to discuss attention to meaning is that it throws up a lot of questions that often fundamentally challenge their methodologies. For example, comprehensible input (and any similar learn-by-absorption philosophy) is confounded by redundancy in language – there is no need to attend to the meaning of every morpheme when the same information is encoded twice in the sentence. Perhaps the clearest example of this is the present tense -s suffix for English verbs (third person singular, i.e. he/she/it). It is readily apparent that a lot of learners do not pick this up, and there are a great many foreigners who spend years in English speaking countries, hearing thousands of hours of input with the correct form, but who never pick it up. There is no need for the learner to attend to the meaning of the -s, because they already know from the subject of the sentence that it's the third person singular being discussed. When they speak, they are understood, because even though it sounds incorrect to a native speaker, there's practically no risk of being misunderstood. In the communicative approach, such an error is not a barrier to completing the intended task (particularly seeing as there's a good chance your conversation partner will make the same mistake, given that everyone in your class is a learner), so there is no requirement to attend to it.

Language has a natural tendency to redundancy, in order to make our utterances easier to understand; we are naturally disinclined to attend to every element of the sentence. Therefore any attention to the meaning of all the individual components of an utterance will be a higher-order process, a conscious or semi-conscious overriding of our lower instincts. Surely that makes it an explicit process? And if it is an explicit process, surely it is better for it to be directed by an expert (the teacher) than carried out on an ad hoc basis by a non-expert (the learner)?

02 January 2016

Why I learn languages,and why I'm not learning any languages

I've probably said before, but the reason I like learning languages is because of the look on people's faces when I speak their language to them.

What I didn't realise was how far back this went.

My dad was a teacher at my high school, and I knew the main janitor before I knew most of the teachers. Lucien, the janny, was from France, and by a couple of years into high school I would say "Bonjour" to him - I couldn't say much else, but it was enough.

When my dad started talking about him this Christmas, the main memory in my head was just a smile - the delight of being spoken to in his own language, even for a moment. It was the same sensation that I've seen so many times since, and I started to wonder why it had taken me so long to start learning languages properly, and then I realised that I've stopped learning again.

I remember that all through my late teens and early twenties, I was keen on the idea of learning languages, and I picked up a couple of books here and there but never got anywhere. I only started to get the proper motivation back when I started speaking broken high-school Italian to a young woman serving in a local sandwich shop. Again I tried picking up the old books, and again I put them down.

As it turns out, restarting a half-forgotten language was really hard - if you attempt to read notes on things you already sort of know, you switch off, so that's when I switched to new languages: Spanish and then Scottish Gaelic. After that, returning to tidy up my French and Italian was a lot easier.

But right now, I'm not really learning, or relearning, or even consolidating anything. Why not? Maybe it's because I can already give lots of people the satisfaction of hearing their own language. More likely, though, it's just because I'm not meeting enough non-English-speaking people. That would be understandable, I suppose. Last summer I moved to an island off the north-west coast of Scotland, where I'm studying full-time, and there aren't many foreigners in the area at all.

What there is, though, is a lot of native speakers of Gaelic, and I just keep falling back to English.

Am I just being lazy? Or is my brain overworked? Or am I just being antisocial?

Probably a bit of each. I really want to get back on track this year, and start learning something new. To that end, I'm starting to plan my summer holidays now -- an epic cycle journey across part of Europe. I need to take in at least one area which requires a new language, and at the moment, I think Germany fits the bill. I've already got a solid basis to build on, so I just need to build fluency and vocab and see where I can get to.

Even just thinking about it, I can start to feel some of the anticipation building.