I was one of those girls who got grammar. I racked up A’s in English because the rules made sense to me. Diagramming sentences was my favorite part of school. I loved discovering how the different parts of a sentence fit together in an elegant, efficient arrangement. I completely bought the notion that there was such a thing as proper English: That the language had rules and, if you knew them you were more articulate, more educated, and maybe just plain better than those who didn’t.
Then I studied linguistics. To my shock, I learned that many of the grammar rules I imagined to be permanently etched in the Stone of Correctness were arbitrary, illogical, and historically erroneous.
Take “ain’t.” This much-vilified word had a long, esteemed history as a perfectly acceptable contraction for “am not,” before it was thrown into the pit of unacceptable usages in the 19th century.
Remember being taught that two negatives make a positive? That rule didn’t exist until 1762, when it was invented by Robert Lowth, a bishop of the Church of England. His thinking: since two negatives make a positive in math, it must be true in speech as well. He wrote his maxim into an influential textbook, and it’s been drummed into the heads of schoolchildren ever since. Yes, that’s right: It was made up.
And then there’s that bugaboo of kids everywhere: The difference between who and whom. Children (and most adults) find this distinction difficult precisely because it’s disappearing. This “erosion” didn’t begin with my generation, or the generation before. Or a hundred years ago. Or a thousand. It’s part of a general simplification in what was once a massively complex system of nouns and pronouns. That simplification has been going on for three thousand years.
These were stunning revelations—and ones that filled me with a sense of delight. English, it turned out wasn’t a pure, crystalline substance that had to be kept free of contaminants. It was a hearty stew with hundreds of ingredients made by dozens of cooks in different kitchens.
Suddenly, I was fascinated with the rich old speech of Appalachia—a dialect that grew right out of Colonial America. With Black English, the source of the most vibrant verbal artforms of the 21st century and possibly the most influential language worldwide. And with the so-called New Englishes that have sprung up in postcolonial Nigeria, Hong Kong, the Caribean. I had been taught that these “non Standard” Englishes were the result of poor education and cultural deprivation. Now, I realized they were part of what linguist David Crystal calls, “The kaleidoscopic diversity of dialects and styles which make up ‘the English language.’”
So, decades later, here I am, freshly arrived in India with a group of fifteen university students to explore a form of English spoken nowhere else. We’ll be doing many of the things other travelers to India do—visiting temples, seeing the sites, staying in an ashram. But, unlike most visitors to India, we’ll be paying particular attention to the English language.
I want my students to get this: English doesn’t just belong to the British any longer, or to the U.S., Canada, Australia, New Zealand. It belongs to many places, including India. Indians have taken the speech of their former colonizers and flavored it with new words, new rules, and a singular accent. They’ve played with it, shaped it, and made it one of their own. They’ve also written in it—producing some of the most vibrant new fiction and poetry coming out today. For nearly a month, we’ll be be immersed in this world of Indian English.
My grade school teachers would have been horrified, of course. They didn’t even know Indian English existed, and if they had, they would certainly have labeled it a poor, corrupt form of the language. Ironically, they were the ones who were poorer for not knowing it. You could say they were culturally deprived.