Sunday, September 25, 2016

Bad English – my journey with the language

Published in Peril
From school to my Masters in India, English served as the only language that I could read and write reasonably well – the only language that I could express complex thoughts in; the only language through which I could feel deeply; the language that I could argue in; and the language that formed the basis of my critical faculties. My grasp of my two other primary languages was always less than adequate for any of those higher-order tasks. The foundations for this were laid during school.  
However, in school, even though English was the medium of instruction and we studied English Literature quite seriously, our learning of the language was flawed. The absence of an organic link between what we read and how we spoke reflected this most markedly. While we read (or were supposed to read) Wordsworth, Dickens, Shakespeare, and all the rest of the “traditional” canon—our curriculum was developed by English classicists in the early twentieth century, and changed very marginally in later aeons—we spoke English in school in a grammatically inconsistent, free-flowing and highly idiosyncratic manner. While it was mandatory to speak in English at my school, the English that was spoken was a hodgepodge of improvised grammar, idiom and syntactical forms. (Many fellow students were the first in their family to receive an education in English. Most students spoke what we referred to as our mother-tongues at home, but there were a few who also primarily spoke English at home.) This state of linguistic formlessness was, in retrospect, a reflection of broader, ongoing historical and cultural shifts, our little social habitus serving as a microcosm of the wider society.  
History and politics
English came to South Asia with British colonialism. If you are someone who is interested in the history of the region, you will at some point encounter what may be regarded as the manifesto of colonial cultural indoctrination – Lord Macaulay’s ‘Minute on Indian Education’ (1835) which memorably stipulated, among other things, that a key aim of colonial education in India ought to be to produce an intermediary class between the colonial governors and the native governed—“a class of persons, Indian in blood and colour, but English in taste, in opinions, in morals and in intellect”.
Lord Macaulay’s observations and exhortations went on to decisively shape the English Education Act that was passed that same year by the Council of India under the stewardship of Governor General Lord William Bentinck. The Act envisaged a swift end to official support for Sanskrit, Persian and Arabic education in India, and marked what may be regarded as the beginning of the ‘Age of English’ in the subcontinent and, indeed, in the broader region. Since Independence in 1947, English has only more decisively established itself as the dominant language—officially, legally, economically, academically and socially—in India, its ascendency underpinned as much by Indian realities as the verities of globalisation. English divides as much as it unites, and it remains a site, instigator and marker of vast and seemingly irreconcilable differences in the country.
Without a doubt, growing up, I was conscious of the politics surrounding this language. Everyday life engendered a subliminal cognitive dissonance – ever-present but unobtrusive. English constituted and opened up an imaginary and habitus that at once alienated and also felt completely natural. It was the language of aspiration. One could not but be aware of its status, its power, its dominion, and its all-encompassing and all-pervasive influence. Growing up where I did, and in my particular cultural and educational milieu, one could not but live on within the language, taking it for granted. It was powerful—educationally, economically and socially—but its power felt utterly unexceptional. The status quo was not something that you wanted to raise too many troubling questions about, particularly questions to which you had no answers. The power of English seemed to accord with the natural order of things. Every act of speaking or writing was imbued with this power, and every act of expression was also an act of consolidation of this power. Every utterance was meaningful in that it betrayed your place in society – your education, your family background, and perhaps even your professional background. In a chaotic cultural landscape, the English language served as a seemingly coherent marker of status.  
Therefore, it goes without saying that for someone like me to discuss the English language here and now, understanding the socio-political framework of its dominance is crucial.
But the average mind tends to gradually despair of the burden of politics.
Mastery over form
Politics is one thing; mastery over form is something else entirely. No amount of political analysis, interrogation and reflection could help me achieve what I’d like to achieve, and what I’ve always wanted to achieve – to overcome my truncated linguistic heritage, to master this one language that has formed the substratum of my intellectual development, and to use it well. To inhabit the language fully, and plumb the depths of complex ideas and fields of knowledge competently and adroitly. To master new knowledge through mastery over language. To generate new ideas through mastery over language. To delve headlong into ideation and thinking, and to do so dexterously.  
As a student of the social sciences, I find myself confronting questions about language daily, perhaps more so than I’d like. Everyone else seems to be able to get on comfortably in life without experiencing the slightest pang over their linguistic ability. I, on the other hand, feel constrained by the incompleteness of my knowledge. I suppose that in this field, where language is—and ought to be—key, becoming more circumspect about language is inevitable. I am now more than ever aware of my incomplete knowledge and grasp of English. In fact, interestingly, flaws that were invisible or unacknowledged earlier have thrust themselves into attention now. If I used to take liberties with word-usage, syntax and meaning earlier, I am now less careless, or rather less confident about throwing language around without being certain of its appropriateness.
My gripe with the English language is that I can never quite master it completely. I can never quite know all of it completely. I can never quite get it absolutely right. I can never quite make it do adequately what I want it to do. I often fail to produce the exact effect that I intend to produce. Even as I write this sentence, I am aware of its incomplete-ness, its inadequacy. It is almost the lexical representation of a half-formed thought. The thought that I am conveying now is only a mere fragment of the real thing. I want to convey my sense of helplessness at being trapped within the confines of the language, but the sense that emerges here is only a refracted version of that thought.
Of course none of this is exclusive to the English language; it would be foolish to believe that what I’ve just said here is anything but universal to the phenomenon of Language itself. Yet something about the English language—about my tortuous journey with/in this language; about my irredeemably incomplete knowledge of it; about my sometimes half-hearted and sometimes refractory attempts at mastering it; and about my feelings of inadequacy within it—something about the wondrous, powerful and utterly mysterious English language accentuates all those feelings and experiences, all that I’ve talked about here.
Foreign masters
Perhaps this is why I have an abiding fascination with ‘non-native’ writers who mastered English so well and so completely that they surpassed the best of their ‘native’ contemporaries. There are a handful—just a handful—who came to the language as outsiders, or as peripheral-knowers, and eventually inhabited it so fully that they made the language their own. Fully in control of the language, and fully capable of effectively demonstrating and deploying its beauty and power, they produced insuperable art.
One such is the Polish-born writer Joseph Conrad. Conrad’s writing stuns with its precision, discipline, power, richness and seeming boundlessness. For someone who came to the English language as an outsider, his skill was amazing. Both the tautness and the expansiveness of his writing pierces with its excellence. Conrad’s psychologically penetrative novels are also literary masterpieces – their literariness is as stupefying as their incomparable probing of the human condition.
The early attempts of foreign masters
Recently, I found myself wondering about Conrad’s learning arc, and how he reached the apogee of his skill. What was his learning curve like? How did he write while he was still learning and developing his skills? It is difficult to find out about this because what we have today are these masters’ triumphant works and successes, not their early (and presumably flawed) tentative endeavours in the language.    
The permanence of our flaws
A peculiarity of our time is that everything that we write (on the computer) or do today will likely get stored somewhere. In the endless galaxy of information that is the internet, every time we publish something online, we leave a small but seemingly permanent trace. A consequence of this is that you can never fully outlive your flaws – while still evolving as a writer, the messy writing of your past doesn’t quite disappear. It’s still there somewhere, not fully discard-able. The past lingers as a reminder of your inadequacy. You’re still constantly learning and you’d like to believe that your best is still ahead of you. Nevertheless, you can’t help looking back when it’s all there for you to scour. The reflexive embarrassment that accompanies every ill-judged attempt to look back over your shoulder gets to be a bit annoying. You can see the flaws, but you can’t go back and change them.
Recently, for the first time, the ‘right to forget’ came to be conceptualised in internet law in the European Union. Impractically, I wonder if we could extend that concept to our past writing as well?
Back to form

As the world increasingly moves towards a more utilitarian and perfunctory approach to language, I find myself pondering the question of formal beauty even more. My bad English reminds me of how much further I could go into this one language that I know well.

Tuesday, September 20, 2016

The furore over cultural appropriation

Published in Southern Crossings

American writer Lionel Shriver recently delivered a keynote speech at the Brisbane Writer's Festival where she discussed cultural appropriation, authorial autonomy, social expectations around works of art and a host of other subjects that have arguably been at the forefront of much critical debate in recent times. Her speech provoked a lot of debate, as well as the usual outrage, grandstanding and squabbling on Twitter. I read the speech as well as some responses to it, and also listened to a discussion about it on the radio. I thought the speech raised many valid questions and argued convincingly against certain contemporary trends. In this blog post, using Shriver's speech as a point of departure, I want to discuss four distinct ideas or areas of concern that I think are pertinent to the conversation at hand. 

Authorial voice

Shriver's speech, at its core, was about authorial autonomy. However, from what I can gather, most people's responses and reactions to her speech have been predicated on what may be characterised as the concept of authorial responsibility. Authorial responsibility is a relatively new concept. A traditional understanding of the creative arts would perhaps privilege the former over the latter, but the zeitgeist seems to place responsibility and care where freedom and autonomy would have previously been. This is especially so in respect of the critical-commentary work that creatives habitually engage in outside of their creative endeavours. At conferences, seminars and talks, for example, the creative is scrutinised and assessed by a set of amorphous but easily recognisable criteria that valorise and emphasise goodness rather than critical rigour. A new schema has emerged that circumscribes the artist's social-critical voice within predetermined bounds. In this schema, the author is expected, above all, to offer commentary that is responsible and socially beneficial. Iconoclasm - that quaint concept that Shriver invoked in her speech - is looked upon unfavourably or with mild amusement. Any diffidence, rebelliousness and irreverence on the part of the artist can be dismissed as self-indulgence and arrogance, rather than recognised as an expression of genuine discomfort with the social consensus. Provocation is passé. 

A key feature of Shirver's speech that provoked much outrage was its emphasis on the author's proprietorial autonomy. An author/artist can do with their creation and their creatures, so to speak, exactly what they'd like to do, Shriver averred. This truism - an assertion of an unquestionable truth about the creative process - provoked outrage because it was seen as evincing 'arrogance', 'entitlement' and even 'colonialist' attitudes. 

Therefore, it can be argued that what we are witnessing - and perhaps this is not entirely unexpected when a particular art form goes into decline - is an intellectual push towards the disavowal of the agency of the author, and its replacement with bland conformity. Shriver makes this point succinctly: not only is it difficult today to do and be sustained by writing as a vocation, you must also constantly grapple with internal misgivings about and external pressures on your agency as a writer. These may not always be apparent or obvious but they are indeed often palpable, further etiolating or rendering toothless what Shriver characterised as the most irreverent of vocations (or avocations). 

American hegemony

So-called American liberals have for a long time now been dictating the terms of cultural engagement for the rest of the world. This is especially true when it comes to the issue of determining how people think about the concept of cultural appropriation. It is not so much that we are told what the correct thing to espouse and believe is. Instead, the actions and manoeuvres of so-called liberal students and sundry others demonstrate to us the limits of our engagement. Certain things are verboten, and we know this not because someone or some movement came out with a cognisable set of rules, but because hostility, skepticism and purblind censorious rage hounded those who did not abide by these unspoken rules. Shriver's speech refers to one illuminating essay written by a so-called 'liberal professor' using the pseudonym 'Edward Schlosser' that highlights this but any number of such essays documenting the rise of liberal intolerance (I am almost tempted to write 'tolerant intolerance') can be found. It is the act of protesting and making impossible certain thoughts, acts, works, practices and what-have-you that eventually determines what people understand and think when the accusation of cultural appropriation is invoked. As most of these protests and acts of proscription take place in the US, this newly re-inflected concept of cultural appropriation embodies and perpetuates American hegemony in the intellectual arena. And make no mistake, in many instances in the US (again, relevant references may be found in Shriver's speech and elsewhere), when protesters and proscribers accuse someone of cultural appropriation, it is an accusation rather than a criticism - the intention is to demonstrate the accused's venality rather than debate their judgment. American sentiments, American race relations and American problems suffuse and overwhelm all conversations about cultural appropriation, rendering this concept a peculiarly and quintessentially American export. The rise of cultural appropriation as a tool of protest and analysis marks the coalescence of a range of phenomena that cannot be summarised or delineated in a short blog post, but it also most definitively signals continuing American hegemony in the field of ideas. 

The concept of harm

Harm is an idea that is very much at the centre of many recent debates relating to culture, education and a host of other areas, and it is one that I find most difficult to grapple with. The idea of harm that pervades literary and artistic debates is perhaps its most nebulous version. To me, the claim that someone has been harmed by a work of art is a contentious claim - it is not one that should expect to go unquestioned. It is not one that should go uncontested. Harm is a concept that has decidedly serious connotations, and when it is used to characterise the effect a work of art has on someone, or, even more problematically, a whole group, it should be used with caution rather than reckless abandon. Taking a concept that is intrinsically objective and verifiable, and rendering it nebulous and unverifiable, and then using this against creatives, is not something that I consider particularly laudable. 

Speaking back to history

The rage against cultural appropriation can be seen as rage against colonialism and western hegemony. It is a expression of anger about historic injustices - anger that, perhaps for the first time in history, can demonstrably shape the cultural or artistic practices that are perceived as being intertwined with those historical injustices, and make these accountable. This is the only point on which I am in agreement with the proponents of the concept of cultural appropriation. The charge of cultural appropriation belies long-held animosity and resentment at racism, conquest, dispossession, theft, loss, misrepresentation and several other phenomena. However, I am almost reluctant to characterise it as 'long-held' resentment, because this animosity is not really something that transcends generations - anger about cultural appropriation is as much a culturally- and temporally-specific phenomenon as it is one that can be said to transcend time. While anger against injustice may transcend generations, this particular manifestation of the anger that we now see before us, pervading protests about costumes, songs and the occasional yoga studio in the US, is something that emerges from the specificities of the wider contemporary socio-political conjuncture. On one level, it is an anger that is empowered with the means of affecting outcomes, and is thus perhaps more virulent than anything authors, artists, musicians and other creative practitioners have encountered before from their intended audiences. On another, more fundamental, level, it is an anger that feeds into the zeitgeist underlying contemporary policies, norms and practices around recognition of historical injustice. In that respect, the charge of cultural appropriation is one that is altogether appropriate. However, my gripe is that I believe the targets are poorly chosen and, in many cases, the manner of protest (again, protest, and not just criticism) is just plain ridiculous. Overzealous American millennials with inflated egos have turned a conversation about history into a shouting-match replete with foul histrionics. They have taken what is still in large part a conversation about historical injustices as well as historical ebbs and flows, and turned it into a farce with disturbing authoritarian undertones. There are many who deride criticism of these antics as something akin to hyperbole - 'You are exaggerating!' - but the truth is, you don't need to exaggerate the transparent unfairness of the many recent instances in which cultural appropriation has featured as the central crime.

Question the criticism

There is now a tendency to countenance the excoriation of certain works of art or the maligning of the intentions of certain authors as 'legitimate criticism'. However, as noted in Shriver's speech, in the US, the heartland of all culture wars, the line between de facto proscription and criticism is one that is constantly shifting, and the arts are hardly immune to these shifts. In such instances, 'criticism' belies far less legitimate tendencies. But putting these aside, questions need to be raised even about the validity of such criticism. Debates need to occur that are free of moral posturing, grandstanding and, worst of all, reckless mudslinging about the intentions and character of authors, creatives, critics and others. 

Wednesday, September 14, 2016

Living with millennials

Living with millennials can be tough. I think I count as a millennial too, but I think my upbringing disabused me of a lot of the sense of entitlement I see underlying millennials' conduct in Australia. 

Allow me to generalise. 

First, they are quite unreliable when it comes to confirming meetings and timings. I remember receiving texts from a number of people when I was looking for housemates earlier this year - texts confirming meetings with them - only to have no notification (or only last-minute notification) of their intention to cancel. The fact that this inconveniences the person you're supposed to meet - that they, in fact, absolutely need to be told that you can (or intend to) no longer meet them - does not seem to occur to some people. 

I suspect this is partly because of the sheer volume of communication that millennials habitually engage in. They are constantly on their phones, texting out hundreds of messages daily, and it does not occur to them that some texts and some messages are more significant than others. For example, informing someone (even if a contact you made while searching for a house) that you are no longer able to meet them, and that they can then do something else with their time - this is essential. Instead, everything is treated with a kind of bland indifference: 'Oh, I can't make it anymore. Meh, forget it.' 

Then, millennials have very little appreciation of the expectations and norms of house-sharing arrangements in Australia. For example, houses are usually rented out for six months or a year or so -  and there is usually, if not always, a contract involved, with a contract period stipulated in it. Therefore, it is necessary, even for later housemates (or housemates joining the original lease-holder), to commit to or stipulate a minimum length of time for their stay. It is quite essential that you commit to something. Yes, I must admit that most people know and understand this intuitively, and respect that custom. However, I have encountered some people who do not seem to comprehend the need for commitment. Instead, their approach is essentially - 'I like this house. Can we do, like, a trial period of a few weeks, and then decide?' This may be a reasonable approach in other areas of life, but housemate-searches involve time, effort and money. Why would I be interested in wasting my time doing so-called trial periods if I can't have some kind of a minimum commitment from you? Such ditzy behaviour reflects naivety and poor judgment, and a lack of appreciation of the norms of house-sharing. 

Finally, when you do end up living with millennials, you are very likely to experience behaviour such as: leaving dishes to pile up in the sink; leaving dirty dishes in the room because they're too lazy to wash them or to even take them to the sink; leaving clothes on racks for days on end even after they have dried; having their friend/date/what-have-you stay for days and days on end, etc. Conversations about house etiquette do not go down well with millennials. They don't know how to respond when concerns are raised with them, which I attribute to a lack of experience of communicating, listening to and resolving concerns. Instead, they prefer to skip the conversation altogether, and offer perfunctory (and perhaps empty) assurances. Further, although this is nobody's business but their own, they eat poorly, and they stuff the fridge with a carton-worth of bottles of Coke. 

To summarise, living with millennials is hard because: 1) they don't respect your time, or do not know how to communicate responsibly - for them, leaving people hanging is OK; 2) they don't appreciate the norms of house-sharing; and 3) they have poor house ettiequte. The first reflects a self-centred approach to communication, where all that matters is that you get what you want, bugger everyone else. The second can be put down to a lack of experience of life outside their parental home perhaps, or a lack of awareness of how things are done in the share-housing scene. The third is due to laziness and, again, self-centredness, where you do not care how your actions affect others. For example, if you pile up dirty dishes in your room for days on end, you prevent others from using the same cutlery for their needs. This is discourteous and irresponsible. 

Of course all of us have our idiosyncrasies and bad habits. Far be it from me to tell anyone to be perfect, and to behave wonderfully at all times. But some things - some qualities, some values, some forms of decorum - are foundational, and essential for harmonious living. And you cannot do without them. Unfortunately, my experience of sharing houses with millennials in Australia - and I specify millennials because, in my experience, older folks have a better sense of these requirements and qualities - has led me to believe that, as in other areas of life, there's a selfishness that pervades millennials' conduct in relation to their housemates.