Saturday, April 22, 2017

Gone, But Not Forgotten

No, I have not forgotten that I have a blog. Really, I haven’t.

I honestly don’t know where the last month went. I remember being on break. That, I remember.

Everything since then, however, has been a bit of a blur.

There’s been a lot of writing. Not on the blog (obviously), but in articles and course proposals.

There’s been a fair amount of reading. I’ve created a new course on the representation of gender and disability in literature, so I’ve been doing a lot of research related to that. I’ve also been doing a bit of reading involving postcolonial literature—specifically, works by writers from India and Africa.

I haven’t stuck to the “deep work” rituals quite as diligently as I would have liked, but I will say, I have stuck to them, and I think that’s enabled me to stay on task with the projects that are important to me, and to remain mindful of the drawbacks of “shallow work.”

That said, I also have to acknowledge that sometimes, “shallow work” is necessary. There are always meetings to attend and conversations that need to be had if academic projects are going to move forward. While I often wish that they didn’t take up as much time as they did, there is a way in which these kinds of meetings and conversations make me feel productive and useful. So to that extent, I think they differ from the straight-up shallow work of things like pointless email exchanges that are obviously going nowhere.

And I’m proud of myself because, in contrast to years past, I’ve really (really!) scaled back on those. Email and social media are wonderful things—at times. But they are also enormous drains on time and energy and attention and, in the wrong hands, they become serious (and occasionally stressful) distractions.

So instead of resorting to technology, I’ve focused on hooking. Crocheting, that is. (Whatever did you think I meant?) I’ve been able to make several blankets for one of my favorite charities, Project Linus, and I had the really gratifying experience of seeing one of them in the hands of a deserving child. It made my day, my week, and my month.

Maybe even my year.

It was actually a really fun pattern: a Rainbow Ripple Baby Blanket by Celeste Young. It forms a 12-pointed star and really looks quite cheerful and wonderful. It’s crocheted from the center outward, so it shapes up really quickly at first, and then slows down significantly, as the rounds get bigger and bigger. But the nice thing is, you can see it take shape, and that’s often a key factor in staying motivated. Especially when it looks like so:

See what I’m saying? I’m currently working on yet another one, that has less of “rainbow” feel to it (it’s blue, green, yellow, purple). It’s really just quite wonderful to make.

I also took a little break and made a little getaway for the weekend of Easter, which meant that I had a busy week last week.

But then again, I’m not going anywhere this weekend, and I’m still looking at a busy week next week so… this too was worth it. Because it resulted in food, fun, friends, and… this.


Long story short, it’s been a good month. And I’m hoping that, from now on, I’ll have a bit more time to drop by the blog and make a note of it.

Saturday, March 18, 2017


That is what has happened to this break.  It has simply flown.

There's been a whole lot of writing and a decent amount of grading.  There's been a bit less reading that I had hoped, but that's because I was busy with the writing and the grading.

And yes, there has been knitting. Of course there has been knitting.

The break was a good opportunity to pick back up on the Persian blanket.  This is hexagon #5. I'm about midway through hexagon #6.

There are 24 hexagons in the pattern.  Plus a whole lot of stitching and edging. So I can't really say that I'm a quarter of the way through it, but I can say that it's... moving along.

And of course I began at least one other project and worked at a couple of others.

Disaster also struck in the form of HOLES in not one, but TWO of my pairs of socks. I couldn't believe it. Luckily for me, they were in the cuff and the upper leg, not the foot or--heaven forbid--the heel.

Because you may not realize this but darning a sock is not easy, and it is somewhat difficult to get the thing mended without leaving a bump that would be quite uncomfortable if it's anywhere on the part of the foot that's going into a shoe and getting walked on.  But for me, such was not the case, so...

You can probably see the mended spot in the sock on the left. That's because it that case, although I searched high and low, I simply did not have any leftover yarn from that skein that I could use for the mending.

For the pair on the right, I had far better luck: I had spare yarn to use, so I could match it and fix it so it's far less noticeable.

All in all, this felt like a triumph, needless to say. Very few things are worse than spending a lot of time knitting something, seeing a hole develop, and realizing that it might very well unravel right before your eyes.

There was also a bit of cooking. In particular, I got a hankering for something I haven't had for years and years (and years--we're talking, like, when I was a child): Boston Brown Bread.

If you've never had it, you don't know what your missing.  It's a whole grain bread (cornmeal, whole wheat and rye flours) with molasses, raisins, egg  baking soda, and buttermilk. You pour the batter into a (greased! in the name of all that's holy, it must be greased!!) can, cover it with foil, put it in a water bath and steam it for an hour.

You'd hardly believe it's bread, if you saw it in its preparation stage.This is what it looks like when it first comes out of the oven. Kinda funky, I know.

But this is what it looks like when it's been removed from the can, sliced and decorated with a little butter.

It's really quite tasty.

And my childhood craving was quickly satisfied, needless to say.

Since yesterday was St. Patrick's Day--also, shout out to St. Gertrude, Patron Saint of Cats, since it was also her day as well--I decided to make a nice little dinner.

I didn't make green food. I don't do that.

Instead, I made a traditional Irish beef stew with stout, which turned out really well, thanks to a couple of hours of slow cooking.  It really makes quite a difference. I also made Colcannon, which is basically mashed potatoes with leeks and cabbage. I liked it.

I was stymied for a dessert, though. Most of the desserts I found involved chocolate and Baileys, which is fine, but I didn't need to be eating a platter of brownies all by myself.

Then my friend, who joined me in the feast, happened to mention St. Joseph's Day and zeppoles. So I got googling and found a recipe for zeppoles san giuseppe, and...

They're actually not terribly difficult to make. For me, personally, using the pastry bag was the challenge. There's a little knack to that thing, and I don't make pastry enough to practice. (The zeppoles went far better than the time I tried to pipe icing. We won't talk about that.)

My friend was... astounded! So all in all, the break has been a success, and at this point, my only wish is that it could be a bit longer.

But with breaks, all things must end and we're at the end of this one.

Friday, March 3, 2017

Into the Deep

Recently, a colleague mentioned Cal Newport’s Deep Work: Rules for Focused Success in a Distracted World (2016), so I decided to read it and give its suggestions a try.

Briefly, Newport argues—and studies have shown—that people in general and writers and intellectuals in particular are increasingly trying to function in a state of more or less constant distraction. As Newport points out, “A 2012 McKinsey study found that the average knowledge worker now spends more than 60 percent of the workweek engaged in electronic communication and Internet searching, with close to 30 percent of a worker’s time dedicated to reading and answering e-mail alone” (6).

What’s gone missing from our work experience is what Newport labels “Deep Work,” defined as “Professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limit” (3). Instead, we’re increasingly devoting the bulk of our work day to “Shallow Work,” defined as “Noncognitively demanding, logistical-style tasks, often performed while distracted” (6).

We’re tackling the emails in our inbox after surfing the web for a recipe for twice-baked potatoes after checking Facebook, where our newsfeed provided us with an endless scroll of infotainment. We’re now aware that it’s Natalie Portman’s birthday, that no animals were harmed during the filming of the upcoming sequel to “Godzilla,” that there is a way to prevent toenail fungus from decreasing your quality of life, and that if you aren’t outraged by your politicians, you aren’t fully alive.

This is what is now passing for “knowledge.” Oh, and by the way, it’s now 4 p.m. and you’ve officially wasted 2 full hours that you can never get back, but on the upside, you’ve responded to no fewer than 15 emails asking you more or less pointless questions and/or reminding you of upcoming meetings that may or may not be devoted to “revisiting” and/or “addressing” some of these pointless questions, which have been generously reframed as “issues for discussion.”

The problem with spending so much time on this kind of distracted mental busy-work is that, over time, it’s a tendency that will actually—and significantly—reduce your ability to activate the level of attention and concentration necessary to engage in the kind of deep work that leads to intellectual growth (and professional development or career advancement).

Newport’s initial remedy is to simply recognize that the mental concentration necessary for deep work is a skill that must be practiced and honed, on a more or less daily basis. And to do this, we need to make a conscious effort to relegate shallow (but unavoidable) activities to the periphery of our lives.

Or, better yet, to eliminate them entirely.

Deep Work offers a series of helpful suggestions for how to spend less time in “the Shallows” (i.e., doing “shallow work” like responding to emails and writing up documents and reports for upcoming meetings). Newport recognizes that, in the “knowledge industry,” “[i]f you send and answer e-mails at all hours, if you schedule and attend meetings constantly, if you weigh in on instant message systems … within seconds … all of these behaviors make you seem busy in a public manner” (64).

More importantly, Newport suggests that “[i]f you’re using busyness as a proxy for productivity, then these behaviors can seem crucial for convincing yourself and others that you’re doing your job well” (64).

But really, you’re not. And in my experience, an even larger problem is the extent to which this kind of behavior can become the norm. Eventually, if you aren’t doing busy-work, you will be perceived by your colleagues as not really “doing your job.”

Those who constantly wade in the shallows instinctively realize that misery loves company. Busyness breeds more busyness (and less business), and those who seek to pursue deep work are perceived as “selfish” or “isolationist” or characterized as “not pulling their weight.”

On this particular point, Newport quotes the late Richard Feynman who, early on in his career, realized that if he wanted to pursue ground-breaking work in physics, he would have to distance himself from the busywork of academia by cultivating a “myth of irresponsibility.”

When asked to serve on committees, Feynman simply said “no” and embraced the notion that this marked him as “irresponsible.” Because ironically, over time, the “irresponsible” are weighted down with far fewer shallow-work-related responsibilities and commitments.

If you want to replace busyness with actual productivity but have become mired in shallow work, Newport has a few suggestions for ways to help return you to the mindset of deep work. In order to “move beyond good intentions” you need to “add routines and rituals” to your work day—and work life—in order to help “transition into and maintain a state of unbroken concentration” (100).

As we all know, this is easier said than done. The lure of social media is strong, and the cumulative weight of the distractions that it offers are great.

To cultivate a life that devotes maximum time (and effort) to the kind of deep work that will produce satisfying intellectual achievements (and hence career advancement), Newport offers several behavioral options, all the while noting that it is important to “choose a philosophy that fits your specific circumstances, as a mismatch … can derail your deep work habit before it has a chance to solidify” (102).

On the one hand, you can withdraw completely from the distractions of shallow work, adopting what Newport calls the “bimodal philosophy” in which you give yourself “at least one full day” for a bout of deep work (108). Sometimes this approach is accompanied by an “internet sabbath” (a day of the week devoted to remaining disconnected from Facebooking and all things Googley) or, more radically, an “internet sabbatical”—that is, an extended period of time in which you unplug from social media completely.

Again, Newport cites a prominent writer who cheerfully notes how happy his life has been since 1990, when he deleted his email account, never to open another. While this sounds rather wonderful, I'm quite certain I'd lose my job if I did that. I’m not only contractually required to have an email account, I’m required by college policy to check it regularly. If I don’t, and a student ends up having a serious academic issue because of my carefree technological disconnection, I’m the one who will be held accountable.

More to the point, though, I’m not sure I could survive such complete disconnection these days. And the sad thing is, I know full well that I used to. I went to college and graduate school back in the days when there was no Internet or social media to offer constant distractions, at a time when email was just beginning to become “a thing.” (I actually didn’t have an email account until I started my first job in 1995.)

At the same time, however, I’m aware that my own intellectual biorhythms typically prevent me from working for long stretches at a time on a mental task. I can, if I put my mind to it, achieve several hours—usually about 4—of mental focus and “flow,” but beyond that, my mental wheels begin to spin and I'm really not accomplishing much for my efforts.

So for me, what Newport refers to as “the rhythmic philosophy” of deep work is the way to go. Under the rhythmic philosophy, you simply try to discover a rhythm for incorporating deep work into your daily life and then consciously set aside time to practice the skill often enough to make it a habit.

To do this, you have to create a routine and a ritual. Newport recommends strictly scheduling your time over the course of the day, from one hour to the next, making specific times for deep work and not allowing the distractions or "commitments" of shallow work to creep into those times.

I’ll admit, when I first read this, I was resistant: I like my freedom.

But I also had to recognize that my freedom was leaving me largely distracted and not nearly as productive as I’d like to be. I’d find myself having a good work day, followed by a string of “distracted days." And when I tried to fire up the energy to have another productive day, it would feel like it took forever to get going.

As it turns out, it felt that way because switching in and out of states of attention and distraction creates “attention residue”—you’ve moved from one task to another, but a portion of your attention is still a bit “stuck” on the first thing you were working on.  This “attention residue” gums up the works, making it that much harder to achieve a state of focused concentration.

So if you switch back and forth over the course of a day between attention and distraction, multitasking your way through your intellectual life, you will end up with very “sticky” mental processes and an exhausted mental musculature. It will require that much more work to achieve a state of concentration necessary to engage in deep work.

Constant distraction will leave you mired in the mental muck of the Shallows.

Overall, what I like about Newport’s approach is that it is realistic and flexible. He offers general “rules” for pursuing deep work and then suggests various ways in which those rules can be implemented, always noting that the key is to find the system that works best for you.

As I said, I was initially skeptical about having to map my day out so stringently. I tend to keep a “to-do list,” but Newport insists that we need to be even more precise than that. We need to identify what we plan to do from hour to hour (acknowledging that, on some days, the best laid plans, etc. etc.) and then we need to 1) stick to it—the only exception being, if you find yourself in a productive state of deep work, it’s okay to stay in that state and bump the next item on the agenda—and 2) keep a record of the fact that we’re sticking to it, so that we can track our progress.

In my own case, when I actually sat down and made a plan for the next day—all the while thinking, “Okay, this probably isn’t going to work at all and I’m probably going to hate this like poison”—I found that, lo and behold, when I finished, I felt far more optimistic about my ability to be productive the next day.

And when I awoke with that sad-sighing-and-heel-dragging feeling of, “Oh, do I have to??” I had a schedule that I had arranged in manageable pieces, so I found myself thinking, “Okay, well, this isn’t so bad, just get started on this task…” and I was able to get to work more quickly.

And when I did, two hours flew by, and I got a lot of writing done. And when it was done, not only did I not feel exhausted, I actually felt a bit… invigorated. It felt possible to do even a bit more writing, and if I opted not to, it was only because I cheerfully decided that my time would be better spent reading instead—the task that I had outlined for myself the day before.

At this point, it’s too soon to say whether I’ll be able to implement all—or any—of Newport’s suggestions and rules over the long haul, but I’ve decided I’m going to give them a wholehearted try and see what happens. This upcoming week is my last week of classes before break—the following week is Spring break.

So this will be a good time to test these strategies out, first under work-week conditions and then in a “vacation” setting. Honestly, I’m kind of excited and interested to see how much I can actually accomplish—and needless to say, this is a far better state of mind that the constant feeling of, “Well, gosh, I really didn’t get much done today, now did I?”

I think this feeling testifies to the fact that, as Newport points out, deep work is a state of mind that taps into what is most satisfying about human activity. As Newport suggests, “if you spend enough time in this state, your mind will understand your world as rich in meaning and importance” (79).

So here’s to more productive days rich with meaning and intellectual significance. 

Sunday, February 26, 2017

Slow and Steady

Recently, I happened upon a couple of texts that have been helping me think about the main challenge I've faced over the past year or two: how to stay productive and not get down on yourself when you can no longer count on feeling “great” (or even just “healthy”) from one day to the next.

It’s an exercise in patience, obviously, but it’s a bit more than that as well. It requires a new mindset.

I gained some insight into how to implement this mindset by reading Anthony Ongaro’s October 22, 2016 blog post, “25 Simple Habits You Can Build From Scratch.” While I’m not necessarily interested in building a new habit, per se, I have been seeking a way to achieve “the consistency needed to make significant changes over time.”

I’m a planner and a doer. If I plan it, I want to do it. So if I plan it and then I can’t do it, I resort to doing one of two things: 1) feeling bad, and 2) beating up on myself for being a failure. Neither of these behaviors is good for the long haul—and they accomplish nothing.

I’ve found myself inadvertently facing a situation not unlike the one that Ongaro describes, in which I find myself “jogging for 10 hours straight then not jogging for 19 days.” Except that I’m not jogging, of course, because I don’t jog. But you get the idea.

As Ongaro points out, “jogging for 30 minutes every day for 20 days in a row” would result in the same number of hours spent on the activity, but lead to a far more productive (i.e., “beneficial”) outcome. But based on my own experience, it can be hard to make that switch to an exponentially reduced level of activity if you’re used to being able to work for 10 hours straight.

It forces us to rethink how we define ourselves in relation to our world and our productivity. I don’t think this is too broad of a statement to make, because it echoes the insights offered by Maggie Berg and Barbara K. Seeber’s book, The Slow Professor: Challenging the Culture of Speed in the Academy (2016). Berg and Seeber suggest that we need to rethink how we define what constitutes “productivity,” particularly when it comes to the profession of college teaching.

Building upon the mindset of “the Slow movement” (Slow Food as opposed to fast food, for example), Berg & Seeber argue that “[b]y taking the time for reflection and dialogue, the Slow Professor takes back the intellectual life of the university” (x).

My goal was much less far-reaching and far more self-involved: I wanted to take back my own intellectual life and productivity, so that I felt less disheartened by the way that my body was not syncing with the dictates of my mind.

Going to bed at night, night after night, thinking, “Tomorrow, I will feel better and I’ll be able to do X. And Y. And Z!! Yes, Z needs to get done tomorrow, and I will do it!!” and then awaking to realize, “I’ll be lucky to do X today,” and midway through the morning it becomes clear that it is unlikely Y will get done and Z is just a pipe dream… well, that can be discouraging, to say the least.

In my own case, it wasn’t that I was simply feeling “lazy” or “procrastinating” (although I’m as guilty of those feelings as the next person), it was that my physical health was affecting my mental focus and determination to such an extent that I could no longer “be” the kind of writer, scholar, and thinker that I had always prided myself on being.

The insights of Berg and Seeber helped me to recalibrate, both emotionally and intellectually, by offering a new way of thinking about how my mindset might have been shaped by the “corporatization of the university”—that is, by the idea that we’re always racing against the clock, fixated on the idea of productivity and efficiency.

As Berg and Seeber point out, there is a “link between time pressure and feelings of powerlessness” (26)—if we feel we have to finish by X date (or hour), the realization that we aren’t going to be able to do that can leave us feeling particularly helpless and drained.

But what if we simply reframe our thinking, so that we don’t succumb to (or, at least, try not to succumb to) “time pressure”? Berg and Seeber argue that this would help us to develop a sense of—and a “place” for— “timeless time” in our lives. Ideally, we’d silence the “inner bully” in our minds, tune out the voices of all of the people out there who think that professors in particular and teachers in generally aren’t “really” doing anything anymore these days (the “must be nice to get the entire summer off” contingent of the population), and realize that any given writing and research task, in order to be well-done, will probably take at least twice as long as you had hoped.

Sounds easy, I know. But reading that sentence actually makes me wince. (“TWICE as long?! I don’t want it to be like that! Because my summers are more relaxed than what most people experience, working 9 to 5 year round, so I really don’t have any excuses, and hey, didn’t I just take a full hour off to watch an X-Files rerun? Well, the research certainly isn’t going to get done if I keep doing that, now is it?!”) 

The point, I think, is not the question of time or productivity—it’s a question of attitude. To do your best work, you have to be your best self, and you simply can’t do that if you’re constantly setting the bar too high and then failing (or crawling to bed bruised and defeated because the bar actually fell from that great height when you accidentally knocked into the goal post… and both the bar and the goal post subsequently hit you on the head). As Berg and Seeber note, “If we think of time only in terms of things accomplished (“done and done” as the newly popular saying goes), we will never have enough of it” (55).

Yes, even those of us with the (alleged) “summers off.”

Because, as Berg and Seeber point out, “Slowing down is about asserting the importance of contemplation, connectedness, fruition, and complexity” (57). The first idea is not always the best idea, and it takes time to work towards what might eventually be the best—the most connected, fruitful, and complex—iteration of an idea.

The Slow Professor reminded me of what I’ve always known, more or less, but chosen not to highlight about the nature of my own work. That “periods of rest” also have “meaning” because “research does not rule like a mechanism; there are rhythms, which include pauses and periods that may seem unproductive” (57). The British novelist Virginia Woolf used to remind herself of this in her journal: when she got on her own case for not writing enough, she would recollect that the creative and intellectual life requires periods of “wool gathering.”

As Berg and Seeber point out, we need to learn to wait (64), to openly acknowledge “the list of detours, delays, and abandoned projects” that we typically hide from view (65), to recognize that “More is not necessarily better”—although paradoxically, sometimes, it actually is (66)—and to give ourselves the time we need to read, think, and reflect (the essence of “research”) so that we can “follow our heart” (i.e., pursue projects “driven by genuine curiosity about a problem even if that is not a ‘hot’ topic at the moment” [68]).

As the fable of the Tortoise and the Hare teaches us, “Slow and steady wins the race.” But more importantly, we need to realize that it isn’t always—because it simply can’t and shouldn’t be—a race.

Monday, February 13, 2017

Who's In Charge Here?

In the wake of the social and political turmoil that has marked the past several weeks, I spent the weekend reading Albert Bandura’s Moral Disengagement: How Good People Can Do Harm and Feel Good About Themselves (2015). As his title suggests, Bandura—an Emeritus Professor of Social Science in Psychology at Stanford and the founder of what has come to be known as “social cognitive theory” —is interested in the question of “moral agency,” the question of when, how, and why human beings behave in ways that align with their stated moral values.

In particular, Bandura is interested in identifying those points at which human beings set aside their moral values and behave inhumanely towards others. In these instances, people rationalize and justify doing harm to others, Bandura argues, not because they are immoral, but because they are “morally disengaged” from the situation and/or the people affected by their behavior.

Bandura’s concept of moral agency thus dispenses with the idea that some people are just innately “bad” or “evil,” while others are inherently “good” or “virtuous.” Instead, Bandura argues that “Moral agency is … exercised through the constraint of negative self-sanctions for conduct that violates one’s moral standards and the support of positive self-sanctions for conduct faithful to personal moral standards” (1)—in short,  that we beat ourselves up for doing things that are not in keeping with our moral values, but feel good about ourselves if we do things that align with our values.

Remember Jiminy Cricket’s advice, to “always let your conscience be your guide”? Well, Bandura argues that this kind of moral guidance has “dual aspects”—it can be “inhibitive” (“don’t do that—you know it’s wrong!”) or “proactive” (“I want to do this, because that’s the kind of person I am.”) 

The problem, according to Bandura, is that “Moral standards, whether characterized as conscience, moral prescripts, or principles, do not function as unceasing internal regulators of conduct” (2)—to put it more simply, our internal moral guidance system isn’t always automatically switched “on.” And that’s not just because Jiminy Cricket is taking a nap or because he stepped out to use the bathroom or go get a sandwich. He might be right where he always is, but he’s… staring at a speck on his sleeve, pretending he doesn’t see what you’re up to.

Even more troubling are the moments in life when you’re doing something quite immoral or “wrong” and Jiminy Cricket is not only not objecting to it, he’s actually cheering you on (“anything your heart desires will come to you—no request is too extreme—who said it’s embezzlement?—fate is kind!!”). These moments—when Jiminy Cricket goes rogue or offline, so to speak—are points of “moral disengagement.” As Bandura notes, “People often face pressures to engage in harmful activities that provide desired benefits but violate their moral standards. To engage in those activities and live with themselves, they have to strip morality from their actions or invest them with worthy purposes. Disengagement of moral self-sanctions enables people to compromise their moral standards and still retain their sense of moral integrity.” (2)

This is why, in many instances, people who have engaged in behavior that strikes the rest of us as blatantly immoral remain convinced that they are “good” people who didn’t really do anything “wrong,” per se. And why they will insist on trying to convince the rest of us that we have them all wrong, that they aren’t callous or cruel or deceitful or downright “evil.” As Bandura puts it, “people sanctify harmful means by investing them with worthy social and moral purposes” (2). Or, as the adage has it, “The road to hell is paved with good intentions.” In the moments when they’re behaving immorally, people are quite convinced that, because their intentions are good, their behavior isn’t all that bad.

This continued belief in one’s own moral stature and integrity stems from the fact that, in instances of “moral disengagement,” people do not alter or abandon their stated moral values. Instead, they “circumvent” their standards by opting to “strip morality from harmful behavior and their responsibility for it” (3). This kind of seductive moral strip-tease, if you will, can occur in various ways. On the behavioral level, people can believe that they’re ultimately preventing more suffering than they’re currently causing, and they can use language in ways that enable them to uphold this perception (through euphemisms that cast their behavior as “innocuous” or that work to “sanitize” it) (2). On the level of agency, people operate by “displacing responsibility to others and by dispersing it widely so that no one bears responsibility” (3)—personally, I consider this as the “everyone-does-it-and/or-anyway-s/he-started-it” system of moral reasoning.

Finally, on the level of what Bandura labels the “victim locus,” people can circumvent their own stated moral values by dehumanizing the people affected by their behavior—in particular, “by divesting them of human qualities or attributing animalistic qualities to them” (3). This is, on one level, the impetus behind the “blame the victim” mentality in general. It is a mindset that is even more toxic than “blame the victim,” however, because it insists that the victim isn’t “one of us,” so s/he isn’t fully human and thus not capable of feeling the way we might feel if these things happened to us.

And if you’re about to deploy the “Well-but-there’s-nothing-anyone-can-do-about-that-because-people-are-naturally-aggressive-and-this-mentality-has-been-around-since-the-dawn-of-time” argument, I’ve got news for you. Research has shown that “stimulating the same brain site activated aggression or submissiveness, depending on the animal’s level of social power” (18, emphasis added).

So don’t blame it all on the human brain. Research shows that, if we want to regulate aggression by biochemical or neurological means, we still need to examine how it is shaped and how it operates “in the context of power relations and other social determinants of aggression” (18).

Perhaps more importantly, Bandura insists that we must regard morality as a more complex operation than we typically do. As he points out, “Higher levels of moral ‘maturity’ do not necessarily foretell stronger commitment to humane conduct. This is because justifications, whatever their level, can be applied in the service of harmful activities as well as benevolent ones” (25). As we all know, context is important: soldiers in battle override their personal moral standard not to kill because they justify what would otherwise be considered a harmful activity as a potentially “benevolent” one (by adopting the moral reasoning for a “just cause” or a “just war,” for example). As a result, we do not morally evaluate all acts of killing in the same way. But at the same time, when a drug dealer refers to his “soldiers” and we characterize their acts of murder as components of an ongoing “war,” for example, we are using language to sanitize—if not legitimate—behavior that does not align with our stated moral values. (The drug dealer and his flunkies aren’t actually “soldiers” at all and their “war” is not only unjustified, but morally unsanctioned and ultimately illegal.)

Which brings me to the point that Bandura makes that I find most compelling, as an English professor and general wordsmith. According to Bandura, “individuals can exercise some measure of control over how situations influence them and how they shape their situations” (45)—in particular, Bandura argues, “People do not usually engage in harmful conduct until they have justified to themselves the morality of their actions” (49). In short, we can strive to cultivate a mindset that helps to prevent moral disengagement.

One of the most potent ways of doing this is by looking at how we speak to—and about—others that we are inclined to dislike, disagree with, or otherwise condemn. As Bandura points out, the increasing predominance of online communications has “ushered in a ubiquitous vehicle for disengaging moral self-sanctions from transgressive conduct” (68). A “highly decentralized system that defies regulation,” the internet is a place where, “Anybody can get into the act, and nobody is in charge” (68).

Except that, ultimately, we are in charge. If nothing else, when we go online, we can strive to ensure that our own moral self-sanctions continue to operate. If I would hesitate to look someone in the eye and call him/her, for example, a “friggin’ orange Cheeto who spews nothing but lies and hate” or a “pathetic little snowflake who needs to grow up and stop expecting mommy to change their diaper,” I can refrain from typing it on Twitter or Facebook or in the comments section on someone’s webpage.

Because, let’s be honest, most of us will self-censor (to an extent) when face-to-face with someone we strongly dislike or vehemently disagree with, because we know that such behavior is potentially rude and unkind and may make us look bad in the eyes of others. But, as Bandura acknowledges, “Cloaked in the anonymity afforded by the Internet, people say cruel things that they would never dare say publicly” (39). The problem, however, is that over time, if you say enough cruel things anonymously or repeatedly justify your own contempt for others who disagree with you (because even if you don’t say it out loud or type it online, odds are, you’re saying it to yourself), your moral self-sanctions begin to weaken with respect to the individuals or groups that you’re targeting.

Keep it up, and you’ll start speaking—and behaving—in ways that you would never condone if you were morally “engaged” with the humanity of those around you, because you’ll cease to see the objects of your contempt as “human” in the same way that you yourself are. You’ll psychologically justify a lack of compassion, the practice of overt aggression, and a mindset that sees your victims as always “guilty,” never potentially “innocent.” And often without even realizing that this is what is happening to you.

This is not to say that we should all just agree with, accept, or normalize words and behavior that we fundamentally cannot agree with or accept.
And this is certainly not about silencing or circumscribing the voices of others. It’s not about “going along to get along”—a phrasing which itself seeks to “sanitize” unacceptable or potentially immoral behavior.

What I’m suggesting is that empirical psychological research has shown that it is all too easy to get used to stereotyping and dehumanizing people if you never actually have to see the people you’re “talking” to or see the effect of your words. And if the algorithms that structure your social media feeds lead you to believe that “everyone” does it and that “most people” agree with you, you’ll be less likely to perceive your words and actions as unkind or immoral or “wrong.” As Bandura points out, the “unknowns” that go hand-in-hand with online communications “make it easy to persuade oneself that one’s conduct caused little or no harm and that one is merely exercising the constitutionally protected right of free speech” (69).

So what can the morally engaged do, in times of collective crisis, when everyone has gotten into the act but nobody seems to be in charge? 

Well, for starters, we can remember that, although behavioral psychology often focuses on what used to be labelled “man’s cruelty to man,” there is “equally striking evidence that most people refuse to behave cruelly toward humanized others under strong authoritarian commands, and when they have to inflict pain directly rather than remotely” (91).

The key phrase in that statement, of course, is “humanized others.” Not “Cheetos” or “snowflakes” or “deplorables” or “libtards” or “Trumpsters.”

When typing hostile comments or aggressively “debating” on social media, maybe we should imagine ourselves, not saying these things publicly to someone—since moral disengagement can quickly enable us to tell ourselves that the objects of our scorn “started it” and/or “deserve it” and because we’ve all been raised on the very (VERY) flawed maxim that “sticks and stones may break our bones but names will never hurt us”—but instead imagine ourselves walking up to a total stranger on the street and repeatedly punching or kicking him/her because we think we “should” because something about them has told us that we “know” what they’re all about.

And, when or if that person begins to scream or kick back, we can imagine ourselves simply laughing gleefully as we continue to punch and kick them, while others crowd around (and maybe join in too). When we’re done, we can just imagine ourselves walking away to go get a coffee, comfortable in the idea that there’s been “no harm done.” And the next day, we’ll do it again—and maybe even look for the person we did it to the day before.

That’s the scenario that of moral disengagement. 

By contrast, Bandura notes, “The affirmation of common humanity can bring out the best in others” (91). This doesn’t mean we insist that we’re “all the same,” somehow. It simply means that we realize that we’re human beings who often disagree, that we can control our own levels of moral engagement (or disengagement), and that social forces and the language we use to talk to and about others have not only a measurable, but also a profound effect on our own levels of aggression and moral engagement.

If, instead of labeling someone, we take a moment to sympathetically think about what the world might look like from that person’s perspective—even if it is a viewpoint we cannot possibly agree with, support, or condone—we strengthen and affirm our own moral functioning. 

Empathy is not a sign of stupidity, weakness, indecisiveness, or capitulation to injustice. It’s an exercise in moral reasoning that strengthens our own moral fiber and the integrity of our moral musculature.

And, believe it or not—and despite what may seem to be overwhelming evidence to the contrary—we really do change the world a little whenever we practice empathy.

Because, as Bandura points out, “vicarious influence”—that is, simply watching someone else behave kindly and morally—helps us to “acquire lasting attitudes, emotional reactions, and behavior proclivities toward persons, places, or things that have been associated with the emotional experiences of others” (93). In these moments, we “make what happens to others predictive of what might happen to oneself” (93).

“I’m not going to say that to this person because that could be me.” “I’m not going to say that because I’ve been on the receiving end of comments like that and I know they stay with you and hurt.” “If someone called me a stupid snowflake or a paunchy Cheeto or deplorable, well, gosh, I’d be kind of upset about that.” “She can’t help who her father is.” “I’m just not going to shop there anymore.”

There are plenty of ways to make our voices heard—and our strength and integrity and moral convictions felt—without resorting to the dark side of anger and the language of dehumanization and cruelty.

If no one is in charge, then we each need to be in charge—of ourselves.

If we want to be morally engaged agents of change, I think we need to resist the constant temptation to succumb to the social pressures and the language of moral disengagement.

And yes, we’ll all slip up from time to time. We’re only human.

And that is precisely the point: we’re all—and only—human.

Friday, January 20, 2017

Gearing Up

I've just finished a load of laundry and, like many of my fellow-Americans, I'm waiting for Armageddon, so I figured it was high time I caught up on my blog.

I'm just joking (sort of). After all, where would we be if we all lost our sense of humor? 

It's been a busy and productive two weeks. I did major revisions on an article on Shalamov, and now I'm in the process of doing major revisions on an article on Zola, and when I'm done with all that, I'll need to finish up an article on John Hersey.

I'm not sure I'm going to be able to get all of this done by Sunday night, needless to say, since I'm also going to be doing things like getting a haircut, packing, and traveling. Oh, and I'll be going to a little rally tomorrow as well, because I figure if I can do a little something that'll piss off Kellyanne Conway, I'll definitely be taking the time to do it.

Like many, I've felt discouraged by the direction we've headed in--and when I say that, I mean not simply the political direction, which in many ways looks to no longer align with some of my own ideals in the upcoming years, but simply the direction of ... mockery and incivility and, well, downright mean-spirited rudeness.

I don't think that's a sign of "thought" or "progress" or "greatness," regardless of people's specific, individual social, political, economic, or religious beliefs. So I hope that we can stop some of that. Actually, I hope that we can stop ALL of that, but I'll settle for "some."

But I'm not optimistic, because we've elected a person to lead us who, in my opinion, thinks being rude makes him look "cool" somehow. And that it's sorta funny, you know, being a little mean and a little crude and a little obnoxious.

I disagree.

So as I gear up for the upcoming semester and move into the full swing of 2017, I decided that this is what I'd like to focus on for myself: to make a commitment to deliberately and insistently opt for kindness whenever and however possible.

Don't get me wrong, I reserve the right to indulge in a bit of snark here and there--I don't think I could survive without it, quite frankly.

Because I'm not a snowflake: I'm not delicate and I don't melt.

And I'm glad that every single class that I'm teaching focuses on literature of other nations and/or literature by populations that have experienced prejudice or oppression in the US. It's important that we hear those voices, and realize that the "greatness" of the past always came with a price for someone.

We don't want to lose what we've learned from the lessons of the past. We just don't.

And I'm glad that I teach skills like critical thinking and that I encourage people to expand their vocabulary and their viewpoint and their perspectives on the world. And that I get to do all of that while also enjoying myself--that the work I do nourishes my own mind and heart and spirit, while it (hopefully) does the same for others.

So here's to the future. We got this.

Thursday, January 5, 2017

Back Around

The holidays have come and gone, and here we are again, starting a new year.

There were some high points during my time away, and some lows. Like many, I was a bit depressed by the prospect of 2017, for all of the (to my mind very obvious) reasons that have made the prospect of 2017 a depressing one for many.

But now that I'm back home, and the holidays are over, and I'm in it ("it" being 2017), I'm determined to make the best of things and not waste a lot of time wishing (or worrying--unless and until it becomes absolutely necessary to do so, that is).

So I've been spending the past couple of days getting back into the swing of writing, and I'm pleased to say that it's going well. I've mapped out what I need to get done over the next few weeks, before classes start, and so far, I'm staying on-task (and thus, on schedule).

I don't always do New Year's resolutions, unless I feel that the preceding year has been marked by a glaringly bad behavior or mindset in need of correction, and this year, I didn't really feel compelled to commit to any sweeping changes.

But I did decide that I want to manage my time on social media and email a bit better and, along with that, to carve out more time to reflect on and, when necessary, adjust the pace I set for myself.

So, for example, instead of beating myself up for not getting in shape right away! or not finishing that article right away! I'd like to commit to taking the time to think about why I might not be motivated to work out when I usually enjoy swimming or biking (is my body trying to tell me something right now, and if so, isn't that okay, really?) and why I might not be able to get past the writing "snarl" that has me entangled.

To help me achieve that goal, I've adopted a policy about when I will (and will not) be available via email and how much time I'll spend on social media (and what I'll do when I'm on there). My New Year's resolution (such as it is) is to stick to those choices, so that I can have my evenings and my weekends to myself, and feel better about the things that I accomplish--not harried or "behind" or stressed out.

Because the thing is, if a professor doesn't set boundaries, s/he can end up working constantly--there's always someone who needs something and always something that needs doing. Something to read, something to write, someone to respond to.

And that's not a bad thing, except that it makes it much harder to conceive of a professor's job as a job with clear boundaries that determine when s/he is "on the job" or "off the clock."

I started thinking about this after I began reading The Slow Professor: Challenging the Culture of Speed in the Academy (2016) by Maggie Berg and Barbara K. Seeber. Berg and Seeber examine the ways in which academia now compels those within it to move quickly and efficiently--a "corporate" mindset that is not necessarily conducive to deep thought or probing insights.

In fact, it's a mindset that many successful corporations are now moving away from, precisely because it is not conducive to thought or reflection, skills that are as necessary in business as they are in the pursuit of book-learning.

As Berg and Seeber--and many, if not most intellectuals employed within academia--acknowledge, jobs in higher education are positions of privilege. We have the option of not only thinking, but actually saying, "Hey, wait a minnit" and arguing for a different pace, something that many people in many jobs do not have.

But perhaps they should have that option. Because really, how rich do we all want to be? How much money do we want to be clutching when the Grim Reaper comes knocking?

Fast food isn't better for us--quite the contrary. Fast cars are fine, but you can only go so fast (outside of the Indy 500) before you're going to end up facing a problem. And if you quickly accomplish 8 million tiny tasks, someone out there is (quickly) going to find another 500 or so that you can do as well, since you "have the time."

At some point, I think we can easily lose sight of what we're rushing around for and why. If we want to spend time with loved ones, we need to slow down and pare back--to live and think more deliberately about what we want to do and when.

And if doing so leads to being branded a loser or a slacker, we sort of need to be "okay" with that.

Because success in our careers is a good thing, but only if we actually have the time to stop and smell the roses that we've so assiduously planted along the way.

And time spent with loved ones doesn't always have to be about getting things done or "catching up" or "keeping up." It could just be about... being.

For my part, I've decided to commit to looking and thinking more deliberately this year--to taking the time and effort to move more slowly, when I feel that I'm getting caught up in the rat-race that is our world.

So with any luck, it will be a year of pondering and savoring the food for thought that 2017 offers.