Sunday, February 26, 2017

Slow and Steady

Recently, I happened upon a couple of texts that have been helping me think about the main challenge I've faced over the past year or two: how to stay productive and not get down on yourself when you can no longer count on feeling “great” (or even just “healthy”) from one day to the next.

It’s an exercise in patience, obviously, but it’s a bit more than that as well. It requires a new mindset.

I gained some insight into how to implement this mindset by reading Anthony Ongaro’s October 22, 2016 blog post, “25 Simple Habits You Can Build From Scratch.” While I’m not necessarily interested in building a new habit, per se, I have been seeking a way to achieve “the consistency needed to make significant changes over time.”

I’m a planner and a doer. If I plan it, I want to do it. So if I plan it and then I can’t do it, I resort to doing one of two things: 1) feeling bad, and 2) beating up on myself for being a failure. Neither of these behaviors is good for the long haul—and they accomplish nothing.

I’ve found myself inadvertently facing a situation not unlike the one that Ongaro describes, in which I find myself “jogging for 10 hours straight then not jogging for 19 days.” Except that I’m not jogging, of course, because I don’t jog. But you get the idea.

As Ongaro points out, “jogging for 30 minutes every day for 20 days in a row” would result in the same number of hours spent on the activity, but lead to a far more productive (i.e., “beneficial”) outcome. But based on my own experience, it can be hard to make that switch to an exponentially reduced level of activity if you’re used to being able to work for 10 hours straight.

It forces us to rethink how we define ourselves in relation to our world and our productivity. I don’t think this is too broad of a statement to make, because it echoes the insights offered by Maggie Berg and Barbara K. Seeber’s book, The Slow Professor: Challenging the Culture of Speed in the Academy (2016). Berg and Seeber suggest that we need to rethink how we define what constitutes “productivity,” particularly when it comes to the profession of college teaching.

Building upon the mindset of “the Slow movement” (Slow Food as opposed to fast food, for example), Berg & Seeber argue that “[b]y taking the time for reflection and dialogue, the Slow Professor takes back the intellectual life of the university” (x).

My goal was much less far-reaching and far more self-involved: I wanted to take back my own intellectual life and productivity, so that I felt less disheartened by the way that my body was not syncing with the dictates of my mind.

Going to bed at night, night after night, thinking, “Tomorrow, I will feel better and I’ll be able to do X. And Y. And Z!! Yes, Z needs to get done tomorrow, and I will do it!!” and then awaking to realize, “I’ll be lucky to do X today,” and midway through the morning it becomes clear that it is unlikely Y will get done and Z is just a pipe dream… well, that can be discouraging, to say the least.

In my own case, it wasn’t that I was simply feeling “lazy” or “procrastinating” (although I’m as guilty of those feelings as the next person), it was that my physical health was affecting my mental focus and determination to such an extent that I could no longer “be” the kind of writer, scholar, and thinker that I had always prided myself on being.

The insights of Berg and Seeber helped me to recalibrate, both emotionally and intellectually, by offering a new way of thinking about how my mindset might have been shaped by the “corporatization of the university”—that is, by the idea that we’re always racing against the clock, fixated on the idea of productivity and efficiency.

As Berg and Seeber point out, there is a “link between time pressure and feelings of powerlessness” (26)—if we feel we have to finish by X date (or hour), the realization that we aren’t going to be able to do that can leave us feeling particularly helpless and drained.

But what if we simply reframe our thinking, so that we don’t succumb to (or, at least, try not to succumb to) “time pressure”? Berg and Seeber argue that this would help us to develop a sense of—and a “place” for— “timeless time” in our lives. Ideally, we’d silence the “inner bully” in our minds, tune out the voices of all of the people out there who think that professors in particular and teachers in generally aren’t “really” doing anything anymore these days (the “must be nice to get the entire summer off” contingent of the population), and realize that any given writing and research task, in order to be well-done, will probably take at least twice as long as you had hoped.

Sounds easy, I know. But reading that sentence actually makes me wince. (“TWICE as long?! I don’t want it to be like that! Because my summers are more relaxed than what most people experience, working 9 to 5 year round, so I really don’t have any excuses, and hey, didn’t I just take a full hour off to watch an X-Files rerun? Well, the research certainly isn’t going to get done if I keep doing that, now is it?!”) 

The point, I think, is not the question of time or productivity—it’s a question of attitude. To do your best work, you have to be your best self, and you simply can’t do that if you’re constantly setting the bar too high and then failing (or crawling to bed bruised and defeated because the bar actually fell from that great height when you accidentally knocked into the goal post… and both the bar and the goal post subsequently hit you on the head). As Berg and Seeber note, “If we think of time only in terms of things accomplished (“done and done” as the newly popular saying goes), we will never have enough of it” (55).

Yes, even those of us with the (alleged) “summers off.”

Because, as Berg and Seeber point out, “Slowing down is about asserting the importance of contemplation, connectedness, fruition, and complexity” (57). The first idea is not always the best idea, and it takes time to work towards what might eventually be the best—the most connected, fruitful, and complex—iteration of an idea.

The Slow Professor reminded me of what I’ve always known, more or less, but chosen not to highlight about the nature of my own work. That “periods of rest” also have “meaning” because “research does not rule like a mechanism; there are rhythms, which include pauses and periods that may seem unproductive” (57). The British novelist Virginia Woolf used to remind herself of this in her journal: when she got on her own case for not writing enough, she would recollect that the creative and intellectual life requires periods of “wool gathering.”

As Berg and Seeber point out, we need to learn to wait (64), to openly acknowledge “the list of detours, delays, and abandoned projects” that we typically hide from view (65), to recognize that “More is not necessarily better”—although paradoxically, sometimes, it actually is (66)—and to give ourselves the time we need to read, think, and reflect (the essence of “research”) so that we can “follow our heart” (i.e., pursue projects “driven by genuine curiosity about a problem even if that is not a ‘hot’ topic at the moment” [68]).

As the fable of the Tortoise and the Hare teaches us, “Slow and steady wins the race.” But more importantly, we need to realize that it isn’t always—because it simply can’t and shouldn’t be—a race.

Monday, February 13, 2017

Who's In Charge Here?

In the wake of the social and political turmoil that has marked the past several weeks, I spent the weekend reading Albert Bandura’s Moral Disengagement: How Good People Can Do Harm and Feel Good About Themselves (2015). As his title suggests, Bandura—an Emeritus Professor of Social Science in Psychology at Stanford and the founder of what has come to be known as “social cognitive theory” —is interested in the question of “moral agency,” the question of when, how, and why human beings behave in ways that align with their stated moral values.

In particular, Bandura is interested in identifying those points at which human beings set aside their moral values and behave inhumanely towards others. In these instances, people rationalize and justify doing harm to others, Bandura argues, not because they are immoral, but because they are “morally disengaged” from the situation and/or the people affected by their behavior.

Bandura’s concept of moral agency thus dispenses with the idea that some people are just innately “bad” or “evil,” while others are inherently “good” or “virtuous.” Instead, Bandura argues that “Moral agency is … exercised through the constraint of negative self-sanctions for conduct that violates one’s moral standards and the support of positive self-sanctions for conduct faithful to personal moral standards” (1)—in short,  that we beat ourselves up for doing things that are not in keeping with our moral values, but feel good about ourselves if we do things that align with our values.

Remember Jiminy Cricket’s advice, to “always let your conscience be your guide”? Well, Bandura argues that this kind of moral guidance has “dual aspects”—it can be “inhibitive” (“don’t do that—you know it’s wrong!”) or “proactive” (“I want to do this, because that’s the kind of person I am.”) 

The problem, according to Bandura, is that “Moral standards, whether characterized as conscience, moral prescripts, or principles, do not function as unceasing internal regulators of conduct” (2)—to put it more simply, our internal moral guidance system isn’t always automatically switched “on.” And that’s not just because Jiminy Cricket is taking a nap or because he stepped out to use the bathroom or go get a sandwich. He might be right where he always is, but he’s… staring at a speck on his sleeve, pretending he doesn’t see what you’re up to.

Even more troubling are the moments in life when you’re doing something quite immoral or “wrong” and Jiminy Cricket is not only not objecting to it, he’s actually cheering you on (“anything your heart desires will come to you—no request is too extreme—who said it’s embezzlement?—fate is kind!!”). These moments—when Jiminy Cricket goes rogue or offline, so to speak—are points of “moral disengagement.” As Bandura notes, “People often face pressures to engage in harmful activities that provide desired benefits but violate their moral standards. To engage in those activities and live with themselves, they have to strip morality from their actions or invest them with worthy purposes. Disengagement of moral self-sanctions enables people to compromise their moral standards and still retain their sense of moral integrity.” (2)

This is why, in many instances, people who have engaged in behavior that strikes the rest of us as blatantly immoral remain convinced that they are “good” people who didn’t really do anything “wrong,” per se. And why they will insist on trying to convince the rest of us that we have them all wrong, that they aren’t callous or cruel or deceitful or downright “evil.” As Bandura puts it, “people sanctify harmful means by investing them with worthy social and moral purposes” (2). Or, as the adage has it, “The road to hell is paved with good intentions.” In the moments when they’re behaving immorally, people are quite convinced that, because their intentions are good, their behavior isn’t all that bad.

This continued belief in one’s own moral stature and integrity stems from the fact that, in instances of “moral disengagement,” people do not alter or abandon their stated moral values. Instead, they “circumvent” their standards by opting to “strip morality from harmful behavior and their responsibility for it” (3). This kind of seductive moral strip-tease, if you will, can occur in various ways. On the behavioral level, people can believe that they’re ultimately preventing more suffering than they’re currently causing, and they can use language in ways that enable them to uphold this perception (through euphemisms that cast their behavior as “innocuous” or that work to “sanitize” it) (2). On the level of agency, people operate by “displacing responsibility to others and by dispersing it widely so that no one bears responsibility” (3)—personally, I consider this as the “everyone-does-it-and/or-anyway-s/he-started-it” system of moral reasoning.

Finally, on the level of what Bandura labels the “victim locus,” people can circumvent their own stated moral values by dehumanizing the people affected by their behavior—in particular, “by divesting them of human qualities or attributing animalistic qualities to them” (3). This is, on one level, the impetus behind the “blame the victim” mentality in general. It is a mindset that is even more toxic than “blame the victim,” however, because it insists that the victim isn’t “one of us,” so s/he isn’t fully human and thus not capable of feeling the way we might feel if these things happened to us.

And if you’re about to deploy the “Well-but-there’s-nothing-anyone-can-do-about-that-because-people-are-naturally-aggressive-and-this-mentality-has-been-around-since-the-dawn-of-time” argument, I’ve got news for you. Research has shown that “stimulating the same brain site activated aggression or submissiveness, depending on the animal’s level of social power” (18, emphasis added).

So don’t blame it all on the human brain. Research shows that, if we want to regulate aggression by biochemical or neurological means, we still need to examine how it is shaped and how it operates “in the context of power relations and other social determinants of aggression” (18).

Perhaps more importantly, Bandura insists that we must regard morality as a more complex operation than we typically do. As he points out, “Higher levels of moral ‘maturity’ do not necessarily foretell stronger commitment to humane conduct. This is because justifications, whatever their level, can be applied in the service of harmful activities as well as benevolent ones” (25). As we all know, context is important: soldiers in battle override their personal moral standard not to kill because they justify what would otherwise be considered a harmful activity as a potentially “benevolent” one (by adopting the moral reasoning for a “just cause” or a “just war,” for example). As a result, we do not morally evaluate all acts of killing in the same way. But at the same time, when a drug dealer refers to his “soldiers” and we characterize their acts of murder as components of an ongoing “war,” for example, we are using language to sanitize—if not legitimate—behavior that does not align with our stated moral values. (The drug dealer and his flunkies aren’t actually “soldiers” at all and their “war” is not only unjustified, but morally unsanctioned and ultimately illegal.)

Which brings me to the point that Bandura makes that I find most compelling, as an English professor and general wordsmith. According to Bandura, “individuals can exercise some measure of control over how situations influence them and how they shape their situations” (45)—in particular, Bandura argues, “People do not usually engage in harmful conduct until they have justified to themselves the morality of their actions” (49). In short, we can strive to cultivate a mindset that helps to prevent moral disengagement.

One of the most potent ways of doing this is by looking at how we speak to—and about—others that we are inclined to dislike, disagree with, or otherwise condemn. As Bandura points out, the increasing predominance of online communications has “ushered in a ubiquitous vehicle for disengaging moral self-sanctions from transgressive conduct” (68). A “highly decentralized system that defies regulation,” the internet is a place where, “Anybody can get into the act, and nobody is in charge” (68).

Except that, ultimately, we are in charge. If nothing else, when we go online, we can strive to ensure that our own moral self-sanctions continue to operate. If I would hesitate to look someone in the eye and call him/her, for example, a “friggin’ orange Cheeto who spews nothing but lies and hate” or a “pathetic little snowflake who needs to grow up and stop expecting mommy to change their diaper,” I can refrain from typing it on Twitter or Facebook or in the comments section on someone’s webpage.

Because, let’s be honest, most of us will self-censor (to an extent) when face-to-face with someone we strongly dislike or vehemently disagree with, because we know that such behavior is potentially rude and unkind and may make us look bad in the eyes of others. But, as Bandura acknowledges, “Cloaked in the anonymity afforded by the Internet, people say cruel things that they would never dare say publicly” (39). The problem, however, is that over time, if you say enough cruel things anonymously or repeatedly justify your own contempt for others who disagree with you (because even if you don’t say it out loud or type it online, odds are, you’re saying it to yourself), your moral self-sanctions begin to weaken with respect to the individuals or groups that you’re targeting.

Keep it up, and you’ll start speaking—and behaving—in ways that you would never condone if you were morally “engaged” with the humanity of those around you, because you’ll cease to see the objects of your contempt as “human” in the same way that you yourself are. You’ll psychologically justify a lack of compassion, the practice of overt aggression, and a mindset that sees your victims as always “guilty,” never potentially “innocent.” And often without even realizing that this is what is happening to you.

This is not to say that we should all just agree with, accept, or normalize words and behavior that we fundamentally cannot agree with or accept.
And this is certainly not about silencing or circumscribing the voices of others. It’s not about “going along to get along”—a phrasing which itself seeks to “sanitize” unacceptable or potentially immoral behavior.

What I’m suggesting is that empirical psychological research has shown that it is all too easy to get used to stereotyping and dehumanizing people if you never actually have to see the people you’re “talking” to or see the effect of your words. And if the algorithms that structure your social media feeds lead you to believe that “everyone” does it and that “most people” agree with you, you’ll be less likely to perceive your words and actions as unkind or immoral or “wrong.” As Bandura points out, the “unknowns” that go hand-in-hand with online communications “make it easy to persuade oneself that one’s conduct caused little or no harm and that one is merely exercising the constitutionally protected right of free speech” (69).

So what can the morally engaged do, in times of collective crisis, when everyone has gotten into the act but nobody seems to be in charge? 

Well, for starters, we can remember that, although behavioral psychology often focuses on what used to be labelled “man’s cruelty to man,” there is “equally striking evidence that most people refuse to behave cruelly toward humanized others under strong authoritarian commands, and when they have to inflict pain directly rather than remotely” (91).

The key phrase in that statement, of course, is “humanized others.” Not “Cheetos” or “snowflakes” or “deplorables” or “libtards” or “Trumpsters.”

When typing hostile comments or aggressively “debating” on social media, maybe we should imagine ourselves, not saying these things publicly to someone—since moral disengagement can quickly enable us to tell ourselves that the objects of our scorn “started it” and/or “deserve it” and because we’ve all been raised on the very (VERY) flawed maxim that “sticks and stones may break our bones but names will never hurt us”—but instead imagine ourselves walking up to a total stranger on the street and repeatedly punching or kicking him/her because we think we “should” because something about them has told us that we “know” what they’re all about.

And, when or if that person begins to scream or kick back, we can imagine ourselves simply laughing gleefully as we continue to punch and kick them, while others crowd around (and maybe join in too). When we’re done, we can just imagine ourselves walking away to go get a coffee, comfortable in the idea that there’s been “no harm done.” And the next day, we’ll do it again—and maybe even look for the person we did it to the day before.

That’s the scenario that of moral disengagement. 

By contrast, Bandura notes, “The affirmation of common humanity can bring out the best in others” (91). This doesn’t mean we insist that we’re “all the same,” somehow. It simply means that we realize that we’re human beings who often disagree, that we can control our own levels of moral engagement (or disengagement), and that social forces and the language we use to talk to and about others have not only a measurable, but also a profound effect on our own levels of aggression and moral engagement.

If, instead of labeling someone, we take a moment to sympathetically think about what the world might look like from that person’s perspective—even if it is a viewpoint we cannot possibly agree with, support, or condone—we strengthen and affirm our own moral functioning. 

Empathy is not a sign of stupidity, weakness, indecisiveness, or capitulation to injustice. It’s an exercise in moral reasoning that strengthens our own moral fiber and the integrity of our moral musculature.

And, believe it or not—and despite what may seem to be overwhelming evidence to the contrary—we really do change the world a little whenever we practice empathy.

Because, as Bandura points out, “vicarious influence”—that is, simply watching someone else behave kindly and morally—helps us to “acquire lasting attitudes, emotional reactions, and behavior proclivities toward persons, places, or things that have been associated with the emotional experiences of others” (93). In these moments, we “make what happens to others predictive of what might happen to oneself” (93).

“I’m not going to say that to this person because that could be me.” “I’m not going to say that because I’ve been on the receiving end of comments like that and I know they stay with you and hurt.” “If someone called me a stupid snowflake or a paunchy Cheeto or deplorable, well, gosh, I’d be kind of upset about that.” “She can’t help who her father is.” “I’m just not going to shop there anymore.”

There are plenty of ways to make our voices heard—and our strength and integrity and moral convictions felt—without resorting to the dark side of anger and the language of dehumanization and cruelty.

If no one is in charge, then we each need to be in charge—of ourselves.

If we want to be morally engaged agents of change, I think we need to resist the constant temptation to succumb to the social pressures and the language of moral disengagement.

And yes, we’ll all slip up from time to time. We’re only human.

And that is precisely the point: we’re all—and only—human.

Friday, January 20, 2017

Gearing Up

I've just finished a load of laundry and, like many of my fellow-Americans, I'm waiting for Armageddon, so I figured it was high time I caught up on my blog.

I'm just joking (sort of). After all, where would we be if we all lost our sense of humor? 

It's been a busy and productive two weeks. I did major revisions on an article on Shalamov, and now I'm in the process of doing major revisions on an article on Zola, and when I'm done with all that, I'll need to finish up an article on John Hersey.

I'm not sure I'm going to be able to get all of this done by Sunday night, needless to say, since I'm also going to be doing things like getting a haircut, packing, and traveling. Oh, and I'll be going to a little rally tomorrow as well, because I figure if I can do a little something that'll piss off Kellyanne Conway, I'll definitely be taking the time to do it.

Like many, I've felt discouraged by the direction we've headed in--and when I say that, I mean not simply the political direction, which in many ways looks to no longer align with some of my own ideals in the upcoming years, but simply the direction of ... mockery and incivility and, well, downright mean-spirited rudeness.

I don't think that's a sign of "thought" or "progress" or "greatness," regardless of people's specific, individual social, political, economic, or religious beliefs. So I hope that we can stop some of that. Actually, I hope that we can stop ALL of that, but I'll settle for "some."

But I'm not optimistic, because we've elected a person to lead us who, in my opinion, thinks being rude makes him look "cool" somehow. And that it's sorta funny, you know, being a little mean and a little crude and a little obnoxious.

I disagree.

So as I gear up for the upcoming semester and move into the full swing of 2017, I decided that this is what I'd like to focus on for myself: to make a commitment to deliberately and insistently opt for kindness whenever and however possible.

Don't get me wrong, I reserve the right to indulge in a bit of snark here and there--I don't think I could survive without it, quite frankly.

Because I'm not a snowflake: I'm not delicate and I don't melt.

And I'm glad that every single class that I'm teaching focuses on literature of other nations and/or literature by populations that have experienced prejudice or oppression in the US. It's important that we hear those voices, and realize that the "greatness" of the past always came with a price for someone.

We don't want to lose what we've learned from the lessons of the past. We just don't.

And I'm glad that I teach skills like critical thinking and that I encourage people to expand their vocabulary and their viewpoint and their perspectives on the world. And that I get to do all of that while also enjoying myself--that the work I do nourishes my own mind and heart and spirit, while it (hopefully) does the same for others.

So here's to the future. We got this.

Thursday, January 5, 2017

Back Around

The holidays have come and gone, and here we are again, starting a new year.

There were some high points during my time away, and some lows. Like many, I was a bit depressed by the prospect of 2017, for all of the (to my mind very obvious) reasons that have made the prospect of 2017 a depressing one for many.

But now that I'm back home, and the holidays are over, and I'm in it ("it" being 2017), I'm determined to make the best of things and not waste a lot of time wishing (or worrying--unless and until it becomes absolutely necessary to do so, that is).

So I've been spending the past couple of days getting back into the swing of writing, and I'm pleased to say that it's going well. I've mapped out what I need to get done over the next few weeks, before classes start, and so far, I'm staying on-task (and thus, on schedule).

I don't always do New Year's resolutions, unless I feel that the preceding year has been marked by a glaringly bad behavior or mindset in need of correction, and this year, I didn't really feel compelled to commit to any sweeping changes.

But I did decide that I want to manage my time on social media and email a bit better and, along with that, to carve out more time to reflect on and, when necessary, adjust the pace I set for myself.

So, for example, instead of beating myself up for not getting in shape right away! or not finishing that article right away! I'd like to commit to taking the time to think about why I might not be motivated to work out when I usually enjoy swimming or biking (is my body trying to tell me something right now, and if so, isn't that okay, really?) and why I might not be able to get past the writing "snarl" that has me entangled.

To help me achieve that goal, I've adopted a policy about when I will (and will not) be available via email and how much time I'll spend on social media (and what I'll do when I'm on there). My New Year's resolution (such as it is) is to stick to those choices, so that I can have my evenings and my weekends to myself, and feel better about the things that I accomplish--not harried or "behind" or stressed out.

Because the thing is, if a professor doesn't set boundaries, s/he can end up working constantly--there's always someone who needs something and always something that needs doing. Something to read, something to write, someone to respond to.

And that's not a bad thing, except that it makes it much harder to conceive of a professor's job as a job with clear boundaries that determine when s/he is "on the job" or "off the clock."

I started thinking about this after I began reading The Slow Professor: Challenging the Culture of Speed in the Academy (2016) by Maggie Berg and Barbara K. Seeber. Berg and Seeber examine the ways in which academia now compels those within it to move quickly and efficiently--a "corporate" mindset that is not necessarily conducive to deep thought or probing insights.

In fact, it's a mindset that many successful corporations are now moving away from, precisely because it is not conducive to thought or reflection, skills that are as necessary in business as they are in the pursuit of book-learning.

As Berg and Seeber--and many, if not most intellectuals employed within academia--acknowledge, jobs in higher education are positions of privilege. We have the option of not only thinking, but actually saying, "Hey, wait a minnit" and arguing for a different pace, something that many people in many jobs do not have.

But perhaps they should have that option. Because really, how rich do we all want to be? How much money do we want to be clutching when the Grim Reaper comes knocking?

Fast food isn't better for us--quite the contrary. Fast cars are fine, but you can only go so fast (outside of the Indy 500) before you're going to end up facing a problem. And if you quickly accomplish 8 million tiny tasks, someone out there is (quickly) going to find another 500 or so that you can do as well, since you "have the time."

At some point, I think we can easily lose sight of what we're rushing around for and why. If we want to spend time with loved ones, we need to slow down and pare back--to live and think more deliberately about what we want to do and when.

And if doing so leads to being branded a loser or a slacker, we sort of need to be "okay" with that.

Because success in our careers is a good thing, but only if we actually have the time to stop and smell the roses that we've so assiduously planted along the way.

And time spent with loved ones doesn't always have to be about getting things done or "catching up" or "keeping up." It could just be about... being.

For my part, I've decided to commit to looking and thinking more deliberately this year--to taking the time and effort to move more slowly, when I feel that I'm getting caught up in the rat-race that is our world.

So with any luck, it will be a year of pondering and savoring the food for thought that 2017 offers.

Wednesday, December 21, 2016

What Goes Around...

Yesterday was marked by a surprising little blast from the past.

Late in the afternoon, I was in my kitchen, singing along with the iPod and making homemade condensed cream of chicken soup. (It's quite delightful.)

So first off, here's the recipe for the soup, and here's the song. (Let it never be said I don't have my priorities straight when it comes to providing relevant details about my day.)

I turned around as I was singing for my supper, and behold, what did I see? The woman from about five years ago--the one who hassled me on my blog for... a year, I think it was?... because she thought I was trying to steal her boyfriend. 

She was parked in her car by my mailbox. Lights on, motor running, just sitting there.

Hunh.   (That's exactly what I said at that precise moment: "Hunh.")

I waited to see what she'd do. Eventually, she drove to the end of my street, paused for a minute or two, then turned the car around and quickly left. (I live on a dead-end street. She drives a big white Jeep. Navigating the turn-around takes a minute.)

As I was busily thinking "hunh," I decided to ask my neighbor if maybe a friend of her daughter's had been visiting or something, even though I had a pretty strong suspicion that I knew exactly what I had just seen, although I really didn't know why I would have seen it.

So I sent my neighbor a message asking her precisely that, and mentioned the big white Jeep. And then I opened my Facebook feed and saw that as I'd been writing my message, my neighbor had posted a picture of the note she'd just received in her mailbox.

From that woman.

See, the thing is, nearly two months ago now, the woman was campaigning for public office--again--and she stopped by my neighbors' house, looking for their vote.

Apparently, that didn't go well. The woman tried to hide the fact that she was not only a Trump supporter, but a delegate for Trump at the RNC last summer. She did this, I can only suppose, because she wanted to get a foot in the door when it came to getting my neighbors' votes.

My neighbors do NOT like Trump. (Nor do I, actually.) (Obviously.)

So yeah, the integrity thing with this woman? Not so much.

Long story short, when this incident occurred a couple of months ago now, my neighbors told her to leave, she argued with them, they told her to LEAVE, she argued with them. So they forcefully told her to get off their property, and then she (finally) left.

The point of her anonymous note yesterday? Well, it seems she had dropped by to gloat over the fact that Trump had won.

So yeah, the maturity thing with this woman? Not so much.

Because of the sheer coincidence of the whole thing, I knew she was the one who'd left the note, because I'd seen her. And I'd noticed her, because of all of those delightful, pithy pissy comments and random insights about the nature of love, life, and human relationships that she'd offered up on my blog back in the day.

So I told my neighbors what I knew and what I'd seen.

And they contacted the police.

Which means that, at this point, if she shows up at my neighbors' again, she can be arrested. The police advised me, via my neighbors, to please call them if I see her sitting in her car outside my house again, so we can all arrange it so that she can't hang out on our street anymore.

The moral of the story, boys and girls? Well, honestly, I think there are two morals here.

First, "once an asshole, always an asshole."

I mean, seriously. Do you really have no life and no hobbies and nothing better to do than sit in a parked car at sundown the week before Christmas, penning anonymous notes to people who are more or less total strangers to you, just to gloat about a Trump victory?

And using an exclamation-point smiley-face in said note, no less. (I didn't know anyone over the age of 13 still used those after 1985. Color me duly informed.)

The other moral of the story? "You reap what you sow." Big league (and/or bigly, depending on your hearing).

Because back in the day, this woman put her hot little fingers to the computer keyboard every chance she could, deliberately trying to sow a whole lot of anger and animosity and chaos in other people's lives.

She went out of her way to try to destroy a friendship of mine. And then she went out of her way to try to make me feel even more unhappy than I already was at the time, because of my godson's death.

But here we all are, five years later. My friend is my friend again, and has been for a couple of years now. He was actually over for dinner the other night, and it's probably a good thing she didn't happen to bump into him when he was, because I think he may have had a few choice words for her at this point.

But as I told him last night, it's just not worth the effort. 

She's simply reaping what she tried to sow in other people's lives, in her own life. Because what goes around, comes around.

And these days, a whole lot of anger and animosity and chaos seem to be coming her way on a regular basis. To such an extent that I actually find myself feeling sorta sorry for her (most of the time) now. And I've even begun laughing about her antics a lot more--and a lot more heartily and happily--than I did five years ago.

And that's a good feeling: to be able to look back at a shit-storm someone tried to spin your way, and just shake your head and... laugh.

I suspect she'd insist that we're all delusional and she's better and smarter and happier than the average bear, but "the lady doth protest too much, methinks."

Because the fact of the matter is, if you feel the need to drive to someone's house, sit in your car, and take the time to write a gloat-note, it means you're really terribly insecure.

And that you never actually feel like a winner, even when it looks like maybe you've won, somehow.

And really, that's just sad.

Tuesday, December 20, 2016

Snarl

Every now and then, I like to check out books about writing.

Whether it's a book about the writing process or about how to be a more productive writer or about how a well-known writer thinks about writing (to wit, my October blog post about Stephen King's On Writing: A Memoir of the Craft), I like to have the chance to think about this activity that shapes such a big part of my life.

This weekend, I stumbled upon Hillary Rettig's The Seven Secrets of the Prolific: The Definitive Guide to Overcoming Procrastination, Perfectionism, and Writer's Block (2011). While I'm not sure I'd whole-heartedly recommend it to seasoned writers--there was a lot in it that wasn't really applicable to me--I did like the way that she approached one of the most fundamental impediments to writing:

Procrastination.

Rettig argues that procrastination is a state of "disempowerment" that stems, not from any "intrinsic deficiency or deficit on your part" (1), but from outside forces that operate as "obstacles" ("an activity or circumstance that competes with your writing for time and other resources") or "triggers" ("feelings that interfere with your ability to write") (2).

If we're under-productive writers, Rettig argues, it's not because we're lazy or lack willpower. These are merely "symptoms" of a state of under-productivity, but the judgmental and moralistic labels we assign to these symptoms are crippling.

For Rettig, this is what separates the prolific from the under-productive. Prolific writers are kinder to themselves. They attribute a lack of productivity to the obstacles or triggers that disempower them and set about finding ways of solving these problems.

More importantly, prolific writers (according to Rettig) do their best to prevent the kind of obstacles or triggers that inhibit their productivity. In particular, they don't succumb to perfectionism because, "it's mainly perfectionism-fueled fear (or terror, really) that fuels procrastination" (6).

I think Rettig has an excellent point. In my own experience, writers who want to write, but can't, often have an inability to move past a few basic roadblocks. In many cases, they think they have a project, but they're not sure if it's "good enough," so they wait for conclusive proof that it is (or will be) before they start writing.

The problem is, there's no such proof available. At least, not until you actually begin writing and the project begins to take shape. But even then.

Because ye gods, early drafts can be terrible things. Just appalling. I mean... you don't want to see what kinds of things can end up on a first draft.

And that can be extremely discouraging. I know this because I too have produced what Anne Lamott refers to as "shitty first drafts." Oh, so many, and oh, so much... shit.

Because that's the way of it. Every now and then, you'll have what seems like an epiphany and a really great sentence or paragraph will descend from your brain to the page via your fingertips, but even then, you may eventually have to face the fact that, although it has its own measure of greatness, it actually doesn't really belong in the thing that you're writing at the time.

Sigh.

When you can't write--or can't get started writing--you typically think of this as a "block" (i.e., the famous "writer's block" that every writer has felt and feared).

Rettig offers a useful way of rethinking this obstacle and the feelings that accompany it: it isn't really a "block," it's a "snarl"--"it's a giant spaghetti snarl with at least a dozen (or, more likely, two or three dozen) 'strands,' each representing a particular obstacle or trigger" (8).

Needless to say, as a knitter, I liked the idea of the "snarl" as opposed to the "block." Because snarls can be exasperating and look like the end of the world in the world of wool and other fibers, but snarls can, in fact, be undone. As Rettig points out, "[t]he fact that your block is really a snarl is great news because a snarl can be untangled far more easily than a monolith scaled or chiseled" (8).

And this rethinking and reimagining the nature of what it is that is impeding your progress is a great way of managing--if not overcoming--it. As Rettig argues, "the shortest route ... to maximum productivity is to work patiently within your human limitations so that you have a chance to regain your confidence and focus" (50).

So, instead of beating yourself up for all of the writing you haven't done, or lamenting the fact that you awoke with a scorcher of a headache and therefore didn't get any writing done yet again, the idea is to acknowledge the limits and "work patiently" within them.

This was a wonderful reminder for me, because as I've acknowledged repeatedly this year, I've not been particularly happy with my own level of writing productivity, particularly here on my blog. I've gotten other things written, but they've gone much, much more slowly than I would like.

And I've kicked myself for that, both publicly and privately. So Rettig's book was a reminder that I need to stop doing that--as she points out, "[f]ormulations such as 'The project was a total disaster,' 'I'm a total loser,' and 'It's going to take a million hours to edit this thing' are not helpful" (29).

I've said many of those things over the past year, many times. But since reading Rettig's book, I've been trying to catch myself when I do this and remind myself to "problem-solve" instead: what's the problem I'm facing and what can I do to unravel the snarl a little?

And I try to take a moment to remember the things I have accomplished. Because this too is a dilemma: even when you're productive, if you're perfectionist, the things you accomplish don't seem like enough at the moment when they're achieved.

You've been trained not to rest on your laurels, so you don't. Which isn't "bad," per se--one could argue that it's a way to foster humility. But it also isn't "good," really, because you spend so much time lamenting the things you haven't done and so little time acknowledging the things that you have completed, that you end up misperceiving your own efforts and achievements.

You snarl at yourself, because all you remember are the snarls.

But there's more to the fabric of writing, and if you can--as Rettig argues--"lose [yourself] nonjudgmentally" in your work, you will join the ranks of the prolific and the productive writers of the world.

For my part, that is precisely where I hope to be next year at this time.

Monday, December 12, 2016

The Season

Well, the end of the semester is upon us, which means that I'll be doing a whole lot of writing and a fair amount of grading in the next few weeks, now that classes are essentially done.

But this time of year also means that I can get my craft thing on like there's no tomorrow. So that's what I've been doing.

A LOT of cookies have been made. I'm not sure who's going to eat all of them, because my cats don't seem all that interested in them. So I guess it's up to me...

I've also been knitting and knitting and knitting. Because if it's going to be cold and dreary and a little snowy, and then get even colder and drearier (and perhaps snowier?), then that's just what has to happen.

I finished one little gift, and since I'm quite certain the person who will be receiving it doesn't read my blog, I'll post a couple of pictures.

It's a scarf: an easy knit that opens out into a lacework pattern when you wet-block it, like so:


I confess, it looks kinda cool on the blocking squares, doesn't it? But no, it can't simply remain there. It must go out into the world, sorta like so:

I know the picture leaves a lot to be desired, but trust me, single-handedly figuring out how to photograph a 5-foot long scarf on a cloudy winter's morning (read: no natural light anywhere, really) is just not part of my skill set.

And I didn't feel like trying, let's be honest. I made the scarf, that counts for a lot.

Because while working on the scarf, I also decided to experiment and try making a few Christmas ornaments for my tree. Specifically, I decided I wanted to try to knit a couple of snowflake patterns, and see how they turn out.

Well, they turned out just fine (look to your right), but knitting them takes a while and involves a lot of stitch markers and near-insanity, so that's why, of the two you see pictured to your right, only one is actually knitted (the one on the left).

The other one is crocheted. That went about a bazillion times faster, so that's what I opted to do when the madness came upon me and I decided to make a few more.  Like so:

In the picture on the left, they're drying after being soaked in cornstarch.

It's a way to stiffen them, so that they hang like ornaments, rather than folding or flopping like doilies.

There are several ways to stiffen a snowflake (okay, that sounds odd, but bear with me): you can use glue... yes, I was a bit skeptical about that as well, but yes, you can.

Or, you can use cornstarch, which is what I did. The advantage to glue is, it's permanent. The disadvantage to glue is, it's permanent.

With cornstarch, if something happens to the ornament--I'm envisioning something analogous to what happens when you neatly pack away Christmas lights and the cords spend the summer having group sex (or something), so that they're hopelessly entangled when the Yuletide season rolls around the following year.

With corn-starched ornaments, if there's a problem--if they get bent while being packed away, or a little grubby or whatever--you just wash 'em out and re-starch 'em.

The starching is a sticky mess: I just went old school and added a tablespoon of it to a half-cup of water, boiled and stirred, let it cool a bit and then plunked those bad-boys in. You have to be careful, because you don't want to mess with boiling hot cornstarch: it will both stick to your skin and burn you.

In short, if you're not careful, it will burn you and then keep on burning you. Just like some people... but I digress.

To keep my strength up while being crafty, I came upon a wonderful recipe for garlic rosemary chicken with cranberries--because I need to supplement the cookies with a bit more nutrition, I think. So this was that, going into the oven:

Again, my apologies to the vegetarians and vegans out there, but I do have to say, this recipe was quite good.

And I hate to say it, but I'm looking for a reason to make it again soon, and I haven't even finished all the leftovers from the first time around.

But before I do that, I need to actually do a bit of work.

Because although the semester is rapidly winding to a close and classes are basically over, it ain't over until it's over. And it ain't over just yet.