Tuesday, June 28, 2011

Social Intelligence

In a January 2011 article entitled, "Facebook posts mined for court case evidence," Brian Grow notes a growing trend in U.S courts: using Facebook postings as evidence in personal injury litigation. 

Two weeks ago, The Federal Trade Commission determined that "Social Intelligence," a company which does background checks of social media for corporations screening job applicants, is complying with The Fair Credit Reporting Act.

Social Intelligence scours the web for the "online presence" of job applicants and stores the information found on Facebook, Twitter, and elsewhere-- for seven years.

Under the terms of The Fair Credit Reporting Act, a job applicant must have agreed to allow the potential employer to perform a background check and must be notified if s/he is denied a job on the basis of any of the material found out there online or on social media.  (See Leslie Horn's article, "FTC-Approved Company Will Save Dirt from Your Facebook Profile for 7 Years," PCMag, June 20, 2011).

I have mixed feelings about this, of course.  I like free speech, and I think people should be able to express themselves whenever they want, however they want.

They may say things I don't like and don't agree with and wish they wouldn't say, but if they're not violating the First Amendment and engaging in practices that represent exceptions to our First Amendment freedoms, then I say, live and let live.

That said, I think technology and electronic media have complicated people's perceptions of the line between public and private.

I am the first to admit that, if you're a friend of mine and you seriously piss me off, you may get some pretty darn salty language directed your way.

In such cases, I don't mince words, even over email.

That said, it's pretty hard to seriously piss me off, and on the rare occasions when I have fired off in writing at someone I was involved in a personal relationship with, I have always done so with a full awareness of the potential consequences.

I realize they may share my emails with anyone and everyone and that, taken out of context, I won't look good.

If I trust someone with an expression of my emotions (both the good and the bad), I can only hope that, in the end, my trust in their character and their integrity isn't misplaced.

Sometimes, though, it is.  This is a painful realization for anyone to come to.

At such moments, I also realize that the follow-up email in which I sincerely apologized for my angry words won't even be mentioned, and it definitely won't be eagerly and voyeuristically circulated in the way that the original email is sure to be.

In short, I have to assume that I will wake up one morning and this will be plastered all over the front page of the newspaper, alongside my picture, and I'll have to deal with the fallout.

And then I make my decision about whether or not to send that person my thoughts and give them the gift of my trust.

If I send it, it's because I believe that we all have the right to express our feelings, our opinions, and our personalities, in our own ways and in our own words.  And we shouldn't always have to worry about whether or not other people are going to "like" it.   

I do it because I believe that anyone I have openly cared about who ultimately chooses to use my words against me is telling me a lot about 1) how much they value personal relationships in general, 2) how much they value me in particular, and 3) how much they truly value freedom of speech and expression.

I refuse to live in fear.  

Dictatorships don't emerge out of the actions of a powerful leader or a failing government.  They emerge as a result of people ratting out former friends or digging up dirt on current neighbors or making personal quarrels the substance of public actions and discussions.

In the 1969 French documentary, Le chagrin et la pitiƩ, Marcel Orphus retrospectively examines the social and political mechanisms at work during the Nazi occupation of France.

One of the most interesting parts of the film occurs when he confronts individuals with evidence that they publicly denounced other people to the authorities simply because they had a personal ax to grind. 

So on this level and for these reasons, I object to using online materials in job application screenings.  Although I understand from a pragmatic perspective why businesses might find resources like Social Intelligence appealing, on a very basic level, I don't like it.

At the same time, I think people need to realize that frequent use of social media fosters an illusion of privacy that never actually exists.

The "Privacy" settings on Facebook are meaningless.  I used to block people, until I realized that there isn't really any point.  "Blocking" only works when someone is actually in Facebook; materials are still available via Google and other search engines and webpages and social media postings are regularly cached. 

So even if you take it down, it's still out there somewhere and someone can always find it if they know where and how to look. 

I think of it this way:  if you run into the street with your pants down, screaming at various passersby, your neighbors are going to see it, and you know that.

(At least, if you're reasonably sane, you do.)

You also know that your neighbors may film it.  Or they may call a friend and say, "You gotta see this...".  Or they may tell your pastor next Sunday at church.  Or they may email a long-lost friend, tweet a buddy and compose a status update about your actions.

The Internet is the electronic street.  Facebook is the street.  Twitter is the street.  You're sitting in your own room, feeling safe and secure and assuming you're communicating "privately," but you might as well be standing on the street, screaming.

There is no reasonable expectation of privacy online or in a public forum such as a social media site.

Electronic communications are always public.  So if you choose to go public with words of anger or outrage, you need to realize that those communications are out there and won't go away.

In a way, that's a good thing.  People should be able to be themselves and speak their minds and words should have a certain resonance and staying-power.

So I guess what I'm saying is, if Anthony Weiner had tweeted me, I would have said, "Oh, holy cow!", deleted the tweet and simply moved on with my life.

Yes, he's a public figure.  Yes, his behavior is inappropriate.  Yes, he absolutely needed to resign.  And no, he shouldn't tweet such things to other people.

But at the end of the day, I think about his wife.  She didn't do anything wrong.  And the public glee surrounding her husband's downfall has to hurt her.

A lot. 

I know I've posted this statement before, but it's been on my mind a lot for the past several days.  In her letter to the House Un-American Activities Committee in 1952, writer Lillian Hellman refused to be intimidated by a Congressional subpoena.

Instead, she wrote, "to hurt innocent people... is, to me, inhuman and indecent and dishonorable."

I think we all need to think more and more carefully about what constitutes humanity and decency and honor, both on the web and off.

I know that from now on, I plan to.

Friday, June 17, 2011

Catching Up

It's been a busy week-- and yes, I still need to post sometime about the Progressive movement.  I haven't forgotten, I've been reading and thinking...

But next week is looking even busier, so I may have to make it a post for the future.

I found an interesting and useful resource this week called "OpenCongress."  If you go to the site, you can check out bills that are pending: you can read the actual bill, not just the media hype surrounding it, since that sometimes (okay, often) grossly misrepresents the actual legislation under consideration.

For example, I read one account that said that under the proposed S.978, web-streaming that violates copyright would be punishable by "10 years in prison," a punishment similar to the sentences doled out to "child molesters." 

But if you look at the actual bill, it states that it could carry "a possible penalty of up to 5 years in prison."  And there are a lot of other elements that get left out of the media coverage.  (Not surprisingly.)

What's also interesting is the tab labeled "Money Trail."  You can follow the funds, see who proposes the bill, what special interests support it, etc. etc.

They're sponsored by The Sunlight Foundation, which is also dedicated to government transparency and accountability and offers a wide range of links to online resources about local, state, and federal government and its policies (or lack thereof).

My only other excitement this week seems to have been installing vertical blinds, and that wasn't very exciting, I must say.  I'm in one of those phases Virginia Woolf described as "wool gathering"--not writing a lot, but reading and thinking and letting the mental landscape lie fallow for a bit.

A new crop of ideas is in store, just planting the seeds for now.  More later, of course...

Thursday, June 16, 2011

"Break Your Heart"

I was always a fan of Natalie Merchant's work.
Don't spread the discontent, don't spread the lies
Don't make the same mistakes with your own life
And don't disrespect yourself, don't lose your pride
And don't think everybody is gonna choose your side...
Mutual respect isn't supposed to be contingent upon total agreement.

Honesty isn't optional.

And people aren't here just for our own use and benefit.

So many people try to be so much better than that.

In the end, those people end up gravitating towards one another and their energy becomes a force all its own.  And the others are eventually just left behind, in the same place they've always been. 

My best friend found out yesterday that the parents and the coach of her other son's baseball team took up a collection so that they wouldn't have to pay his hotel fee to stay overnight if they went to the playoffs.  They had also made arrangements so that he would be able to travel with the coach's family, if my friend was unable to take him.

"The everyday kindness of the back roads more than makes up for the acts of greed in the headlines."  ~Charles Kuralt

Sunday, June 12, 2011

Defining Spirituality

I've often had conversations with people about religion, faith and spirituality--a fact which may seem odd, coming from an atheist.

As I've tried to explain (sometimes very cryptically), I see religion, faith and spirituality as different things.

In my opinion, religion is an organized system of beliefs and rituals that offer those who believe a sense of existential coherence and the opportunity to participate in a community of shared values.

Faith, on the other hand, I see as a belief in a higher deity or original, originating and overarching purpose for human existence, however you choose to define that entity or purpose.

Many people of faith do not necessarily adhere to or participate in a particular religion, and many people who self-identify as "religious" are not necessarily what I would call people of "faith."

What many find odd is that, although an atheist, I would still describe myself as "spiritual." 

Interestingly, sociologist Elaine Howard Ecklund recently discovered that there are many natural and social scientists who also describe themselves as "spiritual" atheists.  

In her essay, "Finding the Sacred in the Secular," posted on Big Questions Online on May 25, 2011,  Heather Wax interviews Dr. Ecklund, who describes the mindset of scientists who do not believe in God but who simultaneously insist "that there is something out there that's larger than themselves that has a hold on them."

The spiritual atheists that Ecklund interviewed tended to separate themselves from the general public's perception of "spirituality as being synonymous with a belief in God."  Instead, they seek a spirituality that is consistent with science.

According to Ecklund, "These are people who really prize rationality. So they don’t want to do anything that seems irrational, but they can’t stop seeing in their own minds and in their experiences that there seems to be something out there beyond themselves."

In particular, Ecklund notes, "They didn’t want to be doing something that was inconsistent with their identity as a scientist. So they didn’t want to be a scientist in one part of their life and then have this other kind of loosey-goosey spirituality over in this other side of their life."

Ultimately, Ecklund found that while atheist scientists who do not identify themselves as "spiritual" would generally downplay the importance and relevance of questions about "the meaning of life," atheist scientists who identify themselves as spiritual
"would talk about how they found awe and beauty in nature, they found awe in the birth of their children, they found awe in the very work that they do as scientists. They just couldn’t see that as being explained only by science—there has to be something else out there beyond themselves. But then they did not see that as being God, or needing to name it as theism of any sort."
I often wonder about how the ways in which we identify ourselves in terms of religious belief, faith and/or spirituality affect our daily perceptions of reality.

As human beings, I think we have an inherent tendency to be incredibly self-absorbed on a pretty regular basis.  So it's no surprise that the Screaming Mimi I mentioned in a previous posting is still out there--with no sense whatsoever of the irony of her position as someone who staunchly advocates individual freedoms-- rudely and repeatedly telling everyone to "stop praying."

To date, I have resisted the admittedly mischievous (but, in my opinion, very understandable) impulse to tell her that the Soviets would have loved both her attitude and her moxie.

I will continue to refrain, but God knows, it's hard.  (Sorry, I couldn't resist that one.)

Like everyone else, my life, my voice mail, and my inbox is regularly littered with people obsessing and depressing, with litanies of complaints about things like weight gain, grades, courses, kids, co-workers, "injustice" (broadly defined but usually limited to a personal sense of having been individually wronged somehow), bad dates, male pattern baldness and menopause.

Some of those complaints even originate with me.

And then there is my best friend.

12 hours after returning home from an 8-hour stint at the oncology clinic (on her birthday, no less), where a routine check of her son's blood counts and a necessary platelet transfusion turned into an emergency CT scan and fears that he was bleeding into his brain, and 48 hours after she received news that his stage 3 brain cancer is spreading in spite of six grueling months of radiation and chemo treatments, she wrote in her online journal at CaringBridge:
"But we are not quitters and we just have to pull ourselves up and carry on.  Life is never certain.  Every day is a gift, is it not?  And its our job to live in the here and now & be the best we can be.  Thank you for listening, caring, and for reading to the end." 
She is not religious.  She is not a person of faith.  Like me, she identifies herself as an atheist.

But, I would argue, she is very spiritual.  And to me, her spirit is more admirable than many expressions of faith and religious dogma.

Her words also represent a kinder and more humane attitude than what is typically associated with the term "atheism."  It's definitely not the attitude of that self-important, loud-mouthed, Royal-Pain-In-The-Ass I described above, for example.

(Sorry, it just slipped out.)

My friend's children are Catholic, because her husband is.  So although she doesn't "believe" in the same way that others around her do, she is nevertheless comfortable letting others believe what they believe.  In spite of her own personal convictions, she is willing to acknowledge that, as William James said, "No one has all the insights."

In "Celebrate the Myriad Ways," Richard T. Hull describes the experience of his son's accidental death from a drug overdose and the response of friends, family and the community at large to that death.

If you scroll through the article and read the next-to-last paragraph, I think you'll be surprised--and in some cases, shocked-- at the kinds of comments that Hull and his wife received.

People can be presumably well-intentioned and yet sadly oblivious.

I think Hull's response, however, speaks to the heart of what distinguishes faith, religion and spirituality.  Although not a man of faith or religion, his words are, in my opinion, deeply spiritual.
"We could have taken offense at much of what went on at the service and much of what was said to us before or afterwards. But it occurred to us that uncritically accepting the outpouring of others’ consolations was the essence of what it is to be a humanist: one who seeks to understand and celebrate the myriad ways in which humans try to deal with the tragedies and stresses of life. Secular humanists hold that this life is all we have...".
As Hull acknowledges, however, this viewpoint is not the end of one's emotional and spiritual engagement with others because
"with that recognition comes another: that humans must have, within the limits of mutual respect, the right to live this life under whatever structure of beliefs makes it tolerable and gives it meaning."
In the end, Hull describes how he and his wife 
"decided to let all these reflections on life from the myriad of traditions and beliefs that give rise to them be expressed without criticism from us. We were just grateful for the love and concern that prompted those remarks that, taken literally and personally, could have destroyed many friendships."
In the end, even in their overwhelming grief, they chose the spiritual meaning of others' (in some cases, incredibly insensitive) remarks over the literal and the personal, because it offered a chance to connect with others in kindness, compassion and tolerance.

It gave them an opportunity to continue to see the good in others--and in life itself--in spite of their own, very significant and very senseless, personal tragedy.

If that is not the essence of spirituality, I don't know what is.

Friday, June 10, 2011

You

It's my best friend's birthday, and we got some bad news yesterday about her son's brain cancer.

So I don't feel much like writing today.

Thursday, June 9, 2011

Purity

It's hard to be discouraged about anything when you spend a little time reading about the human brain--not only what it's capable of, but what it does simply on a daily basis, in the most mundane circumstances, just hanging around.

I will never again feel like an underachiever.  Do you have any idea how many neurons need to be fully engaged and firing at lightning-fast speeds simply for me to yawn and stare out the window?

A LOT.

I've been reading John J. Medina's book Brain Rules.  It's a fast read with all kinds of interesting brain anecdotes and information.

Like the fact that, if you're going to drive and talk on the cell phone, you might as well consider yourself driving while intoxicated.

That's how well cell-phone talkers perform on tests of driving ability.  ALL of them and all of the time.  So if you're doing it, no, you're not one of the anointed, special people who simply can, somehow.

You're just in serious denial and you're jeopardizing the rest of us. 

It's a well-documented fact: the brain cannot multi-task when it comes to paying attention.

The brain processes input sequentially.  Yes, you can walk and chew gum or sing and stir soup.  But those activities aren't processed in the same way as an activity--like driving or reading--that requires paying attention to input.

And no, you can't change the way your brain processes tasks: as Medina points out, "We are biologically incapable of processing attention-rich inputs simultaneously" (85). 

That means you, me, all of us.

We're "BIOLOGICALLY INCAPABLE."  So give it up, all of you driving, cell-phone-talking proponents of your own superior brain-activity.

Unlike schmucks like me, who freely admit that we can't pay attention to two things at once, all you're doing is increasing the likelihood that you'll get into an accident.

As in, 10x more likely. 

I mean, really.  Use your brain.  And use it the way God and Nature intended: by sequentially processing attention-rich inputs.

What is also really fascinating to me is what's called "the binding problem."  Medina describes a woman who suffered a stroke that left her incapable of using written vowels.  She would write sentences that would have only the consonants of words.

But what's even more bizarre is the fact that she would leave spaces where the vowels were supposed to be.

This means that her brain had stored the knowledge of vowels in one region, but had stored the information about where vowels should go in words in another area entirely.  She knew where the vowels should go, but she no longer knew how to fill them in because she no longer knew what written vowels were (Medina, Brain Rules, 105).

This example highlights the binding problem.  If all kinds of bits of information are scattered throughout the brain, how do we perceive the world as a unified and continuous whole?  As Medina notes, "We have no idea how the brain routinely and effortlessly gives us this illusion of stability" (105).

Add to this the fact that, as Medina postulates in Brain Rule #3, "Every brain is wired differently."  Medina uses a really helpful analogy:
We have the neural equivalents of large interstate freeways, turnpikes, and state highways.  These big trunks are the same from one person to the next, functioning in yours about the same way they function in mine.  So a great deal of the structure and function of the brain is predictable... (63)
But when it comes to "the smaller routes--the brain's equivalent of residential streets, one-laners and dirt roads ...the individual patterns begin to show up.  Every brain has a lot of these smaller paths, and in no two people are they identical.  The individuality is seen at the level of the very small, but because we have so much of it, the very small amounts to a big deal" (63).

But NO, you still can't drive and talk on the cell phone.  Those are activities carried out via the neural freeway, not your individual dirt roads.

I think the brain's uncanny ability is implicitly showcased really well by this March 31, 2011 article by Roy Furchgott in The New York Times Business Section about OnStar and voice recognition software installed in cars: "GADGETWISE; Checking In With Facebook, Both Hands on the Wheel."  

Setting aside the fact that I really don't know what to make of someone who feels that they have to check Facebook or upload a status update from their car while driving (especially since checking Facebook while driving = talking on a cell phone while driving = downing a bottle of Jim Beam while driving), I think the fact that we can't get voice recognition software right says a lot about what our brains do so simply on a daily basis--namely, make sense of the world around us in myriad ways and at super-fast speeds.

A friend of mine had voice-recognition software that was supposed to enable her to make calls from her car simply by saying "Call home" or "Call Katie." Supposedly, the driver could also play music: just name that tune, and the software would play your CD or MP3 or whatever for you.

Basically, my friend discovered that even when she enunciated the words, "CALLL  HOOOMMMME" as loudly and distinctly as she possibly could, it would respond, "I do not understand this command."

She wasn't sure which was more frustrating, being unable to get the thing to phone home or being unable to listen to Aerosmith during a long commute.

She confessed that she and her husband began telling it, "GO FUCCCKK YOURRSELLLFF," just because.

She said, "I know it's childish, but that stupid thing is really frustrating."

I think the functioning of the human brain has a beauty and a purity unmatched by technology.  We need to take the time to watch ourselves in action and simply marvel at what we do.

So, as Alanis advises, "Let's grease the wheel over tea/ Let's discuss things in confidence/ Let's be outspoken, let's be ridiculous, let's solve the world's problems...".

Love you when you dance.


Tuesday, June 7, 2011

Palin's Reverie

As another public figure that we couldn't avoid hearing about for months on end once sang, "Oops, I did it again."

The media is currently full of Sarah Palin's gaffe.  If it was a gaffe.  (I think it was.)  (And no, I don't expect her to ever admit it.)

I have a link to The Center for Public Integrity's iWatch on my blog, and they did a Fact Watch of "Sarah Palin's Twist on Paul Revere."  I think it offers a balanced statement about what she got "right" and what she got "wrong," and why "wrong" may not be "wrong," per se, it may simply be a "twist" on Revere that is ultimately rather severe.

Sarah Palin is, for me, just another Donald Trump.  She seeks publicity, at the expense of the GOP (and, in her case, The Tea Party as well), and I don't think she really cares a whole lot about political viability.  I think it makes perfect sense that the two would share a pizza: both seem to me to wholeheartedly believe in the notion that there's no such thing as bad press.

What's more interesting to me is the way in which this debate about Palin's twisted take on American history highlights the issues involved in online collaboration.

After yesterday's posting about contemporary information overload and its effect on nuance and depth ("Screening"), I did what I always do:  I went out and read a whole bunch of stuff that totally disagrees with what I said.

In particular, I looked at Douglas Thomas & John Seely Brown's book, A New Culture of Learning.  Thomas and Seely argue that there has been a shift in the way in which information and learning are obtained: we no longer live in the "stable infrastructure" of education replicated in the classrooms of America.  Instead, we now operate within a "fluid infrastructure where technology is constantly creating and responding to change" (17).

Thomas and Seely suggest that, in the wake of eroding boundaries between "public" and "private," we need to rethink the differences between the two in ways that accommodate contemporary technologies and their effects.

They argue that one of the results of the changes brought about by technology is a shift in the relative functions of "the personal" and "the collective."

According to Thomas and Seely, "collectives are plural and multiple.  They also both form and disappear regularly around different ideas, events, or moments.  Collectives, unlike the larger notion of the public, are both contextual and situated, particularly with regard to engaging in specific actions" (57).

Thus, Thomas and Seely argue that, "[t]hroughout life, people engage in a process of continuous learning about things in which they have a personal investment" (57).

In a very dramatic sense, the Internet has enabled that engagement to expand exponentially.  One of the benefits of social networking media and online interaction is that they enable the formation (and, when they are no longer useful and informative, the dissolution) of collectives that offer "a nearly infinite set of resources that any individual can selectively tap into and participate in as part of his or her own identity" (59).

As you've no doubt heard, Wikipedia had to shut down the Paul Revere page after Palin's comment, because it was flooded with individuals who may or may not have been trying to alter the page to fit her remarks.  Because Wikipedia is precisely the kind of collective that Thomas and Seely describe, it possesses a certain vulnerability: it can be "invaded" by all kinds of personal, editorial contributions that may or not be made in a spirit of shared, collective learning.

Perhaps not surprisingly, many (if not most) academic professors HATE Wikipedia.  The number of haters is legion: some of them outright prohibit students' use of the source.

Personally, I think Wikipedia is an interesting phenomenon, and while I acknowledge its vulnerabilities, I do think it has a very decided place in the digital age, and that place can be a useful one, when it is tempered with scholarly common sense.

I tell students they can certainly start with Wikipedia when they're thinking about an issue (I often do that myself, actually), but they shouldn't simply start and end there.

And I stress the fact that they need to keep their critical and evaluative antennae up at all times, with all sources, whether they appear in print or digitally.

A good Wikipedia entry, like any good entry in any credible scholarly resource, always points you outward and elsewhere.  It tells you the sources of its information so that you can backtrack and check on what it has told you, it highlights the issues that are still open to interpretation and debate, and it usually generates more questions than answers.

What Wikipedia doesn't do, of course, is require a Ph.D. or an M.A.--or any kind of college credentials, for that matter--of its editors.  Obviously, this is going to bother quite a few of those people out there who went to all the trouble and expense of getting the darned things.

But really, any source that treats a complex, multifaceted problem or event as something that it has definitively "solved" or "defined" should be looked at thoughtfully and critically, in my opinion, regardless of who's writing it.

[For example, a debate still exists as to whether prions are the culprit in the transmission of rare brain diseases known as "transmissible spongiform encephalopathies" (TSE's), even though the man who developed the theory won a Nobel Prize for his work.  (See "The Prion Heretic," Science 27 May 2011: Vol. 332 no. 6033 pp. 1024-1027.)]  

These kinds of debates about "facts" are rife in academia--even in the sciences.   Ultimately, they constitute the lifeblood of intellectual inquiry.

The problem that academics seem to have with digital sources that embrace the input of the masses and disregard academic credentials --and this is a problem that isn't really addressed by Thomas and Seely--is that it is possible to create a purportedly "informative" world of completely useless relativism, a digital forum where everything is only as "correct" as its strongest advocate and where every opinion, insight or spin on an issue is treated as valid, even if it is factually incorrect or was debunked years ago.

Academics have standards for evaluating scholarly credibility and reliability; their greatest fear is a world in which there are no such standards.

Wikipedia's response to the Paul Revere crisis has been interesting.  I think it speaks to an understanding of the benefits of the collective digital environment outlined by Thomas and Seely.

Wikipedia has "Five Pillars" or basic operating principles that indicate an awareness of the inherent vulnerabilities of information gathered in a digital collective.  Thus, they make an attempt at avoiding the pitfalls of a collaborative digital environment.

When Wikipedia pages become subject to what is called "edit warring"--that is, " a series of back-and-forth reverts" in which one editor changes content on a Wikipedia entry, another editor "reverts" or changes it back, and this process continues several times over the course of a day--the warring editors can be blocked and Wikipedia administrators can opt to "protect" the page.

When that happens, the editors need to collaboratively work out their dispute and come to an agreement about the content of the page itself.

In the case of Paul Revere's page, this is currently occurring on a "Talk" page.

The assumption is that both the general public's media-spawned interest and any Palinists' personal investment in the collective Wikipedia page will die down over time, allowing those with a collective and personal interest in Paul Revere and his life and actions to reach a consensus and create a collaborative page that will strive to be 1) historically accurate and verifiable, 2) politically neutral,  and 3) comprehensive.

In short, they want to create something that is accurate, informative, interesting, and useful--as all good research, whether it is conducted individually or collaboratively and whether it is presented digitally or in print--always is.

Monday, June 6, 2011

Screening

I decided I couldn't live without seeing pictures and updates from the little girls I used to babysit for, so I reinstalled my Facebook account, but eliminated everything except what directly involves them.

They were the original reason I signed on, so I decided to pare back to my initial purpose.  After all, I shouldn't let other people ruin what started out as a very good thing for me.

It's actually quite nice: now, when I open Facebook, I can simply see how they're both doing.  And I see nothing else.  And no one can "friend" or contact me anymore.

This is good.  This works for me.

I'm also in a good mood because nothing works like running off to the Berkshires on a beautiful weekend and spending the entire time hearing about how beautiful and smart and funny you are, and that you don't look 42 years old at all.

I highly recommend it.  It's good for the soul (and the ego, of course).

What was interesting about the Berkshires (in addition to the fact that it's a beautiful location and walking around is like walking through American Literature 101--Melville, Hawthorne, Wharton, Edna St. Vincent Millay all hung out there, to name only a few), is that you simply can't get cell phone coverage or GPS in a lot of places.

This was good timing for me, because I've been reading Hamlet's Blackberry by William Powers.  Powers examines the electronic over-saturation we're experiencing in the digital age by looking at the ideas and experiences of great writers and thinkers, including (of course) Shakespeare, Plato and Thoreau, and by describing his own experiences as someone who is highly connected to technology.

Powers argues that, in all of our connectedness, we've lost sight of the fact that, in order to make sense of experience, we have to disconnect from time to time, simply to process what we've seen, read, heard and experienced.

If we simply go from Google search results to link to link to email to Tweet to Facebook posting to YouTube video, and if we do this for hours at a time sometimes, we never delve beneath the most superficial surface of anything.  We live a world of constant distraction and inattention, and we lose the ability to focus and think through the images and ideas we're presented with.

Maybe it isn't about diagnosing everyone with ADD.  Maybe it's about shutting off the computer or the smart phone, so that we can relearn how to experience depth and nuance.

I was at a party for an entire afternoon when I was in the Berkshires, and NO ONE checked their cell, no one "needed" to Google anything, and no one sat staring at a screen (in fact, no one so much as glanced at one).  They simply couldn't.

And because they knew they couldn't, they didn't.

No one suffered in the slightest.  Conversation never lagged.  Everyone paid attention to another person.  We talked and listened and thought about what was said.

I think that Powers' ideas about disconnecting and achieving depth are interesting.  I'm generally pretty disconnected, most of the time, and I think that's why I have such awkward experiences when I connect or encounter the Continually-Connected.

I still think that anyone who sits and stares at their phone or checks email while in the presence of another human being is either rude, avoidant and/or self-involved.

You're not that important.  You just aren't.

(Neither am I, by the way.) 

I still think Tweets are beyond absurd.  If an experience can be captured in 140 characters, I'm pretty sure you can either 1) tell me about it the next time you see me --if you even remember it, or 2) skip it entirely.

Not all of life is meant to be documented electronically.  Even interesting people have mundane lives and activities that we don't need to hear about.

What makes people's lives and thoughts interesting is what they value and give dedicated attention to, over days, weeks and months.  This can't be captured moment-by-moment, and it doesn't need to be.

According to Twitter's webpage, "the real magic of Twitter lies in absorbing real-time information that matters to you."  But information isn't important simply because it occurs in real-time, and it only matters to you because it has depth and connection to your own life over a span of time.

If it has that, you should invest more than 140 characters' worth of attention to it.

If two people are talking and a cell phone rings, it is no longer considered a social faux pas to answer it, but five years ago, if someone had done this, the other person would have been insulted and annoyed and everyone would have agreed it was quite rude.

My current policy is, if I mention something interesting and you immediately grab your smart phone to Google it right there in front of me, I'm going to leave you to your blessed Googling and go have an actual conversation with an actual human being, since that's why I'm on the planet.

Or at least I think that's why I'm here...

I think we have to start questioning our own sense of self-importance.  Our uses of technology often demonstrate a lack of depth, but they are also accompanied by a lack of self-awareness and humility (both qualities associated with depth, in my opinion).

All this said, I should probably point out that, in my own personality tests, I tend to fall in the "Idealist" category of the four Keirsey temperaments: I'm what's known as an INFJ ("Introverted Intuitive Feeling Judging") personality.

So at the end of the day, I think maybe it's just that screen-relationships are simply not for me.

My time in the Berkshires helped drive that point home, in the best possible way.

Friday, June 3, 2011

Get Together

I've mentioned before that I often have unusual "to-do" lists.

So, the one that I had for this week included making jam, mowing the lawn, pressure-washing and waterproofing the deck, and finding out more about the history of the libertarian movement in the US.

I should explain.  I've often read postings and blogs and articles by various libertarian people and entities, and I've been quite puzzled about what exactly they "want," per se.

I wanted to find out what the movement has meant, done and advocated, historically, because I really do want to know what they're all about.

After all, it's probably only a matter of time before someone calls me one, so I need to be ready.  (I'm already liking it, though, because it's got almost all of the word "liberty" in it.)

I've refrained from asking any of the people or posts I've read, because there is a terrible tendency in online communications to assume that questions imply distaste or dissent.

There are apparently no innocent, unbiased or purely informative questions to be asked anymore, and no one has the slightest qualms about providing links to incredibly unreliable sources.

I really wish people would be more aware of the fact that blog postings on major news sites are not "news" or information, they're blogs: hence, opinions.  Blogs can incorporate factual information, obviously, and it's apparent when they do because there are clear references in the body of the post to the sources of the information and statistics cited. 

If there aren't any such references, you might as well be citing me as an authority on the issues.  (And no one should be doing that.)   

Anyway, I'm pretty thick-skinned, but I know enough to know that I really don't want to have to go through the day knowing that someone has at one point called me a "one-dick pony" or a "sheep" or told me to "keep drinking the Kool-Aid" and speculated about my mother's mating habits with regard to animals, all because I asked a question about what exactly "libertarians" are advocating.

I should explain: on one of the pages I was reading, one person did in fact call another one of the posters who disagreed with him a "one-dick pony."  This led to rampant speculation about how many penises a pony should have and what on earth the person had meant.  It eventually devolved into extended name-calling that equated everyone's intelligence with someone else's private parts (human and equine).

I was so traumatized just watching the exchange unfold that it took me a full 36 hours to realize that the person had meant to call the other individual a "one-TRICK pony"--that is, someone who only has one particular strength or advantage to display (from the days when there were "dog-and-pony shows" or small traveling circuses where a trained animal was the main attraction).

Clearly, no one on the page was familiar with the phrase (including--and especially--the person who used it), and although it was a really interesting mistake, all things considered, I wasn't about to log on and let everyone know that.

It was best left alone.

And in case you're wondering, it's an English professor liability: whenever I see someone say something incomprehensible, no matter how outlandish, I just have to figure out what they may have meant by it.  I simply can't let it go.

So while I'm not saying the phrase "one dick pony" was at the forefront of my thinking for the past two days, I will admit that I felt much better when, in the midst of waterproofing the deck, I finally figured it out.

In any case, on one of the websites, a person confessed extreme confusion about the difference between "libertarianism," "communism," "socialism" and "Marxism."

I'll admit, I gave a gentle smile imagining Ron Paul and the members of The Cato Institute and its adherents having two cows and a chicken at seeing the term "libertarianism" included in the same sentence with "communism," "Marxism" and "socialism."

The person was answered politely, by someone who suggested that s/he was perhaps confusing "libertarianism" with "liberalism."

But actually, the questioner wasn't wrong to be confused, and s/he wasn't confusing "libertarianism" with "liberalism" when s/he linked it with socialism and communism.

There are libertarian socialists out there: Noam Chomsky is one of the best-known.

Basically, libertarians advocate upholding principles of individual freedom (or "liberties," hence the term) and minimizing (or eliminating entirely) the reach and influence of government.

If you're thinking, "Well, but hey, that kinda sounds like me..." it's probably because it does, to some extent. "Libertarianism" is an overarching term encompassing a broad spectrum of beliefs, many of which are essential to many Americans' sense of national self-definition.  Most of these values were considered "liberal" when they originated in the 18th century.

Within that overarching spectrum, however, are included both minarchists and anarchists and a very wide range of political ideas and positions.  Some "libertarians" advocate free market capitalism and support private property, others do not.  Minarchists believe state control should be limited (think "minimal") and that it should simply protect individuals from crime and aggression (i.e., anything that would infringe upon their own individual liberties or those of another).  Anarchists believe there should be no government at all.

The reason the person was confused about the distinction between libertarianism and socialism or communism is because, in the mid-1820's, Josiah Warren, one of the best-known proponents of "sovereignty of the individual" and the "first" American anarchist, was among the initial adherents of the communist philosophy of Robert Owen.  Warren initially participated in the effort to establish a commune in New Harmony, Indiana.

When the commune failed, Warren became an advocate of individualism and insisted that any attempt to function within a society represents an unfair and unnecessary curbing of "natural" individual sovereignty.

Warren's thinking gets complicated, because on the one hand, he advocates the free market system of capitalism, but without any government.  On the other hand, he advocates limiting "cost" to the actual amount of labor used to produce an item.

So, the items you buy in a store, Warren would argue, should only cost what it cost to create them and bring them to market (plus a small surcharge for the store's overhead).   He actually ran a store on these principles: the "currency" used involved an exchange of equivalent labor for items sold "at cost," meaning, their price was what it cost, in labor and materials, to produce them.

This is the basic principle of "mutualism": labor alone determines cost and value.

Although mutualists like Warren advocate a free market and individual liberties and oppose government intervention, they also oppose earning income from rent, loans or investments, since there is no labor involved in these activities.

So you can see how the "libertarian" label, when applied to Warren, comes to cover a wide range of economic practices and political ideals, some of which would sound decidedly "socialist" to a contemporary reader.

In addition, Warren did participate in the founding of two communes in the 1840's and '50s: Utopia, Ohio, and "Modern Times" (Brentwood, NY), both of which advocated complete individual freedoms (no police, no courts), and both of which utilized a free market system based on the exchange of labor.

And although Warren did not ultimately support Robert Owen's theory and application of communism in the United States, the two remained lifelong friends.

This is what I find most interesting about the history of various movements and countercultures in the US: they are often intertwined in ways that our contemporary "understanding" of them fails to acknowledge or appreciate.

Thinkers and philosophers disagreed, sometimes strongly. They did not, however, simply resort to constantly calling each other "douchebags."

I think that anyone (and everyone) who wants to achieve a social or political purpose abandons that purpose when s/he resorts to name-calling.  When was the last time you listened to the ideas of someone who had just called you a slew of insulting names?

Exactly.

For me, the distinction lies between the individuals who want (or need) to be "right" and those who want (or need) to be heard.  If you have a message, you think about the best possible means of communicating with and convincing others.

If you resort to calling other people names, you abandon the message in favor of upholding the value of your own status as "messenger."

Thursday, June 2, 2011

Blindness and Significance

I recently read William James' essays, "On A Certain Blindness in Human Beings" and "What Makes A Life Significant."

Both lectures were published in 1899, in James' Talks to Teachers on Psychology: and to Students on Some of Life's Ideals

In "On A Certain Blindness," James examines "the blindness with which we all are afflicted in regard to the feelings of creatures and people different from ourselves."

I think a lot of James' observations have relevance for how we think about the things that matter to us and how we perceive the efforts of others.

In "What Makes A Life Significant," James argues that in dealing with others, "The first thing to learn ... is non-interference with their own peculiar ways of being happy, provided those ways do not assume to interfere by violence with ours."

I think it is sometimes difficult not to interfere (or advocate interference), particularly when we see someone living a life or following a course of action that we judge to be somehow "wrong," when viewed from our own perspective.

What I like about James' advice, though, is that it is a reminder to at least attempt non-interference--to value the impulse to stand back for a moment before leaping into the existential fray.  

In many cases, even if we simply paused before interfering in the lives of others, we would significantly change the way in which we interact with everyone around us.

According to James, "No one has insight into all the ideals. No one should presume to judge them off-hand. The pretension to dogmatize about them in each other is the root of most human injustices and cruelties, and the trait in human character most likely to make the angels weep."

I like this: "No one has insight into all the ideals."  It's so true.  

I think the only way we can begin to attain insight into some of them--a mere handful over a lifetime, if we're lucky--is to listen to others.

And we can only listen when we refrain from constant interference.

At the end of "On A Certain Blindness," James advises his listeners to "...tolerate, respect, and indulge those whom we see harmlessly interested and happy in their own ways, however unintelligible these may be to us. Hands off: neither the whole of truth nor the whole of good is revealed to any single observer, although each observer gains a partial superiority of insight from the peculiar position in which he stands." 

In this essay, James notes that, in general, "We are stuffed with abstract conceptions, and glib with verbalities and verbosities; and in the culture of these higher functions the peculiar sources of joy connected with our simpler functions often dry up, and we grow stone-blind and insensible to life's more elementary and general goods and joys."

However, "[t]o be imprisoned or shipwrecked or forced into the army would permanently show the good of life to many an over-educated pessimist."

I think of this quite a bit, when I hear people complaining about their own particular circumstances or hardship, or when I find myself kvetching more than usual.

We all have a right to complain, and some more than others.

But I often wonder, if all of what we have in a given moment were swept away or transformed into something we hadn't even remotely conceived of, what would that lost moment look like from our new perspective?

Would it seem so bad?

We live in a world of over-educated pessimists; I think it's the brand of individuality that contemporary culture rewards and applauds.

I wonder, though, what the under-educated optimists of the world can teach us.

In "What Makes A Life Significant," James wonders, "if we cannot gain much positive insight into one another, cannot we at least use our sense of our own blindness to make us more cautious in going over the dark places?"

I think we can.

I think that, on a daily basis, we need to learn and relearn the value of caution.  We need to constantly remind ourselves that we may at any given moment be treading blindly through someone else's darkness.

For me, this is the essence of respect for others.

Ultimately, this is my favorite observation in "What Makes A Life Significant":
"In God's eyes the differences of social position, of intellect, of culture, of cleanliness, of dress, which different men exhibit and all the other rarities and exceptions on which they so fantastically pin their pride, must be so small as practically quite to vanish; and all that should remain is the common fact that here we are, a countless multitude of vessels of life, each of us pent in to peculiar difficulties, with which we must severally struggle by using whatever of fortitude and goodness we can summon up. The exercise of the courage, patience, and kindness, must be the significant portion of the whole business."
Fortitude and goodness; courage, patience and kindness.

The common fact that here we are.

In the end, this is the significant portion of the whole business we call life.