(Video) How to Call BS on Social Media Gurus

Thanks to the folks at SMC Seattle, the video from last month’s talk is now up.

Some background and disclaimers:

  1. It was a rowdy venue, with an open bar at a circus themed hall (Hale’s Palladium room). I had a heckler of sorts at about the 8 minute mark, which I dealt with abruptly to prevent more of it from happening. Apologies if I come off as brash – the content is brash enough.
  2. A drinking game for my talk was announced during my introduction by Brian, my SMC host, right before the video starts. It wasn’t my idea, and I found out about it when the audience did.
  3. There is a some use of the word BS – very mildly NSFW.
  4. There is a typo in the opening credit, I’ve already let the SMC Seattle folks who produced the video know
  5. The slides and references for the talk can be found here.

SMC Seattle May Event: How to Call BS on a Social Media Guru from SMC Seattle on Vimeo.

Why You Are Not an Artist

Picasso. Van Gogh. Beethoven. Hendrix. O’Keefe. Kurosawa. Kahlo. Kafka. Magritte. Bukowski.  These are people most agree are worthy of the title Artist.

But what if you met them before they were famous? In much of their own lives, certainly for Van Gogh, or Kafka, they didn’t have fame. And in the case of many Artists, including Magritte and Bukowski, their work wasn’t widely accepted until late in their careers. We’d need some other criteria than success to identify them for what they are.

My argument for most creative people reading this post, whatever criteria we’d invent for what an Artist is, few of us would meet that bar. The risks they were willing to take to achieve their artistic visions were well beyond ours. You can chose to be an artist, but if you’re employed as a “creative” you probably don’t.

To be an Artist requires a specific intent. An intent that nearly everyone with a full time job does not have while doing that job. You might be an artist in your spare time, but that’s something else entirely.

While you might have grand aesthetics in your work, or amazing skills that seem magical to others, that is artistry.  How you employ those talents determines whether you are an Artist or not. And sure, you might be the best in your field of designing websites or selling cars, but that’s mastery. The big question is this: your mastery of skill is used in service of what? A corporation? Some stockholders? Customer satisfaction goals? These might be honorable pursuits, perhaps noble in some sense, but that’s not enough to call it art.

Think of Guernica, The Seven Samurai, The Mona Lisa, or your favorite, most powerful work by your favorite Artist. What is it about their work that impacts you? It’s more than just talent.  It’s something to do with the aims they used their talent for. We wouldn’t put a box of laundry detergent (except perhaps for Warhol),or a piece of business software, however wonderfully designed they might be, in the same class of creative effort. We all know there is a distinction, however fuzzy or personal, between one kind of thing and another.

I think to call someone an Artist means they have a higher purpose beyond commerce. Not that they don’t profit from their work, or promote themselves, but that the work itself has spiritual, philosophical, emotional or experiential attributes as central goals. An artist’s work is about an idea, a feeling, or an exploration of a form, framed more by their own intuitions, than the checklists and protocols of bureaucracies and corporations.

In simple terms there are three points:

  1. An Artist is committed to their ideas in ways most people are not. An Artist will risk many things, wealth, convenience, popularity, fame or even friends and family to protect the integrity of their ideas. If you’re not risking anything, and mostly doing what you are told, you’re probably not an artist.
  2. This means anyone who constantly sacrifices their own ideals, and regularly makes major compromises to satisfy the inferior opinions of ‘superiors’ they do not respect, can not sincerely call their work art. And therefore, can not call themselves Artists.
  3. An Artist would be willing to sign their name on what they give to the world. Are you proud of what your company makes? Does it go out the door with even half the soul you put into your designs? If you ship things to the world that are beneath your own bar, can you call it art in the same way you would if it met that bar?

The definition game rarely leads anywhere. You can find many different definitions for the words art, artist and artistry to support any point of view, as it’s an active area of debate. But my favorite definition of artist is:

  1. A person who creates, by virtue of imagination, talent or skill, works of aesthetic value, especially, but not limited to, the fine arts.
  2. A person who creates art (even bad art) as an occupation

If you make paintings, movies, novels or similiar things, of course you’re an artist. Even if your work isn’t good (however we determine that since it’s subjective), and even if you do it part time, or have never been paid a dime for your art, you still qualify.

But if you work for clients/bosses in the making of things that you yourself would not consider art, or are beneath your own standard, or that you blame others you work with for ruining, you are not an Artist. You are an employee. You are being paid to give someone else authority over your creative decisions. This can involve inspiration, effort, sacrifice, passion, brilliance, and many other noble things, but it’s not the same as being an Artist.

As Charles Eames (see video below) said:

Artist is a title that you earn, and it’s a little embarrassing to hear people refer to themselves as an artist… it’s like referring to themselves as a genius.

Related:

Ode to the Waitress at Newark Airport

You can tell much about a person by how they wait tables. The worse the place, the more you can tell. Do they take pride in their work in spite of their surroundings? Transcending the sullen and dull with a mote of warm light? Or, as my TSA associates did, let it be known, in every single movement, how much they loathe their working lives, and you for spending your moments with them? St. Christoper, in my rendering, does not look kindly upon their souls.

Yet, tonight, at the Sam Adam’s bar, by gate 62, at Newark Airport, was a bright eyed, brown haired, wonder of service, with a killer smile, who transformed my dull hours waiting for a long flight into the extraordinary. I watched her dance between tables, charming and serving, making the stress and sadness that accumulates in these hopeless places melt away. There is a rare kind of wonder in watching joy and proficiency combine. Much too young to call me hon’ (as in honey), she did it anyway, which had it’s own unintentional charm.

There should be a shrine in your honor in every airport bar. A beacon of hope for the huddled masses stuck for far too long in the interstitial purgatory of airports, place so bereft of life that they are driven to drink in such unfortunate confines.

I deem you Nina, the innocent unknowing savior of my pre-red-eye flight evening, as the new patron saint of the weary traveler. May all who endure the red-eye, the flight delay, and all mindless time in soulless places, learn the wonders of your charms.

The deception of “Studies say”

In my recent talk on Calling BS on Social Media Gurus, I pointed out how studies are frequently misrepresented by by experts, journalists and even the researchers themselves.

My point is not that you should never believe anyone. It’s just that you should ask some basic questions before you do.

Here’s a recent example:

The Washington Post reported on a study published last week by The University of Michigan about college students and empathy. The article is titled College students losing their sensitive side.

Their second paragraph makes this claim:

“A new University of Michigan study has found that since 2000, college students have become less empathetic.”

The problem is this isn’t quite what the research says.

And the Washington Post article is likely based on the press release from the University of Michigan (Non an uncommon thing, but a surprise to those who don’t know this).

The actual research is just a survey, much like the ones you avoid filling out if you can. The survey asks a bunch of questions about student’s own perceptions of their empathy. This is not the same as their actual empathy, which is hard to measure, unless during the study you can throw some wounded puppies their way and see how the students behave, an experiment hard to do for 14000 students.

Some headline stories, such as this one from The Globe and Mail, lead with the claim today’s students are 40% less empathetic than previous generations. What does that even mean?  That they’d let 40% more puppies die? No. It means on a survey 40% of them answered one radio box to the left, instead of to the right, or something similarly abstract and detached from a specific situation where empathy might apply.

So what the study actually says is this: students perceptions of their own empathy, as framed by a survey, is lower than previous generations.

Measuring someone’s perception is not the same as measuring their behavior. You might think you’re cool, but really you’re a jackass. Or vice-versa. But this article suggests perception of yourself and reality are the same thing (There is one referenced study that claims a correlation, but it’s behind a paywall, which I perceive as annoying).  It is entirely possible there are other reasons for the change in data. Perhaps students are more self-aware and honest in 2010 about their selfishness than previous generations? Seems possible. It’s certainly a question worth asking.

Konrath, one of the researchers is quoted as saying “We found the biggest drop in empathy after the year 2000… college kids today are about 40 percent lower in empathy than their counterparts of 20 or 30 years ago, as measured by standard tests of this personality trait.”

The key phrase is “standard tests of this personality trait”.  By standard, she probably means these surveys, and all surveys have known problems, biases and limitations (The specific personality test is apparently the Davis Inter-personality Reactivity Index). Standard tests are limited – very limited. They might be the best tools we have, but when a limited tool is used to make a general claim, it’s less based on science than opinion,  especially when a scientist is asked to explain “why”. The studies are rarely designed to explain why, but that doesn’t stop many of these articles and experts from theorizing on why.

And to be completely fair – even if there is good reason to make these claims, it doesn’t seem the journalists and reporters writing about these claims have done much work to verify that’s true. Also, everyone is entitled to an opinion. But quoting a scientist’s opinion is typically framed as science, rather than as opinion.

If nothing else, take the actual survey yourself – you’ll see what the students in the survey saw, and when you’re done you’ll be scored against the actual data. Cool. And I suspect you’ll feel more aware of what studies and claims might really mean.

For reference, the same basic story that appeared in the Washington Post also appeared on:

  • MSNBC
  • USnews (they did better ‘Today’s College Students More Likely to Lack Empathy’)
  • Psychology Today
  • LA Times (actually talks about the study)
  • SfGate – This opinion piece suggests our critical view of youth is perennial

But none provide a link to the actual paper presented last week. I dug around for a half-hour and found a PDF of their summary from Sara Konrath’s site. I’d really love to see a rule where any article referring to any study or research must include a link to the actual research.  It’s rarer than you’d think.

Stop Reinventing Wheels – BusinessWeek

My latest article for Business Week is now up:

Stop trying to Reinvent the Wheel:

Right now, in meetings at corporations around the world, the wise are suffering. They are trapped in rooms where debate rages over how to solve a problem. The rub is that the problem has already been solved, just not by someone in the room—and solutions from outside are ignored. This is the disease known as “NIH,” or “Not Invented Here” syndrome, and it’s alive and well in 2010. Despite our many technological advancements in communication, none have eliminated this perennial waste of time. Why is this problem so hard to shake? Will we always be confronted with people who insist on reinventing wheels?

Read the full article here.

Finger on nose: making fast group decisions

Often it doesn’t matter what you decide, as long as you decide quickly and do something. Would you stand in front of train debating if you should jump left or right? Of course not. But often we hesitate in the face of similarly meaningless decisions for no good reason.

Life is filled with little situations where someone has to do some annoying job or another, but as no one wants to do it, long debates ensue over who should do it and why. Examples include picking up a bar tab, putting out the trash, or a thousand other little things.

 

My favorite solution to these decisions is what’s called Finger on Nose, Not It or Nose Goes.  The rules are simple.

Whenever there is a task no one wants to do, the following rules apply:

  1. At any time, anyone can put their finger on their own nose.
  2. This signals everyone paying attention to do the same.
  3. The last person to put their finger on their nose loses, and gets assigned the task.

It’s very simple and works fast.

While the decision is being discussed, quietly put your finger on your own nose and look at a friend. They’ll copy you, others will follow, and soon the person paying least attention is the last one, and gets stuck with whatever it is.

By this point everyone is usually laughing, increasing the odds the work at hand will actually be done. And next time, the loser of this round will be first to start the game to get revenge, ensuing the tradition continues.

I don’t know where this game started – but I learned it eons ago and I’ve taught it to many people. If you’ve heard of it before, or know the backstory, leave a comment. The web provided The Nose Goes rules page, The nose goes facebook group, and the nose variant of the card game Kings)

Next time you have a decision to make about who does something, give it a try.

Calling BS on Social Media Gurus (Video & Slides)

On Tuesday night May 25th I spoke at SMCSeattle, at Hale’s Brewery. They asked me to do a talk based on two posts: questioning gurus, and calling BS on social media.  This was awesome. How many organizations invite outsiders to poke holes in the rhetoric of their field’s experts? Very courageous to do this and I wish more groups did. Kudos to SMCSeattle.

Video:

Slides:

Slightly edited to better reflect what i said without my voice talking over them.

References from the talk, Listed by slide #:

3. Related posts: How to detect BS, Calling BS on a Guru, Calling BS on Social Media

5. Snake Oil drawing, Fast twitter profits (from list of web scams)

6. You too can be a social media guru, www.72ave.com

9/10. Guru defined at Wikipedia

11. Buddha quote

12. How to Lie with statistics, Notes on Cognitive Bias

16. BP press releases – Fascinating example of of PR’s function being detached from reality

25. Victorian Internet (amazon)Victorian Internet (Wikipedia)

26. The Strength of Weak ties (PDF) (wish I had a better link, sorry). This paper is often referred to as evidence of social media’s power, but the paper was published in 1983. If you want to claim “radical transformative fundamental shifting blah blah” then you should have recent research as the backbone to your claims. If you don’t, stop with the “radical shifting blah blah” talk.

27. Summary of Dunbar’s number, The  actual paper, and a talk by Dunbar about Facebook

30. What the F**K is social media

31. Social Media Revolution (Refresh), Notion of Hobson’s choice. The video is fun to watch – however there are assumptions suggested by how some facts are presented. And if someone makes misleading claims about the past, I am unlike to trust their claims about the present and the future.

32. Industrial Revolution – My point is if you compare almost anything to the industrial revolution, and you actually look up what the industrial revolution meant to people’s lives, it sounds like hype. But the bet is most people won’t sit down and study the Industrial revolution, so your claim can get people excited, based on their ignorance. Advertising works on these principles, but wise viewers/readers should work to defuse them.

34. Age Distribution of the World’s Population (PPT from the PRB)

35. Age Distrubtion in the U.S. since 1950 (wikipedia)

36-38. NYT chart on U.S. Consumption this century

40. No one has one million followers, Anil Dash

43. Twitter data analysis, and from mashable (General point is there is data twitter usage is much softer than many people assume).

44. Simple language gets shared more on Facebook – My use of this example is that it’s someone claiming science without giving sufficient information for someone else to repeat the study and support the claims. He might be doing science, but we can’t verify based on the limited data he provided. I’m criticizing any expert that reposts this and calls it science, without asking basic questions.  I asked Dan myself and he responded quickly, and he deserve praise for this – he added more details to his posted methodology (See slide 51). But why was I (likely) the first to ask? An expert should both ask this of other experts, and expect them to ask him/her.

47. Empirical Research – My understanding is that science (in part) hinges on repeatability and sharing details – there can’t be secret sauce. If you want to claim secret sauce, don’t call it science.

The tradeoff of the hyperlink

Nicholas Carr’s recent post on delinkification explores whether we’d be better off if we didn’t use hyperlinks in-line.  He’s hitting on an old issue among the hypertext crowd, as various kinds of hypertext systems, from Apple’s Hypercard, to Hyper-G, have explored the pros and cons of the entire concept of hypertext.

In ancient times, I did some rudimentary studies on the effect of links on reading in 1995 for IE 1.0 and IE 2.0 – I recall many preceding academic studies on hypertext that went  further than we did (oddly enough, they’re hard to find on the web – still looking). Some of that data will be too crusty to apply to today’s web, but some of it is entirely relevant. I’ll follow up if I can find the good stuff.

A few points worth adding to the debate:

  1. The skill of the author is missing from the conversation. The better the writer, the better the job he/she does at anticipating questions or making sure the links are worth the cognitive cost of forcing the user to decide to click or stay. In similiar fashion we can criticize paragraphs, semi-colons, (parentheticals), fonts, bold/italics,blog templates and many factors that we know impact people’s ability to read, as when they are used poorly they do create problems for readers. All  choices writers make have cognitive tradeoffs. Readability, a simple filter that makes pages easier to read, is a surprisingly good alternative to many website and blog designs, but for the better writers on the web, it takes away more than it gives.
  2. This echoes the debates about footnotes and endnotes. I’ve done anecdotal research on this, and in reading this 30+ comment thread of people’s impressively specific preferences, I concluded there is no final answer. It’s too personal, and often people’s feedback hinges on the endnote/footnote style of the last book they read. It’s easy to forget there are many ways how footnotes are used, much like links, some better and some worse. Some uses, in some situations, earn their cognitive costs more than others.
  3. Good browsers should apply preferences for links, including what Carr describes (holding all links until the end). Markup languages are supposed to allow the browser to choose how to present various things, including links. If the reader wants to view all the links at the end, or on the side, or automatically go and pre-load pages, they should all be part of what a browser does to create a good reading experience. This does create conflicts of artistry (should my words appear as I want?) but the spirit of HTML/CSS or any markup language is to give control to readers as well as writers.
  4. Tabbed browsing changes the risks. For those users who use them, it gives an alternative. I know I and other tab users open links from an article in tabs as I go, and let them wait until I finish the article (or until I get stuck on a fact/reference I hope is addressed in a link).
  5. If minimalism for reading is ideal, web site design is a factor too. Even Carr’s site has a top navigation section, and a sidebar with various links and images of books to be clicked on (not that this negates his argument – less distractions are less distracting). Images are possibly more of a drag on cognitive load than a single hyperlink, and it wouldn’t be hard to do research to find out (I suspect in a reading comprehension comparison of Readability vs. most website designs, Readability wins).
  6. Perception of credibility. Forget the reality – in some cases links show the possibility the writer has done their homework. In a glance I can see the link density of a page – too much and I might pass, but none at all, and I might wonder if the writer has thought much about the topic, since they didn’t bother to show they’d found a reference to support or counter their own claims.

Why Creatives Are Confused

The Ad Contrarian has this great little post about Why Creatives Are Confused. Here’s an excerpt:

When they say advertising is an art, their clients say it’s a business.  When they say it’s a business, their clients say it’s an art. When they finally get something good produced, it fails. When they produce mundane crap, it works.When their friends like it, their clients hate it. When their clients like it, their friends hate it. They are  encouraged to be collaborative. But the more people touch their work, the worse it gets. They are counseled against becoming prima donnas. But they see that the people who get good jobs are often disagreeable monsters.

If they weren’t confused they’d be crazy.

It’s a provocative set of observations – and reading it gave me three divergent feelings.

  1. Life is Confusing. I’m empathetic to creative confusion. Many, if not most people on the planet, are confused. And some are so confused that they don’t know why they’re confused. I feel confused by the many paradoxes I find in life, whether I’m creating things, or just trying to fit in and find my way. Creatives are often more personally invested in their work, so their confusion is more personally painful. But look at your coworkers and friends: nearly everyone is confused in one way or another.
  2. Life is more confusing when you (think you) have little power.  Most ‘creatives’ don’t have much power. Count how many approvals you need to get something out the door – if it’s more than 1, you are not an Artist. You are an artist, being paid to make ‘art’ other people are paying you for the right to reject, or change, or ruin. By law you are free to make whatever you want and try to sell it yourself. If you do, you’ll have more pride in your work, but less money in your pocket – a trade Artists have made throughout history. Art/Design school rarely includes skills for making, or succeeding at, these choices, or even putting in your mindset that the trade-offs are yours to make.
  3. Being an expert comes with traps. The more of an expert you are, the more time you will tend to spend with people who know nothing about your craft. The more senior a creative you become (Art Director, Senior Creative Lead Dude), the more time you will spend in rooms with powerful people who know nothing about creativity. This means the people most interested in you will understand little about your craft, and expect you to work with clients who understand even less. You’d be crazy not to expect to be confused when you are paid the most for your expertise by people who don’t know enough not to squander it.

(found via Cameron Moll)

Does having a big ego help achieve goals?

In a series of posts, called readers choice, I write on whatever topics people submit and vote for. This week’s reader’s choice post:

Does having a big ego help? My short answer is any energy you have can be used to help you.

Nietzsche had this notion that there are positive and negative energies you can use to motivate yourself. If you’re afraid of being picked on at school, it might motivate you to learn karate. Or if you love food, you might be inspired to do the work needed to become a chef. Love, hate, angst, curiosity, fear and even competition are all possible motivators to achieve something.

To achieve any goal involves effort, and effort involves converting a feeling into action. It’s one thing to feel inspired or enraged, but what do you do with that emotional energy? Are you able to convert it into actions you’re proud of? That ability to convert explains achievement. Some people get a lot of mileage out of a small amount of emotional energy. Others seems to have massive quantities of emotional energy, but it never goes anywhere productive.  Having a big ego, if managed well, can be a useful source of energy in achieving things. To do difficult work requires fuel, and ego can burn quite well.

Fundamentally, anyone who isn’t dedicating most of their waking hours to helping others has a big ego. We  are mostly self-motivated, with ambitions primarily about satisfying our own needs, and worries mostly centered on ourselves (even our desire to help others can be motivated by wanting to feel good about ourselves). Buddhism, in part, is about learning to diminish our egos – and Buddhists believe this is the path to enlightenment. I think they’re probably right. The problem is you’d need to escape many aspects of American culture in order to do it, but that’s a topic for another day.

Generally when we say someone has “a big ego” we really mean their ego is out of control, and gets in the way. They talk down to people, treat others as inferior, and their sense of self makes them unpleasant to work with. We may actually have bigger egos than they do, but we manage them better. Someone with “a big ego” is likely someone who, for whatever reason, is not self-aware enough to realize their lack of respect for others. Or worse, realizes it but either doesn’t care, or takes pleasure in making people feel bad. When I meet people like this I feel sad – they’re lost. And somewhere deep down inside they probably know it. Odds are good they’re taking it out on everyone else, as they don’t have the courage or the tools to focus that energy on improving themselves.

The goal should be to have a high opinion of yourself, and a high opinion of others at the same time. Some might define this as a healthy psychology. I admit I have a big ego in the sense I’m confident I can do some things well. Is that ambition? Confidence? Whatever word you’d prefer, it’s all tied to the ego.  I do my best not to allow that sense of self to violate someone else’s sense of self. I fail now and then of course, but I’m aware and sensitive to this notion. I also know to grow I have to do things I’m not confident in. I have to put myself in situations where I’m forced to say “I don’t know” or “I’m afraid” and go to places an ego dominated mind would never have the courage to go. Being humble is healthy and creates opportunities you never find if you insist on only doing things you’re confident in.

There are many famous people who achieved big goals with big egos, and were awful to their families, their friends, and their employees. There are also many famous, successful people who were/are miserable despite all their success. Read a biography of Edison, Ford, J.P. Morgan, or any current master of the universe, and you’ll hear similar stories. In chasing an abstract dream (wealthiest man, best athlete, most famous X, etc.) they sacrificed some very obvious and available necessities for a happy, fulfilling life (friends, family, health, community,self-knowledge).

If happiness and fulfillment is the true goal, a big ego alone doesn’t seem to be the way to get it.

In the end, I think of Nietzsche, or what I recall of his notion –  the more aware of our feelings we are, and the better we are at converting those feelings into useful, positive actions, the healthier and more successful we’ll be.

Related:

Worst examples of social media BS?

I’ve been looking for choice examples of inflated claims and promises among people working in social media, for my talk next Tuesday, Calling BS on social media gurus.

If you’re someone who believes in the value of twitter, social media or even web 2.0m who do you think is giving you a bad name? Or making bad arguments?

I’m hoping if you know of a link, blog post or video I should see, you’ll leave a comment.

Disclaimer: I’m just getting started – so far not all of these are bad or even misleading, but they are interesting examples as points of reference. I’m trying to find a wide range to look at and examine.

1. Among other videos and papers I’ve been watching, this popular one is a highly produced, and often fascinating combination of statistics (his blog does list references – kudos) – but I’ll be looking at how they’re used in misleading ways.

2. Joe McCarthy put together this simple collection of stat inflation, where he did some poking around into the papers referenced by articles proclaiming social media growth:

Social Media Adoption By U.S. Small Businesses Doubles Since 2009
Small Businesses Mainly Use Social Media to Identify and Attract New Customers

And in his analysis, it appears they included email as social media in their categorization.

3. Brian Solis’s Unveiling the new Influencers, is a dazzling piece of writing, but has more hyperbole per paragraph than I’ve seen in some time. He’s completely entitled to his opinion, but writing like this is so speculative despite its bold claims, that it deserves attention.

4. This is more in the web 2.0 category, but it was one of the first. Its quite interesting and well done, and stays clear of misleading stats or other rhetorical tricks. But it still does take several positions and assumptions worth questioning:

What I’m looking for:

This list so far is pretty tame. Do you have other favorites, both honest and manipulative, you want to make sure I study and include in the talk? Videos, papers, blog posts, anything goes. Thanks for the help.

?

My surprising inspiration (death!)

My friend and artist Teresa Brazen ran an interesting project, called “What moves you?”  She asked various people  what gets them going. Here’s what I wrote:

I know this sounds morbidly strange but when I’m bored or frustrated or out of sorts, thinking about dying has a surprisingly positive effect on me. When I realize I will die someday, and try to visualize it, imagining the notion this will all be over, my senses vibrate in a way I can’t explain. It’s a crapshoot to be alive at all, and here I am, born at a time and place where I have a million choices, I can read any book, see any movie, visit any art, make, do and feel more things than 99% of all humans that have ever lived, it’s all there waiting for me, right NOW. Confronting the notion of the end of my own life, as far away as I’d like that to be, is one of my most reliable ways to feel moved in the present. To sit and watch TV or wallow in my own hubristic complaints seems unbelievably dumb. And I don’t like to feel dumb.

Kafka wrote “The meaning of life is that it ends”. Every one of my choices matter because I won’t have them forever.  Jim Morrision said “I want to get my kicks in before the whole shithouse goes up in flames” and Horace wrote “Carpe Diem!” If I’m not getting what I want out of my life while I’m alive, or giving to those in need or who I care about before I kick the bucket, when the fuck do I expect to do it?

So there it is. I confess, I’m moved by the idea of my own death. I want to die regret free and the thought of confronting my last moments and having to justify being bored with my own life to myself as  I die compels me to make, and passionately appreciate, the choices I make in the now.

What moves you? Leave a comment. If I’m moved by what moves you, I promise I’ll reply :)

How to call BS on a social media guru (event)

Next week I’m speaking at Seattle’s social media club, on How to call BS on a social media guru.

The basic idea was this: what happens if you combine How to call bullshit on a guru with Calling bullshit on social media, and make a talk out of it?

Event details:
Date – May 25, 2010
Time – 6-9 p.m.
Tickets – $15 includes a drink and appetizers (cash bar will available)
Register – http://smcseamay.eventbrite.com/
Location – Hale’s Palladium (http://halesbrewery.com/Palladium.htm) – 4301 Leary Way NW, Seattle, WA 98107

I’ll be putting the talk together over the next few days, but if you were to give advice to someone about this topic, what would you advise?

How can someone who doesn’t know much about social media tell the difference between someone credible, and someone selling snake oil?

In specific, I’m looking for examples of the most egregious hype you’ve seen on the web.

How I found my passion

In a series of posts, called readers choice, I write on whatever topics people submit and vote for. If you dig this idea, let me know if the comments, and submit your ideas and votes.

Travis submitted the following question, and with 30 votes it’s this week’s topic:

What specific, objective things can you do to find your “passion”? Assuming it’s possible to make a career out of pursuing your passion(s), how do you narrow that down to one or a few things?

This is a great, but strange question. I don’t think there is a way to do this. If there is I don’t know it. Most people seem passionless about their work, much less their lives, don’t you think?

Looking backwards I see I tried different things. Over the course of my life I’ve tried to spend more time doing things I liked. The magical part is twice I’ve managed to find ways making a living doing something I’m passionate about (first with software, now with writing).

My first deep love was baseball. But by age 10 I discovered basketball, and loved that more (much less standing around waiting). It was the defining passion of my life until I was 19. Why basketball? I was athletic, I was competitive, I was better than my friends, and I discovered for the first time that hard work paid off – basketball provided an endless (at the time) path of improvement if I worked hard. But had my brother never made me play basketball at age 8, I never would have discovered this thing I love(d).

I started writing in junior year in high school. We had a poetry month in English class and I wrote poetry. Turned out I loved writing it – some of it didn’t suck. And I kept writing it on my own, after the class ended. Freshman year in college we had to keep a journal for a Philosophy course, so I did that, and by the end of the course I enjoyed it enough I’ve kept it up since. These two experiences were pivotal in me becoming a writer. Had I not been exposed twice, I probably would not be writing this right now.

My story with software is simpler. I was smart. My dad got us a computer when I was 12 or 13. I liked it and it made sense to me so when not playing basketball, I was on it hours every day. I majored in Computer Science because I found it interesting and it made sense. I liked designing things, and got very lucky – I got hired at Microsoft to lead teams and design stuff.  Managed to get on the IE 1.0 team, which turned out to be kind of important. I was passionate about it because it was fascinating, I was young and had power, it was thrilling to feel smart with other smart people, and I convinced myself it mattered to the world. But after a decade those passions changed, or changed shape, so I left.

I’ve never believed in the idea of a calling. I really hate that idea. Most people can be good at many different things, and live happy lives in many different ways. If you want to find your passion I’d say put yourself in different situations, with different people, and see how it makes you feel.  Pay attention to you and write down your responses so you’ll remember. Some of it will bore you. Some of it you’ll hate. But with each experience you’ll have a clearer sense of who you actually are ,what you actually care about, and what you’re good at doing.

There are at least four piles of things in the world for you:

  • Things you like/love
  • Things you are good at
  • Things you can be paid to do
  • Things that are important

But only you can sort out which things go in which piles, or hopefully, all four piles.

I think growing up we’re fed so many stories about what we’re supposed to like, or enjoy, or find pleasure in, and only some of that turns out to be true. It’s implied you need a great career to be happy, but honestly most people seem pretty damn miserable, including those with fancy careers. You can’t be passionate if you’re living your parents dream and not your own, and unless you go out on your own for awhile, you are likely trying to live someone else’s dream.

My advice is simple: Pick something. Do it with all your heart. If you can’t keep your heart in it, do something else. Repeat. Few people have the courage to do this, even for a year, much less a lifetime. But my suspicion is if you ask passionate people how they make choices, this is what you’ll hear.

Why the world is a mess: a theory

I’m not looking to pick a fight here about whether the world is a mess or not. I agree with Penn Jillette –  the trend line is positive. But there is a basic observation here for why anything at all involving people might be fucked up. Families, groups, companies, countries, cultures, etc.

1. People don’t listen.

I don’t mean that their ears aren’t working, I mean it’s rare for person A to genuinely try to understand what person B is trying to say. Instead they’re waiting for their chance to speak. And the fact that people aren’t listening makes the person speaking feel like they’re not being heard. So they talk louder and make more noise. But talking louder mostly makes people want to listen to you less, so the negative feedback loop ensues, leading to anger, rage, and rash acts, all motivated primarily by the absence of acknowledgment, not the facts being argued.

If ever you meet an angry person, odds are good they’re a person seeking to be heard, to be acknowledged or validated in some very simple way, and doesn’t know how to get it, so they’re acting out. It’s amazing how people’s behavior changes when they simply feel someone is truly listening.

2. People don’t read.

I have this short blog post, called how to write a book, that basically says it’s work and like all work you just go and do it.  This post gets 1000s of views daily. It currently has 420+ comments and generates lots of email. Much of it is in the form of “I don’t want to do the work. Can you tell me how to get around the work?” Which is mystifying. I’m not saying people shouldn’t look for shortcuts, but if you read even one paragraph of the post, it’s clear I’m the last guy in the world to ask. Yet they do. Why? They’re not reading, at least not in any sense of the word that involves thinking.

There is another mental process that seems like reading, but it’s really skimming, looking for the single thing you’re hoping to find, rather than trying to understand what the writer was trying to express and perhaps change your thinking about something. And in the case of my post, even if the singular thing they seek is not in the article, people ask for that single-minded thing anyway, despite how absurd in this case it is they’d get an answer. They’d rather take the time to write a pointless question, than read.

To spin the theory around into a conclusion:

  1. If people listened to each other, there would be less anger and unrest.
  2. If people read more carefully, even just a little, they’ll be more likely to get what they want, as there’s a chance they’ll recognize they’re looking for the wrong thing.

There’s this assumption in our culture that with all the TV shows, and books, and websites, we’re all reading more and listening more, but I doubt that. Its become increasingly acceptable not to be listening (e.g. staring at your laptop or phone in meetings) and not be reading (skimming how many emails, or blog posts, in an hour). And I bet any culture, a team, a family, a country, where there is more real listening and real reading, people are happier and more successful at achieving things that matter.

But I’ve yet to see someone monetize listening, or reading. So the whirlwind of commerce  naturally encourages less listening and less reading, but more of everything else.

What do you think?

How Private are Facebook Executives?

Thanks to the New York Times, I got a chance to ask Elliot Schrage, VP of Public Policy at Facebook, about his own settings on FB.

I’d like to ask Elliot, and all the senior staff at Facebook, what are the privacy settings for their own personal Facebook accounts? Can you share the settings (not your personal data, obviously) with the NYT and Facebook users? Scott Berkun, Seattle

Not surprisingly, Facebook senior staff reflect a broad cross section of preferences for sharing and privacy. Because my role is more public, there’s already lots of information about me on the internet over which I have no control on Wikipedia, in news stories and blogs and in other places. These sources include lots of information I might prefer to have private, such as my e-mail address, but I don’t have the power to prevent that information from being available online or in a search index. Perhaps as a result, I use my Facebook profile for more personal information, and take advantage of our controls to target what I share. I’m open to accepting Friend requests from acquaintances and messages from everyone, but I generally restrict my sharing to Friends and members of the Facebook network at work.

Mark takes a different view. He’s more restrictive about which friend requests he accepts, but he’s more willing to share information about himself and what he’s up to with anyone who visits his profile. You can see how my and Mark’s profile differ by checking them out. The settings of other members of our senior management team generally fall somewhere between Mark’s and mine.

Hmmm. I thought I’d asked a pretty tight question, but somehow I don’t feel I got a solid answer. What should I have asked?

You can see the full NYT article here, with questions for him from others . The comments thread is pretty harsh.

Obama vs the iPad (information overload)

At a recent speech at Hampton University, President Obama had this to say about our web 2.0 information age:

With iPods and iPads and Xboxes and PlayStations, – none of which I know how to work – information becomes a distraction, a diversion, a form of entertainment, rather than a tool of empowerment, rather than the means of emancipation.

There is simply no worse argument for or against something then the fact you’ve never used the thing.

And although I agree with him, without trying something yourself, it’s silly to have confidence in your ability to discern its value. I criticize twitter, but I use it just the same to see if maybe I’m wrong.

For decades we bet that the information age would come and solve all our problems. The information age is here, and it has solved some problems, but:

  1. There are some problems information does not solve
  2. There are some problems created by our easy access to information

I can’t blame an appliance if I choose not to put it down. But the simple fact is just as America is physically obese because of bad eating habits (34% of Americans are obese), we are mentally overloaded because of bad information habits.  I can’t blame Apple, anymore than I can blame McDonalds. They are corporations and their prime directive is to sell.  The problem is us, not the devices.

I’ve written about this before  in Attention and Sex and more recently, in the cult of busy, where I point to our misguided cultural value around busy people, as the cause. The web, the iPad or whatever is next is just another way for us to manifest that (misguided) value.  It’s no longer hard to seem busy, it’s incredibly easy and signifies nothing.

Dan Lyons at Newsweek had this to add:

Remember when computers were supposed to save us time? Now it seems just the opposite. The Internet just keeps giving us more ways to do nothing.

We have more information than ever before. We’re never away from it. The air around us fairly hums with it. Computers are all around us too—they’re on our desks, in our pockets, on our coffee tables.

And yet I can’t shake the sense that we are all becoming stupider and stupider—and that we are, on average, less well informed today than we were a generation ago.

Information is cheap.  Entertainment is cheap. Social interaction online is cheap.  It begs the question: what is not cheap? What does not change in the face of new media? If the problem of information access has been solved, which it largely has, what are the real problems we need to solve? Whatever they are, they’re the real things that matter – it’s just harder and harder to get down to the core, given how awash we are in irrelevance.

Do sci-fi movies impact the future?

Today I was reading yet another article about how a movie, in this case Iron Man 2, shows a possible future for computing. I think its kind of silly to put much faith in devices designed for movies, but that’s not a surprise since I think the future of UI will be boring.

My point is simply that filmmakers use technological ideas in movies to serve narrative and stylistic purposes. They are designed for how cool they seem to watch someone else use, rather than for actual use. Things that are designed to be used by actual people 100 times a day tend to be boring, because they should be comfortable, simple and natural and not cause repetitive stress injuries. But that makes for boring style, which filmmakers rarely want. Flying cars, jet packs, Virtual Reality headsets, AI and voice recognition, all sci-fi staples for years, have little practical market value, despite how cool it is to watch characters in movies use these things.

But in an attempt to call BS on myself, I wondered this:  Have design or software ideas from sci-fi movies ever become successful real products?

I’m not saying movies haven’t inspired people. Sure they have. Star Wars inspired me to draw, and play with lego, and think about all sorts of things. But I never actually made something I saw in Star Wars into a successful real product in the real world. And given the lack of working x-wing fighters and light sabers I’ve seen in the last 30 years, I’m assuming no one else has either. And that’s my hypothesis. Not that sci-fi movies don’t have a purpose, just that they’re lousy at predicting the future of anything, much less product design.

Even if there are some (star trek communicators = cell phones come to mind) the odds seem so bad: sci-fi movies are awful predictors. That’s my bet, but prove me wrong.

The question: Can you think of any specific design / UI / software / computer thing that was shown in a movie, and later invented for real, and became successful? Lets make a list.