Book smarts vs. Street smarts

In a series of posts, called readers choice, I write on whatever topics readers submit.

[Note: Polarizing questions are silly since rarely in life do you have to have truly binary choices. Both is often an option. But they are fun: so please assume someone took my lunch and refused to give it back until I picked a side. Also see: The false dichotomy of false dichotomies]

There is no doubt in my mind street smarts kicks book smarts ass. To be street smart means you have situational awareness. You can assess the environment you are in, who is in it, and what the available angles are. Being on the street, or in the trenches, or whatever low to the ground metaphor you prefer, requires you learn to trust your own judgment about people and what matters. This skill, regardless of where you develop it, is of great value everywhere in life regardless of how far from the streets you are.

Most important perhaps, being street smart comes from experience. It means you’ve learned how to take what has happened to you, good or bad, think about it, and learn to improve from it. The prime distinction between street smarts and book smarts is who is at the center of the knowledge. On the street, it’s you. In a book it’s you trying to absorb someone elses take on the world, and however amazing the writer is, you are at best one degree removed from the actual experience. Street smarts means you’ve put yourself at risk and survived. Or thrived. Or have scars. You’ve been tested and have a bank of courage to depend on when you are tested again. Being street smart can lead to book smarts as the street smart sense what works and what doesn’t, and adapt accordingly.

Book smarts, as I’ve framed it, means someone who is good at following the rules. These are people who get straight A’s, sit in the front, and perhaps enjoy crossword puzzles. They like things that have singular right answers. They like to believe the volume, and precision, of their knowledge can somehow compensate for their lack of experience applying it in the real world. Thinking about things has value, but imagining how you will handle a tough situation is a world away from actually being in one (As Tyler Durden says in Fight Club – “How much can you know about yourself if you’ve never been in a fight?”).

Like the stereotypical ROTC idiot in war movies (e.g. The Thin Red Line, Aliens 2) who outranks the much more competent and experienced, but less well pedigreed sergeant, the book smart confuse pretense with reality, and only learn of the difference when it is too late. Or worse, even after the fact, they insist on seeking out more books and degrees rather than recognizing they are trying to improve the wrong skills: they are half blind by their own choice since they insist on looking at the world with only one eye.

I say all this as someone who has a deep love for books, and who has some degree of what might be called book smarts. But it’s that knowledge, used in service of street smarts, that best explains whatever I’ve achieved in life.

 

Obama, Palin and teleprompters

I saw on CNN today more about Sarah Palin’s use of handwritten notes on the palm of her hand. This story is stupid and pointless. It’s just as dumb as the people who criticize Obama for using teleprompters, while using telepromters themselves, which includes nearly every newscaster on every TV network (ironically, Sarah Palin criticized Obama on this too).

In speaking, the ends justify the means. The average speaker sucks. The average politician isn’t much better. If using index cards, crib sheets, teleprompters or whatever allows you to be good, then use it. If it gets in the way, then don’t. End of story.

Some people say it’s bad form to use notes. To them I say you’re a nitpicking jackass.

Should Americans get more vacation?

In a series of posts, called readers choice, I write on whatever topics people submit and vote for.If you dig this idea, let me know if the comments, and submit your ideas and votes.

This week’s reader’s choice post: What’s the impact of 60 hour work weeks and only 2 weeks of vacation on American companies? (submitted by Lynn – thx!)

The running joke at any big corporation is the phrase ‘work/life balance‘. Anywhere that needs to make a special phrase like this is by definition a place populated by workaholics. You’d never hear people talk about ‘work/breathing’ balance, or ‘work/clothing’ balance, because work never puts a supply of oxygen or a shirt on your back in question, unless you’re a workaholic naked astronaut or something.

It’s interesting how us Americans are fond of taking pride in our freedoms, yet when it comes to time off we are the least free for much of the Western world. It’s typical in Europe to get 6-8 weeks off 4-6 weeks off, commonly taken in the summer. This explains, in part, why Europeans have a deeper sense of their own culture, as they actually have time to learn, experience and enjoy the parts of life not spent in front of keyboards or in meetings.

Frankly, hours are a lousy way to measure value. If I can do great work in 5 hours, work my peers at best do in 10, that’s not my problem. I should be rewarded for results, not how much time it took me to get them. A good manager knows this. Good companies know this too. My best managers made clear they didn’t care about the HR policies for time off, or hourly reporting. They knew I’d be motivated to work hardest for them if after I got my stuff done, and had done it very well, I was free to do as I wished. (Oddly, in cultures like this, I tended to stay late and kept working because I enjoyed my work so much).

The impact of the 60 hour work week, or any rigidly defined number of hours, is that smart people loaf around. Rather than be efficient, clever, and wise, and go home, people feel obligated, are in some cases are rewarded, to linger, to pretend, and to give pretense about how long it takes to actually do things. This is all kinds of bad. We should reward people who kick significant ass and then go home. Early. Not those who pull all-nighters for things that were never that complex to begin with. All sorts of goodness happens when managers learn to reward results, not effort. And this starts by getting past the stupid pretense of effort known as hours.

Miserly vacation limits are juvenile and short term thinking. It assumes that time off is bad for the company, and puts faith in the notion that doing things outside of work is an indulgence. God bless the Puritans, as we are still victimized by the prudish stink of their ideals. We want to be whole people, and being whole means having an identity beyond work. We are more than our jobs. Two weeks of vacation takes a bet employees won’t be around that long, so why invest in their long term happiness? If they burn out, it’s not our problem. That’s what two weeks of vacation says to me.

A major reason I quit my job in 2003 was to have complete control over my TIME. The only measure of life you can not get more of. I did not want some corporate policy, written by someone I’d never meet, defining how most of my waking hours on planet earth would be spent. The older I got the more clear it became I’d rather make less money and take on more risk than willingly give away control over MOST OF MY LIFETIME. Especially if the thing I was spending all that time making was mediocre, forgettable and far from what I’d call reaching for my best possible work. But enough about me.

Certainly for any creative field, which many knowledge worker type companies claim they are, time away from work is where much creative growth happens. It’s away from work people have new experiences, see new places, ask new questions, and learn to appreciate the life they’re working so hard to get.When people return from vacation they are better people, not worse (explaining the wise philosophy of rock star web firm, Jackson Fish). And they bring new energy, perspective and ideas back into the company, all things that are essentially priceless.

The objections to more time off typically are:

  • I didn’t get it so why should you. This is bad arguing. Just because something sucked in the past doesn’t justify it sucking now. A tradition of suffering and stupidity isn’t worth defending.
  • If people get more vacation our projects will die! Good managers manage. They can handle working around people’s vacations just as they do already. And of course when to take time off should always be a negotiation between the boss and the worker. Somehow in the U.S. we all know Thanksgiving to New Years is a dead zone. Yet we’re still here.
  • This will mean the end of the world! Yes, the sun will explode and we will all die someday, but this has nothing to do with how much vacation we get or don’t. In fact should the Vogons arrive after you finish reading this post, and announce the destruction of the earth, I’m certain near the top of your list of gripes would be you’d wish you had used more of your vacation, and had been granted more to use.

My bet is, in a well run company with a good manager, if you:

  1. Drop the 40/50/60 hour a week expectation. Treat people like adults.
  2. Clarify the results you want from your staff
  3. Increase people’s vacation days by 50 to 100%
  4. But, and here’s the rub, demand everyone still do the same amount of work they already do every calendar year

You can pull this off without any noticeable decrease in performance. I’d even bet you might see some increases in work quality, as people have real motivation, are free from the pretense of pretending to be busy, and will love their lives so much more and bring some of that love to work with them every day. Why not try this as an experiment for a year?

Other variables worth trying:

  • Let employees choose salary increases vs. more time off. I understand the cost to a company to have people on salary who aren’t working. Fine, come up with a number and let the employee decide if they want raw income increases, or time off increases. Put the equity they’ve earned into their hands and see what they do with it. Then it is truly up to them, without the bean counters complaining.
  • Or work the other way. I never understood why workers can’t give up some % of their salary for additional time off if they want it. Un-paid vacation should be part of every serious company’s benefits plan. It’s a win-win.
  • Stop hiding behind sick days: I don’t understand the accounting, but I’m sure some bean counter has done the math. People don’t use all their sick days, so the more you can push days off into that pile, the better it is on some spreadsheet. “Personal days” and other crap are sneaky ways to attempt to influence behavior. Be on the level.
  • Sabbaticals make sense. Part of why I quit Microsoft in 2003 was I knew I needed a few months to figure things out. At one point I’d have preferred to stay with the company, at no pay, but just to give me some security and the option to stay while I mulled it all over. But this required secret handshakes with executives that I never learned. It made my choice easy: I quit.

It’s surprising, but few companies I’ve heard of have ever experimented with different approaches to vacation and unpaid leave. If you know of examples and case studies, please leave a link.

So what do you think? I’m a insane? Has being independent warped my demented brain? Or is there plenty of room for more time off without betraying the bottom line?

Related: See my essay, work vs progress.

Twitter reconsidered

I wrote a post in June of 2009 called Calling Bullshit on Social Media. The goal of the post was to put twitter, and facebook, into an honest perspective, given all the hype and idiocy surrounding the phrase social media. It was picked up all over, as echo-chamber articles about social media often are, and has well over 100 comments and links to it.

In the six months since then my use of twitter has increased, warranting a follow up post.

I don’t retract what I said – but now I have more experience to explore similiar points.

Stats: I’ve been on twitter for 7 months. My follower count has doubled to 3000+ since the above post, while I’m still following about the same, ~350. Total tweet count is 1500+ over the 7 months I’ve been on twitter (@berkun). Which is an average of 7 tweets a day (although I’m not on every day).

  • Despite its problems, the fact is people who like spreading information use twitter. I’m an independent writer and need all the mediums I can find to spread my work. My blog has thousands of subscribers, my books have sold thousands of copies, but posting a link to new writings on twitter spreads faster and, seemingly, wider. Even if all the criticisms are true, the people currently on twitter are people who like to spread things. And they do. It’s heavily populated by people who like to forward, email, tweet, post, blog, telegraph and anything else. I’m sure my traffic and book sales have benefited from being active on twitter. And I’m grateful for readers on twitter, just as am for the blog and the books.
  • Twitter is fun in bursts and handy on the road. There is a breezy, sarcastic, side comment rich flavor to twitter, which makes it enjoyable if you’re on it enough. This seem possible only if you’re staring at a monitor most of the day, which many are. But if you’re not, twitter won’t make much sense. I doubt taxi cab drivers or anyone working retail will ever be a strong part of the mix. Twitter is fun as a break, as an aside, but if you show up expecting an event it doesn’t make much sense. If you travel often, a decent following guarantees someone can recommend something you need in that place, which is handy and life affirming in a good Samaritan kind of way. But is still something a concierge could do nearly as well.
  • It’s clear many people are free (or distracted) much of the time. It is amazing how quickly, during the work day, I see things retweeted, or get comments on my blog posts that originated from twitter. I’m grateful for this of course. It’s awesome and empowering. But what’s curious is the twitter crowd seems to have notifications for everything on all the time. Someone needs to do some ethnography on the daily work habits of twitter users, but by observation there are many who jump in in and out many times in a half hour, suggesting they’re jumping in and out of their actual work frequently.
  • Some of the positives are artifacts of the new. In the early days of email, it was amazing who you could get to answer you. This was, in part, because few were using it. Some of the thrill of twitter, where you can chat with various famous people, will decline as usage grows. It’s more an artifact of new media, than the medium of twitter itself. It’s still a new frontier and some if it’s charm will decline with each wave of mainstream users, in similiar fashion to how email and the web changed, and the small town frontier charm fades. It’s easy for megacorp to seem authentic on twitter, when there’s one guy online representing them. But when there is a team of 30 doing it, with the inevitable policies, and protocols, it will feel like something more familiar, and less interesting.
  • 140 characters actually does prevent discourse. Twitter is great for snarky jokes, and for pointing people at things, but is a disaster for deep conversation. You haven’t had the full twitter experience until you’ve stumbled into an argument with someone who is incomprehensible and angry, and seems to find you equally incomprehensible and angry, even though, outside of twitter, you are neither. Direct messages are just as bad. I wish twitter was attached to a private chat feature free of the 140 limit, so attempts at deep conversation, or arguments where both sides don’t get the context of the other, can migrate and thrive, run their course, and then return people back to twitter.
  • You can easily spot people confusing life with a popularity contest. It doesn’t take long to realize many people with huge followings have nothing to say. There are some good reasons people follow others, but bad ones too. Mostly it’s easy to figure it out. Some of it is taste. Some of it is not. If the signal to noise ratio is off, look elsewhere. If someone feels slimy, they probably are. You often can tell if someone is being genuinely nice, or is just trying to manipulate you into some kind of reciprocation. I try to say hi to people who mention my work, simply because I’m sincerely grateful. But sorting out people’s intentions on twitter isn’t much different than the rest of life.
  • Twitter breaks often. It’s disturbing how often twitter acts strange, is broken in major ways, or doesn’t work at all. It’s understandable for something new, or experimental, but twitter is neither. The client apps are unreliable, and need major UI help. I’ve reverted to using the web page, which sounds primitive to twitter die-hards, but it’s the fastest and most reliable interface there is.
  • The elements needed most in this age are clear communication, patience, and wisdom, which are all in short supply. All media depends on the minds of the people who use it, and twitter is definitely a reminder than many folks either: a) don’t read what they link to b) don’t understand what they read c) don’t really care and just like pushing bits around. I don’t blame twitter for this. Twitter spreads misinformation just as quickly as real information, simply because people do. No technology can ever distinguish between a lie and the truth. However, twitter is faster and sloppier, which has advantages, but also has natural disadvantages. It doesn’t reward the patient and thoughtful. It’s definitely not a tool for encouraging thinking, questioning, or introspection (the innovation I am waiting for), as the spreading of links is not quite the same thing. It quite possible twitter makes those three things harder, given how tempting twitter makes it just to read the next link.

In summary I’m a reluctant, cautious fan. I don’t expect anything to radically change anything else, but its sensible for me to use any new media that helps spread my work.

I don’t believe the hype, but I do see results for some of the things I need to do to be successful. I do get pleasure now and then in connecting with new people I don’t know, or joking with folks I’ve met on the road.

If you use twitter, has your opinion of it changed over time? And if you haven’t tried it, what would it take to give it a serious spin?

My talk at Google HQ, on Confessions

I wrote a few weeks ago about how my talk at Google about Confessions was the toughest room I’d had all year (with photos, and countermoves). The room was adjacent to a noisy cafeteria,at lunchtime, and an exposed busy hallway of folks on their way to and from the cafeteria: bad news all around.

The funny thing is, video is flat. It evens out the highs and lows, and the audio track isn’t mixed: there’s only what you hear through my mike. If I didn’t tell you, you’d never know how tough that room was, and how much of an impact the background noise had on me, and the vibe in the crowd.

Below is a picture I took of the reverse view, where you can see the cafe behind, and the hallway on the left, 10 minutes before the talk began.

google-tour-reverse

Well enough whining. I’d be a hypocrite if I didn’t show videos of talks I didn’t enjoy. So for your curiosity, here’s the video of the talk. The Q&A, which is better, starts at 34:00.

Why Job Interviews are Flawed

Why does anyone believe they are good at interviewing people? If you ask experienced managers you’ll often her a set of well worn pet theories about how they interview people. Yet most interviewers, no matter how experienced they are, make instinctive judgments based on cognitive and other biases. And then afterwards, when reporting on the qualities (or lack their of) of the candidate, they back-fill logical reasons to support the intuitive responses that they’re in denial about. It’s no surprise that most job interviews are deeply flawed and unfair experiences.

Contributing factors include:

  1. Few are self-aware enough to sort out their biases. Few people posses the self awareness to realize why they instinctively like or do not like someone they’ve just met. And even fewer, especially in business and engineering circles, feel comfortable with their feelings. It’s considered unacceptable to say ‘the guy did well but I didn’t like him for reasons I can’t explain’. It’s much easier to hide that feeling inside unfair judgments, using whatever flavor of corporate jargon can be found in the official hiring criteria (Lacked intellectual horsepower, couldn’t deal with ambiguity, didn’t know the secret handshake, etc.)
  2. Talking about doing is not doing. Most interviewers focus on trying to extract a prediction about someone’s ability through having them talk about their ability. This is ridiculous. Could you evaluate an NFL running back by asking them questions about how they run? (e.g. “I run really really fast”, “Great, you’re hired.”). Better interviewers work hard to put candidates in problems and situations like the real ones they’ll face, and watch. They collaborate on real problems during the interview, as that’s what much of work is. Over time they’re able to calibrate what it means for a candidate to do well, given real problems, in an hour. But this requires skill and patience few interviewers have. And even when they do, the candidates are in an awkward and artificially stressful environment that does not approximate real work well, unless the interviewer is diligent on compensating for these issues (tip: hiring candidates for a trial project is often a better use of everyone’s time. Get a sample of them actually doing the job).
  3. Interviews work better as a filter. The job interview loop is more effective at eliminating bad candidates than identifying good ones. The bet is by the end only good candidates remain, but that’s not true. Like bacteria responding to antibiotics, strains of bad candidates that are immune to your process survive as well, and are hard to distinguish from good ones. The process can be prone to false negatives too (people who get rejected but would have thrived).
  4. Recommendations are underestimated. Since interviews are mostly bullshit, it makes sense to put more weight on a recommendation from a trusted person (not necessarily the names on the candidates resume) who has worked with the candidate somewhere else. They have first hand experience on the millions of things that can only be witnessed outside of the interview room. If you trust them, and they trust the candidate, that may have more predictive ability than 60 awkward minutes in your office.
  5. No one else saw what happened. Interviewers are free to lie and distort, intentionally or not. All interviewers are free to invent pet theories on which questions work best, or how good they are an extracting the value of a candidate. They are the only record of what they asked, how they asked it, and how the candidate performed. If they have bad habits that bias the candidate, no one will ever know, as the candidate has almost no ability to report on the interviewer. Every interview is a cat and mouse trapped in the room, and the mouse is motivated to do whatever it can to survive the cat, no matter how cruel or unfair the cat is.
  6. We never go back a year later and evaluate. The hiring loop at nearly all companies is broken, as there is no feedback loop. No one forces you to go back 6 months or 2 years later and see: how many of the hire decisions you made worked out well, and how many of the people you rejected kicked ass at other companies with similar cultures and needs. With no data, the value of any interview process is guesswork, not rigor.

Despite my affinity for this theory, I believe groups that take interviewing seriously, and leaders who reward interviewers for putting more time and careful thought into interviews, end up with better teams. The choice to hire someone is the most important decision you make that month, or year and the wise know this. At a start-up it can make or break the company. And the more seriously people take the process, even if it’s flawed, the higher the odds they’ll recognize the natural shortcomings above and invest in minimizing them.

While I don’t think all interviews are a crapshoot, I agree with the Winchester theory – in most interviews, most of the time, it’s mostly bullshit as a tool in truly evaluating how well a person would perform in the job, even if the people doing the interviews don’t intend it to be.

The real story is in who you recruit to interview in the first place. Better candidates in, better candidates out: see my essay how to interview and hire people. What is your take on interviewing? How do you work against the Winchester theory?

[This post was inspired by Royal Winchester who, when asked about interviews, offered: they’re mostly bullshit]

Woody Guthrie’s New Year’s Resolutions

I’m often lazy with resolutions. Some years I give myself until the end of January to sort out exceptional goals for the new year. In studying what reseachers say works best, I stumbled across this list.

Apparently Woody Guthrie, one of my heroes, had a different problem. He didn’t seem to do too well with setting clear priorities. He lists 33 things for the year of 1942. Inspiring given the results, nevertheless.

 

Guthrie’s new years resolutions (from the official site), 1942:

  1. Work more and better
  2. Work by a schedule
  3. Wash teeth if any
  4. Shave
  5. Take bath
  6. Eat good – fruit – vegetables – milk
  7. Drink very scant if any
  8. Write a song a day
  9. Wear clean clothes – look good
  10. Shine shoes
  11. Change socks
  12. Change bed clothes often
  13. Read lots good books
  14. Listen to radio a lot
  15. Learn people better
  16. Keep rancho clean
  17. Don’t get lonesome
  18. Stay glad
  19. Keep hoping machine running
  20. Dream good
  21. Bank all extra money
  22. Save Dough
  23. Have company but don’t waste time
  24. Send Mary and kids money
  25. Play and sing good
  26. Dance Better
  27. Hep win war – beat fascism
  28. Love Mama
  29. Love Papa
  30. Love Pete
  31. Love Everybody
  32. Make up your mind
  33. Wake up and Fight

Also see: New years resolutions that work

Challenging Newton’s Apple

Newton_appleRecently The Royal Society put their copy of the best evidence in the world about the fabled story of Newton watching an apple fall up on the web. NPR picked up the story here.

Most Americans know the violent version of the story, where Newton actually gets hit on the head by the apple. This bit was 100% definitely a fabrication, added to the story more than 100 years after Newton’s death (likely by Isaac Disraeli).

But what about the actual event? Was Newton’s entire theory for gravity inspired by an apple? Mostly no. It’s a subject I researched heavily for chapter 1 of The Myths of Innovation. Newton was not famous for discovering gravity, but for writing the Principa Mathmatica, which explained, mathematically, how universal gravity functioned. It took him many years to write this book, despite the legend of his life centering on a singular moment of insight.

The primary evidence for the “apple event” is in a book The Royal Society posted titled Memoirs of Isaac Newton, written by his friend William Stukeley. But it’s a dubious record of facts:

  • Stukeley was Newton’s admirer and interviewed Newton in 1726. Newton died in 1727.
  • Stukeley published his ‘Memoirs of Newton’ in 1752 at earliest. Meaning Newton, and many of Newton’s contemporaries, never saw the book and confirmed its facts.
  • Biographers, certainly in 1720, are not objective reporters checking facts. They are often fans of their subjects, as Stukeley was of Newton.
  • At the time Stukeley and Newton talked they were sitting under apple trees.
  • The ‘event’ Newton supposedly told Stukeley, happened 60 years earlier.
  • There are few other first person sources anywhere, in Newton’s journals or other biographies, of ‘the event’. (Please comment if I’m wrong).

In James Glieck’s excellent biography, Isaac Newton, he strongly suggests Newton offered the story as an metaphoric anecdote, as way to express his curiosity about the world, rather than as a literal tale about specific singular moment that redefined his view of things.

Now my point is not to say epiphanies never happen. Instead it’s that they rarely eliminate the hard work and risk required to manifest the idea in the world. Even if Newton’s apple event took place as the legend described, Newton still worked for years to complete his theory on gravity. The moment did not spare him from years of effort to manifest the idea born from that moment.

Frankly I don’t trust Stukeley. He was apparently a good friend of Newton’s. Just as I wouldn’t trust a biographer/friend interviewing someone famous late in their life, who somehow manages to tell only them a story about something that happened decades ago, that the famous person never mentioned in any of their own extensive journals and writings or interviews with other people. I can guess Stukeley wanted Newton to look good. He also wanted his book to be read (though the publishing history of the memoir is unclear). And in the spirit of those two things some exaggeration of facts and conversion of abstract anecdotes into real specific events would not be surprising.

In an article at The Independent, one of the few pieces this week to do research at all, offers this report from an expert at the Royal Society, which owns the manuscript:

“Newton cleverly honed this anecdote over time,” said Keith Moore, head of archives at the Royal Society. “The story was certainly true, but let’s say it got better with the telling.” The story of the apple fit the idea of an Earth-shaped object being attracted to the Earth. It also had a resonance with the Biblical account of the tree of knowledge [only the word fruit is used in Genesis, but Western imagery has made the apple icon in the tale], and Newton was known to have extreme religious views, Mr Moore said.

I’m surprised that in the history of science so few people have raised any questions at all. I’d love to see the web help me round out the facts, find experts and other familiar with the sources. Spread the word.

How to Stop Overcommunication

Spolsky’s latest piece is about Brook’s law, and how adding people to projects can make them worse.

For those unfamiliar with it, Brook’s law states that when you add a person to a project, you geometrically increase the amount of communication people have to manage, suggesting it’s a bad idea. While I agree with the law, there are important exceptions I’ve identified – depending who the person in question is (elite or bozo), how good they are at jumping into tough territory (ninja or bozo), and how much they already know about the project (familiar or bozo newbie). Spolsky’s points are generally sound, but I believe there’s a deeper cause for over-communication.

The reason committees are so miserable to work with is authority is distributed across a large number of people. This makes everyone feel like everyone needs to know about everything. And worse, people fight in the backroom to obtain control over the committee, so the visible authority and real authority can be far apart.

Over-communication is a symptom of lack of clarity over power. If you want better communication, clarify the following:

  • Who is the single person who has decision making authority for decision X
  • Who should have input into that decision
  • Who should be informed when the decision has been made

This sets everyone’s expectations for who needs to know what. It reduces endless forwarding of fyi material on the hopes someone might need it.

The person with decision making authority should be collaborating with others, and can delegate their authority, but no one should ever be confused that they have the power to make the call.

45 people can not effectively make a decision together. But 44 people can council one wise, empowered person to make a more effective decision.

Like Spolsky, I agree things would be better if there were 5 people in the room, instead of 45, but the clear distribution of power is the problem I’d solve first.

Countdown to 1000 posts

This post, according to WordPress, is #952. I have about 50 more to go to hit the 1000 mark.

Since my posts tend towards new material, rather than just a link and a sentence, this is a shitload of words.

I’m grateful to all the folks who subscribe, read, forward, comment and even snark here, as this blog has been a critical part of my successful independent life so far.

I’d like to do something fun here when I hit post #1000. Open to suggestions – leave ’em in the comments. Thanks.

How to be passionate (when you open your mouth)

Vijay recently asked in the comments on a recent talk:

Thank you for a great presentation.  I noticed that your energy was explosive and there was absolutely no point in the presentation where I could detect a lull. I am interested in learning if you have any secrets or techniques in maintaining the focus of not just the audience, but also yourself as I often space out even when I am working on something that I am passionate about.

Explosive energy makes me think of being a drummer in Spinal Tap. Perhaps I should tone it down.

There are four things going on.

  1. My life is at stake.  I have bet I can make a living on my ideas and my ability to express them. I have no guarantees, no salary and no pension. Every time I write a blog post, a book or a give a talk I’m basically an entrepreneur. I’m not half invested. This isn’t a side project. THIS IS IT. I need people to buy my books, hire me to speak, and to tell others about me. When you’ve invested your heart in something, it’s much easier to appear passionate about it, because you are.
  2. I believe what I say. I really hate phony people. I hate people who water things down, intentionally mislead, or pretend they care about things they don’t. How much of what is said at work do people truly believe or care about? I think very carefully, and long, about most of what I create, and so when the time comes to give a presentation, or write a book, my points are things I truly believe.  And I’ve worked hard to make them concise. I’m not holding much back because I know it’s easier to get excited about things you deeply believe, especially if they’ve been boiled down to their essence. If you asked me to talk about my favorite tax software, or which 401k forms I liked the most, passion would be hard to find.
  3. I’ve extended my range.  If you can only play one note on your guitar, you can’t do very much. Musicians, especially singers, practice to extend their range. Most speakers have a narrow range. They only know how to get from volume level 4 to 5. If you practice, and listen to other great speakers carefully, you’ll notice how wide their range is. They can whisper (volume level 2) or almost holler (volume level 7).  You also have a range of gestures, and postures, and facial expressions. The wider your range the more tools you have to express passion, or curiosity, or humor, or anything. You extend your range through practice and coaching. I never want to be too passionate, as it’s easy to sound like a preacher on cocaine or Billy Mays. Instead my goal is to be at high level of enthusiasm  without crossing over into annoying.
  4. I have great respect for anyone who voluntarily listens to me. Speaking and writing are very subjective, and I know that reasonable people might not like me, or what I have to say. But their sense of how much energy and effort I put in is something undeniable. I never want to be dismissed by people for not being sincere. They can hate me, prove me wrong, heckle me, whatever, but at the end of the day I don’t want anyone leaving the room, or finishing one of my books, feeling like I gave half an effort.  Frankly any speaker is burning way more calories per second than any listener, but that’s often forgotten by most listeners, it’s a consumer’s market when it comes to things to consume.

Hope that helps. Let me know if it doesn’t.

For reference, here’s me speaking at Ignite:

The Limits of Innovation

My post on why the future of UI will be boring has upset people. I clarified that although my prediction is depressing in a way, I don’t make it because I like it. It’s simply that I believe the most honest view of the future is that there are limits to progress in everything, we just don’t know what they are.

One of the best critiques of what I wrote is from Baldur Bjarnason:

What I’m arguing is that automobiles are specialized machines and that automobile users (AKA drivers) have years of highly specific practice which creates a lock-in quite unlike that you see in computing. Computers are different sort of tools with different sort of uses. Umbrella handles and door knobs do not require skill or training and have converged on a simplistic optimal design.

The lack of change in the industries you cited have nothing to do with what might hold back innovation in computing. That is, they did not prove your point in any way. They were non sequiturs.

They are non sequiturs only superficially. I believe there is a pattern of limits on change that effect most industries, ideas, and things. Physical interfaces in particular have natural limits on how they are likely to work. It’s possible to overcome them, but it’s unlikely.

Here are some more seeming non-sequiturs, but bear with me for a moment.

There is a reason all airplanes have wings. The contact point, or interface, between the plane and the air has certain unavoidable properties. Although wings have changed much in 100 years, they are fundamentally the same kind of interface model, or metaphor. Why? The fundamental principles of physics at work have not changed. If someone asked, as Gruber did “why have we had the same interface model for airplanes for 50 years?” I’d give the same answer. And yes, I know we have helicopters . Arguably a chopper blade is a kind of wing, but that’s a thin, ha ha, pun intended, argument. A better argument is this: have they replaced airplanes?

And even if we invented a superior wing that was all loops and circles and made of lightweight edible alien space cheese/steel, how long would it take to for the majority of airplanes to use that new design? 10 years? 30? It takes a long long time for people to replace things in sufficient quantities to become the dominant design. It’s expense too. And it’s not great for the environment to replace working things with new things that are only marginally better. It may blow your mind to consider the history of innovation in electric plugs: why so many that do the same thing? the social costs of change outweigh the thrill of “innovation”.

Cars and trucks have wheels, a wheel being perhaps the most efficient way to bring power from the engine into reliable contact with a road. It’s a safe bet, given the fundamental nature of roads and engines, that wheels will be with us for a long time, even as cars change greatly in other ways. Wheels are about 3000 years old. I’m doubtful we will innovate our way out of using wheels for most heavy moving objects. The wheels might be made of unobtanium, or fried chicken, or there might be two wheels or 40, but wheels they will be.

When it comes to desktop computing, the human body is the interface. Given the constraints of our fingers, hands, elbows, and eyes, and our need to use character based languages, plus the dominant activity of reading and writing when at our desks, there are constraints on what kinds of designs can work well.  Minority Report UI and VR are un-likely to ever be dominant for ergonomic reasons (still looking for any evidence from makers of these things that they are ergonomically sound) – fun to watch, fun to play with, good for niche situations, but if you did it for hours a day your elbows would melt. They also don’t do much to improve email, or reading web pages.

Odds are good most change will occur in places where there are fewer physical limits.

I’d concede there is more room for UI innovation in mobile computing, or even in gaming, than desktop computing, simply because unlike the desktop, we’ve spent much less effort designing for those physical environments and interactions. There is way more uncharted territory to cover, but even there, the human body is one unavoidable constraint to big ideas. New things born from dreams inevitably interact with the limitations of hands and eyes, constraining the range of designs likely to be successful.  And even if those mobile and gaming devices see radical new discoveries, only some will transfer back to the different constraints of desktop computing.

Shell wrote, in the comments:

Will we still be using keyboards in 20 years time? Probably, in some form. We also still use pens and paper after 1000s of years too.

The UI of a pen is a great example. The reason why pens are the same has less to do with what’s possible technologically, or that there aren’t amazing ideas out there, but more to do with with the limitations of good interfaces for human hands. The highest-tech pen I’ve seen, the LiveScribe, which has a ridiculous number of interesting ideas in it, has the same core physical interface and interaction model of the ball point pen I used in high school.

And the last argument for the limits of innovation has  to do with human nature. Why we choose to adopt things is not a logical process, and is fueled by culture, psychology, timing, and a dozen factors, many which have little to do with new idea X being better than old idea Y in technological or design terms. Those are terms technologists and designers obsess about, despite history’s strong suggestion that those factors are overestimated in their role for what becomes dominant, and when.

The simple question of what it would take Apple, Microsoft and Logitech, three companies that dominate the business of keyboards, mice, and GUIs, to abandon those well understood designs and businesses in favor of something new. Or for a new venture to choose to compete against the entrenched powers of those three. And then to be successful at it beyond a niche capacity. The human elements around adoption of innovation are just as formidable as the technological or design ones, especially when we’re talking about the wholesale replacement of one metaphor for doing things with another.

Progress is great. Show me something better and I’ll champion it with all my heart. Explain to me the problem we need to solve and I’ll advocate for its elimination at the top of my lungs. I’m thrilled to see experiments, and risks and people who say damn the odds I’m building it anyway. I’m a damn the odds kind of guy. But before we herald anything as the next whatever, lets be honest about what we have, what the real problems are, what’s involved in change, and what’s likely to happen. I can’t see any other credible way to improve the odds of progress.

Live, Free webcast TODAY on speaking

(Updated: this is today. it’s free. If you’re in Seattle you can come down to creativetech and watch live. )

This should be fun – the folks at creativetechs have a fantastic tech setup for webcasts which promises to be higher quality than most. If you saw the last webcast and enjoyed it, please help spread the word.

You can tune in from anywhere in the world, or if you’re in Seattle come on down and watch live.

Making Interactive PDFs

Learn inside tricks and hear entertaining stories that will make you more persuasive and less nervous when you speak.

If you

CrowdSourcing Star Wars: A new hope

This is uber-clever – Star Wars Uncut. They divided up the original star wars movie into 15 second chunks, and anyone can sign up to recreate the video for any section.

Star Wars Uncut

The UI is a little funky, but click on any scene with a blue box, and watch the version someone made. If you want to grab a scene, click the Find a scene button near the top left.

It’s a total patchwork – some are reenactments. Some are hand drawings. Some are downright strange. But I will definitely give it a watch when it’s done.

I suspect it will be hilarious for 5 minutes, interesting for 10, and then I’ll skip around for a minute, and then do something else.

How do you get motivated?

Note: In a series of posts, now called readers choice, I answer reader questions.

I’m going to cheat here, as I wrote a nice tasty essay on this very question: How to stay motivated – give it a spin. I think you’ll like it.

I think “how to stay” is a better question, since I know many people who are great at starting something, but once the initial wave of enthusiasm wanes, and the easy/fun parts are done, their interest fades. For me I gain motivation by being committed for the long haul. I don’t care if I get a bad review, or a tough thread of comments on a post, as long as I learn something. I don’t care if I fail, provided I grow. I’m focused on the 50 or 80 year old version of me and how I’ll feel when I look backwards. Given that view, many of the things that upset or discourage other people seem to have slightly less impact on me.

I work hard to put things in the long view. A paragraph isn’t just a bunch of sentences, it’s part of a book, or a body of work. Just as a brick isn’t just a pile of mud, it’s part of a cathedral, or a school, or a monument to some great cause.I have an empty shelf on my bookcase for my books. Filling that shelf is my life goal. If ever I’m confused about how to prioritize work or why I’m working, there it is. On a personal level I work on the elimination of distraction theory. It’s not so much about whether I’m motivated or not, it’s how good I am at preventing myself from other things. Motivation isn’t a problem you have if you are starving and need to eat, or are cold and need shelter. You just do it because it must be done.

This sounds tough, and it is. I don’t know any novelist or marathon runner who debates every time they start whether they’re going to do it or not. They try to reach a point where it’s assumed they’ll do it, even when they’re not motivated. Discipline and motivation are tightly coupled for me.One big trick was to quit my job. If you have to do X to make a living, motivation becomes less of an issue. It simply IS. If I want to keep writing, I have to write. End of story. When it comes to tasks that are “hard”, like writing, I eliminate other variables. I close the door, I close the web browser, and promise myself I will either sit and write, or stare at a mostly blank monitor for an hour.

Given the choice, much like starving, eventually my mind would prefer to actively write, rather than sit and stare at nothing. So I write. I also believe in the theory of daily practice. Anything truly important is worth doing once every day, even if just for 5 minutes. If I’m actively writing a book, I must look at it or work on it once every day. Then I never have to worry about thinking about it. I Just do it, in the same way you go the bathroom or eat meals. It just IS. 5 minutes of doing is much better than a hour of thinking about doing. Rituals of this kind are good as they spare you the burden of inventing reasons every day.

If you make a new years resolution, part of it has to be to do that thing once a day.Another trap is the zero-sum problem: When someone tells me they have a wish, or a new years resolution, I ask what are you taking off of your plate to make room in your life for this new thing?Maybe it’s less TV, or less aimless web browsing, but motivation is easier if the choices are clearer. If you don’t make room, you’re letting your motivations compete with each other, and that can often have side effect of negating them completely.Right now I need to follow my own advice. When I’m between books I’m all over the place and it takes weeks to find my center and rhythm again. But, I see this more as a problem of discipline rather than motivation.

Anyway, do check out the essay How to Stay Motivated.

A rant about women

Clay Shirky has a rant up about women and the gender based expectations of behavior. The general theme is about how he’s noticed men are typically more self-aggrandizing than women, and that women would be better served if they represented themselves more aggressively.

As Shirky’s work usually is, it’s a good read. The comments are rough, as folks are responding less to his sentiment (there is an unfair problem that should be fixed) and more to the specifics of his arguments, which, given the title, are easily read in a gender-biased way.

But there’s one passage that bounced around in my mind more than others:

And it looks to me like women in general, and the women whose educations I am responsible for in particular, are often lousy at those kinds of behaviors, even when the situation calls for it. They aren’t just bad at behaving like arrogant self-aggrandizing jerks. They are bad at behaving like self-promoting narcissists, anti-social obsessives, or pompous blowhards, even a little bit, even temporarily, even when it would be in their best interests to do so. Whatever bad things you can say about those behaviors, you can’t say they are underrepresented among people who have changed the world.

First and foremost, I have met plenty of women who act like self-promoting narcissists. I’d never say this was a common trait, but I have definitely seen it. Someone in the comments suggested going back stage at a beauty show or field hockey game. I’ve seen it in every job I’ve had, both in men and in women.

Second, I’m not sure changing the world is worth the price of having to work with a person of any gender who is a pathological asshole or pompous blowhard.

Third, the correlation between being an arrogant self-aggrandizing jerk and changing the world is not the cause of their impact. In most cases their lack of willingness to compromise, combined with superior ideas. You don’t need to be a narcissist, you just need to be confident and effective. Being a jerk is perhaps the worst way we have to be confident and effective.

Also, there is changing the world vs. being famous for changing the world.  The later is the one where all the self-aggrandizement and narcissism matters more.

I think it’s a fair bet to say most people, most of the time, struggle with the line between self-promotion and self-respect. Much of the advice on how to balance this applies well to both genders I think.

This isn’t to say that women do not have unique challenges, or a more difficult line to walk, it seems clear they do.