cooper

Alan Cooper

Alan Cooper is known for his role in humanizing technology through his groundbreaking work in software design. Widely recognized as the “Father of Visual Basic," Mr. Cooper is the author of the books, About Face 3: The Essentials of Interaction Design and The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity. As founder of Cooper, a leading interaction design consultancy, he created the Goal-Directed® design methodology and pioneered the use of personas as practical interaction design tools to create high-tech products that address user’s needs.

Gold rush

I've been watching the new hit TV show "Gold Rush," about amateur gold miners in Alaska and the Yukon. Their struggle to find gold reminds me of the quest for innovation in technology companies. It's interesting to compare the two quests.

Illustration by Scott Cooper

In Gold Rush, a semi-documentary, semi-reality show, big, burly men battle the elements (and sometimes each other) to find gold in the endless miles of wilderness in the 49th state. These days gold is around $1500 an ounce, so a couple of handfuls is all these guys need to have a successful mining season.

Often, all a new technology company needs to become a juggernaut is a couple of handfuls of invention, a few ounces of insight. Google, for example, didn't invent search, they simply added the brilliantly simple idea of ranking search results by the number of references they found. Building their massive search engine and finding a way to monetize their service remained a huge task, but the innovation was just a single nugget. Ironically, the Gold Rush miners almost never work directly with gold. The big problem in gold mining isn't the gold itself, it's dealing with everything that isn't gold. All of their attention and equipment is focused on the not-gold. While they dream of a few handfuls of yellow metal, their day-to-day world is dominated by countless tons of everything else. For the miners to collect a few ounces of gold, these tough, XXL guys have to bulldoze acres of forest, pump rivers of water, dig tons of rock, and move mountains of dirt. They need giant tractors and huge excavators. They need rock and sand sifting machines the size of houses. They also have to contend with trees, wild animals, harsh weather, cash flow, fickle girlfriends, and internecine friction.

Most of what goes on in innovative companies is the simple hard work of designing, coding, and deploying software. It's the quotidian blocking and tackling of everyday business: finding bugs, getting the pixels right, answering the phone. One seed pearl bright idea can occupy a technical team for a year or more, building software and shoveling an endless wilderness of bits. Regardless of the creative brilliance, building a company or a product is mostly just hard work.

The Alaskan gold is just lying there, pure, untarnished, ready to be picked up and sold. They don't have to coerce or cajole it. They don't need to identify or interpret it. Gold is easy to spot, but it rarely comes in a big, fortune-making nugget. It comes in millions of tiny flakes, deposited over the millennia in ancient stream beds.

Innovation is often the same, made up of thousands of tiny shards of creativity. Like gold, creativity rarely comes in giant dollops of obviousness. It tends to arrive in many tiny increments, only the whole of which add up to something revolutionary. So, while the miners have to discard ten-ton boulders, the gold flakes hiding underneath must be handled with exquisite delicacy.

Like mining gold, the quest for innovation is dominated by what isn't innovative. Mostly it's cubicles of conventional work, and it's easy for the delicate innovation to be inadvertently smashed by some hard-rock business process. Just like gold mining, business demands a deft combination of brute force and subtle precision, of massive infrastructure and sensitive awareness.

If you visit a gold mine, you won't see very much gold. If you visit a very innovative company, you won't see crowds of shock-haired Albert Einstein's riding around on Segways reinventing the space-time-continuum. You'll see teams of young men and women working hard at mostly mundane tasks, moving mountains of information, winnowing their way to something of immense value. What lurks there is a respectful awareness of the unique nature of creativity, and how to nurture it. Managers who want innovation don't need to demand it, they merely need to not let the mountain moving of commerce obscure the precious, delicate, dust of invention.

What do you think? Join the conversation in Comments

The upper bounds to quality

The digital age changes our notions of quality, and in particular, our notions of the limits to quality. Generally, there are two limits to quality: The first limit is your imagination. If you are innovative, you can increase quality in many creative ways. The second limit to quality is what the customer will pay for. If your product is priced too high, even if it is of super high quality, you won't be able to sell many.

These two limits to quality have existed since a caveman traded away a stone knife for the first time. The more it costs to make a product, the higher price you must charge for it. Economists call the tension between cost and price, "Elasticity."

The elasticity concept has been around since that caveman, but in the digital age, its power is draining away. That's because the notion that price is dependent on cost is an assumption based on the character of industrial manufacturing.

Actually, there are two distinct components to cost, "Fixed" and "Variable." Variable costs are those tied to each individual product you make. This includes the raw materials, labor, and transportation of each object. Fixed costs are all the other costs that cannot be tied directly to a unique object. Typically, these include design, engineering, marketing, and administrative costs.

In the industrial age, just as in the days of the caveman, the variable costs were a much larger portion of the total cost than were the fixed costs. That's because design and administration is cheap compared to purchasing, transporting, and transforming steel, aluminum, glass, plastic, and energy. A washing machine, for example, might have taken a dozen engineers six months to design, but it took tons of steel, hundreds of people, railroads, mines, and factories to build those washing machines.

For the washing machine company, elasticity was strong because the variable costs were far, far greater than the fixed costs. The clever business person always paid more attention to driving costs down than to raising quality, simply because cost reduction had such powerful, direct downward leverage on price. Certainly higher quality exerted an upward pressure on sales, but it was offset by the need to raise prices to pay for it. Most customers choose a value compromise, where quality is adequate and price is low. This strong elasticity cemented into business thinking the industrial age idea that quality is expensive. But that relationship has now changed, and quality is no longer so expensive.

The digital age has inverted the relationship between fixed and variable costs. Fixed costs are now usually greater than variable costs, and this dramatically changes the role of price elasticity. When a product is made out of bits, there is no cost to purchase, transport, or transform anything! There are little or no variable costs that can be tied to each individual object for sale. Yes, the cost of transforming bits into coherent software is expensive, but it isn't a variable cost. The expense of design and programming is the same regardless of how many copies you sell.

When price elasticity weakens, the upper boundaries to quality relax and take on a different character than in industrial times. When variable costs drop to insignificance compared to fixed costs, it means that price can drop to insignificance, too. This can be seen clearly in today's market where the most successful companies, such as Google, Facebook, and Twitter, provide their products for free.

When price doesn't dominate the purchase decision, quality does. When every company's offering is free or nearly so, the customer is free to choose based solely on the quality of his or her experience in using the product.

The two limits to quality are still there, but in the industrial age, cost held your imagination in check. In the digital age, your imagination is free to expand without limit. It really doesn't matter how much time, money, effort, or imagination you invest in your digital product, as long as what you make delights your customers. They will certainly be able to afford it, so you just have to make them want it.

In the digital age the upper bounds to quality are only the upper bounds of your imagination. If you and your colleagues can think more creatively and innovate more effectively than your competition, you will succeed. The more desirable your product is, the less each day of invention will have cost you. In other words, your costs shrink to insignificance when you drive your desirability way up. Really clever post-industrial managers don't pay much attention to costs. Instead they exhort their people to better and more desirable creativity. That is the path to post-industrial success.

What do you think? Join the conversation in Comments

Descent into irrelevance

Microsoft's upcoming OS release, Windows 8, will finally replace a vital component that has remained largely unchanged for the last 30 years. It is the BIOS, and it has faithfully performed a simple but vital function: isolating the operating system from its underlying hardware. It quarantines all hardware-dependent code in one location with a publicly defined interface available to the rest of the operating system. BIOS is an acronym for "Basic Input Output System." It was invented way back in the 1970s by the brilliant computer scientist Gary Kildall, and was one of the more important conceptual breakthroughs that led to the success of the personal computer. Bill Gates copied the BIOS idea in MSDOS and built his company on its strength.

Thirty years is a long time even for brilliant software, and the Windows BIOS has become both a security liability and a performance limiter. It is past time for a replacement, and the upcoming Windows 8 will ship with a UEFI instead of a BIOS.

The Unified Extensible Firmware Interface, or UEFI, is much smarter than the old BIOS and, in particular, it can detect boot-time malware. Of course, one way to define "malware" is "any other vendor's product." Microsoft has said that it will not use the UEFI to block legitimate software from other companies, but fears are rising in the industry that it will do just that.

Because most of the UEFI needs to be implemented by hardware vendors, it will be written and deployed by third-party developers, not Microsoft itself. If these third-parties want to earn Microsoft's compliance certification, they must follow stringent guidelines. If these guidelines are followed, operating systems other than Windows 8 will not be allowed to boot up. In other words, if your computer, pad, or mobile is running the Microsoft OS, it will not run Linux or any other vendor's OS.

This is not a particularly onerous limitation for about 99% of the human race. Very few people want to mess around with their computer at the operating system level. It's complicated, dangerous, and unnecessary to do so unless you are a programmer. Ah, but if you are a programmer, it raises a significant question.

Programmers may not be large in number, but they are certainly large in influence. In the 1990s Microsoft rose to overwhelming dominance of the industry for the simple reason that it catered to the needs of programmers. What programmers believe is true affects what other people in the software industry believe, and they, in turn, influence everyone else. If programmers didn't believe in Microsoft, then Windows would rapidly lose its hegemony as a platform.

In the last few years I've seen a remarkable thing: development shops using Linux hosted on Apple computers instead of Windows machines. I wrote about this almost a year ago on my personal blog. If I were Microsoft, I'd be very worried about losing influence in the developer community. Yet, with UEFI, it seems Microsoft is making it problematic to run Linux on Windows, and this may alienate even those programmers loyal to the Windows platform.

I'm sure that executives within Microsoft look warmly on the UEFI as a powerful mechanism for combatting what they view as competition. Too bad for Microsoft that the programming community doesn't see it that way. This move could be Microsoft self-administering their own coup de grâce, sending their remaining stalwarts into the arms of Apple, and accelerating Microsoft's descent into the irrelevant.

(Thanks to @BobMacNeal for technical editing)

What do you think? Join the conversation in Comments

The inside view and the outside view

It's easy for business people to forget about the great difference between the inside view and the outside view. That is, the experience customers have with software systems is enormously different from the experience business people have deploying those systems. This means that making an otherwise good business decision about software systems can have terrible, unforeseen consequences.

The Netflix company just learned this lesson the hard way. It doesn't take a rocket scientist to see the progression from VHS tapes to DVDs to streaming video. Netflix built its business by renting DVDs when the competition was still renting clunky VHS tapes. Just a few months ago, the company decided it was time to get a similar head start on the next new technology, but they failed to look at the outside view when they crafted their solution.

They split off the portion of the company that provides streaming video from the older, DVD-supplying part. From the inside of the company, this looked like a really good idea and, from that perspective, it was. It allowed Netflix to offer their streaming video service to customers unencumbered by the older technology. The problem is that this doesn't reflect the point of view of their customers, the outside view.

My wife and I have been happy Netflix customers since they started. We rent DVDs and also stream video from them. As my wife so succinctly said, "I want to go to Netflix to get movies, not to one company for DVDs and another for streaming video." Her sentiments neatly encapsulate the outside view: subscribers think about Netflix as a provider of motion picture entertainment, not as a provider of some particular media.

Netflix learned a hard lesson in the importance of looking at things from the user's perspective, rather than just from their own internal one. This little hiccup has cost them 810,000 subscribers and their market cap has dropped by over a quarter just in the last three months.

In the old days when the variable costs of manufacturing dominated income statements, what was good for the company was usually good for the customer. Today, when the experience of people is far more important than the cost of raw materials, business managers need to focus on their users, their employees, and their stakeholders, and not on their internal business processes. The way to success is by making customers happy, even if it means more work inside the company's walls. The only way to please people is by carefully studying their outside point of view.

What do you think? Join the conversation in Comments

The pipeline to your corporate soul

As a business person, you may consider your software to be an operational tool, part of the sales or operations of your organization. But to your customers, it is a pipeline to your corporate soul. The behavior of your software indicates what is really valuable, what is truly important to your company, and there is really no way to hide.

Websites let your customers access your products and services, but as a side effect, they also access your corporate values. If your website is clumsy or slick, easy or confusing, it tells them a story.

Most clients hire Cooper to solve superficial problems. When they first approach us, they ask us to help make their websites “be more friendly” or their software “easier to use.” Sometimes they just want us to “make it pretty.” In every case, we find that hard to use, unfriendly, or even just ugly software is a symptom of deeper problems within the organization.

The culture of fear

Franklin Delano Roosevelt, in his inaugural address as the 32th President of the United States, uttered his now famous phrase “The only thing we have to fear is fear itself.” How right he was.

He further identified his target as “nameless, unreasoning, unjustified terror.” He spoke early in 1933, during the darkest days of the American depression, when millions were out of work, no safety nets existed to help them, and there was no recovery in sight. What’s more, the specter of European Nazism, with its saber rattling, and strident, irrational racism, was waxing. In the face of these actual reasons to be afraid, Roosevelt fingered the real danger: irrational fear; fear for its own sake; being afraid simply because it’s easier than not being afraid.

Largely, the nation heeded Roosevelt’s admonition. We refused to succumb to fear, the economy recovered, we vanquished our foes, and emerged as the world leader for the rest of the 20th century.

Unfortunately, in the 21st century, we have quite failed Roosevelt. We have become a terrified nation and live in a culture of fear. We act afraid and we let baseless fear drive our choices. Mutual trust is the basis of civilization, and our nameless, unreasoning, unjustified terror is unraveling the fabric of our society.

Back to the future with bookstores

The old saying, "History repeats itself" seems to be true in the recent history of book selling.

When the big chain stores of Borders and Barnes & Noble moved into town, the local independent bookstores all quaked in fear or squawked in high dudgeon about how the soulless giant franchises were ruining the business.

borders bookstore Borders failed to compete with Amazon and has since filed for bankruptcy

But the chains taught the independents a valuable lesson: that some books were a commodity. The price and availability of New York Times bestsellers was more important than was the sales clerk's expertise.

The weaker independents closed their doors while the big chains grew fat and happy. The surviving independents continued to disparage the big chains, but the chains delivered a better experience. They added cafes, benches where you could read for hours, and offered a much larger selection of books.

Then the World Wide Web came along, and after some initial jockeying for position, Amazon emerged as the Internet bookseller to beat. Now the shoe was on the other foot. The big chains squawked in righteous rectitude about how they couldn't compete with a company that didn't need to invest in bricks and mortar.

But Amazon taught the chains a valuable lesson: That all books were commodities if you already knew what book you wanted, and it was easier to purchase online, and the online vendors could stock far more titles. What's more, the supporting information on the Web was far more valuable than anything a harried, youthful sales clerk could offer.

Both Borders and Barnes & Noble took huge body blows as the new business model assaulted them, but the Web delivered a better experience. Barnes & Noble created their own online presence and has managed to stay in the game. Borders, however, not only failed to grasp their role in their brick-and-mortar world, but they foolishly gave their online business to Amazon, and so filed for bankruptcy last month.

You can't save your way to innovation

What's wrong, you might argue, with keeping costs down? Quite a bit, it turns out. If your objective is to design a product people want to use, or to invent something brand new, you must embark on a journey of creativity and innovation. That might seem like normal, every day business, but don't make the mistake of trying to run your creative organization like a conventional one.

Business sage Peter Drucker asserted creative employees "are not labor, they are capital." This has profound implications on the way you should manage and account for your business. As Drucker also asserted, "What is decisive in the performance of capital is not its costs, but its productivity."

In other words, if there is something you can do to enhance the creative abilities of your people, it doesn’t really matter how much it costs, or how long it takes. If it results in a successful invention, or a compelling design, that’s what really counts.

Business people trained in industrial age thinking cut costs from force of habit. After all, expense reduction was an excellent strategy when manufacturing costs were dominant; they are easy to measure and provide instant benefits. In the post industrial age, manufacturing costs are neither dominant nor elastic, so reducing them reduces your quality without improving your desirability. Today, trying to make your product cheaper just makes it frustrating to use and unlovable without making it any cheaper to buy. It’s no longer a valid competitive strategy.

Subject: Error message when I try to save my PowerPoint

When you work at a design company you are surrounded by designers. They are all intelligent, perceptive, have a great sense of humor, and they often indulge in good-natured ribbing. They also have Photoshop skills.

There's always laughter in the hallways, funny pictures on the walls, and occasionally the funny pictures make it to email. Here's an amusing exchange that took place on an internal email thread a couple of weeks ago.

One of our smart designers, Golden Krishna, was rebuffed by Microsoft Office with a particularly unhelpful error message. Grabbing a screen image, he clipped it, pasted it into an email, and posted it to his colleagues. Several members of the staff immediately pounced.

Here's what Golden first posted:
PowerPoint found an error that it can't correct. You should save presentations, quit, and then restart PowerPoint.

He was being sardonic, pointing out just how unhelpful error messages can be. Another smart designer, Glen, responded immediately with this doctored error message:
PowerPoint is a piece of crap. You should stop using PowerPoint.

And that opened the floodgates.

Innovation is a waste disposal problem

“The way to have good ideas is to have lots of ideas.” That’s one of my favorite axioms and, in my experience, it is universally true. I have many ideas, every day, and some of them are very good. Mostly, though, they are bad.

A small fraction of my good ideas made it to market, but time spent on good ideas is never wasted. There’s always abundant insight to be gleaned from working on a promising thought, and sometimes working on an auspicious idea can lead to other, even better ones.

I’ve wasted plenty of time, though, pursuing my bad ideas. The time and attention I’ve invested in bad ideas in the quixotic hope that they will somehow morph into good ones has been, by far, my biggest waste. Not only did it cost me time and effort, but I could have been working on something much better instead.

Pursuing bad ideas instead of good ideas is a significant and largely hidden problem of innovation. Economists call the waste “opportunity cost.” It's the cost of what you didn’t do while you were busy doing something else. That is, what good idea did you ignore while you were busy working on a bad idea. I would argue that opportunity cost is the most expensive in all of business.

The obvious solution is to only invest time on good ideas, but that isn’t a realistic solution because of the conundrum of innovation:

Bad ideas often look really good in the beginning, and that’s when good ideas almost always look bad.

For example, the people who worked hard on the Microsoft Zune really thought at the time it was the best music player ever, and many observers of Google a decade ago thought it was just another silly Web startup with an equally silly name.

Frankly, it’s really difficult to find good examples of this phenomenon because of some very powerful cognitive illusions. In hindsight, all good ideas look good and all bad ideas look bad, even though this is not at all the case in the heat of the moment.

Change is good only when it's great

I just changed from a Wintel machine, which I've used for over 20 years, to a Mac. I had dragged my feet with Office 03 so long that people were starting to notice. I no longer could put off upgrading to the "new" Office interface.

Yes, I do not like the ribbon, but that really wasn't the problem. The real problem was that the changes Microsoft made to the Office Suite accomplished nothing and yet came at a high cost.

The new Office UI is very different but is not better. That is a complaint only old farts make (because they know the old ways), so Microsoft can just move ahead ignoring it. I wrestled with it for awhile, and then I figured, if I have to learn something new, why not learn Mac Keynote? I tried it, and found it was a modest improvement over PowerPoint, but that it didn't aggravate me so much because I no longer expected it to behave the same as the old version as I did with PowerPoint.

Pip Coburn, in The Change Function, says that users will change when the benefit of changing is greater than the perceived pain of making the change. That's the operative element here. There was no benefit and lots of pain. Microsoft didn't improve PowerPoint, they just moved the deck chairs around. That's pathetic and not the behavior of a market leader. FAIL.

Just for the record, I reject the argument that it is a zero-sum game between experienced and new users. That trade-off does exist, but only when physical manipulation is involved, such as twitch games, aircraft controls, and the like. Good UI is, in general, good for both experts and beginners alike.

I do not believe Microsoft's assertions that the ribbon is easy to learn. If you feed someone rotten fishheads for a while, then switch them over to a diet of fresh fishheads, they will be happier. You can then tout the statistical "fact" that "users prefer fresh fishheads," even though the truth is that they HATE fishheads. That, I believe, is how Microsoft gets its rationale for UI changes.

What do you think? Join the conversation in Comments

Will Ford learn that software isn't manufactured?

Ford Motor Company has just convincingly demonstrated that being an excellent industrial manufacturer doesn’t automatically mean that you are an excellent maker of digital technology. Despite Ford’s improvements in manufacturing quality, their overall ratings fell precipitously this year due solely to the poor software interaction on their dashboards. A recent article in the New York Times discusses Ford’s plummeting fall in user rankings this year, focusing the blame on their new touch screen interface.

Ford Display
MyFord Touch on new Ford Edge—heavily criticized in J.D. Power's research and "frustrating" according to Consumer Reports.

According to the article, J.D.Power, the auto industry arbiter, dropped Ford’s ranking from 5th to 23rd, and subsidiary Lincoln’s ranking from 8th to 17th place. J.D.Power acknowledges that both Ford and Lincoln’s fit and finish are excellent. It was the “annoying” behavior of their driver-facing interactive systems that caused their ratings to plummet. Other reviewers concur, as Consumer Reports yanked their “Recommended” rating from Ford’s new 2011 Edge model.

If users could lead innovation, they wouldn't be users

The recent post on Co.Design by Jens Skibsted and Rasmus Hansen, “User-Led Innovation Can't Create Breakthroughs,” echoes something I have said many times, and I agree with their conclusions.

The article has caused a furor in the interaction design community mostly because it has been misunderstood. That is less surprising when you realize many interaction design practitioners misunderstand their own practice.

Many professional interaction designers and other practitioners interpret the phrase “user-centered design” to mean they should ask the user what to make. This is not what the phrase means, and misconstruing it that way can lead to tremendous misdirection and waste. This is the error that Skibsted and Hansen highlight.

There is a large and growing body of evidence that users don’t know what they want, don’t know what the medium is capable of delivering, and are not quite incapable of imagining something new, useful, desirable, or innovative. What’s more, there is ample evidence that the users are entirely ignorant of their inabilities, yet will happily give their flawed answers with unequivocal emphasis.

Rather, it is the job of the integrated development team or, if in a siloed world, the interaction designer, to answer the question of what to make. Of course, the designer should avail him or herself of all of the intelligence available, which will naturally include observations and interviews of the user. But the results of those interviews is to the design solution as grapes are to wine: raw materials that must be transformed by expertise into a palatable product. The apparent conflict of interviewing users yet not following their suggestions is confusing to many undertrained practitioners, and their resultant miscues are what the authors rail against, as well they should.

I have addressed the dilemma of asking the users in both of my books, in presentations, and in various posts over the years. One of the most accessible is a brief and impromptu interview that was recorded several years ago. I had just delivered a talk at the Patterns and Practices conference in Seattle, when @scobleizer (Robert Scoble) poked his camcorder in my face and asked me several questions. The resulting mini-interview has been widely viewed on the Internet. What I said to Robert was very similar to the assertions by Skibsted and Hansen.

The Fast Company article has generated a furor on the interaction designer chat boards because the wording of the article is broad enough to be interpreted as a slam to the usefulness of all interaction design. I certainly disagree with that interpretation, but practitioners bring this criticism on themselves by their own less than rigorous practice.

Ikea and Apple may not ask their users what they want, but they sure work diligently to understand what their users want. There is a world of difference between the two.

What do you think? Join the conversation in Comments

Integrating solve and do

The industrial age divided our world into white-collar and blue-collar workers. Those with white collars went to college, worked in an office, solved problems, and made decisions. Those with blue collars went to high school or trade school, worked in a factory, performed work, and followed orders. “Solve” was separated from “do”.

But in our contemporary world of knowledge workers, very little of it can be teased into separate “solve versus do”. Today, doing is an integral part of solving, and solving is an integral part of doing. We are all “no-collar” workers: smart, well-educated, solving problems, and performing work.

The successes by state-of-the-art practitioners in both agile development methods and UX have given rise to a desire for more effective collaboration. Once a programmer has seen what well-applied agile methods can accomplish, she soon begins to yearn for a better user-facing strategy. Likewise, once a designer has seen what good interaction design methods can accomplish, he soon desires to work with a strong development team open to collaboration.

Many advanced thinkers have already tried a first order solution: agile programmers have requested input from designers, and UX designers have attempted to squeeze their work into the timeboxed cycles of agilistas. The results have been promising, tantalizing, but somehow not quite there yet. It has become clear that while both UX and agile are effective methods, a combined “agile-UX” method will be something different from—something beyond—a simple addition of the two.

This past weekend, in a creatively messy office space in Tribeca, two dozen such advanced thinkers got together for the third installment of a working group dedicated to addressing this worthy challenge.

The group, which calls itself the “Agile UX Retreat”, consists of about 50 people, who borrow time away from work to participate. The group actively seeks and invites promising newcomers, but there is a core of twenty or so who attend every meeting.

At last week’s third meeting, the group shifted into a higher gear and made remarkable progress. They weren’t so much “post-agile” or “post-UX”, as they were “post-doctrine” and “post-hostility”. The thinking, speaking, and exchange of ideas, vision, and practice was not only of a remarkable quality, but it consistently transcended its component pieces. Beyond talk of design or agile, beyond talk of design and agile, the talk was of what the organization can be—and must be—when everyone in it is committed to the principles of user-centered, collaborative, iterative teamwork.

Much of the gruntwork of figuring out how this new organization works is being done by the “lean startup” folks, spearheaded by the likes of Eric Ries and Steve Blank. Lean startup has really only been practiced in tiny little startup companies, where, when you talk to the “product owner”, you really are talking to the product owner. While this is only a microcosm of the larger corporate reality, it is a valuable test-tube in which experiments can be conducted. In other words, we can learn a hell of a lot about how to run a big company by seeing how this stuff works in little companies.

But the core ideas of lean startup aren’t so much new as they are simply the beliefs of agile and UX, brought together effectively in a business context. Half of the lean startup’s principles are bedrock to agilists, while the other half are foundational to user experience professionals.

Not just the designers and not just the programmers, but everyone has to center their work on satisfying the customers. You need to have sublime confidence that the only way to deliver a successful product or service is by first delivering some version that is wrong, or at least, not quite right. That is, success on the first try is not within the capabilities of humans. Iteration and incrementing are integral parts of a post-industrial approach to product development.

At the union of “solve” and “do” we find the definition of the twenty-first-century business. Even though we are still at the beginning of this journey, it’s clear that we are finally on the right track.

Related reading

What do you think? Join the conversation in Comments

Things I learned at Agile Up To Here

(This was originally published on Playwell, Alan's personal blog.)

Elisabeth Hendrickson has recently opened a new test-and-development training facility in Pleasanton CA called Agilistry. It’s bright and airy, well-lit and well-stocked, and it feels like home the minute you walk in. In order to publicize her new facility, she very generously hosted a week-long intensive learning exercise.

She invited eleven different people with widely varied skill sets, backgrounds, and interests. She challenged them to build a website in five days using the best practices of interaction design, agile programming, and test-driven-development. We christened it “AgileUpToHere” (#au2h) and it exceeded everyone’s expectations (you can see our results here).

Since it was my 15-year-old homophone web site that was being rebuilt, I nominally played the role of product owner, but I was an observer, an instigator, a goad, and a participant. It’s hard to remember when I had so much fun or learned so much. If you want to learn to be great, I strongly recommend Elisabeth and Agilistry.

Things I learned:


  1. After 25 years, it’s time to lose the Windows computer and get a Mac.

  2. Good agile developers are self confident; confident enough to trust interaction designers to do interaction design without distrustful oversight.

  3. There are lots of programmers who understand that relational databases are not the only approach to solving problems.

  4. It is time to build software.

  5. Test-driven-development isn’t fully understood. In fact, software testing isn’t fully understood.

  6. When even the leanest developer in the room sees really high quality BDUF (big design up front) for the first time, they get all woo-woo and want some for themselves.

  7. Getting good software built demands the contributions of many different personalities, competencies, and roles, most of which are new and as-yet ill-defined.

  8. Two programmers pairing can create more and better code in less time than one programmer can (I already knew this, but it’s always good to see it in action).

  9. Even this jaded old fart can still get excited about changing the world.

  10. There are many undiscovered and unfilled product niches on the Web, and one of them is “quality”.

  11. People want a leader with a vision.

  12. Elisabeth Hendrickson (@testobsessed) is a magical woman. To paraphrase Tom Robbins, “she’s been around the world eight times and met everybody twice.” Like a great chef or symphony conductor, Elisabeth knows how to combine the unexpected to create the sublime. She brought together a dozen people from all over the country, each with different skills, background, desires, and expectations, and then she blended them together into a cohesive, happy, effective team.

  13. The pre-written code I arrived with was called “legacy” with a grimace, and was quarantined until discarded. Moral: Non-TDD (test-driven development) code is properly regarded like a ticking time bomb.

  14. For interaction design, you can’t have too many white boards, made from porcelain-coated steel, firmly mounted to the wall. For agile development, that isn’t such a big deal.

  15. Story-mapping is a major component of the bridge between interaction design and agile development.

  16. Story-tracking software isn’t quite there yet.

To Pivot, Or Not To Pivot

At the recent Startup Lessons Learned conference in San Francisco I learned a new buzzword for a very old concept. When a startup company discards Plan A and moves on to Plan B, it is called a “pivot.”

Pivoting has some nuanced meaning that differentiates it from simply changing directions. Pivoting is a seek-the-light strategy and it is not seen as fixing a problem. What you were doing might have been good, but what you pivot to do will be better. In fact, a startup that doesn’t pivot can be suspected of rigidity.

When I started my first company way back in 1975, I did contract programming. That lasted less than six months when I pivoted to building turnkey accounting systems. Before I had delivered my first (and only) turnkey accounting system, I had pivoted to just selling the software without the system. Each new business model was better than the last.

When a mature company changes its business model, there are costs associated with it, not the least of which is the dislocation to its people, who were probably very comfortable doing what they used to do. In a small, young startup, the costs are insignificant, and there are few if any extra, comfortable people to be dislocated.

Some people have expressed their doubts about the methods espoused at the SLL conference, and some are surprised to find me enthusiastic about them. But it’s important to understand that it isn’t a one-size-fits-all world. In a four-person web startup it isn’t unreasonable to pivot once a month. In a four-thousand-person mid-size manufacturing company it is insane. Just because methods work, it doesn’t mean that they work everywhere.

Overall, the SLL conference wasn’t so much about entrepreneuring as it was a celebration of today’s incredible web-based entrepreneuring environment. The barriers to entry today are so low that they approach zero.

When the cost to play the startup game is next to nothing, the cost of making mistakes is tiny, too, as is the cost to pivot. Therefore, there is little pressure to be correct or even to have a good idea. You can just keep having and trying ideas at little or no cost, and eventually one of them will be good enough for you to build a business. You can pivot your way to success instead of tediously crafting your way there.

The interesting point to ponder is whether this current web-based startup environment will be around more than a couple of years. Is it a brief anomaly, or is it the new business-as-usual? And should you be pivoting towards it? Should I?

What do you think? Join the conversation in Comments

Common ground

The biggest problem in software today is that programmers and designers simply don’t work well together. They certainly want to, but each craft sees the problem from their own point of view and, with the best intentions, each tries imposing their methods on the other group. But even agile developer’s sharpest tools aren’t going to work well for designers, and likewise, even the designer’s sharpest tools aren’t going to help programmers.

The solution will be to find some common ground where each craft is open to the best contributions of the other, without either side being forced to sacrifice their inherent strengths.

I believe that the solution, like most big things, will be relatively simple in concept, yet getting there from here won’t be easy.

Today, most of the pathologies of both designers and programmers can be traced to their mutual lack of experience working together. Most programmers will tell you their biggest problem is coping with rapidly changing requirements. Most designers will tell you their biggest problem is unresponsive programmers.

In the modern, agile world, programmers defend themselves against changing requirements by showing customers the program as often as possible, and by being able to make rapid changes to suit the customers expressed needs.

Interaction designers defend themselves against uncooperative programmers by doing ever more detailed design and documenting it with greater accuracy, detail, and precision.

But modern, agile programmers can work so flexibly that they don’t need all of that detailed and precisely written design. If designers could just blend into the development team, they could communicate their design directly without the overhead of documentation. They could provide a kind of just-in-time design service to the programmers.

On the other hand, interaction designers can master the driving principles of even the most complex domain so that programmers don’t need to make all of those changes. With a comparatively brief and inexpensive field study, designers can vanquish the changing requirements problem almost completely.

Ironically, the common ground for agile developers and interaction designers is one where the major problem faced by each craft separately is largely solved by the simple presence of the other craft, working collaboratively at a peer level.

That’s really good news for cost-conscious business people (now that’s redundant). Having designers and developers collaborate is very economical. Most of the cost of interaction design is incurred in the documentation and communication of that design. Similarly, most of the cost of software development is incurred in traversing blind alleys trying to elicit useful guidance from the stakeholders. Effective collaboration simultaneously discards the need for the two most expensive parts of product development, while driving quality—and user desirability—through the roof.

What do you think? Join the conversation in Comments

An Insurgency of Quality

Dave Hussman, one of the leaders of the post-agile movement, recently hosted a one-day conference on the topic of “Redesigning Agility”, and invited me to give a plenary talk. The focus of the conference and my talk were how to integrate agile development with interaction design. I was very pleased with how things went.

Here you will find the complete text of my talk, entitled “An Insurgency of Quality”, along with all of the slides I showed. I made a few ad libs, but mostly stayed with the script in order to assure that my message not be misunderstood.

The conference, called “Code Freeze” (due to it being January in Dave’s home town of Minneapolis), was sold out and the audience was razor sharp. The attendees were developers; that is, mostly programmers, but with lots of designers, coaches, testers, and managers, and not a few who wore several of those hats.

This talk is a complement to one with the same title I delivered at the IxDA's Interaction08. That one was directed at designers; this one is for developers.

What do you think? Join the conversation in Comments

My vision of Agile

Lots of ivory tower software experts cheerfully follow their own muse, but in the world of business, the dreams of money-makers are usually in conflict with the dreams of geeks.

In the business world, software developers have always been the whipping boy. In commerce, the delivered-software never matched the envisioned-software, and the technologists got the blame. Executives have always been unhappy about their inability to effectively direct and exploit software development. The only tool that seemed to get results for managers was to keep programmers on a ridiculously short leash by allocating resources in tiny increments. The results weren’t good, but they tended to prevent colossal disasters, which was, apparently, good enough for business.

Over many years, in self defense, programmers increasingly hunkered down to protect themselves. They aggressively lowered the expectations of their managers. They tried to commit to the least possible performance to avoid blame. All they really accomplished was to avoid good performance.

Is incremental design the wave of the future?

An old friend and former client — let's call him "Paul L." — sent me an email question the other day, asking “Is incremental design the wave of the future, or just a flash in the pan?”

When I finished my response, I thought that it might be of interest to a wider audience. Here is the exchange.

Alan,

My wife teaches computer science at Menlo College. She has been teaching software engineering based upon the traditional cycle of specification, design, development, testing.

I have seen some Research Channel TV shows that talk about Incremental design. Google and Inuit are two companies that seem to be having some success with ID.

My wife and I have been talking about ID as compared to the traditional methods. During the discussion I thought about my friend, the Design Guru. I am wondering if you have any thoughts on ID. Is it the wave of the future or just another flash in the pan?

Thanks,
Paul

Paul,

As agile methods take over the programming world (and they will), EVERYONE else will adjust accordingly. The old paradigm of everyone hunkering down and protecting their turf from everyone else is what gave rise to the "traditional cycle" (which is, by the way, uniquely ill-suited to software construction and design).

The new (agile) paradigm isn't at all defined yet, but it characteristically includes a) Generation Y programmers; b) a refreshing belief in the potential for change; c) the understanding that satisfying human users requires special efforts and probably special skills; d) a belief that software should be built in continuous increments; e) a corresponding belief that everything else in the world relating to software would benefit from such continuous increments; f) that building software is a team endeavor; and g) that nobody has solved these problems before.

Thinking outside the inbox

There’s a meme floating around the interWeb called “Inbox Zero, the gist of which is that we should not be slaves to our email. That’s a fabulous sentiment and I agree wholeheartedly.

Merlin Mann, the creator of Inbox Zero, has some truly excellent advice on how to think about your email, your inbox, and yourself. In particular, not feeling guilty about deleting messages or sending terse, one-line-replies are golden rules. Not to put too fine a point on this, but I agree without reservation with the principles and practices of Inbox Zero.

Yes, and.

I believe that Inbox Zero is a human operational method for dealing with fundamental shortcomings in the software we are forced to use. The very fact that we have an “Inbox problem” is prima facie evidence that the software bringing our email to us isn’t really designed with our goals in mind.

Predictably Irrational

Behavioral economist, Dan Ariely’s delightful first book, Predictably Irrational, heaps yet one more shovel of dirt onto the fresh but deep grave of traditional, rationalist assumptions about human behavior. The book is a simple, personal, easy-to-read account of Ariely’s research conducted over the past 15 or so years. This research was conducted at his various host universities; all of them paragons of ivy-covered scientific rigor, including MIT, Stanford, The University of Virginia, and The University of California at Berkeley.

The clear and inevitable conclusion of his dozens of research papers summarized in this book is simple: humans don’t make rational decisions. What’s more, the irrationality of their choices isn’t random, but can be predicted and measured. While many of the experiments deal with choices regarding cash, several of them cleverly divorce themselves from money to clearly demonstrate that the goofy human behavior is human-related, not cash-related.

He identifies several predictable forces that act upon humans during decision making, causing them to make irrational choices. These include the distorting effect of similar, but slightly inferior, products offered for sale; the distorting effect of simply thinking about numbers; the distorting effect of items offered for free; the distorting effect of sexual arousal; social norms, ownership, procrastination, self-control, clinging to options, expectations, and being observed.

Uncle Bob, craftsmen and the Agile Manifesto

Bob Martin’s rousing keynote speech at the Agile08 conference in Toronto entitled “Quintessence” proposed a small but significant addition to the Agile Manifesto, a seminal document in the programming world. Uncle Bob, as he is affectionately called, proposed adding the assertion that we value “Craftsmanship over crap” to the manifesto. The idea is excellent, and the wording bold, but it isn’t quite a complete sentiment, and Uncle Bob addressed this issue in his blog.

Bob Martin at Agile 2008

Shortly after delivering his speech, Uncle Bob stated in his blog,

The problem with my proposal is that it is not a balanced value statement. In the other four statements we value the second item. We just value the first item more. But in my proposed addition, we simply don’t value crap at all.

He goes on to propose rewording it as “Craftsmanship over Execution,” but admits that it still doesn’t capture his meaning precisely. He then asks the blogosphere for help. My response follows.

Alan on the radio

By day, Brad Brooks is a technology executive in Vancouver, BC. By night, he is a popular local talk radio host. Brad recently read my book, The Inmates are Running the Asylum and became a convert to the concepts I wrote about a decade ago.

He quickly asked to interview me on his show. Brad clearly sees the problem and its solution, and the interview neatly recaps the basic ideas in the book. The sad thing is that so little has changed. It all just means that we have to continually beat the drum for design otherwise we will drown in hard-to-use high-tech products.

You can listen to the interview on the Brad Brooks Show Web site.

What do you think? Join the conversation in Comments

Why I read my speech at Agile08

Some attendees at the recent Agile08 conference were put off when it appeared that I was reading my speech rather than delivering it offhand. (If you're interested, you can find my slides and speakers notes here.)

It’s true; I was reading my speech.

When I speak to groups of interaction designers or business people I often address them extemporaneously. It’s a style I enjoy very much and feel that I can do well.

However, the Agile08 audience demanded special treatment. Not only was it large, but it consisted primarily of programmers, agile coaches, and product managers. These professionals are bright, knowledgeable, critical, and opinionated. They do not suffer fools lightly. I was coming to them as something of an outsider; not having programmed for a living for years, and never having programmed in a canonically agile shop.

Alan's keynote at Agile 2008

I was asked by the leadership of the Agile 2008 Conference to give the closing keynote address at their annual conference in Toronto. The audience at Agile08 consisted of about 1500 programmers, engineers, product managers, and others involved in the creation and deployment of software primarily using Agile methods.

My belief in the value of detailed written design has often led enthusiasts of Agile to assume that I am an adherent of the obsolete, and justly maligned, waterfall method of software construction. I was pleased to have this opportunity to state my position with clarity and precision, not to mention making the case for effective collaboration between interaction designers and Agile programmers.

Here are the slides and accompanying speaker's notes for my talk. Some in the audience noticed that I was reading from my notes during the presentation. If you're interested in why I chose to do this, read this Journal entry.

If you would like to discuss this presentation with me, either post a comment to the Cooper Journal blog or email me directly at alan@cooper.com.

What do you think? Join the conversation in Comments

Whither interaction design consulting firms?

Is interaction design done by consultants or employees?

When Cooper was launched as an interaction design consulting firm in 1992, the answer to this question wasn't at all clear. However, as the 90s drew to a close, I confidently predicted that the bulk of the interaction design done in the world would be done by consultants. I based this conclusion on the proliferation and success of interaction design consulting firms. I assumed that the industry would follow the model of building architecture, where major design projects are typically performed by outside consultants. Architects on corporate staff would act primarily as liaison and project management. And for the first few years of the 21st century my prediction appeared correct. Today, I wonder if I called it wrong.

More and more I see corporations both large and small with their own in-house interaction design staffers. In fact, in a broad sense, my company competes with our own clients for qualified designers. There are still many successful interaction design consulting firms, but I see an ever increasing number of design projects handled completely by internal design talent, and successfully at that.

This, of course, brings up the thoughtful question of "Whither interaction design consulting firms?" What will their role be in the next decade? Will the pendulum swing the other way, and clients find that it is less expensive to hire designers on a project-only basis instead of keeping them on staff full time? Or will the consultants find themselves working only on fringe projects that are too large, too small, too complex, or too unique?

I don't yet know the answer to these questions, but I'm leaning towards the idea of an ever-more specialized role for interaction design consultancies. What do you think?

What do you think? Join the conversation in Comments

Design engineering: the next step

Software construction is slow, costly, and unpredictable.

Slow is a big problem because markets and technologies are always in rapid flux and our ability to maintain pace is critical to our competitiveness.

Costly haunts business people because every precious dollar spent on software design and construction is typically a dollar that is difficult—if not impossible—to directly attribute to a measurable benefit.

Unpredictable is by far the nastiest of these three horsemen of the software apocalypse. Unpredictable means 1) you don't know what you are going to get; and 2) you won't get what you want. In badness, slow and costly pale in comparison to unpredictable. While well-tempered business people are loath to part with either time or money unnecessarily, exchanging time and money for an asset that generates an offsetting future flow of revenue is the essence of business, and "slow" and "costly" are relative terms. If something costs millions and takes years it can still be considered excellent business if the return is tens of millions annually over dozens of years. However, exchanging time and money for something that doesn't generate an appropriate flow of revenue is bad. Very bad.

About Face 3: Foreword

The industrial age is over. Manufacturing, the primary economic driver of the past 175 years, no longer dominates. While manufacturing is bigger than ever, it has lost its leadership to digital technology, and software now dominates our economy. We have moved from atoms to bits. We are now in the postindustrial age.

More and more products have software in them. My stove has a microchip in it to manage the lights, fan, and oven temperature. When the deliveryman has me sign for a package, it's on a computer, not a pad of paper. When I shop for a car, I am really shopping for a navigation system.

More and more businesses are utterly dependent on software, and not just the obvious ones like Amazon.com and Microsoft. Thousands of companies of all sizes that provide products and services across the spectrum of commerce use software in every facet of their operations, management, planning, and sales. The back-office systems that run big companies are all software systems. Hiring and human resource management, investment and arbitrage, purchasing and supply chain management, point-of-sale, operations, and decision support are all pure software systems these days. And the Web dominates all sales and marketing. Live humans are no longer the front line of businesses. Software plays that role instead. Vendors, customers, colleagues, and employees all communicate with companies via software or software-mediated paths.

2nd edition foreword excerpt: The Inmates are Running the Asylum

Order Inmates Are Running The Asylum on Amazon.com

In my recent travels I have noticed a growing malaise in the community of programmers. Sadly, it is the best and most experienced of them who are afflicted the worst. They reflect cynicism and ennui about their efforts because they know that their skills are being wasted. They may not know exactly how they are misapplied, but they cannot overlook the evidence. Many of the best programmers have actually stopped programming because they find the work frustrating. They have retreated into training, evangelism, writing, and consulting because it doesn't feel so wasteful and counterproductive. This is a tragic and entirely avoidable loss. (The open-source movement is arguably a haven for these frustrated programmers—a place where they can write code according to their own standards and be judged solely by their peers, without the advice or intervention of marketers or managers).

Programmers are not given sufficient time, clear enough direction, or adequate designs to enable them to succeed. These three things are the responsibility of business executives, and they fail to deliver them for preventable reasons, not because they are stupid or evil. They are simply not armed with adequate tools for solving the complex and unique problems that confront them in the information age. Now here I am sounding like I'm slamming people again, only this time businesspeople are in my sights instead of programmers. Once again, to solve the problem one must deconstruct it. I'm questing after solutions, not scapegoats.

The origin of personas

The Inmates Are Running the Asylum, published in 1998, introduced the use of personas as a practical interaction design tool. Based on the single-chapter discussion in that book, personas rapidly gained popularity in the software industry due to their unusual power and effectiveness. Had personas been developed in the laboratory, the full story of how they came to be would have been published long ago, but since their use developed over many years in both my practice as a software inventor and architectural consultant and the consulting work of Cooper designers, that is not the case. Since Inmates was published, many people have asked for the history of Cooper personas, and here it is.

In their book, Fire in the Valley, authors Paul Freiberger and Mike Swaine, credit me with writing the “first serious business software for microcomputers” as far back as 1975. Like so much software of the time, it was terribly hard to use, and its real power was in demonstrating that making software easy to use was harder than everyone thought. Despite my commitment to making software more user-friendly, it wasn’t until 1983 and about 15 major business and personal applications later that I began to develop a more effective approach.

A breath of fresh air

When all you have is a hammer, everything looks like a nail. If you have never seen a wrench or a screwdriver you will have a hard time seeing what you need, even once you discover that your hammer does not work very well on bolts or screws. This makes it hard to break away from tools that do not serve you. Under pressure, companies tend to fall back upon what they know, so they often end up trying to solve problems with the same tools that got them into trouble in the first place. When this tactic threatens to choke an organization, we call it "breathing your own exhaust."

Right now, many companies see an opportunity to approach product creation from a fresh perspective. With the frenzied dot-com "business model" no longer a distraction, and the recession apparently easing, these companies are looking for ways to benefit from their painful experiences and create a better crop of products and services. They want to nurture customer loyalty by building products that please their customers, rather than following fads or stacking up long lists of features that no one really wants. Everyone knows pleasing customers is the right thing to do, but how do you really do it?

Navigating isn't fun

The artless Websites created during the Web's infancy were of necessity built only with simple HTML tags, and were forced to divide up their functionality and content into a maze (a web?) of separate pages. This made a navigation scheme an unavoidable component of any Website design, and of course, a clear, visually arresting navigation scheme was better than an obscure or hidden one. But many Web designers have incorrectly deduced from this that users want navigation schemes. Actually, they'd be happy if there were no navigation at all.

Today more than ever: the lost chapter of The Inmates Are Running the Asylum

Computers and their software participate in every aspect of the precious and delicate relationship between the company and the customer. The typical customer first learns about your product from an email advertisement or a computerized mailing. He visits your website to find out more about it. He buys it from your online store, and you ship it to him by FedEx, where he uses software to track it on his PC. Once delivered, even if your product isn't 100% software, it very likely has some silicon intelligence inside it. When your customer can't figure out how to work it, he calls your company on the telephone (which is itself a computer). The first thing he hears is the stilted and artificial voice of your automated call distribution system, instructing him "for technical assistance, press one now." Finally, the software puts him in contact with a real human, only for him to find that this person-trained by software-is merely echoing instructions from a problem report database software program running on his computer!

The second-order effects of wireless

Even though "wireless" is the hot buzzword on the lips of every high-technologist, the effects of the technology hold far more interest than does the technology itself.

Wireless freedom is intriguing: It isn't hard to imagine a world of perpetually perambulating people with cell phones clamped to their ears and styli firmly gripped in their fingers doing at the cinema or the next table over at Il Fornaio what they could formerly do only at their desks.

But this flexibility to work where you want is just the first order of change wrought by these new tools. Far more interesting are the second-order effects - those unintended consequences of a new technology which often have a more powerful impact on society than the more obvious first-order changes.

The iteration trap

High-tech companies are in a hurry—as well they should be—but many hurt themselves by trying to move products out the door too quickly. I often hear executives repeat homilies like "Ship early, ship often," and "Launch and learn." They assume that there is no penalty for simply slapping something together, shipping it, and then upgrading their product or site in a rapid iteration cycle. Unfortunately, there is a big, hidden cost associated with this tactic.

Rapid development environments like the World Wide Web have promoted the idea of simply iterating many versions of a product or service until something works. Arguably, the Web is in its nascent stage and companies are still experimenting to see what works and what doesn't, yet this should not be an excuse for iteration without planning, nor should "speed to market."

The perils of prototyping

Which is harder to change: a program with 1000 lines of code or a 1000 square foot slab of concrete? The concrete is ten inches thick and has steel reinforcing rods criss-crossing within it. Every cubic foot of it weighs almost 100 pounds. The software has almost no physical existence at all. It weighs nothing. It consumes no space. A few microamps and those bits flip from zero to one without a second glance. The answer to my question seems a simple one, doesn't it?

Which is the best medium for designing software: Visual Basic or a sharp pencil and a couple of sheets of paper? Visual Basic is a powerful, flexible integrated development environment. It is on its way to becoming the most widely used language ever. It has won every industry award there is. Paper is not interactive. Paper offers no palette of pre-made controls. It just lays there and you have to do all of the work. The answer to my question seems a simple one, doesn't it?

Sign Up

Want to know more about what we're thinking and doing? Tell us about yourself, and we'll be happy to share.

+

Required

+

Optional

Categories

 

contact

Contact

To work with us

tel: +1 415.267.3500
Talk to the man
Want a direct line to the big guy? Here's your conduit. Alan Cooper:

+ Careers

Cooper is always on the lookout for the best and brightest talent. Feel free to take a look at our current career opportunities.

+ Site

To send feedback about our site, drop a note to our web team. An actual human will respond.

+ Cooper

100 First Street
26th Floor
San Francisco, CA 94105
tel: +1 415.267.3500
fax: +1 415.267.3501