Michael Mandel, chief economist at Business Week believes that innovation is "the only game in town." In this time of financial meltdown and economic crisis, he believes that the only way we can pull out of this mess is for innovation to prevail in our culture.
I agree with his intention, but most of his suggestions left me scratching my head. Michael suggests that we need new economic policies (tax incentives, etc) to encourage companies to be more innovative. He also suggests prizes, and encourages new technology development. While these policies can't hurt, I find that a lack of incentives or ideas is not the problem. While his suggestions won't hurt, and may help, I don't think they will have the deep, meaninful impact that is needed.
In my experience, the problem lies in the fact that most companies don't know what to do to be more innovative. They know that in order to grow or be more profitable they will need to innovate. That is a great incentive. There are also more than enough ideas to go around. What is lacking is that most companies do not know how to guide innovation efforts in a way that will be valuable in the market. They also have a difficult time managing innovation efforts within cultures that need to manage predictable processes and outcomes. To be more innovative, companies need to find ways to reward behaviors that encourage innovation, without discouraging the maintenance of business. To be more innovative, companies need to be encouraged to take on projects whose outcomes cannot be defined before the project is started. To be more innovative, companies need to learn to identify problems before they search for technologies. Otherwise we end up with solutions in search of problems. These are difficult behaviors for organizations to manage from within organizations who need to reward reliable delivery of products and services.
We are a community of innovators. What would you suggest to Michael to answer the question: What is necessary for America to become more innovative in the future? How can we help him?
I'll share a quote today that's relevent to my last post. Enjoy the weekend.
"Not everything that counts can be counted; and not everything that can be counted, counts"
-- Albert Einstein
I was advising a client on how to collect some qualitative and quantitative research. He wanted to combine all the questions into one survey. I cautioned him about collecting too many qualitative responses, and suggested that we may want to do two separate studies.
"Won't that take forever?" he asked? I responded that it wouldn't take that long, because we wouldn't need many responses in the qualitative part. "But that's not statistically significant!" was his reply. He had the understanding that no matter what type of research you're doing, statistical significance meant that you needed at least 500 respondents. We then had a good conversation about what statistical significance means.
Statistical significance is a term used to describe the confidence level with which you can use the results of your study to project how the broader population will respond. For example, if you are doing a quantitative study to find out how many people identify positively with your brand, you may collect 1000 responses recruited to represent proportions of the population. If 800 of those people identify positively with your brand, you may say with a high degree of confidence that approximately 80% of the population identifies positively with your brand. There are charts that can tell you exactly what that degree of confidence is, and that is your measure of statistical significance.
But let's say that you want to understand the associations with your brand more deeply. You want to know how it makes people feel. You can't ask this type of information in a quantitative survey without making some assumptions, so you decide to do some in-depth interviewing to find out how your brand makes consumers feel. In each interview, you need to have time for a longer conversation about the consumer's values. Even if you talk to a large group of people, if you don't go deeply enough to get an understanding of their values, your confidence level for understanding what those values are is pretty low.
In following this example through, suppose that in interviews with 10 people who love your brand, we find out that it gives them a greater sense of control than your competitors' brands. These consumers may not have said this directly, but in 10 interviews, we were able to actively think about what we learned, and draw this conclusion with a high level of confidence.
Would we project this conclusion onto the population at large? No. While we have a high level of confidence that we know the issue, we don't have a high level of confidence that this issue is true for the broader population. For that we need a quantitative study with a large sample size.
What does the term "statistical significance" mean at your company? Remember that it is a measure of your level of confidence that you have the right answer. This will vary based on what you are trying to learn, the size of your total population, and how deep you need to dig for answers. If you are doing very in-depth qualitative research, remember this: A large sample size may be less statistically significant, because you won't be confident that you could find the right answer in the first place.
My current client owns some of the world's largest online consumer communities in niche enthusiast segments. Here are a few things I've learned about doing consumer research within established online communities. I hope you find them useful.
1) Get introduced to the community by the founder. Have a profile, and let everyone know who you are and what you'll be doing. Be transparent about this.
2) Remember that you're not really "one of them". You may be welcome, but you are their guest.
3) If you're working with a passionate community, you can actually disrupt some of the traditional in-depth research techniques, and learn some very deep information very quickly.
4) Still, there are some things you need to be with people, in person, to learn. This will vary with each community.
5) Make surveys as much like an informal interview as possible. Make the questions informal, and communicate as similarly to the way they communicate on the site as possible.
6) When executing a survey, remember that consumers hate pop-ups. Don't you?
7) Long questionnaires feel smarmy. You know, the ones with multiple matrix tables that expect consumers to know the name of every feature on the site? Yeah, those.
8) I'm sure no one reading this would ever do the previous two points, but let's say there's a prior agreement with a third party, and you have one on your site. Make sure the community knows that it didn't come from you, and that you wouldn't do that to them. Graciously collect all complaints about them.
9) In global communities, be careful with how you use incentives. Rules vary by country, and international members could feel left out.
10) Remember that passionate communities LOVE their site. If they honestly believe you are working to make it better, they will bend over backward to help you. Authenticity and genuine interest will be your most valuable tools.
Think about the last focus group you ran or attended. Be honest with yourself. Why did you have the focus group? And what, exactly did you learn from it?
I always tell my clients that focus groups serve a real, and valuable purpose, and that they must be used judiciously to make sure we're using the right tool for the task at hand. But then I noticed that I'm using them less and less frequently. They don't seem to be very helpful in learning deeply about consumers' lives. One-on-one interviews are much more helpful to do that. They also don't seem to be very helpful in evaluating new product concepts, as group-think often obscures real opinions. Add to that, the numbers are too small to use them to quantitatively infer the behavior of the larger population with a high degree of confidence.
So why are we using them? I had one client tell me that she used focus groups because it was an easy way to get others in the company to participate. They could drop in and out from behind the glass, and at least have some exposure to their consumers. Another client told me that they are used in her company because they were an "accepted" method of gathering market research, and they could easily obtain the funding to run them.
I can see using focus groups when I have an idea of what I want to learn, and I want to collect some basic information to help me to know which areas I'd like to probe more deeply. It's a safe, middle of the road tool. What I'm finding, however, is that as we become more focused on who our consumers are and we become more adept at using online tools to collect this basic learning, the focus group is becoming less relevant for me.
I'm curious to know if others are having similar experiences. Is there greater value we should be getting from focus groups? Or are they a tool that will become less relevant in the future?
I talk a lot about consumer insight: how to learn from consumers, how to derive insights, and how to translate them into useful criteria to guide decision-making. I realized that I don't talk so much about how this connects to the design process, and I'll be mixing in more of that from now on.
I've often heard clients talk about holding back on the constraints because they don't want to hinder the creative process. While the intention is good, nothing could be further from the truth. The creative process depends upon constraints. Figuring out how to manage constraints is what creativity is all about. Having a blank slate to design whatever inspires you is what fine art is all about. It may be fun and interesting, but it most likely won't help to achieve your business goals.
Next time you're working with a designer, remember that it's your job to let the designer know about all the constraints to the process ahead of time. Along the way some of these constraints may be challenged or made irrelevant, and that's part of the creative process at work. If you don't do this, the designer will create their own constraints, and what gets designed may not be relevant to your business at all. At that point everyone's time has been wasted.
Also remember that you don't need to decide what the answer is, and have the designer just draw it up and make it pretty. Design is about problem solving. Problem solving needs constraints. Otherwise it's just decoration, and that's a different task altogether.
I was looking over the last few blog posts and realized that the real purpose behind many of our innovation processes is to help us to work around traditional corporate reward systems. Defining Active Thinking is a way to ascribe value to a process that many clients undervalue. Brendan commented that mind-mapping tools are useful because they help to make thought processes more visible, and what's visible is more likely to be rewarded.
Organizations have good reasons for rewarding tangible, predictable processes. Their main businesses typically revolve around providing high quality, consistent, relaible products or services. The problem arises when they apply the same reward systems when trying to innovate. If you haven't read Steven Kerr's article On the folly of rewarding A, while hoping for B it is well worth taking a few minutes to review it. It's old, but the message is still fresh, and is something that is consistently overlooked.
I do believe that our tools and processes for innovation are useful. It is important to make the innovation process as consistent with our clients' processes as we reasonably can. If it's too foreign or scary then innovation will never happen. But sometimes the fit is just too forced. In those cases, we may be better served to point out the obvious, and define a reward system that will enable the right work to be done.