|See full post and discussion||Posted: 6 months ago|
|See full post and discussion||Posted: 8 months ago|
By Paul Reali
In the creativity world, we celebrate failure.
This can be difficult to grasp, especially for those companies (um, most of them) who have a history of punishing failure. Failure, mistake, error—these are all part of the creative process. Contrary to popular belief, rare is the idea that pops into the brain as a fully formed, shovel-ready solution. The bulk of the creative process is the hard work of turning that half-formed, half-baked, half-witted idea into a workable, novel, implementable solution. Along the way are mistakes, and missteps, and restarts, and outright failure.
There is no creativity without risk, there is no risk if there is no failure, therefore there is no creativity without failure.
It might be natural, then, for us to celebrate the failures of Jonah Lehrer.
In case you’re not familiar with the case, Jonah Lehrer is a creative guy, a writer who mostly deals with science and the brain. He is the author of the best-sellers Proust was a Neuroscientist, and Imagine: How Creativity Works, and he was a columnist for The New Yorker. It recently came to light that he had fabricated quotes in Imagine and had been reusing his old work for new New Yorker columns. He has since been fired from the magazine, and had his book recalled, if you can imagine such a thing.
But before any of this came to light, I and many of my creativity colleagues had a beef with Jonah Lehrer. In Imagine, and in several widely-circulated articles, blog posts, and interviews based on the book, Lehrer assailed brainstorming, one of my community’s favored tools, as not working. It wasn’t that he called brainstorming into question; that’s been going on for years. Rather, we felt that: a) his conclusion was unsupportable and ridiculous (he did not say, for instance, that other methods might work better than brainstorming, but that brainstorming does not work; b) selectively chose and interpreted data to fit his conclusion; c) misunderstood brainstorming in the first place; and d) it got a lot of attention. (The best critique I’ve read of Lehrer’s conclusions and how he used the data come from Scott Berkun.)
I (and, I suspect, many of my creativity colleagues) want to celebrate the failures of Jonah Lehrer, but not in the usual way—not celebrating the failure that is an inevitable byproduct of the creative process. No: if we celebrate, it’s of the “it couldn’t have happened to a nice guy” variety. If we celebrate, it’s because the guy who maligned and misunderstood creativity did so while misusing his own creativity.
Creativity, as noted at the beginning of this piece, is mostly about hard work, so let’s look at Lehrer’s mistakes through that lens.
Creativity is cognitive, as is the creative act of writing. Lehrer’s job, like that of other non-fiction writers (think, for example, of Malcolm Gladwell, with whom Lehrer is often compared), is the cognitive hard work of learning a great deal about a subject and then distilling it clearly for his readers. There are conclusions to draw, connections to be made, evidence to present, positions to argue.
In that mission, here is what Lehrer did: he fabricated Bob Dylan quotes, selected questionable research, and falsely interpreted that research, all in order to support his pre-selected points of view. Add to this what he was accused of at The New Yorker: he reused his earlier work and called it new. (This has been called “self-plagarism,” a nonsense phrase.) What binds all these crimes together?
If the data doesn’t support the conclusions, then change your conclusion, even if it means rewriting or doing more research. If the subject did not say what you wanted him to say, find someone else to say it, or rethink your conclusion. If you are hired to do original work at your job, go and take the time to do it.
Is Jonah Lehrer actually physically lazy? I don’t know. Maybe he was intellectually lazy. Maybe he was rushed. Maybe he was feeling the pressure of one anointed as a young genius. Whatever it was, the result was not one of those failures that inevitably come with being creative; no, it was a human’s failure to respect the creative process, a failure to do all the rigorous work the creative process requires.
All creative persons experience failure. But in the end, this one is tragic for Jonah Lehrer. This is not one to celebrate.
|See full post and discussion||Posted: 9 months ago|
By Paul Reali
They are practically mantras in the business world these days: “we need to be more creative,” and “we need to be more innovative.” Which, just on the surface, tends to frighten people.
“You want me to…what?”
It’s frightening, or at least off-putting, because people don’t quite know what it means. Or, more to the point, they don’t know what is expected of them. Creative…how? Innovate…what?
Most of us don’t have experience in being innovative. Most of us do have experience in being creative, but we probably don’t recognize it as such. Many think of creativity as meaning artistic, or simply off-beat. But whatever the reasons one might be intimidated by the demand to be creative and innovative, the truth is likely very simple:
We simply don’t have enough experience with it.
Here’s a parallel example. I regularly meet people who feel shock and awe when they learn that I, for a good part of my living, stand up in front of groups and actually speak to them. And enjoy it. “I could never do that,” they invariably say. My answer is always the same: “Of course you could.” The only difference between me and any of them (and maybe you) is that they are not experienced in speaking to large groups, and I am. They (and maybe you) simply haven’t spent enough time doing it, have not spent time identifying the skills required, have not worked at developing those skills.
Which brings us back to creativity and innovation.
At Innovation Bound, people often say to us, “I’m not creative.” To which we answer: “Of course you are.” We are creative beings, we humans. The very act of speaking—to even just one person—is a creative act. What most of us don’t have is experience being deliberately creative, or in purposely developing an innovative solution. Just as with public speaking, most of us haven’t had the chance to practice, and don’t know what skills are required, or how to develop those skills. Therefore, it seems impossible: “I could never do that.”
Many of our posts here deal with the skills needed for creative thinking and innovation, and how to develop those skills. Here, in this post, let’s focus on that other aspect: practice. And to remove the fear, let’s lower the stakes to zero.
Here are two no-risk ways to try out some of the basic skills of creativity and innovation: solve someone else’s problem, and be creative in small ways, with small things.
Solve someone else’s problem.
Now, I don’t mean you should impose yourself on someone else, as in, “you know what your problem is?” I mean examine some problem external to yourself, and use creative thinking to address is. The thinking will never go beyond yourself, but that’s entirely the point.
Here’s an example. You can practice reframing, an essential creativity skill, by looking at a problematic situation (someone else’s, that is) and reframing it. To reframe a problem, restate it in multiple ways, in statements that begin with “How to…” or “How might…” What you’ll often find is that the problem being addressed is the wrong one. Let’s try one. Toys “R” Us continues to struggle. It appears from the outside that the toy giant sees its problem as “How to compete with WalMart?” They probably can’t. So what can they do? They could reframe their problem: how might we bring more people into our stores? How might we make our stores a destination retailer, like American Girl? How might we increase our physical presence, but in a low cost way? Each of these reframes leads to a different possible set of solutions. (To see the entire creative process applied to the problems of a fictional struggling toy retailer, check out my new book, Creativity Rising.
Here’s another example. For any problem you see around you—say, your city school district’s battle with truancy in high schools—reframe the problem (e.g., how to make kids want to go to school), then list as many possible answers as you can, trying for 25, or even 50, different answers.
Be creative in small ways, with small things.
Little things happen all the time that require a little creative thinking. Try to get in the habit of thinking: how might I fix this? Here’s a real-life example. About a dozen members of my family were out to dinner recently at a favorite Italian restaurant, and realized too late that they had wandered into Frank Sinatra night. The singer was authentic, but too loud for our mother. She wanted to know: how can we get him to turn the sound down? My sister went at the problem another way: how to make my mother comfortable? The obvious answer was earplugs, but no one had earplugs in pocket or purse. What’s in my purse, my sister thought, that could be used as earplugs? My mother’s sons could not have solved this problem, but her daughter did: she disassembled a tampon, and made my mother cotton earplugs.
So that’s your mission: you can practice being just a little creative and innovative, without taking any risk, at any time you want. Solve someone else’s problem, and solve some of your own—small ones, in small ways. And over time, when someone asks you to “be creative” or “be innovative,” you’ll be able to say, “let’s go.”
Stretch the brain, and it never returns to its former shape.
|See full post and discussion||Posted: 10 months ago|
By Paul Reali
Here’s a question to ponder: will there be a Research in Motion in five years? How about three years? Because I don’t want to bury the lead, let me provide my answer to that question: yes, there will be a RIM…as a business unit of Microsoft.
Now, let’s explore why it might play out this way.
The backstory on RIM is straightforward. Simplifying, RIM created, and for a good while dominated, the market for secure corporate in-your-pocket email communication, with their BlackBerry mobile phone. The telephone part was not the key to its success; it was the traveling email client, which integrated nicely with corporate email servers and networks. The iPhone (or, perhaps we should call it the far sexier iPhone) was not seen, by RIM, as a threat. The iPhone, they reasoned, was a consumer phone. As long as they continued to provide the integration and backbone, they were safe.
The widespread and well-publicized RIM system failures tarnished the company’s reputation, certainly, but the greater damage was this: it provided the impetus for RIM’s corporate customers to begin investigating other options. Once I.T. departments determined if and how they could support other devices, the tide had turned.
Here’s the problem that RIM can’t seem to fix, or fails to understand: for the user, it’s all about the phone. And for corporations which have to decide which devices to support, it’s all about the user. (I will acknowledge here that the user, typically, seems to be the last priority for corporate I.T. departments. But even I.T. eventually cedes to the will of the masses, and of their bosses.)
RIM has attempted to provide phones that are equivalent to the Android and Apple smartphones, but their devices have been critical and commercial failures. The company still has its strengths, including many customers who support no other mobile device platforms; dominance in emerging markets; and the (currently, for a few more minutes) unchallenged BlackBerry Messenger application. (Apple offers iMessage, which, like BBM, works only within its own universe.)
Perhaps it’s time for RIM to reframe the problems and the opportunities. Here are a few ways the company might think about the road ahead:
I suspect that the question RIM is asking itself, however, is how might we survive at all? And if that’s the question, the answer is: be acquired by Microsoft. Microsoft could make RIM into a very profitable division, and allow The House That Gates Built to move beyond its own failures in the phone arena.
RIM could survive on its own, too. But only if they are asking the right questions, and finding the right answers.
|See full post and discussion||Posted: 11 months ago|