NextStage will be posting some of its research here (as noted in NextStage Evolution Research Brief – The Importance of Brand as it Relates to Product v Feature Diversity and MarketShare). We normally apply our research methodology — one familiar to anyone doing psych, social, anthro or language research — to any engagement.
One thing we're repeatedly told is that our problem solving methods are unique and very different from what everyone else does so we decided to offer our methodology's high level form here.
The methodology is simple, adaptable and expandable. What is offered here is a core that can be used in any discipline with little modification.
- Background Study
When presented with a challenge or a question to be answered or investigated, learn as much as possible about everything that's been done before, regardless of how seemingly irrelevant to the task at hand. Be thorough, be detailed. Find out what's failed and why. Find out what got close to solving the problem or answering the question and why it didn't go all the way.
Learn this, study the background (even if it's obvious. And if it's that obvious to you, have someone not familiar with this particular paradigm do the study. You're missing something if you think the background is obvious), study the personalities, the models, the methods, the politics, everything.
If you're not willing or don't have the time, don't take on the research, project or task.
- Necessary Data
This one, we'll admit, causes people the most concern. Many people attempt to solve problems with either available data or easily obtainable data.
Stop. Go no further until you honestly answer this question:
What data — existing or not, obtainable or not — best solves the problem, answers the question or furthers the research?
Talking with researchers and analysts world wide, the above question is the greatest stumbling block. People defer to what data is currently on hand, previously obtained data made public by other researchers, data that current methods make easily obtainable or collectable, etc.
However, “ease of collection” or “prevalence of availability” should not be equated with “solves the problem”, “answers the question” or “furthers the research”. Agreed, it would be great if the exact data that would do all three was there for the taking and yes, solution vendors make wonderful cases for their data collection methods.
The first challenge to solving any problem, answering any question, etc., is to determine what kind and how much data is necessary to provide a solution or answer. Find out how to measure what you really need to measure to solve what you really need to solve and you're 90% of the way to answering the question, solving the problem or furthering the research (there are two corollaries to this and they go into the third step in this research model).
I've seen research come to a halt until the investigators could determine what they really needed to measure to answer what they really needed to answer. Take your time at this stage. I've heard that “We have money to do it over but not enough to do it right the first time” and, while I know several businesses accept that concept, and while I agree with Jeff Bezos' “Anything worth doing is worth doing poorly” I don't believe or accept that these two statements are congruous at all.
- Equals Must Be Equals
The number of times projects fail, results are erroneous or research flounders due to people forgetting the simple rule that “1=1” is staggering. Using an online analytics term, “clicks” here must mean the same thing and be measured the same way as “clicks” there.
The first corollary is
Make sure you're measuring what really needs to be measured.
The second is
Units must be the same — and have the same meaning — on both sides of the equation.
The “1=1” requirement most often fails because people mix Categorical, Rank, Metrical Analysis techniques, measures and methods as if one is identical to the other and they are different.
And if you're not sure of the differences, I'm sorry, you should not be doing research. NextStage is often called in to help businesses make sense of some research they performed or contracted with another group, and more often than not the solution comes from clearing up categorical, rank and metrical overlaps. Categorical, Rank and Metric are basic measurement concepts. I explain them briefly in The Social Conversion Differences Between Facebook, LinkedIn and Twitter – Providence eMarketing Con 13 Nov 2011. Learn them and learn them well.
To that end, the market is contributing to poor research models and measurement methodologies. The number of solution providers promoting self-serving “equations” as solving industry problems that
- change critical term and KPI definitions midstream,
- measure irrelevant (at worst) or very loosely (at best) correlated elements and profess a one-to-one correspondence between measurement and claim,
- include data having no direct relevance to the problem yet are included because they're easily obtained, and
- make up their own KPIs and claim relevance
Companies can find vendors whose definitions make the companies' failures look good and aren't we all a little tired of naked emperors?
People who know me or NextStage know we love quotes. Here are some our researchers keep on their walls for easy reference:
- We are continually faced by great opportunities brilliantly disguised as insoluble problems. – Lee Iacocca
- Simple solutions to complex problems are often wrong. – Jeanne Ryer
- The cause of a problem is the system that produced it. – Tom Bigda-Peyton
- Judgement consists not only of applying evidence and rationality to decisions, but also the ability to recognize when they are insufficient for the problem at hand. – Tom Davenport
- Should you encounter a problem along your way, change your direction, not your destination.
- For every complex problem there is an answer that is clear, simple, and wrong. – H L Mencken