Spring Season: Get 50% OFF auto coupon applied.
×

Traditional vs Modern Market Research: What Actually Works Today

Before his new fitness tracking product went to market in 2019, a founder that I was consulting with invested four months and nearly $40,000 on doing traditional market research. He went to a research firm for a quote. They conducted focus groups in 3 cities. They sent out 600 questionnaires to respondents in a structured manner. They produced a 80-page well-prepared report with charts and demographic details and a clear conclusion: the market was ripe, the idea resonated and the audience had very high purchase intent.
He launched. It didn't take more than seven months for the product to fail.


On our return, we attempted to figure out, together, what was the problem, and the answer was rather uncomfortable, but instructive. The focus group participants reacted in a typical focus group manner, reacting optimistically and socially, and unconsciously trying to please and assist the researchers conducting the focus group. They expressed their intention to purchase the product. Saying yes was easy, natural, in the safe structured moderated discussion, in the presence of others, who were all nodding with interest. That was made at the time, but actually removing a credit card six months later in the privacy of their lives with other competing products on offer and a price tag bite that was more painful than welcome, that was a different story.

Modern research on sheet


The studies had been using intention as a measure. It had not been taking measurements of actions! But in market research, it's not the same data.
I believe that this experience has been the beginning of my more serious education in the difference between the research methods which give comfortable confidence, and the research methods which give the correct prediction. This guide is all I have learned from education on traditional market research, modern market research and — most importantly — when it really works.

The word traditional can mean a lot of different things, and when it comes to market research, the same is true.
Although a general term, traditional market research refers to a familiar category of methods that were developed during the mid-nineteenth century and came to be used by companies, in their big product, marketing, and strategic decisions, through the 1970s, 1980s and 1990s.

What We Mean When We Say Traditional Market Research 

The environment in which it is conducted, and the information that people are asked to report deliberately regarding their preferences, behaviors and intentions. Focus groups are a type of qualitative research that involves bringing together a small group of people in a moderated setting to discuss the concept, product or message in real time. Telephone and mailed questionnaires provide a standardized questionnaire to a large sample and compile the responses to the questionnaire into a statistical picture of the attitudes of the population. Face-to-face interviews allow researchers to have longer time with the respondent, to delve deeper into responses. Ethnographic observation involves the researcher entering the respondents' natural environment and observing behavior as opposed to having respondents self-report.
These approaches were developed because they were the best that they had for a particular problem: how to learn about markets so large and complicated, prior to the era of digital behavior data on any meaningful scale. For decades they did just that and in some cases they still do! Traditional methods of research don't necessarily have any flaws — rather, they simply are not always used to answer the right research questions or even used at all with consideration for the systematic biases that make the results less reliable than the confidence of a professional research report would indicate.

The biggest problems with traditional research that most reports don't discuss.

Many in the research industry have long been aware of the drawbacks of the old-fashioned approach, but those drawbacks are seldom emphasized in the reports they provide to their clients — that's because research companies are paid to provide insight and confidence, not to erode confidence in the methods that they're being paid to implement.
The greatest constraint is that there is a mismatch between what people say they would want to do and what they actually do — the difference between stated preference and revealed preference. Over the past decades, many behavioral economic researchers (including Daniel Kahneman and Richard Thaler) have demonstrated that humans are systematically unreliable historians of their own future behavior. We grossly overestimate the amount of exercise that we will get from the gym membership we are paying for. We don't take account of the extent to which the convenience of an old product will inhibit our willingness to try a new one, even if we say we like that one in the research. In a focus group room, and in groups in general, we give the socially desirable answers as it is a deeply engrained human trait that doesn't turn off in the group setting.
This is exacerbated by an interviewer effect. The context of who poses a question, how the question is worded, the sequence of questions, and the answers provided to the previous questions in a research instrument or conversation can affect the answers to the questions. These effects are unavoidable by professional researchers although they are trained to reduce these effects. All traditional market research information goes through a filter that entails some systematic distortion between researcher and respondent.
A third limitation, not often discussed in a straight-forward way, is the size and representativeness of the sample. A focus group of eight people is not a market! While a 600-person survey might be sufficient for some population level metrics, it's insufficient to know what any particular psychographic population did in the survey. Many polished research reports give the illusion of more confidence than they are warranted and clients not trained in research methodology often don't have the means to judge the extent of the over-confidence.

Where Traditional Research Still Genuinely Earns Its Place

Even these constraints do not mean that traditional market research techniques are outmoded: They are just misapplied when used as catchall research techniques, instead of for specific questions.
Traditional research still has something going for it in qualitative discovery research — that is, knowing what to ask in a market before you know what questions to ask in a quantitative study — when you have to learn the language of the market, delve into its emotional landscape, and understand what is missing from even the most capable of articulations. A good in-depth interview with a willing and reflective buyer can uncover the language, metaphors and emotional connections a market has for a problem—and things that can't be gleaned from behavioural data. From digital data, it is possible to recognize at which point in the onboarding process users are dropping out of a product. It is not possible to tell from those data alone what they at that time felt, or thought. A competent researcher can allow him to come much closer to that understanding if he is able to interview those users.
Concept testing is another area where traditional research has relevance – products that are truly new – that are doing something that doesn't exist yet in the market – they have no behavioral analog. Behavioral data provides you insight into what people are doing with what you have. It can't by definition tell you how they will respond to something that doesn't exist. Even with a clear understanding of the limitations of such structured attitudinal research, it is needed for truly novel concepts, to gauge market appetite.
The third context, in which traditional research methods are still important, is regulated industries. Traditional methodologies are often used in these industries, such as healthcare, financial services and government, where research methodologies are often required to have a valid framework and documented processes.

The dawn of modern market research — what really changed?

Market research has changed more in the past 15 years as a result of data availability rather than methodological innovation. What is different is that with the digitization of human activity, researchers created new ways to be smart in their methods, but they also created an incredible amount of naturally occurring behavioral data that enabled them to ask and answer certain kinds of research questions that were not previously possible.
Users who conduct a Google search, on the other hand, are exposing a real need in the most unfiltered form, without a researcher to prevent certain answers from being provided, without societal pressure to provide a certain answer, without the rational self-editing that occurs in a structured research environment. Amazon can only collect data on this kind of behavior when a customer reads an Amazon review and then leaves without buying, no survey will ever create this data point. If a SaaS user is using a product and drops out of an onboarding process at step three after spending four minutes on step two, the sequence of actions provides feedback about a friction, confusion, or an unmet expectation that a focus group user talking about their hypothetical experience with the product could not have accurately anticipated.
Today, market research is all about the ability to read these natural "behavioural signals" and to read them in a sophisticated way so that meaningful insights can be gained. This encompasses search demand analysis, social listening, behavioral analytics, digital ethnography, A/B testing, and an ever-expanding library of AI tools to process very large quantities of behavioral data where a decade ago it would have been impossible to run the calculations. All of these have the key strength of the worst methodology in traditional research — they focus on what people do, instead of asking them what they would do.

If you have an interest in the stock market, you must have heard the term “search demand analysis”.
More than any other contemporary market research tool I've embraced over the past few years, search demand analysis has been a constant winner in terms of delivering high quality insight for the investment in market research. But it's the way that I see being underused the most by businesses that are still largely geared towards the traditional research methods.
In the search engine, people put their needs, problems and questions as plainly and honestly as they can be stated and in a way that almost no other research environment can replicate. A person inputs “why does my lower back hurt after sitting” on Google and is in fact asking about a real issue which they have in their own words and with no researcher there to put a bit of spin on it. By searching for the “best project management software for small teams under $50 per month”, you will learn what they are looking for in a project management solution, how much they are willing to pay, and where they are in the buying journey. That's a ton of market intelligence at an enormous scale that anybody can get on his hands if he has access to search analytics tools. On other hand you can Wind Spot how it helps users to adapt modern changes and latest AI features in their business.
I've applied search demand analysis to a lot of different scenarios, including when I was assisting a client with determining the market opportunity for a new online learning product in 2023. The traditional approach to research would have sought a sample of interested parties and asked them if they were interested in taking an online course and would pay for a structured course. Instead, search analysis revealed which topics within specific areas of learning were seeing a rise in search interest, what questions users had at any given point of their learning journey, users' natural language to describe their own skill gaps, and what search results with pricing terms revealed about the sensitivity of the market to price. The behavioral data was much more actionable and more honest – and more specific – than any stated-intentions survey. Not to mention the price of a regular research job was a lot more.

Social Listening – Reading the Conversation Your Market is Already Engaging

Social listening is the process of detecting and analysing publicly available comments, conversations, and dialogues occurring on different social media platforms, review sites, forums, and communities – not to track brand mentions, but to find out what words, feelings and worries a market uses to communicate about the issues your product or service resolves.
There is the very reason that focus groups are restricted that makes social listening so valuable as a research tool -- it's that it is listening to people talking to each other, not to the researchers. A person complaining in a Reddit thread, recommending something in a Facebook group or asking for advice on a professional forum is using his or her natural voice, and the motivation is real experience, not to be helpful to an ongoing research process. That communication is the data, and that's what's authentic.
For a client working in the home organisation category, I worked on a social listening project to help them comprehend why the conversion rate for their product was not as expected from the trial to the purchase. Instead of conducting user interviews (which would have resulted in user-explanation after the fact of already completed behaviour that would have been affected by memory and social presentation), I conducted a systematic reading and coding of conversations across three relevant subreddits, two Facebook groups and the review section of competitor products on Amazon, spanning a 3 week period.
That analysis revealed a certain pattern that, by itself, had never been seen in the data from the trials to purchases. The design and initial experience of people were great but they were not sure about the correct use of the product after the trial period. The issue with the conversion was never about value, or price — it was about confidence. It wasn't from asking other people what they thought, it was from people telling you. A following simple in-app guidance sequence that focused on the confidence gap resulted in a 31% increase in conversions of trials within a 90-day period.

Behavioral Analytics – Your Own Data Has the Answer

But no business should outsource, whether traditional or modern, primary or secondary, market research without first leveraging the research resource that most businesses are grossly underutilizing: the behaviors of their customers and prospects while interacting with their own digital touchpoints.
Market research can be in the form of website analytics, product usage data, email engagement metrics, support ticket patterns or sales funnel drop-off rates. They tell you—with behavior-specific detail—what your real customers and prospects do, what they do and don't do, what they click on, what they ignore, how long they spend with what content, when they make and when they don't make a decision to purchase. This data is more truthful than any survey or more specific than most focus groups or more immediately actionable than most research deliverables.
One of the main reasons it's not used is organisational: this data is frequently stored in different systems, managed by different teams, and extracting this data from them and combining it into meaningful market insights requires analytical skills and cross-functional access many businesses lack. And, at least in part, it's a research culture issue – the inclination to outsource research and not to analyze internal data more thoroughly is ingrained in organizations that were established in the traditional research model.
One client that I work with has a business that produces a lot of data on the behaviours of around 40,000 monthly active users. Prior to becoming involved with them, customer surveys were done quarterly and user interviews were taken from time to time, to inform their product decisions. Their behaviour data were only for operational reporting. Analytical focus was redirected to that behavioral data, and product insights that were meaningfully more accurate and more specific than those gleaned from the survey data used to inform their roadmap were created, with an emphasis on building user journey maps based on real click-stream, segmenting users by behavioral cohorts, and identifying behavioral signals that forecast long-term retention versus early turnover.

A/B Testing — The research Method That Replaces Assumption With Evidence

Finally, A/B testing is the most powerful research technique of which digital businesses can avail and it is unlike the others mentioned in this guide as it is not studying the behaviour, it is creating the behaviour. Unlike surveys and questionnaires asking what people would like or watching what they do in current environments, A/B testing involves putting a 2nd version of a particular element in front of some individuals from a real audience and a different version of the same element in front of others, and measuring the actual behavior that results from each version.
The advantage of this method is that it is the answer to the question which market research is really trying to answer — “which of these will perform better with my real audience?” — and it's because it needs to be answered by the real audience, rather than by the audience predicting or recalling. The most popular headline for 70% of the survey takers may not be the most popular headline for everyone. An A/B test takes out that guesswork and provides concrete data.
I have witnessed so many times that even traditional research that required a lot of confidence came to be thrown out due to A/B testing, that I am now viewing every attitudinal research piece (both traditional and modern) as a hypothesis to be tested rather than a conclusion to be implemented. A client in the subscription services arena had conducted customer surveys that provided them with a new design for a web page for their pricing information they felt customers would better understand and find more persuasive. The new design was tested in an A/B test against the original design and found that the survey version was 12% less successful than the original design in converting. The customers were right in saying that they wanted the aesthetic that they had requested. They were not correct in their prediction of how that aesthetic preference will impact on the purchasing behavior, in real conditions.

The promise and reality of AI-powered research tools.

Artificial intelligence has found its way into the market research toolkit, and some of the uses are definitely more game-changers than others, but some market research practitioners need to be more critical in deciding which is which.
The true money-spinner use cases for AI in market research are those in which the previous market research limiting factor was computational: dealing with a massive amount of unstructured qualitative data that would otherwise need to be analyzed by human market researchers, but that now can be analyzed by AI. For example, AI text analysis can analyze thousands of customer reviews, support tickets, or social media posts to recognize sentiment trends, prevalent themes, and word clusters, which would otherwise take weeks of human analysts to replicate. This is a true power of the qualitative research as it's used to be possible only for quantitative research on such large volumes of data.
For some applications, predictive modeling, which involves finding the right mix of product features, messaging or customer traits that predict success, is a true improvement over traditional research approaches using machine learning. If your business has a big enough behavioral data, you can train predictive models that can find out the drivers of conversions with a high precision that survey-based research cannot get close to.
But where AI research tools are not quite as revolutionary as they're made out to be is where they don't offer a replacement for the human touch, meaning, and context. If the AI tool telling you that sentiment around your brand is "67% positive" based on the sentiment analysis of data that was collected from social media, then it's a number that is completely dependent on whether the sentiment classification on it was trained on data that reflects how sentiment is expressed by your particular audience in your particular category. If it doesn't include that validation, then the number is accurate without being exact – and exact inaccuracies can be more damaging than known inexactitude.

A hybrid research approach – How I actually design research projects today!

After having worked with both traditional and modern research methods with different research questions in various industry scenarios for years, I no longer consider market research as a traditional and/or modern approach. As a sequencing and combination problem: deciding which method(s) to use at which stage of the research question and how to combine the results of two or more methods to provide insight which neither method alone would provide.
My research design approach is always the same; I begin a research project by examining behavioral data prior to commissioning any primary research. What additional, existing behavioral data (search trends, analytics, social listening, sales funnel data) can we use to discover the answer to the question we're seeking? It often turns out that some part of the research question can be answered with existing information, thereby targeting primary research resources to the information gaps where behavioral information is needed.
Qualitative discovery — usually done in-depth interviews or digital ethnography instead of focus groups — is the next step to gain rich understanding of the emotional and experiential landscape that behavioral data can show, but cannot explain. What are the reasons why users are bouncing at this stage of the funnel? What do people say in social discourse about this category that gives indication of their thinking about the problem? This is a qualitative phase, which produces hypotheses, but not conclusions.
Those hypotheses then are validated at scale, quantitatively, via A/B test, in which the behavioral results are the test, or a survey research that is specifically created to test the hypotheses developed by the qualitative research, not necessarily to yield open-ended insights. The quantitative phase yield confidence and the confidence from the qualitative stage, which has already made the hypotheses really meaningful.
Although it is more time and effort to plan for than a one-shot research project, this sequential process provides a greater degree of insight which is more accurate, more specific, and more directly actionable than any one method provides alone.

Why the quality of research is more important than the quantity of research?

One of the most pervasive business research culture misconceptions is that “more is better” — that the more surveys, larger samples, focus groups, and reports the more confidently you’re going to make a business decision. My exposure to numerous research projects is that this is not seen in real life and that it is the businesses that invest in the quality of research rather than the quantity of research that make the best research informed decisions.
One good behavioral analytics project that assesses the decision journey of your real customers — not three focus groups that ask customers what they might do if they bought something — is more valuable. An analysis of a targeted social listening campaign that reveals the real language and emotional needs of a particular customer base is more valuable than a big survey to quantify surface-level attitudes and lack of understanding of the motivation behind them.
Quality measures of market research, such as methodological rigor, research design and research question fit, avoiding systematic biases, and analysis sophistication, are not apparent in the final product of the majority of research deliverables. The same 100 page research report, and the same 15 page behavior analysis, may have widely different actual insight quality, depending on the design of the report. One of the best skills that a business leader can have when deciding on how to invest in research is learning how to determine the quality of the research and not the amount of research.

My "What To Do If You Start Your Business Today" List of Market Research Projects

The first thing I'd tell them is to work with the data that they have, rather than going out and spending money to get new data. The most underutilized market research, most businesses pay for, is website analytics, CRM data, support ticket patterns, conversations with the sales team and product usage. It is often much quicker, more economical and more precise to derive insights from existing, available behavioural data rather than having to generate new attitudinal data specifically for the purpose of answering the questions.
The other thing I would say to them is to select the appropriate research technique for the question at hand, not go with the one they are comfortable with, or it looks like a "big" research method. Multivariate testing is better suited to answer a question regarding what language resonates with your audience for your marketing copy, than focus groups. Session recording analysis is more helpful for a question on why users aren't completing a particular product action as opposed to a satisfaction survey. Search demand data does a better job of answering a question on the size of an addressable market segment than can a stated-interest survey.
The third was that they should use all research output, be it a traditional or a modern, qualitative or quantitative, attitudinal or behavioural one, as input to a judgment process and not as answer. Research decreases, but doesn't remove, uncertainty. The most successful businesses using market research are those that recognise this difference, and keep their judgment as the final integrating mechanism between the market research insight and the business decision.

Those who have found themselves on either side of the debate have overlooked the Research Truth.
The traditional/modern argument for market research is the most simplistic form of a people who trust what people say vs. people who trust what people do debate. Both camps are correct on the issues of which both camps are wrong. While the traditional researchers are right that data on behavior is necessary, but insufficient, to explain motivation, cannot predict response to truly novel ideas, and cannot substitute for the depth of knowledge produced by good qualitative research. Researchers today are right that stated preferences do not reliably predict actual behavior; that the amount and authenticity of digital data available are often greater than can be obtained in a structured research environment; and that the cost per insight of many modern methods of data collection is often quite high compared to traditional methods.
Both camps believe that they have it right when they ask: What is the better approach? What they do not see is that this is the wrong question. The important question is: which of these techniques – or which combination of techniques – and in what order – is most appropriate for the research question asked by the particular business at the particular time of its life? There's no universal answer to that question. This needs to involve judgment, methodological literacy and a real appreciation of what each of the research tool kit measures and what is systematically missing.
That's the true art of market research – to develop that judgment. But it's work that won't be done through a traditional research certification, or even a love of the newest analytics platform.

We have first-hand experience designing and testing market research projects in B2C and B2B spaces from 2016-2026, featuring product launches, content research and pricing studies, as well as conversion optimization projects for e-commerce, SaaS, professional services, and consumer goods verticals. Each of the case studies presented is a true project and the true result.

dev manu dhiman
Meet the Author
Dev Manu Dhiman
I am an online content professional and blogger, who offers useful information, materials and advice to advance your internet life. I post only the best pieces of content carefully chosen due to the extensive research that I conducted on thousands of tools, platforms, and resources, which I share on this blog. I want to be able to fix the issue that bothers people on the internet and I want you to be successful in whatever you are trying to do, be it create a web site, engage in the world of digital opportunities, or make your blogging experience the one you enjoy.
Piki Templates
.com
Manu Dev
Hi There, Have a question? Text us here.
1
Manu Dev
Manu Dev
Typically replies within an hour
Hi there 👋

We are here to help you!
Chat on Telegram
Fast · Reliable · Secure