FREEDOM AND SAFETY

 

Without doubt, critical thinking is necessary in order to be a good analyst but particular skills and experience are also required. What are some of these skills?

All analysts are not alike.  So how do you spot a good one?

Dictionary.com defines analyst succinctly as "a person who analyzes or who is skilled in analysis." For me, a good analyst is able to separate the signal from the noise, and also knows what kinds of signals to look for as well as where, when, how and why to look for them.  It's more than number-crunching, as one can have excellent math and computer programming skills and still be a lousy analyst.  It doesrequire critical thinking, which Dictionary.com defines as "disciplined thinking that is clear, rational, open-minded, and informed by evidence."  Without doubt, critical thinking is necessary in order to be a good analyst but particular skills and experience are also required.

What are some of these skills? Naturally, in marketing research knowledge of marketing is crucial and this, plus some background in the social and behavioral sciences, can distinguish a marketing scientist from a statistician. Perhaps it's like the difference between a pitcher and a thrower in baseball.  I feel analysts should also look outside of their own specialization to other fields for ideas and techniques not used in their own field.  Software skills of many kinds, beyond those required of a typical businessperson, are also needed but, like mathematics, these are means and not ends.  Interpersonal and communication skills are necessary for most positions these days and, increasingly, are even being asked of robots so, like critical thinking, they are necessary but not sufficient.

It's hard to overrate the value of experience, especially in a field such as statistics.  Formal university-level stats training lays the necessary foundation but there is a great deal of material and little time, so that foundation can be rather thin.  Moreover, in these classes, the examples shown seldom bear much resemblance to real-world marketing research problems, which come in a wide assortment of sometimes odd flavors. The University of Hard Knocks is where the lasting lessons are taught.

Personality and outlook are also important. A good analyst should be a curious sort and have a hunger to understand what makes things tick.  They need to move between the abstract and the concrete fluidly and make connections between the two that are understandable and pertinent to their clients. They must avoid oversimplifying while, at the same time, aim for simplicity.  Tolerance for ambiguity, obviously, is helpful!  Integrating various sorts of data and information into decisions - including expert opinion - is increasingly both feasible and obligatory for decision makers, and analysts should not be overly-focused on a single data source, data type or analytic method.  Curiosity will come in handy here too.

What about imagination and creativity?  Both are indispensable, I feel.  However, good analysts also need to be wary of falling into the trap of assuming that an exotic theory is necessarily a valid one.  Simple answers, even if boring, are more likely to be true than elaborate ones.  As Nicolaus Copernicus put it, "We thus follow Nature, who producing nothing in vain or superfluous often prefers to endow one cause with many effects." They also avoid confusing the possible with the plausible and the plausible with fact.

A good analyst must also be able to deal with complexity and uncertainty, however, and be able to think in terms of conditional probabilities: for example, if A and B happen, what are the probabilities of C and/or D happening?  While it is true that we all must sometimes make quick go/no go decisions, analysts need to step back and figure out how the various pieces of the puzzle fit together and the implications for decision-makers.  Being able to do these things is one of the reasons they're hired in the first place!

Similarly, they need to keep an open mind but avoid trying to be being different just to be different, which is another flavor of conformity anyhow.  Analysts do need to be quite risk tolerant since they are often under pressure to try out a new idea quickly when more familiar approaches don't work for the problem at hand.  Yet, at the same time, good analysts must pay attention to detail and be able to endure the drudgerous tasks that still consume much of their time.  Being a good analyst is like a tightrope walk...

Here are a few more thoughts on what I think makes a good analyst:

  • They are careful about making assumptions (especially that they have been clearly understood!) and try to put themselves in the shoes of the eventual users of their analyses. They are hard on themselves: "Am I really asking the right questions and do I really have the right answers?" They know that even the fanciest analytics can't save them from shoddy thinking.
  • They never assume their data are clean or that the data they have are all they'll need for the project they're working on.
  • When working cross-culturally assumptions can be especially dangerous and we all should be aware that TV, the World Wide Web and Beyoncé have not made everyone on this planet the same.
  • Sampling is a technical subject that many marketing researchers do not enthuse over, yet I am unaware of any type of research, including qualitative and Big Data analytics, for which it is irrelevant. Though this may come as a surprise to some of you, quite a few self-described data scientists have little understanding of sampling and, in my opinion, this affects the quality of their work.
  • Experimental designs and causal analysis are huge topics and have relevance to qualitative researchers as well as to quant guys like me. This is another weak area of data science, which tends to stress data management and predictive analytics (e.g., for cross-selling). Though causation is very difficult to prove - some would say impossible - marketers usually need insights into why consumers behave the way they do. Simply being able to predict their behavior is seldom enough.
  • A good analyst will be aware of the risks of data dredging and know that if they indulge in it they might unearth a land mine instead of precious metal.  The Improbability Principle by David Hand, explains in some detail why our data can blow up on us. Hand is an emeritus professor at Imperial College and past president of the Royal Statistical Society, and I find this book helpful even when reading a newspaper or watching a documentary on TV.
  • They will be on the lookout for confirmation bias. It's quite natural to search for, interpret or recall information in a way that confirms our beliefs.
  • They recognize that statistical models are simplified representations of reality, "Essentially, all models are wrong, but some are useful" in the famous words of statistician George Box. Because of this, it's not at all uncommon for two or more models to fit the data equally well but be inadequate or suggest different interpretations and courses of action. It's rare that good analytics can be done just "by the numbers."
  • How the model is built is part of the model. Mechanistic approaches are confining and good analysts don't let point-and-click software do their thinking for them. That can be hazardous to the health of their careers as well as bad for clients' business.  Likewise, they don't stick blindly to the default settings of statistical or machine learning software.  It's not only that these settings may not be "best", they might be completely wrong for the task.
  • A good analyst has many tools in the toolbox and frequently uses them in combination rather than sticking with the one they are most comfortable with. Client needs and the data drive their selection of tools, not their personal preference or commercial interests.
  • The following observation by Richard McElreath in his book Statistical Rethinkingwill make perfect sense to a good analyst: "...statisticians do not in general exactly agree on how to analyze anything but the simplest of problems. The fact that statistical inference uses mathematics does not imply that there is only one reasonable or useful way to conduct an analysis.  Engineering uses math as well, but there are many ways to build a bridge."
  • Though it's now fashionable to bash p-values and Null Hypothesis Significance Testing (NHST), I feel each has its place. They arose from the need to discipline ourselves and to minimize subjectivity (and politics!) in decision-making. Identifying the core business issues is an essential first step and NHST provides one framework analysts can use for this.  Basing decisions on significance tests alone, however, is unwise, and the pioneering (if prickly) statistician R. A. Fisher counseled against this practice. A good analyst understands this.

I'm writing as a marketing researcher mainly focused on quantitative research but feel many of these skills and the general mindset I've described also apply to qualitative marketing researchers as well as analysts in other disciplines. Not everyone will agree with everything I've written, of course, and what I've left out for space reasons or simply because it didn't occur to me would fill many blogs, I'm sure!  This has just been a quick snapshot of a complicated subject and I hope you have found it interesting and helpful.

Notes:

  1. Though their framework is not universally accepted among scholars, Geert Hofstede and his colleagues present some useful perspectives on how businesspeople can deal with cross-cultural differences.
  2. If you have an interest in sampling and want to go beyond what is covered in introductory textbooks on marketing research and statistics, Sharon Lohr has written a book entitledSampling: Design and Analysis that is detailed but not dreary.
  3. Experimental and Quasi-Experimental Designs (Shadish et al.) is one book I wish all marketing researchers would read or at least have on their shelf as a reference.  If you're very ambitious and mathematically-inclined, Judea Pearl's writings may also be of interest. Pearl has challenged much of statisticians' conventional wisdom regarding causal analysis.

By Kevin Gray, Cannon Gray.

http://www.kdnuggets.com/2017/04/gray-makes-good-analyst.html?utm_conten...