There’s no doubt that data has become big.
The amount of information that can be gathered and processed has expanded far beyond the reach of anything that could have been imagined before the development of digital technology. What’s more, the pace of growth in the amount of data we have is itself increasing at a speed that is beyond any practical measurement.
Big Data is undeniably useful for large organizations seeking to manage systems of marketing and management that have become too complicated and vast for any human mind to comprehend. Artificial intelligence systems are sifting through data to bring astonishing efficiency and optimization.
The very power of Big Data, however, is creating new challenges, as systems of automated analytics are becoming so powerful that no human being can understand how they do what they do. The problem created when Facebook’s AI bots began speaking to each other in a language no human could immediately understand wasn’t that computers were plotting to take over the world. Rather, the incident was just the latest signal of a dangerous divide between data systems and the people who seek to profit from them.
As we gather value from things that we don’t understand, our sense of alienation from our grows. Increasing numbers of people work at companies where the human beings in management can’t fully explain how decisions are being made, as A/B tests deliver answers without explanations. Even as computers make increasingly accurate predictions, human beings are left ignorant of the reasons those predictions make sense. They receive information without understanding.
When corporate decisions are made in a black box, the sense of common purpose that once brought employees and consumers together around compelling brand identities fades. Even as efficiency grows, the culture context that made the efficiency matter in the first place crumbles.
This crisis of disengagement in commercial culture can’t really be blamed solely on the the development of Big Data. The bigger problem is that the development of algorithmic information processing hasn’t been matched with an investment in methods of human insight.
Current discussions about research methods are dominated by quantitative standards. Research teams focus on finding and analyzing accurate information in order to solve problems that have been tightly defined to meet objective criteria. Sometimes, however, accuracy of information isn’t the point of research.
As the power of data has grown stronger, the strength of qualitative research in the business world has atrophied. Under the focus on efficiency and optimization as measures of success, qualitative research has been pressured to become quicker, cheaper, and more standardized – a pale imitation of quantitative approaches. The output of qualitative researchers has thus grown thin at the very historical moment when it ought to be getting more thick.
The idea of a thick approach to business is gaining traction as the thinning of corporate culture spreads. Tricia Wang has advocated the articulation of a theory of Thick Data, and the Manifesto of Beautiful Business urges professionals to “for thick presence rather than lean distribution.
Thickness has negative connotations, to be sure. Thick texts use language that is difficult to penetrate. A thick person is unable to understand even the most simple truths.
More positively, thickness has a heft to it. The use of the term “thick” among researchers can be traced back to Clifford Geertz, who advocated “thick description”, the close attention to the multiplicity of meanings contained even within simple gestures. A thick approach to research will have a more narrow breadth than what quantitative approaches can achieve, but penetrates deeply where it seeks, elaborating on the full context of an experience that a data driven approach would reduce to a single number.
Qualitative research serves a different purpose than the quantitative methods of Big Data: While quantitative research finds its power through strict definitions of objectivity, the purpose of qualitative research is to explore the influence of subjectivity. What’s more, the best qualitative research seeks to involve members of research teams in that subjectivity, so that, rather than remaining removed from the research process and unmoved by it, they are transformed by it. Qualitative research methods become more powerful the more that researchers become personally involved in them.
Qualitative research is worth doing not just because it explores the thickness of human experience, but because it is itself a form of thick experience. The purpose of research, after all, isn’t merely to gain information. Research is conducted in order to create change. The most important kind of change takes place for the people involved in the research process.
Qualitative research is, in its highest form, a ritual experience that separates members of the research team from the restrictions of their ordinary lives, disorients them from their habitual ways of thinking, and exposes them to a rich display of unfamiliar symbolic meaning. Truly in-depth qualitative methods immerse researchers in situations that are thick with significance. They require researchers to undergo ordeals of suffering, and demand the sacrifice of cherished assumptions about the world. In exchange, however, they provide researchers with the boon of insight, from which fruitfully disruptive innovations can be constructed.
The kind of transformation that results from a thick research experience goes far beyond what we traditionally think of as learning. Recent research from the Basque Centre on Cognition, Brain, and Language, for example, found that as two people enter into conversation, their brainwaves gradually move into synchrony with each other. Thus, in-depth, one-on-one interviews of the sort conducted by dedicated qualitative researchers build a kind of intuitive empathy with their subjects that a quantitative survey can’t deliver. The more the thick interactions of qualitative research are replaced with chatbots and data-mining, the less emotionally engaged consumers will be with the brands that seek their attention.
The risk of confidence in quantitative data is that, while it is mathematically reliable, it is culturally disconnected. Numbers can be manipulated with confidence because they are simple, but the qualitative objects and experiences that the numbers represent are rarely as simple as what the numbers are capable of representing. While a number represents a definite value, qualitative concepts are always deeper than they at first appear to be. Even the most simple of qualitative ideas carries with it an historically rooted and culturally-specific constellation of associations, many of which merely suggested at in typical conversation, and some of which rarely emerge into conscious awareness at all. It’s easy to predict what will happen when you put two numbers together. When you put two symbols together, though, something unexpected is likely to take place.
Neither the quantitative perspective nor the qualitative perspective is superior to the other, but they each have their proper uses, and excessive reliance on either perspective leads to disaster. Qualitative research without a quantitative testing can lead to business strategy that derived from bias more than true opportunity. Quantitative analysis without qualitative grounding, on the other hand, can enable businesses to be exceptionally precise without having any idea what they’re being precise about.
The best approach to research isn’t in one direction or another, but in balance, thick and thin.