20110518

Social influences kill the wisdom of the crowd

By John Timmer

The "wisdom of the crowd" has become a bit of a pop cliché, but it's backed up by real-world evidence. When groups of people are asked to provide estimates of obscure information, the median value of their answers will often be remarkably close to the right one, even though many of their answers are laughably wrong. But crowds rarely act in the absence of social influences, and some researchers in Zurich have now shown that providing individuals information about what their fellow crowd-members are thinking is enough to wipe out the crowd's wisdom.

As the authors of the paper point out, the wisdom of a crowd is actually a statistical phenomenon. Many people have some very rough sense of a given value—in this case, things like the length of Switzerland's border with Italy or its murder rate—but don't know the precise figure. As long as their answers are just as likely to be above or below the actual value, the mis-estimations should cancel each other out. "The wisdom of crowds effect works if estimation errors of individuals are large but unbiased such that they cancel each other out," as the authors put it. That places the mean of the answers somewhere in the neighborhood of the correct one. In some cases, the crowd will actually do better than a group of experts.

The authors, however, realized that most decision making takes place in a social environment—people talk among themselves, compare answers, and get various forms of feedback. So, they decided to test how these processes might influence the wisdom of a crowd. To do so, they asked a panel of a dozen people the same question five times, allowing the participants to change their answer each time. To make sure the answers were serious attempts at getting it right, financial rewards were offered based on the accuracy of the guesses.

In the control case, the one where the crowd was most likely to be wise, the participants were simply reminded of their previous answers before being given the question again. In the other cases, individuals were given the chance to see what the rest of the panel was thinking. Before re-asking the same question, the participants were given either the aggregate of the answers provided by their fellow panelists, or the individual answers provided by each of the panel members. This let everyone adjust their next answer based on the answers provided by the rest of the panel.

Compared to the control setup, the additional information changed the crowd's collective behavior dramatically. In what the authors term the "social influence effect," the panels that were provided with information about their peers quickly narrowed their focus onto a fairly limited set of values, meaning the diversity of their answers decreased. In contrast, the control group retained its initial diversity throughout the repeated rounds of questioning.

Worse still, the panels that were provided with social information narrowed in on answers that were more likely to be wrong. The "range reduction effect" occurred when the diversity narrowed down to a point that was further away from the actual true value, pushing the correct one to the periphery of the range of estimates provided by the participants. This becomes a problem if people are trying to get a broad estimate of the correct value; the range that the social panels produce will be centered on the wrong value, and might be so distant from the correct one that it's excluded entirely.

These two effects focused on the statistical distribution of the answers provided by the crowds. The authors, however, also detected a purely social influence, which they termed the "confidence effect." As the range narrows and more of the participants are close to the typical answer of their fellow panelists, their confidence that the answer they're giving is roughly correct goes up. In other words, when someone sees that the rest of the crowd is giving an answer close to their own, it gives them greater confidence that their answer is likely to be right.

How important are these effects? Although popular culture has adopted the phrase "wisdom of the crowd" to apply to anything that involves more than a handful of people, the original description of a crowd's wisdom made it clear that it only applied to a limited number of question types and circumstances. So, finding additional limits really shouldn't be much of a surprise. It does, however, serve as an added caution that, just because there's a crowd involved, we shouldn't assume that anything that comes out of the crowd is wise. As the authors note, we seem to have a tendency to make exactly that assumption. "Opinion polls and the mass media largely promote information feedback and therefore trigger convergence of how we judge the facts," they conclude. "The wisdom of crowd effect is valuable for society, but using it multiple times creates collective overconfidence in possibly false beliefs."

No comments: