Zufälliges Hintergrundbild

When Confidence Limits Attention: New Evidence on Belief Updating

Paying attention to information is essential for good decisions, from personal finance to public policy. A new study shows that people often ignore useful information not only because it is costly to process, but also because they are too confident in what they already believe.

The paper, “Overprecision and (Ir)rational Inattention”, by Ciril Bosch-Rosa, Muhammed Bulutay, and Bernhard Kassner, asks a simple but important question: does overconfidence distort how much attention people pay to new information? The authors focus on “overprecision,” a form of overconfidence in which people overestimate how accurate their own beliefs are.

The study matters because economists often explain inattention  in informationally rich environments as “rational”: people choose not to process all available information because doing so takes time and effort, and the benefit may not be worth the cost.The new research challenges this view by showing that part of inattention reflects biased confidence rather than sensible cost-benefit trade-offs.

The authors begin by developing a formal model of belief updating. In this model, people combine what they already believe with new signals, such as fresh data or news. If they think their prior beliefs are very precise, they see less value in new information and therefore update less. The model predicts that higher information costs reduce updating, higher overprecision reduces updating, but for the same information cost overprecise persons react even more.

The experiment

To test these ideas, the researchers ran a large online experiment with a representative sample of adults in Germany. Participants were shown historical photographs of groups of people and asked to estimate the average age of those in the picture. They also reported how large they expected their error to be. This allowed the researchers to measure overprecision as the gap between expected and actual errors.

In a second stage, participants were shown partial information about the ages in each picture. They could then revise their initial estimates. For some participants, the information was easy to read. For others, it was made harder by adding many irrelevant words, increasing processing difficulty without changing the quality of the information.

What the researchers found

People who were more overprecise updated their beliefs less when they saw new information. In other words, those who thought their initial estimates were very accurate were less responsive to evidence that could improve them.

The information-cost treatment behaved as expected. When the information was harder to process, participants adjusted their beliefs less, made larger errors, and spent more time looking at the information. This confirms that higher processing costs reduce attention and updating.

A key question was whether overprecision and information costs interact. The authors predicted that overconfident people would be especially inattentive when information is hard to process. The main, pre-registered analysis did not find this interaction. However, a more detailed follow-up analysis showed that overprecise participants were indeed less responsive to new information when processing costs were high.

This pattern supports the idea of “irrational inattention.” Unlike standard rational inattention, which reflects a sensible response to high informational costs, irrational inattention arises because people misjudge how accurate their existing beliefs are. As a result, they undervalue new information and allocate too little attention to it.

The study has important implications beyond the laboratory. In many real-world settings, people face complex information, from financial advice to health warnings. If they are overconfident in what they already know, making information cheaper or more visible may not be enough to change their behavior.

The findings also matter for economic modeling. They suggest that attention is shaped not only by external costs but also by internal biases. Ignoring this can lead to an overly optimistic view of how people respond to incentives and information.

Finally, the research shows that feedback can reduce overprecision, at least in the short run. This points to a possible role for targeted interventions that correct biased confidence, alongside policies that reduce information complexity.

In sum, the study highlights that inattention is not always rational. Overconfidence can make people ignore valuable information, even when it is easy to access. Recognizing this “irrational inattention” is crucial for designing better policies, better communication, and better economic models.

To the Study

 

About the Authors

Ciril Bosch-Rosa
Research Assistant of Macroeconomics at Technische Universität Berlin and researcher at the Berlin School of Economics. His work focuses on experimental economics, behavioral economics, surveys, finance and banking.

Muhammed Bulutay
Post-doctoral researcher at the Alfred Weber Institute for Economics at Heidelberg University. His research interests include behavioral economics, information economics, and monetary policy.

Bernhard Kassner
Economist, previously at Ludwig-Maximilians-Universität München. His research interests are at the intersection of behavioral economics, financial market regulation and public finance, and in competition economics.