In January 2011, Barbara Mellers was appointed as the 11th Penn Integrates Knowledge Professor. Mellers, a globally influential scholar of decision making, is the I. George Heyman University Professor. This appointment is shared between the Department of Psychology in the School of Arts and Sciences and the Department of Marketing in The Wharton School.
Mellers’ research examines the factors that influence judgments and decisions, including emotions, self-interest, past mistakes, sensitivities to risk and perceptions of fairness. She is an author of almost 100 articles and book chapters, co-editor of two books and a member of numerous prestigious editorial boards. She served as president of the Judgment and Decision Making Society, was a five-year National Science Foundation Presidential Young Investigator and has received major research support from the NSF.
She earned a Ph.D. in 1981 and an M.A. in 1978 in psychology from the University of Illinois at Urbana-Champaign and a B.A. in 1974, also in psychology from Berkeley.
Pavel Atanasov, J. Witkowski, Barbara Mellers, Philip Tetlock (Under Review), The person-situation debate revisited: Forecasting skill matters more than elicitation method.
Philip Tetlock, Lu Yunzi, Barbara Mellers (2022), False Dichotomy Alert: Improving Subjective-Probability Estimates vs. Raising Awareness of Systemic Risk, International Journal of Forecasting.
E. Karger, J.T Monrad, Barbara Mellers, Philip Tetlock (Under Review), Reciprocal scoring: A method for forecasting unanswerable questions.
Katherine L. Milkman, Dena Gromet, Hung Ho, Joseph S. Kay, Timothy W. Lee, Predrag Pandiloski, Yeji Park, Aneesh Rai, Max Bazerman, John Beshears, Lauri Bonacorsi, Colin Camerer, Edward Chang, Gretchen B. Chapman, Robert Cialdini, Hengchen Dai, Lauren Eskreis-Winkler, Ayelet Fishbach, James J. Gross, Samantha Horn, Alexa Hubbard, Steven J. Jones, Dean Karlan, Tim Kautz, Erika Kirgios, Joowon Klusowski, Ariella Kristal, Rahul Ladhania, George Loewenstein, Jens Ludwig, Barbara Mellers, Sendhil Mullainathan, Silvia Saccardo, Jann Spiess, Gaurav Suri, Joachim H. Talloen, Jamie Taxer, Yaacov Trope, Lyle Ungar, Kevin Volpp, Ashley Whillans, Jonathan Zinman, Angela Duckworth (2021), Megastudies Improve the Impact of Applied Behavioural Science,, 600, pp. 478-483.
Abstract: Policy-makers are increasingly turning to behavioural science for insights about how to improve citizens’ decisions and outcomes. Typically, different scientists test different intervention ideas in different samples using different outcomes over different time intervals. The lack of comparability of such individual investigations limits their potential to inform policy. Here, to address this limitation and accelerate the pace of discovery, we introduce the megastudy—a massive field experiment in which the effects of many different interventions are compared in the same population on the same objectively measured outcome for the same duration. In a megastudy targeting physical exercise among 61,293 members of an American fitness chain, 30 scientists from 15 different US universities worked in small independent teams to design a total of 54 different four-week digital programmes (or interventions) encouraging exercise. We show that 45% of these interventions significantly increased weekly gym visits by 9% to 27%; the top-performing intervention offered microrewards for returning to the gym after a missed workout. Only 8% of interventions induced behaviour change that was significant and measurable after the four-week intervention. Conditioning on the 45% of interventions that increased exercise during the intervention, we detected carry-over effects that were proportionally similar to those measured in previous research. Forecasts by impartial judges failed to predict which interventions would be most effective, underscoring the value of testing many ideas at once and, therefore, the potential for megastudies to improve the evidentiary value of behavioural science.
Ike Silver, Barbara Mellers, Philip Tetlock (2021), Wise teamwork: Collective confidence calibration predicts the effectiveness of group discussion, Journal of Experimental Social Psychology .
Ville Satopää, Marat Salikhov, Philip Tetlock, Barbara Mellers (2021), Decomposing the Effects of Crowd-Wisdom Aggregators: The Bias-Information-Noise (BIN) Model, International Journal of Forecasting.
Christopher Karvetski, Carolyn Meinel, Daniel Maxwell, Lu Yunzi, Barbara Mellers, Philip Tetlock (2021), Forecasting the Accuracy of Forecasters from Properties of Forecasting Rationales, International Journal of Forecasting.
PSYC 253 Judgments and Decisions
PSYC 600 Proseminar in Judgments and Decisions
MDS 521 Judgments and Decisions
MKTG 211 Consumer Behavior
MKTG 960 Seminar in Consumer Behavior
This course is concerned with how and why people behave as consumers. Its goals are to: (1) provide conceptual understanding of consumer behavior, (2) provide experience in the application of buyer behavior concepts to marketing management decisions and social policy decision-making; and (3) to develop analytical capability in using behavioral research.
MKTG2110002 ( Syllabus )
The purpose of this course is to provide a solid foundation for critical thinking and research on the judgment, decision-making and choice aspects of consumer behavior. There is a focus on how people process information when making judgments and choices and how the processes of judgment and choice might be improved. Topics of discussion include rationality, judgment under uncertainty, judgment heuristics and biases, risk taking, dealing with conflicting values, framing effects, prospect theory, inter-temporal choice, preference formation, and the psychology of utility. The focus will be on the individual decision-maker, although the topics will also have some applicability to group and organizational decision-making and behavioral research methodologies.
MKTG9500301 ( Syllabus )
Choice of half or full course units each sem. covering a range of subjects and approaches in academic psychology.
PSYC6000303 ( Syllabus )
New Wharton research shows that when people are trying to solve problems, the most effective team discussions happen when participants know what they know – and what they don’t.…Read MoreKnowledge at Wharton - 1/2/2020