Source: UNIVERSITY OF ARKANSAS submitted to NRP
UNDERSTANDING NUTRITION THROUGH MACHINE VISION
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1032227
Grant No.
2024-67023-42545
Cumulative Award Amt.
$648,358.00
Proposal No.
2023-10720
Multistate No.
(N/A)
Project Start Date
Aug 1, 2024
Project End Date
Jul 31, 2027
Grant Year
2024
Program Code
[A1641]- Agriculture Economics and Rural Communities: Markets and Trade
Recipient Organization
UNIVERSITY OF ARKANSAS
(N/A)
FAYETTEVILLE,AR 72703
Performing Department
(N/A)
Non Technical Summary
Healthy food choices are central to nutritional security, but these choices require individuals to find a complex balance between interests in the present and future. Therefore, policies to encourage healthier eating must consider food characteristics to which people give greater attention when making these intertemporal decisions and the psychological mechanisms through which those characteristics influence the decisions. In this project, we propose to study emotion as one such psychological mechanism.We propose to utilize cutting-edge machine vision and artificial intelligence technologies to evaluate the role of attention and emotion in food choice. These technologies use cameras to track subjects through a physical space, measure attention through eye-tracking (what information is currently being processed), and measure emotion through facial micro-expressions (how that information affects a subject's emotional state).We will investigate the role emotions play in food choice decisions. We are also interested in how the relationship between emotion, attention, and food choice is affected by whether the food-choice environment is digital (e.g. online shopping) or physical (e.g.a grocery store). We will use both acomputer laboratory and a more naturalistic, simulated shopping experience to test these questions.
Animal Health Component
33%
Research Effort Categories
Basic
34%
Applied
33%
Developmental
33%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
6075010301050%
6096010301050%
Goals / Objectives
We propose to utilize cutting-edge machine vision and artificial intelligence technologies to evaluate the role of attention and emotion in food choice. These technologies use cameras to track subjects through a physical space, measure attention through eye-tracking, and measure emotion through facial micro-expressions. We propose three key research questions: 1) How do emotions mediate food choice decisions? 2) How well do emotion and attention predict food choices in a natural environment? And 3) How does emotion interact with context to influence food choice?Our proposed design evaluates these questions in two phases--first in a computer laboratory and subsequently in a more naturalistic, simulated shopping experience. We aim to identify attention and emotion variables that mediate the relationship between nutritional attributes and food choice. We will also observe novel details about food choice behaviors by combining advanced technology with our controlled, yet naturalistic, shopping experience. We further propose to explore how measures of emotion predict heterogeneity in responses to randomized interventions. Finally, we will evaluate the role of emotion in determining how behavior is influenced by the food-choice environment.
Project Methods
Computer Science Methods: Our project will take existing computer vision algorithms thatdetectfacial microexpressions andtrackeyegaze and tailor them to a food-choice environment.Experimental Methods to ElicitFood Choice: Subjects will participate in food-choice experiments where nutrition interventions (such as information, context, or price)arerandomly varied. This randomization methodology is straightforward: a predetermined fraction of subjects will receive each intervention and these subgroups will be assigned randomly. These food-choice experiments will take place digitally in front of the computer and in-person in the aisles of a simulated grocery store.Statistical Methods to Analyze Data:We will use mediation analysis to determine the role that emotion plays in determining how food choices respond to food characteristics. Through this process, we will determine the "total effect" that food characteristics have on food choices and decompose that into the "direct effect" and the "indirect effect" that is mediated through emotion. This will follow the mediation analysis in Baron and Kenny (1986).We will perform this analysis for a suite offood characteristics and six emotions. Specifically, for each food characteristic, we will identify the emotions that mediate its effect on food choice. In this way, we can separately categorize food characteristics based on the emotions most closely associated with them. For each emotion, we can establish its impact on food choice through the regression coefficient. We can transform this slope into an elasticity by re-scaling using mean levels of each emotion and the mean food choices.We will estimatethe causal impacts of our experimentally-random interventions through astraightforward comparison of means. We willestimate the role emotion plays in mediating these treatment effects by includinginteraction terms to determine whether the effectiveness of interventions is predicted by the emotions triggered.

Progress 08/01/24 to 07/31/25

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Our computer science graduate student has been able to apply for a conference to present the progress he made on the expression detection model for this project. Unfortunately, he was not accepted, but the opportunity to assemble a competitive application will help him in the future. Our economics graduate student just joined the project and has been able to improve his programming skills in order to contribute to the development of our survey tools. How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals?Our primary goals moving forward are (in order): To integrate and test the eyegaze and food choice procedures. To continue to pilot this integration using human subjects. To perform the analysis of the pilot data to prepare our data-cleaning and data-analysis procedures. To capture data from our surveysthat we can use to calibrate the expressiondetection model. To pilotthe emotion-induction tasks to train and calibrate our expression detection model. To pilot the food choice tasks with both eyegaze and expression detection capture. We are optimistic that we can accomplish tasks 1--3 in Fall 2025 semesters and may be able to accomplish task 4 in the Fall 2025 semester as well. Concretely, PI Luu and agraduate assistant have been working on programming our software. PIs Brownback and McFadden are developing the food choice surveys while PI Li has been developing the emotion induction procedures. These efforts will be integrated as they mature throughout the Summer/Fall 2025 semesters.

Impacts
What was accomplished under these goals? Our focus during this period has been on developingthe software infrastructure required to test our key questions. There are two primary goals with this software development: Programming the eyegaze tracking to understand focus and attention. This must have a careful calibration to ensure data quality. This must integrate with our food choice surveys. This must output data that are interpretable and analyzable. Training the expression detectionmodel to evaluate subjecs' responses to nutrition information. This requires developing the framework of the model. This framework must then be calibrated using reinforcement learning based on data from human subjects. For objective #1, we have programmed an eyegaze tracking software complete with a calibration procedure, an integration with our survey software, and a CSV output file that can be analyzed through any common statistical software. Additionally, we have prototyped our food choice survey and are preparing to test how well our eyegaze tracking software captures attention to different pieces of information on the survey screen. In particular, we are concerned about the precision of our software because there is the potential for excessive measurement error. For objective #2, we have programmed the framework of the expression detection model and have integrated it into the survey software. However, we have made limited progress on objective #2.2, as it must wait for our survey infrastructure to be fully developed and launched to collect the expressions of human subjects. We have designed an emotion-induction task that we plan to use as a second source of variation in emotion (and consequently, expression). Wehope this will assist in calibrating our expression detectionmodel without relying on subjective human feedback.

Publications