Quantifying the Person, the Environment, and Their Interaction
The ongoing work of Marc Berman and his Environmental Neuroscience Lab examines the external environment by combining human behavioral data with computational models and methods, including machine learning and edge sensing, to study how social and physical surroundings affect behavior. Their goal is to more fully understand which features of the physical environment (e.g., natural fractals) lead to improvements in memory and attention as well as identifying other manipulations that increase brain efficiency.
Among the ENL’s current projects, a primary focus is furthering understanding how different qualities of physical spaces shape interactions between individuals in different neighborhoods, illuminating factors that contribute to positive or negative social interactions. He and his team are working to develop edge-based sensors that can quantify the quality of social interactions in real-world settings. With a query centered on how individuals judge the quality of social interactions, study participants view short videos of social interactions while moving a continuous sliding scale to reflect how well they believe the interaction to be going. By analyzing these rating data with a feature-based AI approach, and then developing an interpretable AI model that is trained on these features, Berman and his lab can begin to understand how humans assess the quality of social interactions.
In addition, the ENL team is measuring how the visual features of a place impact behavior; assessments which have historically been difficult to measure accurately. In an ongoing study focused on Chicago, human participants rate thousands of Google Street View images on features including environmental transparency and aesthetic preference. These ratings have created a “training dataset” from which a computer vision AI model can learn to assign transparency or preference scores to over 200,000 images representing every single 50 meter segment of Chicago. By averaging scores of images within the same neighborhood, ENL will create citywide neighborhood maps of human environmental perceptions which, in conjunction with data provided by the Chicago Police Department, will allow the evaluation of whether places with certain types of human-perceived visual features tend to experience more crime.
Emerging work from ENL includes a novel approach called explainable AI through modularity (XAIM), co-led by Berman and Hank Hoffman in Computer Science. In XAIM, explainability is built into models from the beginning, rather than post-hoc, using a modular combination of approaches from both computer and social science. While modern machine-learning and AI can make incredible predictions, generate human-like content, and identify subtle patterns in data, the way in which these models arrive at their decisions is currently opaque. For subjective judgments, this is problematic as models leak unknown systematic biases into the prediction. Using domain knowledge to select a set of objective feature components based on a predictive hypothesis for a black box model, a larger model will then be composed by combining those features in a fundamentally explainable way to produce a subjective output. The XAIM approach will allow those features to be fed back into the model itself, making the AI more self-aware and able to “reason” about its own decision-making. This work is applicable across many domains of science that utilize AI, and could help researchers to build theories by understanding how XAIM arrive at their solutions.
Project Team
Marc Berman, Professor & Chair, Psychology, and co-Director of C3S2, directs the environmental neuroscience lab, where he examines how external environmental factors such as greenspace, urbanness and disorder impact human behavior and brain functioning. He is an expert in fMRI, multivariate statistics, Bayesian statistics and computational modeling. He and his lab develop quantitative models that explain the relationship between human behavior, brain functioning and the environment. He currently leads a large NSF grant with Hoffmann (and Catlett) using XAIM to quantify the nature of social interactions from sensor data.
Collaborators:
UChicago: Wilma Bainbridge, Luís Bettencourt, Monica Rosenberg, YC Leong, Susan Goldin-Meadow, James Evans, Hank Hoffman, Sanjay Krishnan, Nick Feamster, Howard Nusbaum, Ed Vogel, Sarah London, Sarah Keedy, Akram Bakkour
Sian Beilock (Dartmouth), Ethan Kross (Michigan), James Gross, (Stanford), Kim Doell (University of Vienna), Greg Bratman (Washington), Tor Wager (Dartmouth)

