I have been asked on more than one occasion how I took a path from film theory to Human-Robot Interaction (HRI). To me, the connection seems very clear—both paths of inquiry are scaffolded by an interest in human communication through and with technologies. Münsterberg  has said that the arrangement of events in film correlates with the way we think, and I believe there is truth in that theory, and that idea has, in part, been one of the things that ignited my interest in human communication via technologies years ago when I began looking at film critically as a cultural medium of expression and interpretations.
Later, my graduate academic work in Human-Computer Interaction (HCI) studies and my professional experience in Web development dovetailed—on the job, I did usability testing of sites and Web-based software and applications—and I repeatedly observed people describing personal relationships and reactions to information presented on the computer screen. In my earliest years in Web development, part of my job was working on Y2K preparation and response for an organization in the banking sector. This was an era that demonstrated to me the user-centered design and development challenges connected to emotion-centered aspects of how people interacted via the Web. Certainly, I began to think about factors like the importance of building and maintaining (or repairing) user trust in their experiences online. Over the years, my academic interests moved from human interaction with Web-based experiences to human interaction with robots and other embodied AI. To me, embodied AI offers some of the most interesting proving and testing ground for media, communication, and cultural theories.
My primary research goal is to investigate human-technology emotional attachment and trust issues, focusing on how these things influence user decision-making in human-technology teamwork or collaborative situations. More broadly, I am interested in people, and their experiences interacting with robots and other embodied AI, as well as understanding the cultural influences and narratives that influence user expectations of technology interactions, and the user's reciprocal influences on technology design and the cultural narrative around that technology. Therefore, the methods and strategies I use to explore these topics are human-centered, and rooted in social science ways of understanding individuals and their relationships to others, and to the world around them.
The current state of AI development is moving quickly, yet human-centered research in the field is still emerging. This void in technology development may have an enormous impact on product effectiveness in situations such as medical, defense, space exploration, or humanitarian relief use and training scenarios. Developing AI without attending to human factors can also have a heavy ethical cost given the roles AI plays, and how it will continue to influence the human experience. My research investigates the human side of human-AI teamwork, and then suggests practical applications based on these research findings in order to aid successful task and mission and project outcomes.
Contact me to inquire about my teaching, research, or consulting approach, or to discuss a specific problem you are trying to solve; or to invite me to submit a proposal.
 Münsterberg, H. (1916). The photoplay: A psychological study. (2005, eds.) A. Longhurst & Feilbach, A. [The Project Gutenberg EBook #15383]. New York: D. Appleton & Company.