“Is data objective or subjective?”
In my last year at Rice University, I enrolled in a data science ethics course. It was my first engineering course focused on the societal impacts of emerging technologies. Until this point, my work in social justice was largely separate from my technical work as a student of materials science and engineering.
So, when my professor posted that question about data, I thought, this was what I signed up for! During that discussion, I described how the way questions are asked, as well as who is doing the asking, can make collected data meaningless. I showed how data could then be processed and manipulated in algorithmic models in ways that perpetuate inequities. I pointed out that even what we choose to study in the first place is influenced by economic, social and political factors.
The overwhelming majority of the class, however, said that because data points themselves were numbers, observed facts, they had to be objective. I seemed to be the only one concerned with subjectivities around data collection and interpretation, and I worried about my peers’ lack of awareness of how bias could creep into their designs and exact real harm on people. The process through which facts are developed matters.
I left the class disillusioned and even more uncertain about my identity as an engineer.
Feeling like they don’t fit into the culture is one of the primary reasons why women, people of color, LGBTQ+ individuals and other systemically marginalized people drop out of STEM fields. We feel out of place because of the disconnect between our desires to use technology to address inequities, and engineering curricula with little representation of social consciousness or awareness of social issues. If we do not change the way we teach engineering, we will continue to lose talented people from a diversity of backgrounds whose ideas could solve a host of the world’s problems.
Our work as engineers directly affects people through shaping the distribution of wealth, power and privilege in society, and it’s time we acknowledge that.
Yet, one of the first things I was taught as an engineering trainee was how to be an “objective” practitioner, suppressing personal experience, bias and emotion about research questions and processes. Such objective reasoning has served the mostly white, male field well by applying dispassionate focus to research problems and lending credibility to results from one practitioner to another. But it has also, by design, perpetuated injustices. Outside of our narrow fields exist people, with complex social factors and systems that will inevitably affect the metrics we study and bias the solutions we seek to create if such concerns are not taken into consideration from the beginning.
When we teach engineering problems now, we ask students to come to a single “best” solution defined by technical ideals like low cost, speed to build, and ability to scale. This way of teaching primes students to believe that their decision-making is purely objective, as it is grounded in math and science. This is known as technical-social dualism, the idea that the technical and social dimensions of engineering problems are readily separable and remain distinct throughout the problem-definition and solution process.
Nontechnical parameters such as access to a technology, cultural relevancy or potential harms are deemed political and invalid in this way of learning. But those technical ideals are at their core social and political choices determined by a dominant culture focused on economic growth for the most privileged segments of society. By choosing to downplay public welfare as a critical parameter for engineering design, we risk creating a culture of disengagement from societal concerns amongst engineers that is antithetical to the ethical code of engineering.
In my field of medical devices, ignoring social dimensions has real consequences. Pulse oximetry devices, vital during COVID-19 for monitoring lung function, have been shown to be significantly flawed in calculating the oxygen levels of black compared to white patients. The technology behind the monitors had been studied and developed with majority white patient groups and failed to correct for melanin’s absorption of visual light. Most FDA-approved drugs are incorrectly dosed for people assigned female at birth, leading to unexpected adverse reactions. This is because they have been inadequately represented in clinical trials.
Beyond physical failings, subjective beliefs treated as facts by those in decision-making roles can encode social inequities. For example, spirometers, routinely used devices that measure lung capacity, still have correction factors that automatically assume smaller lung capacity in Black and Asian individuals. These racially based adjustments are derived from research done by eugenicists who thought these racial differences were biologically determined and who considered nonwhite people as inferior. These machines ignore the influence of social and environmental factors on lung capacity.
Many technologies for systemically marginalized people have not been built because they were not deemed important such as better early diagnostics and treatment for diseases like endometriosis, a disease that afflicts 10 percent of people with uteruses. And we hardly question whether devices are built sustainably, which has led to a crisis of medical waste and health care accounting for 10 percent of U.S. greenhouse gas emissions.
Social justice must be made core to the way engineers are trained. Some universities are working on this. Last year, the University of Michigan committed to teaching equity-centered engineering to every undergraduate student. As faculty reimagine their curricula, they should be grounded in three key learning objectives: (1) What is it that engineers do? (2) How should we behave as professional engineers on diverse teams? (3) What is the historical context of engineering, i.e., how did we get where we are right now? Engineers taught this way will be prepared to think critically about what problems we choose to solve, how we do so responsibly and how we build teams that challenge our ways of thinking.
Individual engineering professors are also working to embed societal needs in their pedagogy. Darshan Karwat at the University of Arizona developed activist engineering to challenge engineers to acknowledge their full moral and social responsibility through practical self-reflection. Khalid Kadir at the University of California, Berkeley, created the popular course Engineering, Environment, and Society that teaches engineers how to engage in place-based knowledge, an understanding of the people, context and history, to design better technical approaches in collaboration with communities.
When we design and build with equity and justice in mind, we craft better solutions that respond to the complexities of entrenched systemic problems. I imagine a better future where we claim our subjectivity by sharing and celebrating our lived experience and how that motivates our work—and hold ourselves accountable to making engineering work for all.