About

After the Cambridge Analytica scandal, the tech industry goes through an ethical reckoning. This thesis for the design engineering program at Harvard synthesizes and distills the cultural discourse around tech ethics into a speculative platform that pragmatically resolves the problem. Although speculative, the platform is not a video but an interactive digital prototype that simulates an object and context in a fictional near-future. The undertaking of this project was also a methodological synthesis of wicked problems, design thinking, and data engineering. Methods developed from this class was applied to the USCIS and generalized into a studio curriculum at the University of Michigan.

Problem

It is how we design product, not what product we design, that leads to ethical problems.

Solution

Origo connects the product team with the information they need to anticipate and mitigate ethical problems at the point of design.

1
I worked in Silicon Valley at an enterprise data management company from 2016 - 2018 that was an AWS competitor.

The Cambridge-Analytica Data scandal broke out April 2018. My Statement of Purpose was written November / December 2017.  At the time of writing, I saw theoretical problem of tech ethics as an interesting intellectual exercise.

The theoretical framing was possible due to my undergraduate training in architecture and design at UC Berkeley. The program has a history of interdisciplinary engagement with social justice and wicked problems.

My continued time in practice led me to focusing the thesis on data ethics.
Statement of Purpose

The core problem that I wrote in the SOP that is the subject of this thesis are the "inscription errors of unconscious human bias.
Photo: Xinhua / Barcroft Images
2
The problem of "tech ethics" is very broad. Since I wanted to design a platform intervention, I approached the problem atomically by reducing system components to their smallest parts.Instead of looking at the system outcomes / outputs through an enumeration of risks and benefits, I looked at the system inputs to guide the research and development of my problem statement.

Starting at the macro-level, I needed to understand the range of tech's ethical problems from multiple perspectives. I decided to collect case studies from media articles due to their public-facing nature and contribution to mass understanding of the discourse. I then structured the articles to identify the stakeholders that needed to be accounted for in the platform design.
Historical Analysis of Silicon Valley Disruption.
Living and working in Silicon Valley from 2009 - 2018, I had front row tickets to the tech industry's innovation and disruption.

When I left, Silicon Valley politics became inter/national politics.

The timeline shows a sample of major events during this time, including my own company's acquisition by ARM Holding as part of their device-to-data IoT strategy.
Learning 1
There are four major stakeholders involved in tech ethics: 1) venture capitalists, 2) employees / workers, 3) consumers, and 4) regulators.

"Cities" is not a stakeholder, but technology impacts urban space.
3
The primary characters exerting influence on the problem of tech ethics is Big Tech and the media.

Given the history of yellow journalism, the media is not an impartial reporter. It fails to have the domain knowledge for understanding the root causes of problems and is incentivized to sensationalize. What the public receives from the media are observations of effects of mechanical failures, not the mechanical failures themselves. It is critical to be able to distinguish between the two in order to mitigate internal bias as a designer. Scapegoating is unethical.

At the same time, the media has immense power to create change through influence by creating a shared cultural consciousness. Like how it is important to assess the failures of tech, it is equally important to understand the failures of the changes that have been introduced due to media reporting. A system map synthesized the literature review.
Systems Map of Media Effects
Media has the capacity to provide information at the collective level, but can also cause hysteria. Both has gone up over time as tech draws national attention.

The information reported are usually outcomes and effects, not causes, which means public literacy of technology does not increase. Attitude towards tech also becomes increasingly negative, which creates a potential risk of scapegoating.

As governments respond to problems, the regulations are not designed with technology in mind. A case study is the consent form now on every website which do not benefit the public and are only for companies to protect themselves.

Since there is an obvious problem, people want to step in and help. Others, including academics, want to step in and capitalize on a hot moment (Floridi). This creates an influx of information and can lead to ethics-washing (Floridi). Technologists may be caught in a vicious cycle of scapegoating which is, ironically, unethical.

Misfit regulations can also lead to infrastructural bloat which can impact future use cases, particularly in domains of healthcare and climate change, in unknown ways.
Learning 2
Media articles create vicious cycles of scape-goating, misinformation / misinterpretation, and infrastructure bloat,
4
Once I identified the range of stakeholders, I needed to zoom in and understand one in depth. I chose to research the worker / employee experience since they were the ones who are directly involved and building the products. A solution that targeted this context would impact everything downstream.

I needed to understand their mental model of the situation so I conducted semi-structured interviews and reviewed literature to understand how employees framed the causes of ethical problems.

But what is an ethical problem?

Based on the historical analysis, there is a broad range to what is understood to be an "ethical problem." I needed to understand how "ethics" was identified in this specific context. As an abstract concept, I knew that a singular definition is diffuse and would not be useful. I was not trying to be a philosopher. Pragmatically, the definition and types of ethics does not matter. For the purposes of designing an intervention, how the human identifies and finds an ethical problem is the design requirement that needs to be accounted for.
Learning 4
Standards and Policies codify ethics and is where people usually start their validation process.
Learning 3
Identifying what is ethical requires an internal moral compass which will depend on personal experience.
Employee Mental Modeling
5
In addition to understanding the mental model, I needed to also understand the actions employees took and what happened. This would help identify gaps.
Learning 7
Technologists believe they are responsible, but they lack adequate tooling for assessing impact.
Learning 6
Managers are point persons for ethical concerns.
Existing Employee Reporting Experience
6
Finally, using the employee report of problems as a framing lens, I telescoped out, to create a system map. I traced and mapped the socio-technical environment of production based on my interview notes with stakeholders.

System consequences, inefficiencies, and failures were identified, structured, and contextualized.
Organization System Map
I selected a Series B Start-Up in the middle of scaling as a model for my systems map. agile management, design thinking, and quality management frameworks were synthesized into a diagram.

Following Conway's Law, I assessed for gaps in information and communication, then identified where and how bias / insensitivities might enter the system to cause ethical problems. I also identified bugs / deviations / and inconsistencies to be a source of ethical risks due to their impact on the user experience.
Learning 9
Information and communication silos create time pressures that make it difficult to assess for ethical problems.
Learning 8
There is a lack of procedures for dealing with changing regulations in frontier domain products (such as bio-technology). This creates complexity for quality management teams. Quality is not fully defined and/or has a shifting definition.
7
As I went through the research, I continued to refine my problem statement.

The final form is as shown.

Problem Statement

It is how* we design product, not what product we design, that leads to ethical problems.

* how = people + process

8
In keeping with a systems approach, the "process," the "people," and the relationship between the two have been researched. However, "ethical problem" is still extremely broad and dependent on individual evaluation.
Learning 11
Ethics exceeds compliance and includes culture and community.
Learning 10
Assessing the potential ethical consequences of a product feature is a team sport.
Ethics Synthesis Diagram
Ethics is concept understood and manifested in different ways. Some are implicit, others are explicit. Understanding what is ethical is also not distributed evenly as individuals always have their own perspectives and understanding of the world. The culture of an organization also varies as leaders influence individuals by modeling appropriate behavior.
9
With the components of the problem identified and structured at all levels of detail, I synthesized everything into a hypothetical task analysis with pain points that would serve as the backbone of the platform.

I then roughly designed the objects in accordance with the pain points.
Task Analysis & Object Design
10
While this project uses familiar product design methods, it is conceived as a philosophical application of world-building.

While the problems are real, I avoided designing for existing buyers or users. Purely contextually driven, I wanted to design the platform spatially rather than functionally to demonstrate how something can be born for more than one purpose. Reflecting democratic aspirations, the traditional protagonist / user becomes a generating void.

Closer to research-based art, the project is a parafiction. Blending fiction and fact, it presents itself as an engineered fact. In its deception, it aims to ask the audience why such a product cannot exist in order to stage a confrontation between the fact of how things can be better and the fact of how things continue to get worse. It presents a socio-technical solution to a cultural debate.
Mixed Methods for Designing Parafictions
In being a parafiction, designs can engage with problems at a technical fidelity that matches the discourse it hopes to influence by being credible to a broader audience. While still narrative-driven, the way the story is told exceeds a cinematic speculative design which provides a compelling narrative surface but is unhinged from the technical present.

As an interactive object from an alternate future, the platform could transition into the real provided a buyer or user comes into being. In this way it also engages in design futures and strategic forecasting. Not cordoned off as an alternate reality, its persistence in meatspace retains a meaningful change-making ability.
11
The brand of a para-fictional platform that synthesized the research and operationalized the hypothetical task into a workflow was developed.
A Platform for Ontological Design
The platform is designed to be applicable to any domain problem.

In the following prototype, it is looking at the domain of digital education (MOOCs), but it can also be applied to affordable housing, criminal justice, etc.

Vision Statement

Origo believes the future of tech ethics lies in creating a continuous feedback loop between software development and regulatory legislation while enabling a plurality of stakeholders to participate in the construction of data models without needing to be "technical." By democratizing the construction of data models and grounding ethics in everyday context and operations, teams can spot, assess, and mitigate ethical risks in a sandboxed environment while collaboratively sharing their learnings with lawyers and regulators. Technology can be designed to be safe, and regulations can update closer and more elegantly to the speed of innovation.

Mission Statement

Origo connects the product team with the information they need to anticipate and mitigate ethical problems at the point of design.

Reasons to Believe

1

Lawyer-in-the-Loop
Base requirements are collected directly from lawyers so the product team can bootstrap their data model and design ethically without costly refactors.


2

Legible Ethical Value

Protocols that become design requirements can be created and reused by lawyers, traced, and audited at scale so that the product team's work is visible and of consumer value.


3

Input Diversity

Context on design problems are collected through references representing a spectrum of temporal change and diversity of stakeholders.


4

Product Knowledge Management

References and requirements are collected and managed in direct relation to the product data model so that context is never divorced from object.

Protocol Conceptual Model
The concept of an ethical protocol generalized the different ways that ethics manifested in textual documents into a common object that could bridge between stakeholders in different institutions.

Ethical protocols would also serve to normalize the set of words and their referents into a coherent namespace by pairing lexons (internal vocabulary) with keywords (external vocabulary). As a data object that packaged relevant information, protocols could be traced and a digital space for institutional stakeholders was created for data governance.
12
Data objects were designed for interaction based on stakeholder analysis.
Protocol Coordination
As a common object, protocols act as a bridge between stakeholders in different institutions.
13
All the stakeholders interacting with the problem domain were structured, and new emergent roles were identified.

Stakeholder Map & Role-Based Design Futures
A new role was created for the primary user / change-maker: a lawyer from the future. The platform sought to create communication channels for this lawyer to interact with the product design team.
14
I prototyped both the backend and frontend for the platform throughout the project to create the underlying data model / API for the platform and all workflows / interactions.
Provisioning Flow for a Two-Sided Platform
Landing Page w/ Onboarding Resources
Onboarding is critical to the success of employees / users and organizational leadership by providing steps for transition into the product.

The landing page of Origo bootstraps the user requirements-gathering process by asking them to define their jurisdiction and anchoring the product in a context and culture.

It then creates a boundary object by visualizing the data model, and directs users to a glossary to give them a common vocabulary for discussion.
Documentation and Training
The glossary lays out my synthesis of over five years of practice-based research on designing data models, all brought into the creation of this parafictional platform for ontological design.

For clarity and ease of learning, the glossary eschews the production of new marketing / self-marketing terms to describe what has already been described in other fields to produce a unified vocabulary.

This aids in the creation of a shared multi-disciplinary framework, reduces silos, improves collaboration, and helps in the search for more information.
Emergent Domain Ontology
The lawyer develops a domain ontology of concept. The ontology unifies namespaces across multiple concepts. In a unified view, it also counters the problem of jurisdictional fragmentation that comes with intersecting local, state, and government laws by serving as a boundary object at the macro-level.

The ontology can also provide a view into optimizing laws created for a common problem by detecting patterns and reverse salients across and within jurisdictions. Over time, data governance can be complex without being complicated.

As feedback with product team are maintained, the ontology can also serve as aa data collection mechanism for experiments that might be generalized into new laws.
Collaborative Data Design
Where the ontology exists for lawyers at the macro-scale, the data model exists for product teams at the mezzo-scale and provides a space for designing data representations where questions of identity, meaning, and semantics need to be discussed.

Given that much of the cost of UX designs are due to architectural changes needed because impedance mismatches, the data model also aids engineers in planning and making trade-offs for product development.

By making a technical object legible, non-technical stakeholders can participate in problem-spotting and designing data.

By abstracting data away from interface, designers can make more conscious design decisions.
Namespace Systems Map
The systems map traces information from outside to inside the product at the global, local, and structural level to show how feedback loops can be constructed to counter the ones created by the media, while working cooperatively with governance.
Phase 1: Gather Context
Phase 2: Share Protocols
Phase 3: Develop Product Ethically
Data Model Specifications
The devil is in the details. This is the same for data ethics. The visualization is created to surface properties of data relevant to decision-making.

By surfacing this within the product, it also passively educates less data-savvy stakeholders on the nuances of data.
Surface Modeling Data
Many activities occur around the data model. Different views are created of its surface for different interactive purposes.
Data Properties
Visualizing the data model also creates a simple chart of all its keywords.

Properties of the keywords relevant to ethical decision-making are also visualized / surfaced.
15
For platform enablement, I developed an ethical maturity model to help users make sense of which ethical problems were easier / harder to approach. The maturity model also aligns with their product roadmap.
16
The thesis was presented May 2020 during Covid lockdown over Zoom.

Attendees:
Dr. Michael Smith (Advisor)
Andrew Witt (Advisor)
Dr. Tom Dayton (Advisor)

Dr. Martin Bechthold (MDE Program Director)
Dr. Mary Tolikas (MDE Program Director)

Bryan Boyer (Future UT Faculty Director)
17
After graduating, I was hired at the University of Michigan to teach their design research studio. The ontological data design methods developed as a part of this thesis formed the basis of that curriculum.
View Design Research Studio

I also presented my thesis work and its curricular evolution at various venues both inside and outside the University of Michigan.
BAC Lecture,
Data: Between Fact and Fiction
October 2024
A Final Note: Beyond Governance
Ethical product development is an emergent and ongoing process. What is good / bad, better / worse, right / wrong changes over time as culture and norms change. Inherently non-deterministic and stochastic, it is necessary to think of ethics beyond governance, and as a set of values that should be brought forth in the world. The latter needs to come first; compliance, though necessary as a mechanism of enforcement, encourages laziness. Human-centered designers know the pitfalls of a checklist.

As an executive I interviewed said succinctly: ethics is a choice. It is not abstract but manifested in the everyday actions of people saying words and making things with decisions. Ethics is embodied. Those who objectify ethics in judgment ossify its evolution and risk becoming the very problem they hope to prevent. While this is a project about words, it starts with process, with the how, because this is what gives ethics movement. The words will always be incomplete; they will always betray you. They betray you faster when the speaker is unreliable and when the thing is invisible. Designers of the built environment know that a map is not the world.

Community, on the other hand, is deeply human. Creating it requires critical thinking and dialogue, the same basic skills needed for a functioning democracy. The words need to connect and land. If the search for an ethics is a pursuit of justice, this project stakes a claim that justice is a living thing: it emerges in and through the words we share with each other. May it always bend towards good.