
Section 1: Introduction
There is a persistent myth that technology is neutral in our daily lives. The idea that systems, machines, and algorithms operate on full and pure logic, free from human interference in a sterile and objective manner, is just that, an idea. Every tool we use, every item we see — from something as mundane as a chair to an advanced computer program — has been shaped by decisions of its creators, architects, designers, and engineers. These choices, though oftentimes unintentional or invisible, are deeply woven into the fabric of our world and the very structure of the environment around us. They define who is included and who is excluded. Something as simple as a chair may seem like an innocent object, but if it was designed with an average male body in mind, it marginalizes every other body that it doesn't fit. A camera that does not recognize darker skin tones isn't broken or malfunctioning, it’s working just as it was built to, without the training data of everyone who might end up using it. And a predictive algorithm that over-polices certain neighbourhoods is not broken, but reflecting the inequalities of the information it was built on. The further we advance and venture forward into the metaverse and the world beyond our bodies, the more difficult it is to notice the decisions humans make in its creation. Data are mistaken for truth and UI is treated as nature rather than structure. Behind every algorithm is a designer, a dataset, and an agenda. When looked at from above, it shows us who really is behind it all. It is a mirror of the world as someone sees it, and when these systems are scaled up, deployed, and automated, their biases are amplified for the whole world, without question. Mortal.fr is a project developed to illustrate that bias is not a mere glitch in an otherwise perfect system, but rather bias is the system. This capstone will unpack how even the most ‘rational’ or ‘objective’ systems are deeply rooted within cultural assumptions, social norms, and racism. The interactive portion of this capstone is designed to look and feel like a unique spirograph generator. It also reveals how constrained our expressions become when filtered through the predefined rules and limited affordances of algorithms. This paper asserts that all systems are inherently biased —from furniture to algorithms — because they are created by humans and thus embedded with our own ideas about the world. By analyzing these arbitrary structures we can better understand the subtle ways design encodes power and perception.
Section 2: Furniture to Frameworks: Human Intent in Physical Design
Bias does not begin with code, but with design. Everyday objects we take for granted carry assumptions about who is meant to use them and in what way, such as door handles, plastic chairs, or traffic lights. One of the more obvious examples of this is the ubiquitous, monobloc plastic chair.

Photo By Andreas Sütterlin - ndion, https://ndion.de/en/monobloc-what-matters-is-that-you-sit/
It is far and away one of the most mass produced pieces of furniture on the planet, found in cafes, weddings, and backyards across different continents, countries, and cultures. It transcends borders in every possible way and has found a place in everyone’s life. Given its sheer ubiquity, we have given it the illusion of neutrality, normalcy, and standard because we can see it everywhere we go. Its standardized dimensions assume that a ‘normal’ person with an able body of average size and regular weight will be sitting in it in an upright position without the need for assistance. For anyone outside that very norm, such as an elderly, disabled, short, or large bodied individual, that chair isn’t just a chair, but a piece of hostile design. When we zoom out and look further, we see this everywhere we go. A famous example of bias in design with intention is Robert Moses’ low overpasses on Long Island and in greater New York City, designed with the purpose to limit public buses and block access to racialized communities. The plastic chair doesn’t just restrict movement in a literal sense, it also dictates who is and isn't accommodated by it. These designers are making active decisions — whether consciously or unconsciously — and the existence of their bias must be observed and understood.
Section 3: Algorithmic Systems and Their Ghostwriters
The same way that ghostwriters shape a book behind the scenes while the author's name appears on the cover, algorithms also have ghostwriters — the training data that silently define what they can see, do, and recognize. On their own, they are nothing but abstract logic. However, when we introduce vast amounts of data, often collected in the real world and curated by humans, that’s when they truly begin to function. In the case of facial recognition algorithms, major swaths of the population are overlooked. A very well documented case of this is in a TED Talk by Joy Buolamwini, a researcher and scientist at MIT Media Lab, who found that most of the time black people are not seen by computers or are seen too much, such as policing facial detection software. This was a major issue while developing Mortal.fr myself. Originally the capstone was intended to be a facial recognition based spirogram generator, designed to extract emotions, looks, symmetry, other physical, and social characteristics from a snapshot of a person's face. This is when I ran into a roadblock. The facial recognition API I was using worked well on me, a white man with traditional facial features, but when I tested it on my friends who didn’t share those characteristics, the system would either crash or return incorrect and confusing outputs. Faces weren’t detected, features were missed, and the emotional readings were oftentimes pure nonsense. This is when it dawned on me, I wasn’t the only author in this project. While I designed the UI, coded the logic, and created the majority of the code, I was completely at the mercy of the invisible data that trained the tools I was using. There were data in there — data I could not see, modify, change, or extract — leaving me at the mercy of an unknown and unverifiable ghostwriter whose biases contradicted my own intentions. This is where I decided to move my project from an art project into the realm of social commentary, aimed squarely at the very thing I was making.

Photo By Google AI - Google, https://store.google.com/intl/en/ideas/real-tone/
Three years ago, Google announced a new feature to their Pixel line of phones called ‘Real Tone,’ aimed at creating colour accurate and photorealist images of people of colour. This announcement was both a step forward and an exposure of a truth many were unwilling to notice; that something as mundane as a camera, which is designed to capture the objective moment, was but a lie. These cases expose a deeper truth. Algorithms aren't just mirrors into the world around us, they actively shape it. They are controlled by the dataset they were trained on, and more often than not, that world is one of exclusion.
Section 4: Interface as Ideology
When we interact with screens, our phones, computers, and other technical mediums, we rarely question them. We click buttons, drag sliders, swipe right, left up and down, and we do this instinctively, but what feels ‘natural’ or ‘intuitive’ is rarely neutral in the grand scheme of things. Every interface is an experience designed from the ground up with constraints, assumptions and perceptions of how people—the end users—will consume what we make. These are ideological architectures, shaping and limiting our very interactions. An example of this is the standardization of design in modern social media applications, and their very basic structure. The horizontal flows, dropdown menus, icons meant to represent universals like sharing, editing, liking and commenting, are all within a left to right format. These conventions were created by Western, able bodied, and neurotypical people, for them by them, and by that logic only their worldview is represented in this design. Someone using Arabic on their phone might encounter massive issues within an application if the developer did not take into account the swapping of layouts. Apple recently published a WWDC22 resource on how one might optimize their application for such a scenario, a scenario often overlooked that is a large part of so many lives. In her book, Design Justice, Sasha Costanza-Chock argues that “designers tend to assume the user has broadband internet access, unless it is specified that they don’t; that the user is straight, unless it’s specified that the user is LGBTQ; that they are cisgender, unless it’s specified that they are nonbinary and/or trans*; that they speak English as a first language, unless it’s specified otherwise; that they are not Disabled, unless specified that they are; and so on” (Costanza-Chock 47–48). The end result is not just exclusion within these systems, but a downright erasure of difference and a creation of uniformity for the sake of aesthetics. With things like accessibility often overlooked, different scripts forgotten and languages removed, it is not unreasonable to assert that UI and UX can and often be as oppressive as a chair that does not fit or a camera that does not see.

Mortal.fr ties into this, as it directly comments on the existence and limitations UI places on us. It looks like a playful and entertaining spirogram generator, and it behaves like one as well, to a degree. The user is presented with an illusion of choice, the grid of characters, letters and numbers, some bold and some not, the buttons ‘Generate’ and ‘Clear.’ These elements suggest the idea of freedom, agency and control, yet there is none within the walled garden I’ve constructed. The matrix offers only characters within the Latin, Canadian-English centric alphabet, the style is dictated by me, the numbers follow the same logic. No user can go beyond what I’ve made possible, and yet it looks like an interactive and personalized system. That illusion is furthered by the webcam input for the colour selection, something that too is deceptive in its nature. The design of the applet helps to conceal all of this. Originally I was going to design a clean, minimalist and modern UI, something reminiscent of Apple, but this moved into a matrix-like design. With a black background and neon green text, the new UI gives the illusion of hacking, breaking the rules, and going beyond, again a subversion. Underneath there is no hacking, no rules being broken and no protocol strained. The interface reflects my decisions and my ideas about how my tools should be used, and the idea here was to project the assumption of openness while enforcing strict and set limits. Just like algorithms, interfaces are mirrors, not of the participant, but of the designer, and their assumptions about the end user. Design is not just what something looks like, but what it permits.
Section 5: The Aesthetics of Trust
Oftentimes we are told to trust the system, and when the systems look sleek, modern and minimalist that level of trust is far easier to give. The rise of modern UI aesthetics from Google’s Material You and Material Design to Apple’s rounded corners and industrial typefaces, has shaped how we interact with technology on an emotional level. We have become conditioned to associate the idea of ‘clean’ design with safe systems, things that won't harm us. We expect fairness from rounded buttons and smooth animations but ‘good’ design is not a moral compass, but a mask. What looks open and objective can still hide layers upon layers of deception, surveillance and control over us. Take, for instance, the UI of ChatGPT. Developed and designed by OpenAI, ChatGPT has a sleek, modern and minimal UI, designed to encourage continuous conversation between the user and the AI Client. The experience feels like texting a friend, modeled after a message-esque UX, but deep down ethical issues arise. We can see environmental issues, disinformation, and copyright all stem from within this one system, yet using it feels so intuitive, and quick. This aesthetic is not accidental. In her book The Atlas of AI, Kate Crawford mentines the myth of Clean Tech. “Data centers are among the world’s largest consumers of electricity. Powering this multilevel machine requires grid electricity in the form of coal, gas, nuclear, or renewable energy” (Crawford 43).

Photo By demilked - https://www.demilked.com/facebook-server-farm-arctic-lule-sweden/
We assume that just because a program looks minimal and clean, that it is just that, but when you account for the massive toll it takes on the environment, that illusion begins to break. The friendliness of UI becomes a kind of digital greenwashing, hiding the realities of massive server farms, and the exploitation of unpaid labour. Ethical complexities arise from the need of countless moderators needed for the algorithmic responses we get within milliseconds of asking a question. This is what made me rethink Mortal.fr. Originally I imagined it with a clean and legible layout, but that felt dishonest. The project wasn’t meant to be frictionless, but confrontational to a degree. The use of the cyberpunk-y, and hacker-y aesthetics was there to flip the logic of trust in UI. Instead of soothing the user, Mortal.fr asks them to squint, and to guess. To become even slightly disoriented or dissolution with the entire process. Clean design tells you that a system works, its purpose is simple, safe, and obvious, but that can and often is a lie. Mortal.fr is designed to remind users that even when the system works, it may not be working with you or for you, or for anyone at all.
Section 6 - Opinion: Breaking the Loop and Designing Against the System
So what can we do? If every system is built with bias, if every dataset is outdated, every interface an ideological prison, and every algorithm a mirror of its creator, what is there left to do? How do we build systems that reflect us all? Is that even possible? In my opinion we can’t, but that doesn’t mean we can’t attempt to be more inclusive. As humans we will make mistakes and errors a lot of the time, and that's fine as long as those mistakes are acknowledged and used as learning blocks. A potential answer could be the field of critical design, which aims to challenge the predisposed notions we might have about everyday objects around us. It helps us ask the questions that traditional design is there to suppress. A somewhat odd example of this is Balenciaga Spring 2023 at the New York Stock Exchange. Demna Gvasalia took traditional silhouettes and layered them with BDSM and Leather gear, a visual provocation aimed squarely at the masochistic, capitalist spectacle that is Wall Street.

Photo By Balenciaga- https://youtu.be/KP1OvbFB3wo
He showed off enormous oversized tees, jackets and pants, with skin-tight leatherwear poking through in an effort to challenge the norm, to explore other ideas. The clothes were not meant to flatter or conform, but to do something deeper. Demna isn’t designing for utility, he was designing for critique. To question the rules by exaggerating them. Mortal.fr operates similarly. The interactive portion of my project is not a precise tool, nor is it efficient. It doesn’t generate error free or editable spirographs. It doesn’t give feedback, personalization, or the ability to share anything with anyone. It barely explains itself and that is all on purpose. The project is a refusal of seamless interaction, and it’s meant to make you question why you cannot do whatever you want to do. Why does the colour feel off? Why does my output look similar to someone else's? It offers just enough to illustrate the illusion of choice and to feel somewhat interactive with just enough limitation to ask the deeper question. I am asking you to see the borders of the system and not forget them. Maybe we can’t build a system that reflects everyone, but we can build a system that shows its own limits and refuses false promises. We can build a system that leaves space for users to ask ‘Who made this and why does it work this way?’
Section 7: Conclusion
Technology is not neutral, and never will be. From the very chairs we sit on to the algorithms that dictate what we will see as we swipe through TikTok for hours on end, everything has been designed with someone in mind, and oftentimes that might not be you. Mortal.fr is a critical and analytical response to this, a confrontation but not a solution. It does not pretend to have the answers, but it doesn't pretend to be universal or frictionless either. It is a system like every other system and it has rules and boundaries in which it operates, and it refuses to pretend otherwise. In bringing this all to light, Mortal.fr exposes that bias is not a glitch of the system but a blueprint of who we are, inviting you to pause and look beneath the surface, to have an open mind and a critical eye, and to question everything. While we may not be able to build systems that are perfect, we can attempt to build ones that are honest, and admit that even computers have their objective limits. Only when we begin to unravel and realize what systems are, how they behave and work can we truly understand the world around us, how they shape it, and how we shape them.
Acknowledgments
I would like to thank Ben Fiebert and Janine Latus for their thoughtful feedback and assistance with grammar and copy editing during the writing process.
I am also grateful to Mauro Carignano for contributing the music that accompanies the interactive portion of the project.
Theoretical References
Andrius. (2016, October 3). First ever glimpse into Facebook’s massive Arctic Server Farm. Demilked. https://www.demilked.com/facebook-server-farm-arctic-lule-sweden/
Apple Inc., A. (2022, June 7). What’s new in Swiftui - WWDC22 - videos. Apple Developer. https://developer.apple.com/videos/play/wwdc2022/10052/
Balenciaga NYC Show Spring 23 Collection. (2022). YouTube. Retrieved March 22, 2025, from https://youtu.be/KP1OvbFB3wo.
Costanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. The MIT Press.
Crawford, K. (2022). Atlas of AI: Power, politics, and the planetary costs of Artificial Intelligence. Yale University Press.
Google. (n.d.). Skin tone representation with real tone photography. Google. https://store.google.com/intl/en/ideas/real-tone
How I’m Fighting Bias in Algorithms. (2016). TED. Retrieved March 22, 2025, from https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms.
Sanchez, A. (2024, October 18). Design: Monobloc. what matters is that you sit. ndion. https://ndion.de/en/monobloc-what-matters-is-that-you-sit/
Technical References
“CSS: Cascading Style Sheets.” MDN Web Docs, Mozilla, https://developer.mozilla.org/en-US/docs/Web/CSS. Accessed 22 Mar. 2025.
Google Fonts. Google, https://fonts.google.com. Accessed 22 Mar. 2025.
“HTML: HyperText Markup Language.” MDN Web Docs, Mozilla, https://developer.mozilla.org/en-US/docs/Web/HTML. Accessed 22 Mar. 2025.
iam-robin. Severance Interface. GitHub, 2022, https://github.com/iam-robin/severance-interface. Accessed 22 Mar. 2025.
ivelasq. Severance. GitHub, 2022, https://github.com/ivelasq/severance. Accessed 22 Mar. 2025.
“JavaScript.” MDN Web Docs, Mozilla, https://developer.mozilla.org/en-US/docs/Web/JavaScript. Accessed 22 Mar. 2025.
P5.js Reference. Processing Foundation, https://p5js.org/reference. Accessed 22 Mar. 2025.
Pole11. Spirograph. GitHub, 2023, https://github.com/Pole11/spirograph. Accessed 22 Mar. 2025.
rspt. Processing Spirograph. GitHub, 2017, https://github.com/rspt/processing-spirograph. Accessed 22 Mar. 2025.
seedcode. SpirographN. GitHub, 2021, https://github.com/seedcode/SpirographN. Accessed 22 Mar. 2025.