When Technology Goes Wrong: Technical Debt vs Ethical Debt

Why some problems hurt companies, while others hurt people

The Two Types of Tech Problems

When we build technology fast, we create two kinds of problems:

Technical Debt = “We know this isn’t perfect, but we’ll fix it later”

  • Companies choose quick solutions to meet deadlines
  • They plan to improve it when they have time
  • Usually hurts the company’s efficiency

Ethical Debt = “We didn’t think about who this might hurt”

  • Companies assume their technology is harmless
  • They don’t realize the damage until it’s too late
  • Usually hurts vulnerable people who had no say in building it

Why This Matters More Than Ever

Technology today is different. It’s not just tools we use occasionally—it’s everywhere:

1. Deep in Our Lives

  • Social media decides what news we see
  • Algorithms choose who gets job interviews
  • Apps track our every move

The Problem: When biased systems are deeply embedded, they can reinforce discrimination in hiring, housing, and healthcare.

2. Incredibly Complex

  • Multiple systems talking to each other
  • Hard to predict what will happen
  • No one person understands the whole thing

The Problem: When something goes wrong, it’s nearly impossible to figure out why or who’s responsible.

3. Hidden from View

  • Cloud services mask environmental costs
  • Data collection happens in the background
  • Decisions are made by invisible algorithms

The Problem: We can’t fix what we can’t see.

4. Making Decisions Alone

  • AI systems approve loans automatically
  • Recommendation algorithms choose content
  • Automated systems flag people as “risky”

The Problem: Machines make life-changing decisions without human oversight.

Real Examples

Healthcare Algorithm (Ethical Debt):

  • System was supposed to identify sick patients who needed extra care
  • Worked “fairly” by spending equal money on Black and white patients
  • But Black patients actually needed more expensive care due to systemic inequities
  • Result: Very sick Black patients were denied care they desperately needed

Zoom Security (Technical Debt → Ethical Debt):

  • Company rushed to market without proper security
  • Knew it wasn’t perfect but planned to fix it later
  • When pandemic hit, security flaws exposed students to harassment and hate speech
  • Technical shortcut became an ethical crisis

How to Think About Solutions

Ask the Right Questions

Before Building:

  • Who might be harmed by this?
  • What could go wrong?
  • Are we including affected communities in the design?

After Deploying:

  • Who is actually being affected?
  • Are outcomes fair across different groups?
  • How do we know if something’s going wrong?

Ongoing:

  • What can’t we see that we should be monitoring?
  • Who should have a say in how this technology is used?
  • How do we fix problems when we find them?

Build Better Systems

Include People Early:

  • Talk to communities who will be affected
  • Test for bias and discrimination
  • Plan for things going wrong

Stay Vigilant:

  • Monitor outcomes across different groups
  • Have real humans review important decisions
  • Create ways for people to appeal automated decisions

Take Responsibility:

  • Don’t hide behind “the algorithm decided”
  • Be transparent about how systems work
  • Fix problems quickly when they’re discovered

The Hard Truth

Some problems can’t be solved with better ethics alone:

  • Market failures need regulation
  • Fundamental design flaws need different technology
  • Social inequities need broader social change

Ethics helps us see and understand problems, but fixing them requires action across society.

Questions We Must Answer

About Responsibility:

  • When a biased algorithm denies someone a job, who’s accountable?
  • Should companies pay for the harm caused by their technical shortcuts?

About Fairness:

  • Is it okay for technology to work better for some people than others?
  • How do we balance innovation speed with thorough safety testing?

About Control:

  • When should humans be required to make decisions instead of machines?
  • How much automation is too much?

About the Future:

  • What ethical problems are we creating today that we don’t see yet?
  • How do we build technology that serves everyone, not just those who can afford it?

The Bottom Line

Technical debt asks: “What’s this going to cost us?” Ethical debt asks: “Who’s going to pay?”

Too often, the people who pay for ethical debt are those who had no voice in creating it. As technology becomes more powerful and widespread, we can’t afford to discover our ethical problems after people have already been harmed.

The goal isn’t perfect technology—it’s responsible technology. Technology that acknowledges its power, considers its impact, and takes responsibility for its consequences.

The choice is ours: Will we build technology that serves all of humanity, or just the people building it?


What ethical questions do you think we should be asking about the technology in your life?

References

Petrozzino, C. (2021). Who pays for ethical debt in AI? AI and Ethics, 1, 205–208. https://doi.org/10.1007/s43681-020-00030-3