#2 The foundations of humane technology: respecting human nature

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

Technology vs. human nature

As humans, we inherit evolutionary conditioning that developed over millions of years.

Center for Humane Technology

🤔 Not to freak out entirely just yet, but my brain just leapt to this thought: When I think of the damage done over just ten years of asocial behaviour on so-called social media, I really worry about evolutionary conditioning and its possible acceleration in the face of individualism and instant gratification.

⚠️ heuristics (shortcuts for making decisions and sometimes fast decisions on incomplete data) may become cognitive biases (such as social conformity biases where our opinions are shaped by (the actions of) others) which can be manipulated and exploited for profit or power.

Instead of exploiting these biases for short-term profits, humane technology uses them as design constraints for wise choices.

Center for Humane Technology

An interesting overconfidence bias is the above-average effect: many people think they’re above average, which of course is statistically impossible. Because of that, we believe we can not easily be manipulated, influenced, and shaped. The truth is that we make decisions based on context and/or emotions, however great we (think we) are.

For example:

  • What we think is a fair price depends on the first price we hear. This is known as price anchoring.
  • If someone has just read a paragraph containing the word “Ocean,” and then you ask them to identify their favorite clothing detergent, they’re more likely to choose “Tide.” This is known as subliminal cueing.
  • A nauseating environment inclines us toward harsher moral judgments.

🤔 Personal reflection: hijacked biases

Are there any instances in the last day or two you can think of when your brain’s heuristics may have been hijacked, by technology or anything else?

Yes. It happens at least once a day and more, that I pick up my smartphone to do something specific and that seeing a particular app icon, or a notification totally hijacks my focus. Most of the times I don’t even accomplish that specific thing I had set out to do until after I’ve put the phone down and something else reminds me of what I had intended to do. It makes me wonder the amount of things I ended up not doing after all 😂

Badge earned: Respect human nature

Design to enable wise choices, NOT to exploit human vulnerabilities

Human vulnerabilities and tendencies need to be factored in as we study user behavior. Ignoring values risks to trade what’s right for what works.

When a machine learning algorithm can profile someone, identify their psychological weaknesses, and flood them with information tailored to shift their behavior, manipulation of the many by the few becomes possible in a new and terrifying way.

Center for Humane Technology

Good practice: technologies or products should be capable of:

  • Receiving information from users about their experience
  • Learning how your technology does or doesn’t support user values
  • Evolving towards increasing alignment between platform design and user well-being

To “give people what they want” is often an easy way to avoid moral responsibility for exploiting human desire.

Center for Humane Technology