#8 The foundations of humane technology: Ready to act

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

This module addresses how to make lasting changes, building on the tenets of humane technology and highlighting the most pressing issues and stakes around building technology that interacts with human attention.

Graphic summarising the 6 tenets of humane technology which are the titles of the course modules

Watch and listen to Randima Fernando, co-founder of the Center for Humane Technology, introduce the course on the Foundations of Humane Technology (1:43):

Introduction in less then 2 minutes of the course: Foundations of Humane Technology

The most pressing issues and stakes

As a summary of all I’ve learned in the course modules, here are the most meaningful elements (to me):

  • Technology divides us, distracts us from meeting our biggest global challenges, and from our ability t solve problems.
  • Human bias and vulnerabilities have helped us survive in ancient times, but not today.
  • We can and should build better tech. Tech that does not leverage human vulnerabilities. Tech that honors that attention is sacred, and that centers on individual and societal thriving.
  • A shift is needed from “design for conversion” to “design to enable wise choices” or “design for collaboration and sense-making”.
  • Humane Technologists have a role (a duty?) to proactively design toward human thriving.
  • Be values-driven, and metrics-informed.
  • Recognise that tech isn’t, and cannot be neutral.
  • Don’t fix tech with more tech. Instead, create fewer harms. Even if there is allure to more business / market opportunities by fixing the problems.
  • Fix the causes, fix the crises, re-condition habits, establish trust.

How to make lasting changes?

Now that you’re informed, determined, and ready to act, be aware of this and be prepared to do that:

  • You will meet substantial resistance. The parties that are winning have a lot to lose (in the short-term.)
  • The good news is: you’re not alone! Others are fighting extractive technology.
  • You will meet substantial resistance as far as egos are concerned (yours as well.)
  • The good news is: it’s easier to encourage change when framing it in terms of 1) spirit of service (because it connects with our desire to help, to be charitable), 2) openness to learning (because when we recognise that we don’t have all the answers, our sense of self is less rigid –which is the politically correct way of saying we’re not arrogant pricks)
  • Make bold choices that give up short-term security and the established definition of success.
  • Change the definition of success. Rally people around it.
  • Success is living in a world aligned with our values, with humanity, and with long-term thriving.
  • Humane Technologists succeed by rallying others and proving people can be more balanced, more fulfilled, and more impactful.
  • Tell your story (there are 3 stages to story-telling according to Movement Theorist Dr. Marshall Ganz: the challenge, the choices, the outcome)
  • Find collaborators and experiment until you get a first win, and perhaps even… recognition!
  • [optional] Become the champion of the new definition of success.

🤔 Personal reflection: Telling your story

What is a story that shapes your path as a humane technologist? What is a challenge you’ve faced, a choice you’ve made, and an outcome that others might learn from?

The only relevant story I can think of, since I’m not a technology builder, is how I reformed the team I lead at work. Eight years ago, I was promoted to lead the team I was on. It wasn’t a career goal and I had not been paying attention to any of the leadership considerations. But once in place, I knew that making small adjustments might have a positive impact. I was aware of my colleagues’ strengths and weaknesses, and there were territories I was keen for us to explore. I wanted to achieve three things: that we were doing the things that needed to be done, that we were enjoying ourselves by doing them well, and that we take up new select longer-term projects that are more preparatory for bigger things, and less reactionary in nature. So I focused on “what”, “why” and “how”, provided some guiding principles and built a common sense of our purpose and capabilities, and broke down our projects by affinities so that anyone’s strength was put to the best use possible. There is flexibility in who exactly does what, how it is done because we focus on outcomes over tasks and we’re established some level of redundancy. After a few years, we’re inspired to grow our own expertise and learn new skills, are autonomous, efficient and credible.

Badge earned: Ready to act

#7 The foundations of humane technology: Helping people thrive

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

This module explores the goal ultimately of humane technology: respecting human nature, minimizing harmful consequences, centering (on) values, creating (fostering) shared understanding, supporting fairness and justice all converge towards the goal of helping people thrive.

🤔 Personal reflection: Thriving

What does thriving mean to you?​ Imagine a thriving world, locally, and globally. What role do persuasive technologies like social media play in that world?

Thriving ranges from succeeding to growing, to enjoying oneself, in the pursuit of something of one’s choice. It encompasses all the stages between setting out a goal and the outcome. Social media can be both a blessing and a curse because it has the ability to inspire, motivate, inform, but also the ability to distract, confuse or harm.

What considerations and design choices enable thriving?

  • Aim for durable benefits.
  • Support people in relating to pain in ways that enable learning, resilience, personal agency.
  • Develop people’s capacity for: gratitude, creativity, learning.
  • Help people adapt to change and respond wisely to life’s circumstances, rather than react compulsively.
  • Simplify features, journeys.
  • Respect the natural flow of human life, respect natural human boundaries.
  • Help people figure out what wise action to take next, recognising that we can change how we contemplate the past and how we approach the future, but “right now” can’t be changed, and is therefore an opportunity to place our attention and energy on the right solutions.
  • Self-improve, or engage in personal development yourself, because the more we understand ourselves, the better we can be of service to others.
  • Prompt to care, be kind and considerate. Design for a kinder world.
  • Learn from the top five regrets expressed by the dying:
    • I wish I’d had the courage to live a life true to myself and not what others expected of me.
    • I wish I hadn’t worked so hard.
    • I wish I’d had the courage to express my feelings.
    • I wish I’d stayed in touch with friends.
    • I wish I’d let myself be happier.
  • Consider a “regret metric” to assess the performance or relevance of a technology or feature.
  • Mind the bigger lens, not just individuals’ considerations or short-term interests.
  • Remember that design choices must fit within the ecological ceiling and the social foundation that ensures that no one is left falling short of life’s essentials (such as food, water, shelter, electricity, safety.) (cf. donut model)
  • Remember that addiction narrows the range of pleasurable experiences, making it harder to enjoy not just the object of the addiction but everything else. This results in the addiction(s) becoming less and less rewarding, and leads people to feel increasingly empty.
  • Re-balance user engagement (simple consumption) and creativity, as there is more reward in the latter while there is value in some external validation that likes and shares afford.
  • Shift paradigm: from “compete for attention” to “attention is sacred“.

Donut model: Path to regenerative systems (sustainability)

I’m digressing to focus on this aspect from Module 3 “minimizing harmful consequences” which I had neglected to highlight last year when I took the module, but resonates particularly today.

Graphic of the donut model

If we wanted (duh!) to create economic systems in service to ecological, cultural, and psychological regeneration, Kate Raworth’s model of Donut Economics describes how economic systems must be productive enough to meet human needs and contained enough to avoid exceeding ecological limits.

There is growing recognition that humans have gone beyond our ecological and social limits. There is increasing pressure to shift the structure of markets and create space for radically new kinds of regenerative technology to emerge. But there is incredible resistance, dammit.

Technology can deliver a twisted version of thriving

“We value creators and want to create a place where they can show off their brilliance, grow their audience, and make real money.”

Often used by products like TikTok

“We’re a place for fun, inspiration, and ideas that matter. We show people the most relevant content so that they can enjoy themselves and lead more informed and meaningful lives.”

Often used by products like YouTube/Google

“We help small business owners find customers who’ll love their products so that their businesses can grow and thrive.”

Often used by products like Instagram

🤔 Personal reflection: Examining flawed stories

Technology companies often have compelling stories about how their products contribute to human thriving. In what way are they authentic? In what ways do they mimic thriving in toxic ways? Then consider how a product you’ve worked on or use may seek to help people thrive. Does that product define thriving accurately?

There is truth in the statements, such as “show off” and lies such as “lead more informed and meaningful lives”. The rest appeals to shallow but powerful temptations. The empty promises may be true for some lucky few. The rest can get stuck in the maelstrom of youtube-like recommendations, and the infinite rabbit-hole of the “more like this”.

Most of the sites and social networks that are free are guided by their own sustainability and therefore prioritise users’ actual benefits last. Unless these are developed by independent developers and/or developed as free and open source software by people who most of the time are already more aware and sensitive to ethics. More and more people seem to realise that if a product is free, they are the product; and therefore there is a teeny bit more interest in supporting developers who make a living out of developing software for people, to fulfill specific outcomes.

Design for thriving: Principles

Paradigm shifting:

current paradigmparadigm shifting
Compete for attentionAttention is sacred
Center engagement and growthCenter individual and societal thriving

1. Protect attention at all cost

“Relentless hijacking of attention is a direct assault on autonomy (and on thriving).”

Center for Humane Technology

Attention is sacred. Like our time, attention is finite. Make design choices or consider the following:

  • Design to minimize attention-grabbing
  • Design to add delays or helpful friction to keep attention and intention aligned
  • Increase users’ control over their own attention

2. Design for intentionality

“Thriving requires us to reflect on, set, and then act on our intentions.”

Center for Humane Technology

Because sometimes, the conditions for meaningful rest are better than those for optimum engagement, made design choices or consider the following for user experience:

  • Prioritize users’ intention over capturing their attention, e.g., choosing the right default incentives (for example, on iOS, a long tap on the Facebook app icon will show a menu offering to “write post”, “upload media”, “search”.)
  • Understand the difference between “honoring intentions” and “manufacturing desire”.
  • Design your homepage in a way that it pulls towards what the users intend.
Badge earned: Help people thrive

#6 The foundations of humane technology: Supporting fairness and justice

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

Justice is the degree to which citizens can practically experience democratic values such as freedom of speech, due process, voting. A “capability approach”, as described by Professors Martha Nussbaum and Amartya Sen, is enabling the realization or fulfillment of these values. For example, it is physically and socially safe to exercise free speech. Or voting stations are accessible and easy to get to.

Humane Technologist checklist

Accept uncertainty

  • Any complex system/tool comes with consequences that are hard to predict, and sometimes hard to understand.
  • Acknowledge your limitations with humility.

Anticipate unexpected outcomes

  • Acknowledge that harms are complex, shifting, difficult to predict.
  • Strive to find and address externalities that deepen inequalities.

Form accountable relationships with users

  • Communicate in good faith.
  • Seek knowledge in particular about and from groups affected by the technology.
  • Create space for listening and mutual understanding.
  • Understand how your technology is used.
  • Mind that come communities have a greater vulnerability to harm.
  • Include the most impacted groups in the design process.

I was very pleased to discover in the list of takeaways and resources that Center for Humane Technology recommend W3C’s free ​​Digital Accessibility Foundations course as a way to create more inclusive digital products and services.


🤔 Personal reflection: Core societal capabilities

What is your vision of a just democracy in the digital age? How have you seen technology enhance and degrade the capabilities of yourself and others?

Just democracy in an era where technology brings together people but also divides them, is one which prioritises common interests over individuals’ and actually allows people to exercise their rights, to make a difference. Ideally, the digital age is harnessed to solve hard problems effectively. I’m seeing at the same time a plethora of ways to express ourselves, but scarcity of actual means to make a difference or not at a pace that is encouraging.

🤔 Personal reflection: Relationships with those impacted

Consider a product you’ve worked on. Ask yourself about the relationships you have with those affected by this technology. (If you don’t work on products, try to put yourself in the shoes of the builders/operators of a technology you regularly use.) Do we hear and respond to those who are directly impacted by our product?

Yes, although it may take several user reports to illustrate a given impact. Sometimes addressing issues can be tabled until other priorities are crossed off, unless the issues warrant being put top of the priority list.

Do those impacted by our product have a way to communicate their experience to decision-makers in our organization?

Yes, our process is pretty open and ways to give feedback prominently displayed.

Are there voices that should be better represented in our design process? Are we uniquely positioned to build trust with groups of people whose perspectives are missing?

Yes. Once input is welcome or sought, and conversation is established, it is up to the stakeholders to inspire or grant trust. Good faith and reasonable due process are key ingredients that enable issues to be triaged or resolutions to be understood.

Badge earned: Support fairness and justice

#5 The foundations of humane technology: Creating shared understanding

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

“Humanity’s biggest problems require a collective intelligence that’s smarter than any of us individually, but we can’t do that if our tools are systematically downgrading us.”

Center for Humane Technology

A shared understanding helps with making sense and processing noisy information to inform better choices. For example, it’s one’s ability to asses the trustworthiness of claims, or catalogue the values held by a community, or find common ground across multiple view points. It can be achieved through responsible journalism, scientific method, non violent communication.

This module focuses on social media in particular because it warps reality and thus ends our capacity for shared understanding and effective collaboration.

Beware the common social media distortions

These distortions or side effects of social media or information technologies are particular manipulation artifacts that people are more or less aware of, which contribute to altering the way people grasp concepts or understand reality.

Engaging content distortion

People’s curated best moments distort these people’s lives and even our views of our own lives, because social media encourages a race to the virality lottery. When you flood the attention landscape with engaging content, everyone else must mimic this to be heard and seen.

Moral outrage distortion

Algorithms optimize for moral outrage where negativity or inflammatory language spreads more quickly than neutral or nuanced or positive emotions. This affects television and news which must keep up to stay relevant. As a result, negative caricatures are disproportionately represented and it feels like there is more outrage than there actually is. Public discourse dominated by artificial moral outrage makes progress impossible.

“Extreme emotion” distortion

This is leveraging subconscious trauma or fears by feeding people content they can hardly look away from.

Amplification distortion

This is when social media trains to, and rewards users who present more extreme views, which then get amplified.

Microtargeting distortion

This is the ability to deliver a specific resonant message to one group while delivering the opposite message to another. If these groups don’t overlap much, the contradictory messaging becomes invisible and social conflict can be orchestrated.

“Othering” distortion

This relates to the “fundamental attribution error” cognitive bias where we are more likely to attribute our own errors to our environment or life circumstances, and others’ to their personality. Social media provides ample evidence (which can and usually us strung together by algorithms) of the bad behaviour of particular groups of “others”.

Disloyalty distortion

More than a distortion, it’s really a side effect whereby you may be attacked by members of your own groups when you express compassion or try to understand another group.

Information flooding distortion

For example, fake accounts and bots that make topics trend and thus influence what people are exposed to (hence altering their reality) can be very easily engineered.

Weak-journalism distortion

More than a distortion, it’s a side effect due to the tension between substance and attention-grabbing headlights, forcing news organisations to invest less in depth and nuance.

🤔 Personal reflection: Distortions

Think of your work or consider technology products you regularly use in your personal or professional life. Where might they have contributed or amplified any of the distortions? What design features can increase or combat distortion in our shared understanding?

I have fallen more or less strongly for most of the distortions from having been an early adopter of the mainstream social media platforms. After about 15 years of being subjected to the manipulations I had a late but exponential sense of awareness which spurred me to reclaim my mental health, social and societal sanity. I consider myself in recovery and find it more useful to protect myself first and then the people in my immediate circles, by letting or not the outside signal and noise creep on my radar and approaching information with care. I believe that at-scale three-ponged methods with education on bias, design choices that optimise for people’s benefit, and corporate communication may contribute to reducing distortions and improving shared understanding.

Rebuilding shared understanding

“Democracy can not survive when the primary information source that its citizen use is based on an operating model that customizes individual realities and rewards the most viral content.”

Center for Humane Technology

How can design choices steward trust and mutual understanding? How to get to a healthier information ecosystem?

Fight the race for human attention
  • Helpful friction (such as sharing limits, prompting to read before sharing, or to revise language when harmful language is detected) can have notable results. According to Twitter, 34% of the people who were prompted to revise language did, or refrained from posting.
  • Optimize algorithms to create more “small peak” winners instead of a small number of “giant peak” ones.
  • Apply to online spaces teachings from the physical spaces and how its regulation works (such as blocking political ads, or limiting their targeting precision, in time of elections.)
  • Build empathy by surfacing people’s backgrounds and conditions. Encourage curiosity, not judgement or hate.
  • Invite one-on-one conversations over public broadcast.
  • Provide avenues for de-escalation of online disagreement.
Allow for addressing crises
  • Assume that crises will happen and plan for rapid response.
  • Seek to identify negative externalities among non-users to grow the ability to make design choices that contribute to a more resilient social fabric.
  • Maintain cross-team collaboration so that no bad design decisions are made in isolation. (e.g., are the teams which may be at the origin of harmful features enough in touch with those that mitigate or fix the features?)
  • Enable “blackouts” where features that may create harms are turned off during critical periods (e.g., leading to an election, turning off recommendations, microtargeting, trends, ads, autocompletion suggestions.)
Heal the years of toxic conditioning and mental habits, recover and re-train
  • Call out the harms so that there is mutual recognition. Public educational materials like the documentary “The Social Dilemma” should be distributed.
  • Teach/learn to cultivate intellectual humility, explore worldviews, reject the culture of contempt.
  • Rehumanize, then de-polarize.
  • Design to reach consensus in spite of / in harmony with differences.
  • Build smaller places for facilitated conversations where participants don’t compete for the attention of large audiences, but can see and enjoy others’ humanity and rich diversity.

Mind the (perception) gap

A perception gap is the body of false beliefs about another party, and that party’s beliefs. It leads to polarization. For instance, the perception gap leads to the incorrect belief that people hold views that are more extreme than they actually are. The negative side-effect is that people see each others as enemies. Very engaged parties want to win over the others at all costs, while the exhausted majority simply tunes out. Both outcomes are negative for shared understanding (or for democracy in the case of political polarization.)

Bridging the perception gap is a first step to minimize division and toward willingness to find common ground, overcome mistrust and advance to progress.

🤔 Personal reflection: The perception gap

The degree of perception gap matches the degree of dehumanization of the other side: the higher the perception gap the more likely someone was to find the other side bigoted, hateful, and morally disgusting. How might you be able to reach out to someone unlike you and better learn about their perspective on key issues? How might the “perception gap” concept and measurements inform the development of technology that creates shared understanding?

Independent and neutral parties who at critical times like election periods, or social upheaval, shed the nuanced light on the gap that might bridge polarised parties. Or else, a great deal of zen because there is a cost in time and energy, and a personal psychological risk to approaching people who may not see or recognise the good faith of a first step, or be wiling to find common grounds. Surfacing elements of the perception gap to the extent that it is known or can be deduced, and displaying these as helpful indicators, would come a long way to recalibrate one’s perspectives. Proprietary platforms may not always suggest varied sources of information to put forward, but anything that compromises between integrity and profit is a step towards progress.

Badge earned: Create shared understanding