I made straight lines & cover page labels for my #reMarkable

2023-11-16 update: since release 3.8, which happened yesterday on my tablet, straight lines are now available as a new feature!

I recently acquired a second-hand e-ink tablet. The reMarkable2 comes with very little but specific features which optimize for efficient note-taking mainly, and for some sketching.

For the latter, the only assistance available is a few templates that afford guide lines, and the possibility to work with layers. In both cases the handling tools consist of a couple of erasers and a selection tool which lets you resize, rotate, copy and paste (except for what you typed as text, it only works for what you put on “paper” with the “pen”). No warping, no inversion, no tool to create any common shape or make a straight line.

Yet it knows of straight lines because when you use the highlighter on a PDF or EPUB file, it can “snap to text” and your highlighter strokes are transformed into straight lines.

I don’t know how others manage, when they prefer not to “jailbreak” (for lack of a better term) their tablet, but I don’t care whether I can display a custom image while my tablet is sleeping, but I do care about straight lines and shapes that are scalable. So I drew some and made a PDF of the pages.

How I use them

Note: the illustration pictures are post processed with a filter to give them a slight background that changes the colour (the eggplant colour should in fact be black, the red is in fact much more vivid.)

Horizontal and vertical lines of various thickness and lengths, and one rectangle

I made horizontal lines of varied thickness and length, a few vertical lines too, and a rectangle.

When I need a line, I navigate to this page in my templates folder, use the selection tool to copy it, navigate to my destination page, and tap the pen. Then I drag it where I want, stretch it or shrink it, rotate it if I need. And repeat as often as needed.

Oval black label with hand-written text in white reading: Notes & thoughts

For this cover page label, I used one of the black oval shapes I hand-drew, copied it with the selection tool, navigated to my notebook page, pasted it, and gave it the size I wanted.

Then I added a new layer. I chose the calligraphy pen, thick size, and white ink and wrote. The layer protects the oval if you erase or select and move your words.

Black rectangles of various sizes stacked on top of each other with white hand-writing inside to look like a cover

This is exactly the same instructions as the oval label, but selecting all the black boxes of various sizes and using the medium-sized calligraphy pen nib.

In this particular notebook, I used the same cover page for each of the modules. I duplicated the first one, moved it to the right place, selected the layer where I wrote and made changes.

Large red circle and red outer outline within which is hand-written in white: My evil schemes, and in block black letters underneath: Book 42, year 2023

This is page 14 of the PDF I made. I duplicated the whole page and moved it as cover page of a new notebook. I could have selected the shape, copied it, and pasted it elsewhere, but I wanted the circles at exactly the same place.

#8 The foundations of humane technology: Ready to act

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

This module addresses how to make lasting changes, building on the tenets of humane technology and highlighting the most pressing issues and stakes around building technology that interacts with human attention.

Graphic summarising the 6 tenets of humane technology which are the titles of the course modules

Watch and listen to Randima Fernando, co-founder of the Center for Humane Technology, introduce the course on the Foundations of Humane Technology (1:43):

Introduction in less then 2 minutes of the course: Foundations of Humane Technology

The most pressing issues and stakes

As a summary of all I’ve learned in the course modules, here are the most meaningful elements (to me):

  • Technology divides us, distracts us from meeting our biggest global challenges, and from our ability t solve problems.
  • Human bias and vulnerabilities have helped us survive in ancient times, but not today.
  • We can and should build better tech. Tech that does not leverage human vulnerabilities. Tech that honors that attention is sacred, and that centers on individual and societal thriving.
  • A shift is needed from “design for conversion” to “design to enable wise choices” or “design for collaboration and sense-making”.
  • Humane Technologists have a role (a duty?) to proactively design toward human thriving.
  • Be values-driven, and metrics-informed.
  • Recognise that tech isn’t, and cannot be neutral.
  • Don’t fix tech with more tech. Instead, create fewer harms. Even if there is allure to more business / market opportunities by fixing the problems.
  • Fix the causes, fix the crises, re-condition habits, establish trust.

How to make lasting changes?

Now that you’re informed, determined, and ready to act, be aware of this and be prepared to do that:

  • You will meet substantial resistance. The parties that are winning have a lot to lose (in the short-term.)
  • The good news is: you’re not alone! Others are fighting extractive technology.
  • You will meet substantial resistance as far as egos are concerned (yours as well.)
  • The good news is: it’s easier to encourage change when framing it in terms of 1) spirit of service (because it connects with our desire to help, to be charitable), 2) openness to learning (because when we recognise that we don’t have all the answers, our sense of self is less rigid –which is the politically correct way of saying we’re not arrogant pricks)
  • Make bold choices that give up short-term security and the established definition of success.
  • Change the definition of success. Rally people around it.
  • Success is living in a world aligned with our values, with humanity, and with long-term thriving.
  • Humane Technologists succeed by rallying others and proving people can be more balanced, more fulfilled, and more impactful.
  • Tell your story (there are 3 stages to story-telling according to Movement Theorist Dr. Marshall Ganz: the challenge, the choices, the outcome)
  • Find collaborators and experiment until you get a first win, and perhaps even… recognition!
  • [optional] Become the champion of the new definition of success.

🤔 Personal reflection: Telling your story

What is a story that shapes your path as a humane technologist? What is a challenge you’ve faced, a choice you’ve made, and an outcome that others might learn from?

The only relevant story I can think of, since I’m not a technology builder, is how I reformed the team I lead at work. Eight years ago, I was promoted to lead the team I was on. It wasn’t a career goal and I had not been paying attention to any of the leadership considerations. But once in place, I knew that making small adjustments might have a positive impact. I was aware of my colleagues’ strengths and weaknesses, and there were territories I was keen for us to explore. I wanted to achieve three things: that we were doing the things that needed to be done, that we were enjoying ourselves by doing them well, and that we take up new select longer-term projects that are more preparatory for bigger things, and less reactionary in nature. So I focused on “what”, “why” and “how”, provided some guiding principles and built a common sense of our purpose and capabilities, and broke down our projects by affinities so that anyone’s strength was put to the best use possible. There is flexibility in who exactly does what, how it is done because we focus on outcomes over tasks and we’re established some level of redundancy. After a few years, we’re inspired to grow our own expertise and learn new skills, are autonomous, efficient and credible.

Badge earned: Ready to act

#7 The foundations of humane technology: Helping people thrive

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

This module explores the goal ultimately of humane technology: respecting human nature, minimizing harmful consequences, centering (on) values, creating (fostering) shared understanding, supporting fairness and justice all converge towards the goal of helping people thrive.

🤔 Personal reflection: Thriving

What does thriving mean to you?​ Imagine a thriving world, locally, and globally. What role do persuasive technologies like social media play in that world?

Thriving ranges from succeeding to growing, to enjoying oneself, in the pursuit of something of one’s choice. It encompasses all the stages between setting out a goal and the outcome. Social media can be both a blessing and a curse because it has the ability to inspire, motivate, inform, but also the ability to distract, confuse or harm.

What considerations and design choices enable thriving?

  • Aim for durable benefits.
  • Support people in relating to pain in ways that enable learning, resilience, personal agency.
  • Develop people’s capacity for: gratitude, creativity, learning.
  • Help people adapt to change and respond wisely to life’s circumstances, rather than react compulsively.
  • Simplify features, journeys.
  • Respect the natural flow of human life, respect natural human boundaries.
  • Help people figure out what wise action to take next, recognising that we can change how we contemplate the past and how we approach the future, but “right now” can’t be changed, and is therefore an opportunity to place our attention and energy on the right solutions.
  • Self-improve, or engage in personal development yourself, because the more we understand ourselves, the better we can be of service to others.
  • Prompt to care, be kind and considerate. Design for a kinder world.
  • Learn from the top five regrets expressed by the dying:
    • I wish I’d had the courage to live a life true to myself and not what others expected of me.
    • I wish I hadn’t worked so hard.
    • I wish I’d had the courage to express my feelings.
    • I wish I’d stayed in touch with friends.
    • I wish I’d let myself be happier.
  • Consider a “regret metric” to assess the performance or relevance of a technology or feature.
  • Mind the bigger lens, not just individuals’ considerations or short-term interests.
  • Remember that design choices must fit within the ecological ceiling and the social foundation that ensures that no one is left falling short of life’s essentials (such as food, water, shelter, electricity, safety.) (cf. donut model)
  • Remember that addiction narrows the range of pleasurable experiences, making it harder to enjoy not just the object of the addiction but everything else. This results in the addiction(s) becoming less and less rewarding, and leads people to feel increasingly empty.
  • Re-balance user engagement (simple consumption) and creativity, as there is more reward in the latter while there is value in some external validation that likes and shares afford.
  • Shift paradigm: from “compete for attention” to “attention is sacred“.

Donut model: Path to regenerative systems (sustainability)

I’m digressing to focus on this aspect from Module 3 “minimizing harmful consequences” which I had neglected to highlight last year when I took the module, but resonates particularly today.

Graphic of the donut model

If we wanted (duh!) to create economic systems in service to ecological, cultural, and psychological regeneration, Kate Raworth’s model of Donut Economics describes how economic systems must be productive enough to meet human needs and contained enough to avoid exceeding ecological limits.

There is growing recognition that humans have gone beyond our ecological and social limits. There is increasing pressure to shift the structure of markets and create space for radically new kinds of regenerative technology to emerge. But there is incredible resistance, dammit.

Technology can deliver a twisted version of thriving

“We value creators and want to create a place where they can show off their brilliance, grow their audience, and make real money.”

Often used by products like TikTok

“We’re a place for fun, inspiration, and ideas that matter. We show people the most relevant content so that they can enjoy themselves and lead more informed and meaningful lives.”

Often used by products like YouTube/Google

“We help small business owners find customers who’ll love their products so that their businesses can grow and thrive.”

Often used by products like Instagram

🤔 Personal reflection: Examining flawed stories

Technology companies often have compelling stories about how their products contribute to human thriving. In what way are they authentic? In what ways do they mimic thriving in toxic ways? Then consider how a product you’ve worked on or use may seek to help people thrive. Does that product define thriving accurately?

There is truth in the statements, such as “show off” and lies such as “lead more informed and meaningful lives”. The rest appeals to shallow but powerful temptations. The empty promises may be true for some lucky few. The rest can get stuck in the maelstrom of youtube-like recommendations, and the infinite rabbit-hole of the “more like this”.

Most of the sites and social networks that are free are guided by their own sustainability and therefore prioritise users’ actual benefits last. Unless these are developed by independent developers and/or developed as free and open source software by people who most of the time are already more aware and sensitive to ethics. More and more people seem to realise that if a product is free, they are the product; and therefore there is a teeny bit more interest in supporting developers who make a living out of developing software for people, to fulfill specific outcomes.

Design for thriving: Principles

Paradigm shifting:

current paradigmparadigm shifting
Compete for attentionAttention is sacred
Center engagement and growthCenter individual and societal thriving

1. Protect attention at all cost

“Relentless hijacking of attention is a direct assault on autonomy (and on thriving).”

Center for Humane Technology

Attention is sacred. Like our time, attention is finite. Make design choices or consider the following:

  • Design to minimize attention-grabbing
  • Design to add delays or helpful friction to keep attention and intention aligned
  • Increase users’ control over their own attention

2. Design for intentionality

“Thriving requires us to reflect on, set, and then act on our intentions.”

Center for Humane Technology

Because sometimes, the conditions for meaningful rest are better than those for optimum engagement, made design choices or consider the following for user experience:

  • Prioritize users’ intention over capturing their attention, e.g., choosing the right default incentives (for example, on iOS, a long tap on the Facebook app icon will show a menu offering to “write post”, “upload media”, “search”.)
  • Understand the difference between “honoring intentions” and “manufacturing desire”.
  • Design your homepage in a way that it pulls towards what the users intend.
Badge earned: Help people thrive

#6 The foundations of humane technology: Supporting fairness and justice

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.


  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

Justice is the degree to which citizens can practically experience democratic values such as freedom of speech, due process, voting. A “capability approach”, as described by Professors Martha Nussbaum and Amartya Sen, is enabling the realization or fulfillment of these values. For example, it is physically and socially safe to exercise free speech. Or voting stations are accessible and easy to get to.

Humane Technologist checklist

Accept uncertainty

  • Any complex system/tool comes with consequences that are hard to predict, and sometimes hard to understand.
  • Acknowledge your limitations with humility.

Anticipate unexpected outcomes

  • Acknowledge that harms are complex, shifting, difficult to predict.
  • Strive to find and address externalities that deepen inequalities.

Form accountable relationships with users

  • Communicate in good faith.
  • Seek knowledge in particular about and from groups affected by the technology.
  • Create space for listening and mutual understanding.
  • Understand how your technology is used.
  • Mind that come communities have a greater vulnerability to harm.
  • Include the most impacted groups in the design process.

I was very pleased to discover in the list of takeaways and resources that Center for Humane Technology recommend W3C’s free ​​Digital Accessibility Foundations course as a way to create more inclusive digital products and services.


🤔 Personal reflection: Core societal capabilities

What is your vision of a just democracy in the digital age? How have you seen technology enhance and degrade the capabilities of yourself and others?

Just democracy in an era where technology brings together people but also divides them, is one which prioritises common interests over individuals’ and actually allows people to exercise their rights, to make a difference. Ideally, the digital age is harnessed to solve hard problems effectively. I’m seeing at the same time a plethora of ways to express ourselves, but scarcity of actual means to make a difference or not at a pace that is encouraging.

🤔 Personal reflection: Relationships with those impacted

Consider a product you’ve worked on. Ask yourself about the relationships you have with those affected by this technology. (If you don’t work on products, try to put yourself in the shoes of the builders/operators of a technology you regularly use.) Do we hear and respond to those who are directly impacted by our product?

Yes, although it may take several user reports to illustrate a given impact. Sometimes addressing issues can be tabled until other priorities are crossed off, unless the issues warrant being put top of the priority list.

Do those impacted by our product have a way to communicate their experience to decision-makers in our organization?

Yes, our process is pretty open and ways to give feedback prominently displayed.

Are there voices that should be better represented in our design process? Are we uniquely positioned to build trust with groups of people whose perspectives are missing?

Yes. Once input is welcome or sought, and conversation is established, it is up to the stakeholders to inspire or grant trust. Good faith and reasonable due process are key ingredients that enable issues to be triaged or resolutions to be understood.

Badge earned: Support fairness and justice