Day-to-day work at W3C

An author for IEEE asked me last week, for an article he’s writing, to write a high level introduction to the World Wide Web Consortium, and what its day-to-day work looks like.

Most of the time when we get asked, we pull from boilerplate descriptions, and/or from the website, and send a copy-paste and links. It takes less than a minute. But every now and then, I write something from scratch. It brings me right back to why I am in awe of what the web community does at the Consortium, and why I am so proud and grateful to be a small part of it.

Then that particular write-up becomes my favourite until the next time I’m in the mood to write another version. Here’s my current best high level introduction to the World Wide Web Consortium, and what its day-to-day work looks like, which I have adorned with home-made illustrations I showed during a conference talk a few years ago.

World Wide What Consortium?

The World Wide Web Consortium was created in 1994 by Tim Berners-Lee, a few years after he had invented the Word Wide Web. He did so in order for the interests of the Web to be in the hands of the community.

“If I had turned the Web into a product, it would have been in people’s interest to create an incompatible version of it.”

Tim Berners-Lee, inventor of the Web

So for almost 28 years, W3C has been developing standards and guidelines to help everyone build a web that is based on crucial and inclusive values: accessibility, internationalization, privacy and security, and the principle of interoperability. Pretty neat, huh? Pretty broad too!

From the start W3C has been an international community where member organizations, a full-time staff, and the public work together in the open.

Graphic with illustrations showing that the public and members contribute to 52 work groups, and that 56 people in the w3c staff help create web standards of which there were 400 at the time I made this drawing
W3C Overview

The sausage

In the web standards folklore, the product –web standards– are called “the sausage” with tongue in cheek. (That’s one of the reasons behind having made black aprons with a white embroidered W3C icon on the front, as a gift to our Members and group participants when a big meeting took place in Lyon, the capital of French cuisine.)

Since 1994, W3C Members have produced 454 standards. The most well-known are HTML and CSS because they have been so core to the web for so long, but in recent years, in particular since the Covid-19 pandemic, we’ve heard a lot about WebRTC which turns terminals and devices into communication tools by enabling real-time audio/video, and other well-known standards include XML which powers vast data systems on the web, or WebAuthn which significantly improves security while interacting with sites, or Web Content Authoring Guidelines which puts web accessibility to the fore and is critical to make the web available to people of all disabilities and all abilities.

The sausage factory

The day to day work we do is really of setting the stages to bring various groups together in parallel to progress on nearly 400 specifications (at the moment), developed in over 50 different groups.

There are 2,000 participants from W3C Members in those groups, and over 13,000 participants in the public groups that anyone can create and join and where typically specifications are socialized and incubated.

There are about 50 persons in the W3C staff, a fourth of which dedicate time as helpers to advise on the work, technologies, and to ensure easy “travel” on the Recommendation track, for groups which advance the web specifications following the W3C process (the steps through which specs must progress.)

Graphic showing a stick figure with 16 arms and smaller drawings of stick figure at a computer, stick figure talking to people, and stick figure next to documents. The graphic lists nine different roles: super interface, representation of w3c in groups, participation and contribution, technical expertise, mastering the process, creation of groups and their management, liaison with other technical groups, being consensual.
Role of the W3C staff in work groups

The rest of the staff operate at the level of strategy setting and tracking for technical work, soundness of technical integrity of the global work, meeting the particular needs of industries which rely on the web or leverage it, integrity of the work with regard to the values that drive us: accessibility, internationalization, privacy and security; and finally, recruiting members, doing marketing and communications (that’s where I fit!), running events for the work groups to meet, and general administrative support.

Graphic with stick figures representing Tim Berners-Lee, the CEO and the team, and four areas of help: support, strategy, architecture & technology, industry, project.
W3C team

Why does it work?

Several of the unique strengths of W3C are our proven process which is optimized to seek consensus and aim for quality work; and our ground-breaking Patent Policy whose royalty-free commitments boosts broad adoption: W3C standards may be used by any corporation, anyone, at no cost: if they were not free, developers would ignore them.

Graphic showing the steps from an idea to a web standard
From an idea to a standard

There are other strengths but in the interest of time, I’ll stop at the top two. There are countless stories and many other facets, but that would be for another time.

Sorry, it turned out to be a bit long because it’s hard to do a quick intro; there is so much work. If you’re still with me (hi!), did you learn anything from this post?

Book: “Persuasion” by Jane Austen

Cover of the book showing a painting of a young woman reading next to an elderly woman

Ah, the wit of Jane Austen is sharp in this novel, and enjoyable as always. 

It is laid out a lot like a theatrical play. Many of the scenes could be played in their own stage. Except perhaps the long walks and the beach strolls. 

This novel is her shortest, I think. However, I was stricken by the over-abundance of the word “and”.

I may return to this post with more to add. I only finished it now after starting it yesterday, and I may need to let it sink.

2022-07-27 update: “and” appears 2802 times (*) in a total of 227 pages (24 chapters). That’s 12.3 per page

(*) [After starting to underline them in my book, I found it tedious and unreliable, so I found an HTML version of the book, stripped it of non-novel cruft using emacs and then piped a word count to a grep, embracing the nerddom, but then ran a better grep(**) command which my colleague Bert Bos supplied and explained, because the simpler one would find hand, grand or wander, but not And,]

 (Wed, 27 Jul 2022 01:11:29 CET)-(koalie@gillie:~:)$grep and /Users/koalie/Library/Mobile\ Documents/com\~apple\~CloudDocs/Downloads/Persuasion\,\ by\ Jane\ Austen.html | wc -l

(**) (Wed, 27 Jul 2022 07:11:31 CET)-(koalie@gillie:~:)$grep -E -i -o '\band\b' /Users/koalie/Library/Mobile\ Documents/com\~apple\~CloudDocs/Downloads/Persuasion\,\ by\ Jane\ Austen.html | wc -l

(where -E = enable regexps, -i = case-insensitive, -o = put every occurrence on a separate line, \b = word edge) [Thanks Bert!]

#2 The foundations of humane technology: respecting human nature

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.

  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

Technology vs. human nature

As humans, we inherit evolutionary conditioning that developed over millions of years.

Center for Humane Technology

🤔 Not to freak out entirely just yet, but my brain just leapt to this thought: When I think of the damage done over just ten years of asocial behaviour on so-called social media, I really worry about evolutionary conditioning and its possible acceleration in the face of individualism and instant gratification.

⚠️ heuristics (shortcuts for making decisions and sometimes fast decisions on incomplete data) may become cognitive biases (such as social conformity biases where our opinions are shaped by (the actions of) others) which can be manipulated and exploited for profit or power.

Instead of exploiting these biases for short-term profits, humane technology uses them as design constraints for wise choices.

Center for Humane Technology

An interesting overconfidence bias is the above-average effect: many people think they’re above average, which of course is statistically impossible. Because of that, we believe we can not easily be manipulated, influenced, and shaped. The truth is that we make decisions based on context and/or emotions, however great we (think we) are.

For example:

  • What we think is a fair price depends on the first price we hear. This is known as price anchoring.
  • If someone has just read a paragraph containing the word “Ocean,” and then you ask them to identify their favorite clothing detergent, they’re more likely to choose “Tide.” This is known as subliminal cueing.
  • A nauseating environment inclines us toward harsher moral judgments.

🤔 Personal reflection: hijacked biases

Are there any instances in the last day or two you can think of when your brain’s heuristics may have been hijacked, by technology or anything else?

Yes. It happens at least once a day and more, that I pick up my smartphone to do something specific and that seeing a particular app icon, or a notification totally hijacks my focus. Most of the times I don’t even accomplish that specific thing I had set out to do until after I’ve put the phone down and something else reminds me of what I had intended to do. It makes me wonder the amount of things I ended up not doing after all 😂

Badge earned: Respect human nature

Design to enable wise choices, NOT to exploit human vulnerabilities

Human vulnerabilities and tendencies need to be factored in as we study user behavior. Ignoring values risks to trade what’s right for what works.

When a machine learning algorithm can profile someone, identify their psychological weaknesses, and flood them with information tailored to shift their behavior, manipulation of the many by the few becomes possible in a new and terrifying way.

Center for Humane Technology

Good practice: technologies or products should be capable of:

  • Receiving information from users about their experience
  • Learning how your technology does or doesn’t support user values
  • Evolving towards increasing alignment between platform design and user well-being

To “give people what they want” is often an easy way to avoid moral responsibility for exploiting human desire.

Center for Humane Technology

#1 The foundations of humane technology: setting the stage

I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) These are my notes.

  1. Module 1 – setting the stage
  2. Module 2 – respecting human nature
  3. Module 3 – minimizing harmful consequences
  4. Module 4 – centering values
  5. Module 5 – creating shared understanding
  6. Module 6 – supporting fairness & justice
  7. Module 7 – helping people thrive
  8. Module 8 – ready to act

Setting the stage (notes from 2022-07-10)

“The real problem of humanity is the following: we have paleolithic emotions, medieval institutions; and god-like technology.”

Dr. E.O. Wilson, Sociobiologist

🤔 Personal reflection: taking stock of our current situation

Looking at the UN’s Sustainable Development Goals, which of these goals are most important to you? Which challenges represented by these goals have you experienced personally (directly or indirectly)? Which challenges have you been spared from? Which issues have affected your loved ones or communities you care about?

Climate action as well as peace, justice and strong institutions are the most urgent goals to me, because I believe they have respectively the potential to give humanity the time it needs and the framework required, to make significant progress on many other of the UN stated sustainable development goals.

All of the issues are truly important. I am fortunate enough as a middle-aged white European woman, to have been born in a place at an era where I don’t suffer from poverty or hunger, or lack of clean water. I have received an education of quality and I am healthy and live in a country where a health system and well-being are key benefits.

Personally I am directly negatively impacted by gender inequality. Not as much as generations before mine, and hopefully more than generations to come. I teach my teenager son the values which put humans first and that recognize that women and men are equally important. I made choices for myself and him that speak to responsible consumption, and I hope and expect he will follow a similar or better path, if not lead others to build a better world.

What role should technology play in achieving the goals you care about most?

People rely more and more on technology, which is a blessing and a curse. Because science without conscience can have disastrous consequences. There is hope where technical choices are individual’s, but it’s not always the case. Technology can empower the people through participatory governance. Safe-fails and careful design and planning are key. Technology may play an essential role in each and every of the UN stated sustainable development goals. But before we get there, there needs to be broader realization of the stakes, and global awareness and will to put the humans first.

Our shared understanding of the world is being polluted just like the Earth. The information we use to inform our choices is being mediated by technology designed to maximize engagement, not collective wisdom.

Center for Humane Technology

Systems thinking (notes from 2022-07-15; 2022-07-17)

Systems thinking is the ability to see the problems and solutions in relationship to one another rather than as isolated concerns.

Persuasive technologies use scientifically tested design strategies to manipulate human behavior towards a desired goal like increasing time on site or user engagement. The creation and amplification of these technologies have been called “the race to the bottom of the brain stem.”

Ledger of Harms by the Center for Humane Technology lists the harms to society that technology platforms have created in the race for human attention, due to immense pressure to prioritize engagement and growth:

  • The Next Generations :: From developmental delays to suicide, children face a host of physical, mental and social challenges. Exposure to unrestrained levels of digital technology can have serious long term consequences for children’s development, creating permanent changes in brain structure that impact how children will think, feel, and act throughout their lives.
  • Making Sense of the World :: Misinformation, conspiracy theories, and fake news. A broken information ecology undermines our ability to understand and act on complex global challenges.
  • Attention and Cognition :: Loss of crucial abilities including memory and focus. Technology’s constant interruptions and precisely-targeted distractions are taking a toll on our ability to think, to focus, to solve problems, and to be present with each other.
  • Physical and Mental Health :: Stress, loneliness, feelings of addiction, and increased risky health behavior as technology increasingly pervades our waking lives.
  • Social Relationships :: Less empathy, more confusion and misinterpretation. While social networks claim to connect us, all too often they distract us from connecting with those directly in front of us, leaving many feeling both connected and socially isolated.
  • Politics and Elections :: Propaganda, distorted dialogue & a disrupted democratic process. Social media platforms are incentivized to amplify the most engaging content, tilting public attention towards polarizing and often misleading content. By selling micro targeting to the highest bidder, they enable manipulative practices that undermine democracies around the world.
  • Systemic Oppression :: Amplification of racism, sexism, homophobia and ableism. Technology integrates and often amplifies racism, sexism, ableism and homophobia, creating an attention economy that works against marginalized communities.
  • Do Unto Others :: Many people who work for tech companies — and even the CEOs — limit tech usage in their own homes. Many tech leaders don’t allow their own children to use the products they build — which implies they’re keenly aware that the products from which they make so much money from pose risks, especially for young users.

Humanity has always faced difficult challenges, but never before have they been able to so quickly scale up to create truly global threats.

Center for Humane Technology

The leverage points framework

Inspired by Dr. Donella Meadows’ Leverage Points to Intervene in a System, the Center for Humane Techology developed a simplified model of leverage points for intervening in the extractive tech ecosystem. (3-minute video explaining the levers and giving simple examples.)

Leverage increases from left to right on the framework, as does the difficulty of implementing changes.

  1. Design Changes: These are adjustments that technology companies themselves make in the visual design and user experience of their platforms.
  2. Internal Governance: These changes are implemented by decision-makers within platforms to shift how internal systems and structures operate.
  3. External Regulation: This occurs when legislators or regulators, pass laws that set requirements related to safety, transparency, interoperability with competing platforms, or create liabilities for unsafe business practices or harms.
  4. Business Model: These changes shift the fundamental operations and profit structures of a firm, for example, a company could move to a subscription model with a sliding scale to ensure broad access.
  5. Economic Goal: Redefining economic success can radically alter how systems behave.
  6. Operating Paradigm: Paradigm changes are the highest leverage point and generally the most difficult to shift. They occur when there is a widespread change in core beliefs, values, behaviors, and operating norms.

We cannot meet our biggest global challenges if our technology distracts us, divides us, and downgrades our collective ability to solve problems.

Center for Humane Technology

🤔 Personal reflection: The ledger of harms

Did any of these harms factor into your decision to take this course? Why do they matter to you personally? Are any of these harms new to you and worth deeper contemplation?

​I am acutely aware of all of these harms to society. If I have personally experienced some of them first-hand, I have witnessed all of them unfold increasingly during the past decade or two, in the three decades that passed since the invention of the web by Tim Berners-Lee, who then created the Web Consortium (W3C) for whom I work, with a mission to develop the open web standards that make it more powerful and helpful to society.

My growing frustration and feeling of helplessness in the face of market interests factored in my decision to take a course on approaching humane technology. W3C and its stakeholders are gradually, albeit slowly, moving from purely technical standards work to value-based systems thinking. The technical work is already anchored around values that put people first: “User needs come before the needs of web page authors, which come before than the needs of user agent implementers, which come before than the needs of specification writers, which come before theoretical purity.” (called Priority of Constituencies.) I am watching with hope the social aspect of our work as there is growing interesting among our stakeholders to make “sustainability” a resident among the wide horizontal review group, joining long-term subjects web accessibility (for disabled people primarily), internationalization (for all of the languages and scripts of the earth), security and privacy (for everyone).

Showing people that humane technology is possible can help transform consumer distrust of existing platforms into consumer demand for something else.

Center for Humane Technology

We need humane technology that:

  • Respects, rather than exploits, human vulnerabilities.
  • Accounts for and minimizes negative consequences.
  • Prioritizes values over metrics.
  • Supports the shared understanding necessary for effective democracies.
  • Promotes fairness and justice.
  • Helps humans thrive.

🤔 Personal reflection: acting with leverage

How can your own work help to push a leverage point in a meaningful way?

At my level (which isn’t that of a product designer, or technical architect), I strive to influence my peers and colleagues by ensuring we keep being guided by a strong moral and ethical compass, by reminding others of relevant aspects, and in some cases by making proposals for internal discussions.

Where do you see opportunities to push on other leverage points?

The decisions I take for myself are more and more informed by my values, and manifest in whether I choose to support, or not support, certain types of activities, businesses, practices, etc. I evangelize when I can. I teach my teenager the values which I believe make the world a better place. I try to lead by example. Beyond this, I believe that one’s exercising democratic preference is the next best leverage point that is available to me.

A common pattern in the tech industry is thinking that systemic problems have algorithmic solutions. Sometimes it doesn’t work. Per the following xkcd illustration.

Bad earned: setting the stage