Take the story about "addictive tech." At its root, the idea that tech companies compel us to use their technology without us becoming inured to their tricks is just the mirror image of the self-congratulatory story tech tells itself about its ability to shape our behavior.
"Good Intentions, Bad Inventions," a new pamphlet by Amanda Lenhart and Kellie Owens from Data and Society takes aim at this story and three others, painting a more nuanced (and evidence-based!) picture of what's wrong with tech and how to improve it.
Take the story about addictive tech. The story is grounded in a kind of evolutionary psych story about our caveman brains meeting high technology. That story is just...a story, devoid of any evidence.
The evidence for tech addiction is thin, contradictory and controversial. Individually, we each have different degrees of susceptibility and it's possible to have a pathological reaction with anything, but that's a story of people and their vulnerabilities, not technologies.
Meanwhile, marginalized groups often benefit from new technologies, and the increased access and visibility they afford.
The author recommend expanding "research to include the needs and values of a broader range of users, including youth, communities of color, and other historically marginalized populations."
They call for more diverse tech workplaces, and to pay diverse workers for any extra work they do helping to make products better for a broader range of users.
The next story they take on is the idea that bad tech should be fixed with good tech: the seductive neutrality of a tech fix belies the complexity of systemic problems, and asking the same people who built a broken tool to fix it just compounds the error.
So rather than paternalistic "choice architectures" and "nudging," figure out who's missing from your design process, and how their absence creates blind spots in your product design.
"Sometimes, the right choice might be to not build a technology at all. "
Look beyond tech to the policy sphere: maybe the way to fix a problem is to advocate for a law or regulation.
The third story is "engagement" - the near-universal metric by which tech companies measure the success of their products. Again, measuring the number of hours a user spends or how often they visit your product can seem like an objective way to measure success.
But this elides the subjective experience of the users, the cultural context of their use. Metrics can annihilate the experience of outlier users - focusing on the average can blind you to the needs of (and harms to) "atypical" users.
The authors exhort technologists to measure success based on values, not numbers; to divide users into subgroups that you separately evaluate; and to incorporate qualitative investigations into quantitative analysis.
Finally, they tackle the digital detox: "Our health and well-being depend on spending less time with screens and social media platforms."
It's natural that parents want to know how much screen time their kids should have, after all.
The authors point out that it's far better to consider what KIND of screen-time your kids are putting in than merely how much time they're spending with their screens.
Screen-time can be beneficial or harmful, and for kids who are struggling, screen-time can be a "release valve." But when we see a kid who's having a hard time and spending lots of time online, it's easy to assume the screen is causing the problem, rather than managing it.
The authors tell us not to try to infer what users want by observing their use of your platform, but rather to ask users what they want, and to pay attention to non-quantifiable outcomes like "value, pleasure and joy."
"Use ethics teams, diverse product teams, and qualitative social science to broaden the values that guide the design of new products."
I mean, who can argue with that?
@pluralistic thank you this. As a social worker for families raising youth with behavioral issues this is a constant issue. Especially now with distance learning and screen burnout.
@pluralistic there is indeed a //severe// lack of ethics in tech
its like jeff goldblums "just because you can, doesnt mean you should"
@pluralistic "figure out who's missing from your design process, and how their absence creates blind spots in your product design."
hell yea! diversity brings diverse experiences, which help with problem solving and it frustrates me that tech is insular
thats not healthy
@pluralistic See also: ads, MtG, Pokemon, and fkn _baseball cards_.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!