Newsletter #5: 2023-01-12

Hi everyone, Hope the first week of class is going alright! So I don’t know about you but it’s easy to get caught up in the AI part of AICC. It’s the field that’s massively shifting and changing, the thing that’s alternately exciting and scaring the pants off everyone, and where there are months worth of papers to read being released every day.

In some ways, though, it’s the “cultural computing” side where I’m a bit more naturally comfortable. So let me share more of my thoughts and some of the various directions this field can go.

First, so I personally see “cultural computing” as being – very broadly – the study of computation as a situated thing. In other words, taking the philosophic stance that:

  • Technologies are products of people at a particular point in time and thus carry personal and cultural values into how they’re built.
  • These values in turn shape how the technology is used and influence the people using the technology.
  • This back-and-forth interplay means that technologies are capable of both being a source of cultural hegemony as well as cultural preservation and resistance.

And this is a stance to take! I think it’s very intuitive but that doesn’t mean it can’t be argued with.

The other stance I tend to take is that “technology” should be thought of really broadly. I kinda crib from Ursula Franklin (Franklin (1999) The Real World of Technology, House of Anansi) when she argues that a technology isn’t just the literal thing but all the cultural apparatus of how it’s used and understood. This makes sense, right? Like the difference between footstool-as-technology and sitting-stool-as-technology isn’t in the ability to fasten four legs to a flat surface it’s in the understanding of how it’s used and how that shapes what makes it good or ill-suited for the task.

Okay, but a more involved example here is that smartphone-as-technology is not just the individual technologies of a capacitive touchscreen, cellular internet, and such but rather is the entire culture of always-on-internet-connectivity, of app stores, of only lightly configurable operating systems, etc.

From this perspective, any technology carries with it a bunch of cultural assumptions and norms. And that’s not inherently a bad thing! Part of the point, from, this perspective is that there is no inherently neutral ground to be found in the creation of any technology but that understanding this means that we can be freer to intentionally ground the things we make in the thoughts and needs of a diverse set of voices.

The phrase “democratization of technology” gets used a lot but to me, the best sense of it is the idea one can make the creation of technology a participatory democracy, one where everyone doesn’t just have access but actually has the ability to contribute.

Now maybe an obvious objection might be something like “But how can you have a single technology that reflects everyone’s cultural and personal needs” and, honestly, my position is that you can’t. I don’t think you can create universal solutions and that’s okay if it means you focus on creating tools that can be adapted to particular community needs. The term often used for that kind of technology is “convivial”, which was a term popularized (maybe even coined, though I don’t think so) by Ivan Illich in his book Tools for Conviviality (Illich (1973) Tools for Conviviality, Harper & Row New York.) It’s a great little book, really more of a pamphlet than anything else, and while I don’t agree with everything in it it’s very provocative text that delves into the dangers of systems that grow so big that they’re outside of any group’s control. In other words, tools and technologies that become so big and so entangled in our lives that we no longer meaningfully consent to using them nor do we have a choice in how they are used in our lives.

Again, smartphones! They’re great. They’re amazing. Devices with computational power unconceived of in the 20th century with permanent internet connections that can operate at absurd speeds packed with enough sensors to be a research station with location data provided by a permanent network of satellites that’s so accurate we need to use special relativity to make the proper calculations. Wow!

But do we even get a meaningful choice in using one anymore? And what does that mean for people who can’t afford these little miracle machines, when more and more of the services we use in the day-to-day require having them? When our bosses and coworkers can assume that we’re reachable and able to check our email no matter where we are? Or, more darkly, when companies can harvest identifiable information about us from the apps we use that we’re forced to use to engage with the world.

As a side note, the other author I’d recommend even though he’s such a curmudgeon that he makes me look like a sunny techno-optimist, is Neil Postman, particularly his book Technopoly. Again, I don’t agree with everything but if you haven’t read it before it’s a pretty easy read that’s just broadly about the idea that technologies that improve some things are not unalloyed goods in all things.

Okay, but I’ve been talking a bit about sort of the broader world of tech criticism but there’s a much broader world out there when it comes to cultural computing, sometimes called “ethnocomputing” after the older field of “ethnomathematics”.

So what’s ethnomathematics? If you’re like me, the name sounds a little odd but it’s really the field of study on the formulations and performances of mathematics outside the somewhat narrow story we tell of the Greek academics – mathematicians in the Middle East and North Africa building off of the Greeks – European adoption and extension of their work in the mid-second-millennium. I feel like it’s only in my lifetime that we’ve even started commonly acknowledging the second step of the over-simplified story. But it’s still an oversimplified story!

Now, there’s a caveat here that not all of the field is about that. There are parts that have, shall we say, bad vibes like when Wikipedia says that one definition of ethnomathematics equates the practice and study of mathematics outside of the predominantly Western tradition of symbolic reasoning and proof with the informal ways of understanding mathematics children use. I don’t think I’m editorializing much to say “yikes” there.

For the interesting parts of ethnomathematics, you basically can’t go wrong with anything Swapna Mukhopadhyay has touched. She’s very cool. Her research was cool. I got to hear her give a lecture some years ago and she was awesome. She used to be a professor at PSU but she’s retired now.

For example, she edited this really cool volume by a bunch of different authors about how mathematics has historically been expressed in cultures that have been marginalized: Alternative forms of knowing (in) mathematics: Celebrations of Diversity of Mathematical Practices, Springer Science & Business Media.

I love the title of this “Alternative forms of knowing (in) mathematics” because I think it gets into the heart of the matter: epistemology.

Every scientific community, including mathematics – which yeah I count as a science it’s just not, okay not entirely, an empirical science – is practiced with an epistemology, a way of determining what can be known and how we know it. This epistemology isn’t static, either. The invention of things like double-blind studies or modern statistical measures like p-values are developments in the epistemology of sciences. But it’s not like the universe has handed us the correct epistemology, there are cultural values encoded in this as well!

This isn’t just highfalutin’ (to be overtly southern for a moment) talk, either. A concrete place of application where the cultural situatedness of epistemology matters is in education! This is where the rubber meets the road in the work of the old Epistemology and Learning research group in MIT – later absorbed into what’s become the lightly-disgraced MIT Media Lab. One of the groundbreaking works here was Sherry Turkle’s paper Epistemological Pluralism and The Re-evaluation of the Concrete (and if you haven’t read it it’s not a long read but very good) which argues for a framework of education that doesn’t prioritize any particular way of building knowledge and, instead, allowing for students to build their own frameworks of understanding. It’s a paper that’s about the implicit hegemony of education as a technology and, to tie things back around, the values that are implicit in how we design programming environments and languages. While not specifically cultural computing per se, this paper and the approach their lab took has influenced a ton of work in constructionist education which, in turn, intersects with a lot of attempts at culturally relevant computing and the design of tools and curricula that take into account very culturally specific ways of learning. A famous example is this old paper by Eglash et al.

Eglash, Bennett, O’Donnell, Jennings, Cintorino (2006) Culturally Situated Design Tools: Ethnocomputing from Field Site to Classroom, American anthropologist, also available at researchgate.net.

That explored the idea of creating frameworks for learning coding, basically like little blocks-based languages a la Scratch, that were useful for creating specific kinds of artifacts that youth already had a connection to.

More recent work along these lines comes from the absolutely incredible Vernelle Noel and her work on creating design systems for the kind of wire sculpture used in Carnival in Trinidad & Tobago. As I understand it, part of the motivation for this work was that, as someone from there, she was concerned about wire sculpture as an endangered art as the old masters of the craft were aging and part of what she was doing was creating a situated computational design tool to help youth understand wire-bending as a craft, passing on the cultural and technical knowledge of it to future generations.

Here’s one of her more recent papers on the topic that seems like a great summary about both the social and technical aspects of this research.

Again, I cannot stress enough how cool I think this is at all levels. Both in terms of what she’s literally done but also the way it models what a non-hegemonic relationship with technology and code could be.

Bringing it back around, she’s created a convivial relationship between generative design and this artistic practice. I don’t know if Illich would have appreciated it immediately but I don’t think it would have been hard to convince him, either.

Now I haven’t talked about how any of this intersects with AI because I was leaving that for the end here. In some sense, this entire thing is an explanation for why I said last time:

How much objectivity is even possible? If a chatGPT system provides an answer is that really “the” answer? There are obviously a lot of questions where “the” answer is very subjective, depending on assumptions and axioms that can dramatically change interpretation of facts.

This gets to the question of “the view from nowhere”, whether there can be a point of view that is inherently objective. And even further whether or not you believe such a thing can exist it’s somewhat obvious that the idea of objectivity has been weaponized to marginalize others.

And this isn’t an idle question either: it affects the very nature of what we do in this space. I think the two extremes are either that that we try to eliminate bias so that a future gpt6 is a completely objective source of accurate information or that we completely abandon the project of LLMs for information management.

or, with the language we have in our hands now: is using an LLM like chatGPT compatible with epistemological pluralism or is it a hegemonic reification of a dominant epistemology, now taken the form of an objective “model” rather than a culturally situated artifact?

What would a culturally situated version of LLM tools even look like? Certainly, some people think it might be possible! If you have thoughts on how to do it you could probably steer the direction of an entire burgeoning world of technology.

So, that’s enough from me this time for sure but I hope this was an interesting thought dump about cultural computing and how it might be tied to the future of AI.