The new black

The new AI ethics course is now officially underway – actually, we’re close to the halfway mark already, with three out of eight lectures done. I’ve been chiefly responsible for all three, which has kept me thoroughly busy for pretty much all of March, and I’ve seldom felt as deserving of the upcoming long weekend as I do right now. Zoom lecturing, which I had my first taste of in the autumn term, still feels weird but I’m getting used to it. Typically none of the students will have their camera on, and it’s hopeless to try to gauge how an audience of black rectangles is receiving you unless they go to the bother of using reactions. Perhaps a year of online classes hasn’t been enough time for a new culture of interaction to emerge organically – or perhaps this is the new culture, but that sounds kind of bleak to me and I hope it’s not true. 

I’m sure I could have done some things better to foster such a culture myself; I’m fully aware that I’m not the most interactive sort of teacher. On the other hand, I’m firmly of the opinion that teaching applied ethics without having any ethical debates would be missing the point, so we’ve been trying to come up with various ways to get the students sharing and discussing their views. We’ve had some success with supplementary sessions where a short presentation expanding on a minor topic of the main lecture seeds a discussion on related ethical issues, and there has also been some action on the Zoom chat, especially during last week’s lecture on controversial AI applications. It helps that there are many real-world controversies available for use as case studies: people will often have a gut reaction to these, and by analysing that it’s possible to gain some insight into ethics concepts and principles that might otherwise remain a bit abstract. 

Although the course has been a lot of work, some of it in evenings and weekends, it’s also been quite enjoyable, not counting the talking-at-laptop-camera-hoping-someone-is-listening part. Ethics isn’t exactly my bread and butter, so preparing materials for the course has required me to learn a little bit about a lot of different things, which suits me perfectly – I’m a bit of a junkie for knowledge in general, and I’ve never been one to focus all my efforts on a single interest. My eagerness to dabble in everything has probably worked to my disadvantage in research, since we’re way past the days when one person could be an expert in every field of scholarship, but I think it serves me well here. On the other hand, the mental stimulation I’ve been getting from looking into all these diverse topics has also given me all sorts of ideas for new papers I could write. The most laborious part of the course for me is over now, with my co-lecturer plus some guests taking over for most of the remaining lectures, so I may even have time and energy to actually work on those papers after I’ve had a bit of R&R.

In my latest lecture I talked about the relationship between AI and data. Here I was very much on home ground, since pretty much my whole academic career has revolved around this theme, so it wasn’t hard to come up with a number of fruitful angles to look at it from. I ended up using the ever-popular “new oil” metaphor for data quite a lot; I actually kind of hate it, but it turns out that talking about the various ways in which data is or isn’t similar to oil makes a pretty nifty framing device for a lecture on data ethics. Data is like oil in that it’s a highly valuable resource in today’s economy, it powers a great many (figurative) engines, and it needs to be refined in order to be of any real value. On the other hand, data is not some naturally occurring resource that you pump or dig out of the ground: it’s created by people, and often it’s also about people and/or used to make decisions that affect people, which is where data ethics comes in. 

None of these are very original observations I’m afraid, but perhaps it’s good to say them out loud all the same. If I do have a more novel contribution to add, it might be this: both oil and data have generated a lot of wealth, but over time we have come to regret using them so carelessly. With oil, we are working to reduce our dependence by adopting alternatives to petroleum-based energy sources and materials, but with data, I’m not sure that the idea of an alternative even makes sense, so it looks like we’re slated to keep using more and more of it. This makes it ever more important that we all learn to deal with it wisely – individuals, enterprises and governments alike. The economic value of data is well established by now, so maybe it’s time to pay more attention to other values?