Over the last years, Federica Russo, Eric Schliesser, and Jean Wagemans have worked together in the UvA Research Priority Area “Human(e) AI” funded project TEEXAI. One result of this joint project is the article Connecting ethics and epistemology of AI that just has been published in the journal AI and Society.
The need for fair and just AI is often related to the possibility of understanding AI itself, in other words, of turning an opaque box into a glass box, as inspectable as possible. Transparency and explainability, however, pertain to the technical domain and to philosophy of science, thus leaving the ethics and epistemology of AI largely disconnected. To remedy this, we propose an integrated approach premised on the idea that a glass-box epistemology should explicitly consider how to incorporate values and other normative considerations, such as intersectoral vulnerabilities, at critical stages of the whole process from design and implementation to use and assessment. To connect ethics and epistemology of AI, we perform a double shift of focus. First, we move from trusting the output of an AI system to trusting the process that leads to the outcome. Second, we move from expert assessment to more inclusive assessment strategies, aiming to facilitate expert and non-expert assessment. Together, these two moves yield a framework usable for experts and non-experts when they inquire into relevant epistemological and ethical aspects of AI systems. We dub our framework ‘epistemology-cum-ethics’ to signal the equal importance of both aspects. We develop it from the vantage point of the designers: how to create the conditions to internalize values into the whole process of design, implementation, use, and assessment of an AI system, in which values (epistemic and non-epistemic) are explicitly considered at each stage and inspectable by every salient actor involved at any moment.
Russo, F., Schliesser, E., & Wagemans, J.H.M. (2023). Connecting ethics and epistemology of AI. AI & Society: Knowledge, Culture and Communication. DOI: https://doi.org/10.1007/s00146-022-01617-6.