Supervisors: Martin Kampel
Care work in long-term care (LTC) is considered as a genuine human-centred activity, requiring empathy, emotional investment, physical encounters and intimate, trust-based relations between various care-givers and care-recipients. AI technologies are introduced in this professional field to assist care workers in their daily activities and provide an additional measure of care for clients. This has changed the provision of care, affecting care givers and recipients alike.
So far, little research has been done on the biases that emerge from AI in this field and the risks that algorithmic governance of care offers in the profession. Based on data generated by AI technologies, unfair decisions can remain unnoticed in the process of linking different big data sets, leading to ethical and social issues in LTC.
- Overview on existing techniques and solutions for LTC
- Evaluate and Enhance Tools for Bias Assessment in Care AI Models: Investigate existing tools and methodologies for detecting and assessing biases in AI models used in LTC settings. Enhance these tools to better capture the unique dynamics of care work.
- Application of Tools on Real LTC Data: Deploy the tools on real-world data from LTC settings. This involves analyzing data patterns and AI decisions to identify potential biases and their sources.
- Pre-processing for Bias Mitigation: Engage in pre-processing of data to remove any underlying discrimination. This includes refining data collection and data selection processes to ensure a fair and balanced representation of diverse care scenarios.
- Model Training and In-Processing for Fairness: Adapt AI model training processes to actively reduce bias. Implement and test algorithms that are designed to learn in a way that minimizes discriminatory patterns.
- Post-Processing Analysis of AI Models: Treat AI models as ‘black boxes’ in the post-processing phase to evaluate their outputs independently. Analyze decisions made by the AI systems to identify any residual biases or ethical issues.