star All star
Presented in satirical manners, a ‘regular’ Alexa engages with people in a sarcastic tone. I altered the behaviour of the Amazon’s intelligent assistant Alexa to respond sarcastically in order to manifest the embedded human values and biases in technological artefacts. The goal of this piece is to discredit algorithm’s claimed neutrality and accentuate instances of algorithmic discrimination.
During this process, I became extremely aware of the complexity of designing an apparatus that performs within ethical and feminist norms. Most importantly, I saw first-hand how this object was constructed to give specific answers to specific questions; or, in the same logic, to avoid replying to particular questions, for example, questions of political or religious nature.
Akin to Dunne and Raby’s experiments with robots (2007), I argue that building a ‘robot’ that functions in an unprecedented way, by avoiding its own nature as an assistant, is one approach to emphasise the embedded ideological character of design. This approach intends to expose the algorithmic mechanism of such technology, encouraging people to reflect about biases in algorithmic systems by interacting with my own biased system.
Interactive installation, audio interactive
Speculative and critical design
Amazon’s Alexa, Amazon Web Services, Arduino