All Student Vote (Summer 2023)
Warwick Against Unethical Scientific Research
For the President, VP Education, VP Postgraduate, and VP Welfare and Campaigns to lobby against any research into ABA at the University, against all into the use of AI for security and defence purposes, and for the officers mentioned to seek clarity on ethical frameworks used for research taking place across the university so that the University does not fund or support research into medical practices such as ABA that stigmatise and harm marginalised groups.
Warwick Against Unethical Scientific Research
This Union notes:
- There are several current and previous research projects at the University which are problematic in nature. Of particular concern, having come to light in recent months, is research taking place within two centres: the Centre for Educational Development, Appraisal and Research (CEDAR) within the Faculty of Social Sciences and the Signal and Information Processing Lab (SIP) within the Department of Computer Science.
- CEDAR is a participant in the Sharland Foundation Developmental Disabilities ABA Research and Impact Network. This network attempts to “to increase the reach and impact of Applied Behaviour Analysis (ABA)” through bringing together “an extended network of like-minded research practitioners”.
- ABA is a therapeutic intervention often used to modify behaviours of neurodivergent and/or mentally-ill people. It has been associated with PTSD amongst patients, and a report by the committee of the Oireachtas (the legislature of the Republic of Ireland) found that therapies such as ABA ‘cannot uphold the United Nations Convention on the Rights of Persons with Disabilities principles of autonomy, dignity, right to identity and freedom from non-consensual or degrading treatment. Autism@Warwick have urged CEDAR to review their promotion of ABA based practice.
- The Signal and Information Processing Lab has, and is, undertaking a number of research projects into AI in collaboration, often funded by the defence sector, which are likely to give repressive governments here and around the world even stronger tools with which to monitor and control resisting populations.
- For example, the SIP worked on a $15m collaborative project into voice-recognition led by an Israeli cyber-security company called Verint Systems, which has sold surveillance technology to the governments of Azerbaijan and Indonesia where it was used to monitor and imprison political dissidents and members of the LGBTQ community. The project was run with other partners including French arms company Airbus and the Police Service of Northern Ireland, with the practices of both of these institutions associated with human rights violations.
- Other projects from SIP include projects which use deep learning to “identify small faces in videos and cluster them by similarity” funded through the Ministry of Defence, and another which aimed to advance the use of deep learning “to match identities across modalities and databases, for example, using voice to match a face and vice versa.”
- The proliferation of the use of AI in the security and defence sector has been widely condemned based on the fact that AI often reproduces existing forms of bias and discrimination e.g. racism and sexism. The UN Committee on the Elimination of Racial Discrimination found that “the increasing use of facial recognition and other algorithm-driven technologies for law enforcement and immigration control risks deepening racism and xenophobia and could lead to human rights violations.”
This Union Believes.
- All research has ethical considerations. This is particularly true when research will go on to have a broader impact in the world. Warwick has a responsibility to uphold strong ethical standards, particularly because of its strong research culture. ABA and SIP research can both often function to strengthen the ability to control and repress marginalised populations: whether it is autistic children or migrants seeking to cross the border.
- ABA is a therapeutic approach which medicalises and stigmatises autism, and treats autistic behaviours as needing to be corrected, effectively pushing for autistic people to “normalise” themselves rather than advocating for a society which includes them. Our University should not be carrying out research into this practice, which has produced trauma in patients that can be characterised as abuse.
- Coupled with an increasingly repressive government which has already suppressed our right to protest, the misuse of AI presents a dystopian future for freedom in Britain, and around the world as these technologies are shared amongst governments and private security companies.
- The Union has also recently passed motions in opposition to University partnerships with companies involved in the arms trade and research into autonomous weapons. We as a student body will not sit idly by whilst the University facilitates research that perpetuates harm in a world marred by discrimination and violence.
This Union Resolves:
- For the President, VP Education, VP Postgraduate, and VP Welfare and Campaigns to lobby:
- Against any research into ABA at the University.
- Against all SIP projects in into the use of AI for security and defence purposes e.g. facial recognition, voice recognition, behavioural monitoring.
- For the officers mentioned to seek clarity on ethical frameworks used for research taking place within SIP and CEDAR and across the university so that:
- AI Research at the University does not increase the capacity of the government and other governments or private security agencies around the world to monitor citizens and repress protest.
- The University does not fund or support research into medical practices such as ABA that stigmatise and harm marginalised groups.