GUCU Data & Democracy Reps
October 23, 2019
The attempt to introduce SEAtS displays two tendencies on the part of college management which are likely to cause even more problems in the future, as more data analytics are introduced into college systems:
- tech solutionism – an assumption that having digital systems will solve problems, presented in a way that erases complexity
- lack of critical tech awareness – an active ignoring of the fact that tech systems always have unintended consequences, are likely to amplify existing problems, and may introduce new ones
In response to the recent reply from David Oswell, these concerns can be expressed as two questions (plus supporting points):
1. What is the evidence that diverting this amount of resource into attendance monitoring will have better outcomes than spending much less on an internal solution and putting the rest of the resource into direct support e.g. counsellors?
- student support services and teaching staff are already stretched; where is the plan to ensure that additional needs resulting from SEAtS can be properly addressed?
- SEAtS shouldn’t be presented as having the benefit of ‘saving (admin) time’ if the idea is to respond to newly identified demands
- what credible evidence is there that SEAtS attendance monitoring reduces student suicides?
- what additional data would need to be collected, or what additional analytics would need to be carried out, to distinguish people at risk of suicide or serious mental health problems from those with the same attendance patterns, and what would be the other implications of that
2. Where is the evidence that SEAtS won’t exacerbate existing problems like racism or other systemic prejudices, especially given the alarming findings of the recent ‘Insider-Outsider’ report and the statement agreed by SMT and GARA? David’s email accepts that there is
“integrated bias prevalent in digital systems” but there is no attempt to substantially address this. How are concerned parties supposed to have confidence in a consultation brief that asks more questions about ‘perceptions’ than it does about actual structural racism and how that might interact with technical systems?
- data protection doesn’t fully protect people from the unequal impacts of data processing systems
- saying that data about protected categories is not collected via SEAtS doesn’t alleviate possible harms in either direction:
1. harms that occur because of already existing inequalities in students’ situations
2. the ability to combine SEAtS data with other data to datamine in ways that may also
have unequal impacts
- how was the brief to the consultants shaped by the report on racism on campus and by the
issues raised by gara?
- the terms of reference, as presented by David, seem disconnected from messy everyday contexts
and mostly concerned with communications strategy.
It seems likely at this stage that college will force through some implementation of SEAtS, as they have certainly signed the contract. However they won’t get a blanket roll-out; computing are likely to continue to use registermate and art will do their own thing. So this may be a long term campaign. In any case, this is likely to be round one in a longer set of struggles involving technical systems, data and algorithms.