Bridging individuals with technology thru innovative solutions & delivery of excellence in  service.

G360-Expanded

440.973.6652

Bridging individuals with technology thru innovative solutions & delivery of excellence in  service.

PI alerts regulator about the use of algorithms by the UK Government and their impact on migrants

August 15, 2025


Use case and large-scale processing, including of sensitive data, are not justified as necessary and proportionate

We found that the Home Office had seemingly failed to carry out necessity and proportionality assessments for either tool, examining whether the processing of personal data by the tools is needed to achieve a legitimate aim and is not unnecessarily intrusive. First, there was no justification for their use, and neither tool appeared subject to limitations on their use, according to the documents we have reviewed. Second, there appears to be no mechanism to limit the volume and categories of personal data processed, namely input data used by each tool to generate their decisions. In our assessment, the stated objectives fail to justify the collection and use of data in the large-scale data processing activities that IPIC and EMRT are seemingly used for and are therefore unnecessary and disproportionate.

The failure to minimise the amount of data processed and to justify the use of tools themselves is further concerning given that these tools are processing vast amounts of sensitive, so-called ‘special category data’, such as health information, vulnerabilities, ethnic or racial origin and genetic or biometric data for the purposes of unique identification, as well as records of criminal conviction. Such data is awarded additional safeguards as its misuse can lead to discrimination prohibited under human rights instruments and contrary to constitutional protections of non-discrimination. In the case of these tools, the Home Office does not appear to have considered the risks associated with processing such data, in particular in such a large volume and we believe it has fallen short of its obligations to safeguard the rights and interests of individuals.

Data being re-purposed to facilitate the tools

The data that is fed into the tools for decision-making processes such as an individual’s name, gender, nationality, data of birth and other data associated with their immigration case as outlined above, includes data from varied and broad datasets (e.g. ‘detention details’), which in several cases may have been collected for wholly different purposes.

The re-purposing of datasets outside their original or primary purpose to generate automated recommendations is incompatible with the data protection principle of purpose limitation. This principle requires that personal data be collected for a specific, clearly defined purpose and not re-used in a manner incompatible with that purpose. Any use of that data for an incompatible purpose must be supported by a new lawful basis and an updated data protection impact assessment, which does not seem to have occurred here.

This concern is compounded by the fact that the retention of certain data is unjustified and in breach of the storage limitation principle, which requires that personal data be kept only for as long as is necessary to achieve the purpose for which it was collected, after which it should be deleted or anonymised. In the case of IPIC, some data may possibly be retained for dubious purposes for at least 5 years, after all other relevant personal data has been deleted. Data processed under the EMRT appears to have no retention limits at all. The mere availability of this data raises concerns about mission and function creep in the future, where information collected for one purpose is gradually used for unrelated purposes without proper legal authority.

Discriminatory impacts resulting from the design and use of business rules

From the information gathered, it appears that the current uses of IPIC and the EMRT may be adversely impacting individuals who are subject to those tools.

IPIC’s automated prioritisation function enables case filtering based on nationality, which may result in direct discrimination because it can introduce biases and lead to individuals being treated less favourably solely on the basis of a protected characteristic. In addition, the use of ‘associations data’ — linking individuals to others who have interacted with the criminal justice system — may result in indirect discrimination, as it can disproportionately affect certain groups who are overrepresented in law enforcement datasets due to structural inequalities. These features risk disproportionately targeting certain nationalities and/or ethnic group without apparent adequate safeguards or justification.



Source link

You May Also Like…

0 Comments