Invasion of Privacy: AI Facial Recognition Being Used in Public
By Louis Findlay
By Igor Omilaev via Unsplash
As of early 2026, Police Scotland is in the advanced stages of planning to introduce Live Facial Recognition (LFR) technology, despite extreme opposition from human rights groups and political, legal, and ethical debates.
The concept was originally aimed to be in use for 2026 under its “Policing 2026” strategy; however, due to the intense resistance and dislike from the people, the project might be postponed.
This idea started to come to the surface in August of 2025, with Police Scotland confirming their intentions to move ahead with the use of LFR, aiming to use it to identify suspects in serious crimes, such as violent or sexual offences, and to find missing people.
Usually, Police Scotland will do a traditional consultation. They are projecting the use of facial recognition as an “ongoing public conversation” to discuss the ethics and the safety measures put in place.
The Scottish National Party (SNP) passed a resolution in October 2025 opposing the use of LFR unless it's authorised by specific legislation passed by Holyrood, citing it as a “radical departure” from policing by consent.
Police Scotland is framing this as a new security measure for the country to make it safer, but how safe are we?
Having our faces scanned day by day in search of “criminals” or illegal immigrants, a database is being filled, telling who runs these systems, where we were and at what time, which completely violates our public privacy.
Independent research has also shown that facial recognition is prone to bias and error – especially higher misidentification rates for women and people of colour. Critics warn this can lead to wrongful stops, harassment or even wrongful arrests.
The technology can make errors, and when mistakes disproportionately affect minority groups, the impact isn't just technical – it's social and discriminatory.
Critics argue that LFR is mass surveillance by default, capturing biometric data on everyone who walks past cameras – whether they're suspects or not. This fundamentally alters expectations on privacy in public life.
Many rights organisations, such as Big Brother Watch, describe it as "dangerously authoritarian" and a "surveillance crisis", breaking down freedoms rather than protecting them.
This is also an example of the government being told by the people that they don't want something to happen in their country, but the government pushes for it to happen, nonetheless.
Another negative aspect of AI being utilised in the police force would be that AI necessities to operate, such as RAM, graphics cards, motherboards and immense amounts of water, cost the government lots of money, and buying electronics used in AI data servers independently will become very expensive.
RAM prices have risen to some extreme prices, with some DDR5, DDR4, and general DRAM modules rising between 100% and over 500% from late 2025 into early 2026.
Mark Grayson, 19, said: “It's a complete violation of our privacy, and I will start wearing masks if they do that.”
Rama Yuda, 26, said: “It could lead to everyone being tracked all the time.
“We can't go in public without being seen on one of these facial recognisers.”
Jasmine Wilson, 20, said: “It makes me uncomfortable knowing my face is on some database somewhere. I don't like it and find it unnecessary.”
Ross Brown, 20, who approved of the facial recognition system, said: “if it can catch illegal immigrants and criminals, then I don’t really care.
"I don't have anything to worry about; I've done nothing."