Surveillance & Mental Health, Predictive Policing and the State’s Data Accountability

Anthea Indira Ong
9 min readAug 6, 2019

Parliamentary Speech for Home Team Science & Technology Bill, 5 August 2019

Introduction

Mr Speaker, I stand in support of this Bill. I commend the bringing together of more than 1,000 research scientists, engineers and technical staff from the Home Team departments in the set-up of the new Agency to focus on developing cutting-edge technologies such as detection and surveillance capability, forensics as well as robotics and unmanned systems to safeguard Singapore in this digital age.

However, as we continue to amass the latest and increasingly sophisticated technologies in surveillance and data analytics in our homeland security arsenal, I would like to take this opportunity to highlight three areas of concern for this House and the Ministry to consider.

First, we must ask ourselves what is the impact on the society’s prevailing mindset and mental health as we live under the constant gaze of surveillance with security technologies becoming more chillingly ubiquitous?

Second, as we increase the use of artificial intelligence and surveillance in a preventive manner, whether for criminal actions or terrorist threats, what are the implications of algorithmic biases and unwarranted systemic discrimination on our multiracial and multicultural society? How do we maintain a resilient and cohesive society with unbiased policing as we fight terrorism with new technologies?

And lastly, how are we creating a culture of respect for privacy and accountability as the presence of numerous surveillance and data analytics methods by government agencies makes it difficult for individuals to control the use of their data?

The Paradox of Physical Security versus Psychological Safety

Mr. Speaker, the proliferation of smart technologies is a double-edged sword. Technologies that allow the government to monitor, analyse and manage behaviour to fight crime, detect threats and keep us safe are also the same technologies that may paradoxically give us less psychological safety and have detrimental consequences on our society’s mental wellbeing.

Let’s take the example of the $7.5m ‘smart’ lamp post project which was awarded to ST Engineering in October last year which includes the installation of artificial intelligence or AI-based facial recognition surveillance cameras on 110,000 lamp posts around Singapore, amongst other smart features

We already have an existing and extensive network of security cameras but the use of more sophisticated technologies on these cameras means information could be shared between agencies. According to a report by Straits Times, the new camera systems can for instance analyse faces — down to race, gender and age — for catching speeding e-scooter riders. With the surveillance information, government agencies can increase their situational awareness, detect potential problems and respond quickly to incidents, such as unruly crowds, train breakdowns and traffic congestion. It has been shown that the fear and uncertainty generated by surveillance inhibit activity more than any action by the police. This is clearly a good outcome for law and order. Yet, Mr. Speaker, we mustn’t forget that the well-being of our citizens goes beyond just physical safety and security.

A 2018 study by the Digital Ethics Lab of Oxford University found that surveillance has been linked to increased levels of stress, fatigue, and anxiety. This constant state of being monitored shapes an environment where one’s sense of personal control becomes greatly diminished. We behave in a desired way just in case we are being watched. Another research by Duke University from 1966, which has been cited in numerous other articles and books on privacy even to today, showed that the shrinkage of free space where a person can be “off-stage” and simply be themselves rather than trying to be respectable under society’s standards may increase the frequency of anxiety and withdrawal from social roles, which are signs of mental illness.

Other studies have also argued and reported that the long-term damage on one’s psyche from prolonged surveillance creates a culture of self-censorship and apprehension which clearly inhibits creativity and innovation — the very bedrock for the future of our economy and our Smart Nation vision.

Mr. Speaker, I may now sound like a broken tape recorder but I cannot emphasise enough the urgent need for us to make mental health a national priority, what with the upward trend of one in seven Singaporeans experiencing a mental health condition in our lifetime according to the 2016 Singapore Mental Health Study published in 2018.

Unless the mental well-being of our citizens is a top priority in policy narratives, we may never intentionally and adequately address the trade-offs between the security imperative and potentially suffocating psychological effects of living under the gaze of the Authority. Again, let me use this example of the 100,000 POLCAMs, how did we decide how much surveillance is enough when we currently already have a desirable low crime rate? What is truly necessary and when does it become a surveillance overreach, especially considering the financial and social costs?

I urge the Ministry and the new Agency to be mindful of the delicate balance needed between keeping our streets safe and safeguarding the psychological resilience of our people to maintain a strong social compact. I would like to ask the Minister to clarify if an additional clause can be provided in the Section (5) subsection (2) for the Agency, in performing its functions, to have regard, in addition current provisions, for the social and psychological consequences of these scientific and technological advances on our citizens.

Maintaining a cohesive, multicultural society through unbiased policing

Mr Speaker, this brings me to my second concern — unbiased policing with predictive technologies. While predictive policing through AI is the future of data-driven law enforcement because of convenience and accuracy in the face of potential threats, we cannot ignore the risks that may arise with unwarranted systemic biases that might lead to inadvertant discrimination towards some communities. Predictive policing risks reinforcing known biases in law enforcement. Bias may also lead to the over-policing of certain communities, heightening tensions, or, conversely, the under-policing of communities that may actually need law enforcement intervention but do not feel comfortable in alerting the police.

Facial recognition systems tend to disproportionately misidentify women and those from ethnic minorities. These neural networks are built on a vast number of faces from different groups. As a result, if it is trained and built for the better part on male subjects from the ethnic majority, which is statistically more likely, the system will be more accurate for this group, and less accurate for others.

Let me share two examples, Mr. Speaker:

  • Last year, Amazon’s face recognition technology falsely identified 28 members of the American Congress as people who had been arrested for crimes, disproportionately misidentifying African-Americans and Latin-Americans. This raises concerns that face surveillance used by governments can fuel discriminatory surveillance and target racial minorities.
  • These harmful effects are also already being felt in China, where facial recognition software targets the domestic minority populations, particularly the Muslim Uighur minority in China’s “war on terror”. “Anomalies” in their behaviour that can trigger suspicion includes “dressing in an Islamic fashion and failing to attend nationalistic flag-raising ceremonies” This is worrying as suspicion can be activated based on culture, religion and ethnicity as legitimate identifiers of threat.

In Singapore’s multicultural context, inadvertent algorithmic bias can put our harmonious social fabric at risk. No community must be perceived to be singled out. Can the Minister please clarify the safeguards and mitigating measures we have in place to avoid such misidentifications and algorithmic biases?

An Informed Citizenry: Cultivating a Culture of Respect for Privacy & Government Accountability

Last but not least, Mr Speaker, please let me share my third concern on data privacy and accountability.

An article by TODAY in 2016 claimed that Singaporeans saw national security as more important than privacy. Nevertheless, this does not offset the value of privacy and the need for the government to remain accountable to its citizens. Though there may be a predominant mindset that Singaporeans are not that concerned with privacy, the fact the Personal Data Protection Commission saw over 1,600 complaints on data protection issues in 2018 alone, as mentioned by Minister Iswaran to this House, shows that privacy is valued in Singapore.

Convenience cannot be the sole consideration of data collection — citizens deserve to know the purpose and how their data is being used. Therefore, if we want to create a culture of respect for privacy, the way the government handles individuals’ data must set the example.

While the government has in place the Public Sector (Governance) Act where government agencies can share relevant data only when there are data sharing directions issued by relevant Ministers, citizens may not be aware of what these directions entail and how their data is being shared between agencies. When the public is aware of the efforts that the government takes to safeguard their data, trust with the government is built and the public lives in less fear that their every action is subject to scrutiny. With transparent safeguards, the public also plays the role of a compliance officer, as they can identify lapses that occur. Hence, I urge the government to consider informing the public on what kinds of data different government agencies are authorised to access and use, and for what purposes to build a more informed citizenry.

The Public Sector Data Security Review Committee had identified technical, process and people strategies to strengthen our data security regime given the rising complexity of our systems as our demand to use data grows in our society. It has found that there are varying levels of training on data protection in the public sector and that many data incidents have previously been the result of human error. Furthermore, it is not just the government systems that are at risk but also the ecosystem with which our citizens’ data interacts with and given the recent data breaches we need to ensure high data protection standards by third parties.

I understand that the Public Sector Data Security Review Committee has until 30 November to submit its findings and recommendations to the Prime Minister. In the meantime, I would like to urge the government to consider constituting an independent oversight body that will evaluate on an annual basis whether various forms of personal data, collected through forms or surveillance, has been legally and properly collected and managed by the public officers. This would include reviewing cases of misuse, the public organisation’s data management culture and whether any changes have to be made to prevent future cases of misuse or breaches.

In the United Kingdom, the Investigatory Powers Commissioner’s Office (or IPCO for short) provides independent oversight of the use of investigatory powers by intelligence agencies, police forces and other public authorities. IPCO is comprised of judicial personnel, scientific experts, inspectors, lawyers and communications experts. In the United States, the U.S. Privacy and Civil Liberties Oversight Board reviews information-sharing practices related to terrorism protection efforts.

Establishing an oversight body is important as the government increasingly gathers massive amounts of information with new surveillance measures, including smart lampposts and drone cameras, being rolled out. The report of this body should be made public to ensure state accountability and to also demonstrate the respect that the government has on data privacy.

Conclusion

In conclusion, Mr Speaker, it is clear that artificial intelligence, robotics, the Internet of all things and 5G will change and are changing the face of law enforcement. It’s tempting to brush off the concerns shared above with the rhetoric of ‘its better to be safe than sorry’ or ‘if you have nothing to hide, it won’t affect you’. However, we must ask ourselves what is the kind of society we want Singapore to become.

Do we want to build a psychologically resilient and informed citizenry that will also be the ears and eyes for each other in these times of increasing security threats, OR a people that depends completely on these ubiquitous electronic eyes on every lamppost and leaving security entirely to the responsibility of the state?

Do we want to build a culture of respect for privacy, starting with the government setting an example for transparency and accountability, that provides the bedrock of our social compact where there is trust between people and faith in the government? Or a culture of fear and distrust that may impede creativity and innovation?

Mr Speaker, answering these questions will help us fortify the Social and Psychological Defence pillars of our Total Defence for a strong and united Singapore, one that is built in partnership between the government and the people.

Thank you.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Anthea Ong is a Nominated Member of Parliament. (A Nominated Member of Parliament (NMP) is a Member of the Parliament of Singapore who is appointed by the President. They are not affiliated to any political party and do not represent any constituency. There are currently nine NMPs in Parliament.)

The multi-sector perspective that comes from her ground immersion of 12 years in different capacities helps her translate single-sector issues and ideas across boundaries without alienating any particular community/group. As an entrepreneur and with many years in business leadership, it is innate in her to discuss social issues with the intent of finding solutions, or at least of exploring possibilities. She champions mental health, diversity and inclusion — and volunteerism in Parliament.

She is also an impact entrepreneur/investor and a passionate mental health advocate, especially in workplace wellbeing. She started WorkWell Leaders Workgroup in May 2018 to bring together top leaders (CXOs, Heads of HR/CSR/D&I) of top employers in Singapore (both public and private) to share, discuss and co-create inclusive practices to promote workplace wellbeing. Anthea is also the founder of Hush TeaBar, Singapore’s 1st silent teabar and a social movement that aims to bring silence, self care and social inclusion into every workplace, every community — with a cup of tea. The Hush Experience is completely led by lovingly-trained Deaf facilitators, supported by a team of Persons with Mental Health Issues (PMHIs).

Follow Anthea Ong on her public page www.faceboook.com/antheaonglaytheng

--

--