Join Us | Print Page | Sign In
Emerging Fellows
Group HomeGroup Home Blog Home Group Blogs
Search all posts for:   

 

View all (199) posts »

Can liberty be preserved in a surveillance state?

Posted By Ruth Lewis, Friday, October 25, 2019

Ruth Lewis a member of our Emerging Fellows program discusses the protection of personal data under surveillance in her tenth blog post. The views expressed are those of the author and not necessarily those of the APF or its other members.

 

What do we make of the practice of actively mining our personal data and building our personal digital profiles by corporations, to observe our behaviour, predict our needs and to nudge us to buy more products? Even more invasive is the retention and use of our information by governments, to scrutinize where and when we may perform anti-social acts. These predict or manipulate us to conform with the dominant governance system through a social credit score or through other less publicly visible means.

 

Current pervasive use of ‘predictive policing’ seeks to forecast the risk of criminal activity and acts of terrorism before the crimes have taken place. This method seeks to predict who the offenders may be, who the victims of crime may be and where and when the crimes may take place. Profiles are created of individuals, groups and locations deemed to be ‘at risk’ of future crimes. These identified groups are given additional surveillance and intervention from police to prevent future acts. Predictive policing is achieved through sophisticated collection and quantitative analysis by Artificial Intelligence algorithms. Vast pools of stored data are gathered from multiple sources, including historical crime statistics, social media, financial records, CCTV images and geo-location records of vehicles and mobile phone records, much of which would be classified as personal information.

 

Many concerns have been raised on the use of predictive policing, not least of which that it seeks to apply a ‘technology band aid’ on to what are often endemic socio-economic and political issues, without seeking to understand and remedy the root causes of these problems. Additionally, it may be causing greater harm than good, applying unjust profiling techniques based on biased algorithmic training data onto marginalised or vulnerable elements of our society, subjecting them to ever greater levels of surveillance within a harmful cycle of confirmation bias.

 

A clear and mature analysis and understanding of our societal ‘wicked problems’ leading to crime, together with strong, mature, transparent and accountable governance over such powerful surveillance tools must be developed before their widespread deployment and use. The creators of these tools need to ensure that there is an intentional commitment to eradicate inherent bias. The various service providers who provide data as input to the analytic algorithms need to build in a strong commitment to collective and individual privacy and personal autonomy of our personal data, and transparency as to the processes and purposes used for collection of this data.

 

Without this, the more likely risk is that these surveillance tools will catch the innocent in a wide surveillance net. It will undermine personal privacy and liberty, and the ability to engage in our own lives without psychological or physical inhibition within the boundaries of the law. We should not have to constantly look over our shoulder and wonder who is watching us, and how they might be judging us.

 

Truth becomes distorted in the name of crime prevention, and sometimes bent toward political or coercive outcomes that skirts legality. There are no future facts, nor a way of accurately predicting when someone will break the law. However, there are inalienable human rights – the right to privacy within our lives. There is also the ability to examine where we are now, foresee where this trend may develop, and be concerned about the type of society that this may create. There is the ability to define the type of future that we want to live in, one that seeks justice for all. As a start, we as a society need to demand greater accountability and transparency from our governments and our service providers to protect our liberty, our privacy and our freedom of current and future expression.

 

Our society’s future aspires to extend beyond our terrestrial realm. How should we consider space travel and off-world habitation? As an extension of our current terrestrial culture with its inherent injustices? Can we envisage a space of liberty and humanity? But how free would that off-world society be, when mere survival will necessitate extreme co-dependence?

 

© Ruth Lewis 2019

Tags:  data  liberty  rights 

Share |
Permalink | Comments (4)
 

Comments on this post...

...
Tim Morgan says...
Posted Friday, October 25, 2019
This really comes down to establishing a new privacy architecture in society. We can create the technologies & laws to create that privacy architecture, but we will need to get very specific about what dynamic we want surrounding privacy. Privacy used to mean closing a door. Now it means figuring out how much control we desire over passively & actively collected data. Attempts to implement legal & technological privacy measures have to be based on a larger values framework identifying what data about me is "mine", and what isn't. "What do we want out of privacy?" is the key question to me..
Permalink to this Comment }

...
Stephen Aguilar-Millan says...
Posted Saturday, October 26, 2019
The younger British Royals are an interesting case here. On the one hand, they crave celebrity and seek publicity. On the other, they find the Press to be intrusive and upsetting. It seems to me that you can't have it both ways and win. Do you want privacy? Or do you want celebrity? The two are mutually exclusive unless you have a completely sycophantic Press, which is what they want.
Permalink to this Comment }

...
Ruth Lewis says...
Posted Sunday, October 27, 2019
Hi Tim, please see the developing IEEE P7000 series Standards, where this privacy and ethical architecture is being developed.  I have been involved in this initiative for 2 years now.  https://ethicsstandards.org/p7000/
Permalink to this Comment }

...
Ruth Lewis says...
Posted Sunday, October 27, 2019
 Also see ISO/IEC initiatives in this area of Ethical AI, which I am also involved in through Standards Australia. https://www.iso.org/committee/6794475.html
Permalink to this Comment }