Friday, January 30, 2015

Visualising Machinic Surveillance

Steve Mann sends his greetings from Spaceglasses.com headquarters in Silicon Valley California and at Stanford University in California where he just did some veillametrics:
http://wearcam.org/metabaq/tei2015dusting/
and
pic.twitter.com/cYQMSvLek5

V- cool pictures that perhaps might help improve human-machine understanding/empathy in the field of surveillance (as humans start to appreciate what machines are doing/seeing)?

Wednesday, January 28, 2015

POLICY RECOMMENDATIONS FROM SEMINAR 1


1.     There is a need for politics students, academics, and professionals to be technologically aware as well as for computer science and engineering students, academics, and professionals to be politically aware as disciplines continue to intersect.
2.     Those involved in data creation and storage need to be mindful of the possibility of that data being misused, intercepted, or commodified by others – with or without their consent. Users need to consider how data can be controlled and accessed, and what use can be made of data once created.
3.     Governments must be aware of the implications of outsourcing surveillance to private entities, both in terms of the negative impacts on competition that can result (as highlighted by Ball et al.) and more broadly of the fact that by securitising an activity, it is implicitly rendered dangerous. There are important implications for the private sector and customer relations, if private companies are co-opted into a policing function.
4.     A deeper engagement with the concept of privacy and what it means in today’s society needs to be undertaken, from political, journalistic, legal and philosophical perspectives, amongst others. Are technological tools to prevent surveillance sufficient to protect privacy or are we entering an arms’ race of technological techniques of surveillance and counterveillance (ie measures to block any type of watching)?
5.     The extent to which individuals can avoid interference with their privacy in an increasingly technological society, and the extent to which sousveillance can counteract surveillance, is worthy of further in-depth examination. In particular, is there value in sousveillance without meaningful evidence of accountability?

Seminar 1 - SUMMARY


This summary is based on detailed seminar summaries provided by PhD students Abigail Blyth, Aberystwyth University; George Petry, University of South Wales; and Tiewtiwa Tanalekhapat, Aberystwyth University

Introduction by Dr Vian Bakir and Dr Andrew McStay: The seminar began with an in-depth discussion of three types of transparency; two originally identified by Bentham: Liberal and Radical Transparency, and one coined by McStay, that appears increasingly pertinent to today’s society: Forced Transparency, or transparency without consent or choice.

Keynote 1: Professor Steve Mann, University of Toronto, introduced some key concepts, including sousveillance, surveillance, equiveillance, and privalence. Mann argues that surveillance is hypocritical, centralised, secretive and corrupt whereas sousveillance has integrity, openness and is distributed. He predicted a time in the future when an ‘equiveillance point’ will be reached with an equal number of cameras carrying out surveillance, and the same amount of cameras being used by the public as wearable media carrying out sousveillant activity. This will, he argues, in due course lead back to a situation where everyone is wearing cameras, thereby negating the need for, and legitimacy of, surveillance. Slides are available on: http://wearcam.org/html5/mannkeynotes/priveillance.htm

In the lively Roundtable on Sousveillance that followed, the group discussed whether the continued focus on the visual aspects (i.e. the ‘veillance’ part) of surveillance was legitimate. Recent events such as the Edward Snowden revelations has shown that surveillance encompasses much more, including an examination of mass data – dataveillance – and sousveillance might not be able to fully counter that surveillance. The discussion then turned to the possibility of having sousveillant data transmigrate into surveillant data, if it falls into the wrong hands; the commodification of data, and the issues of ‘dark corners of power’ that fall beyond the reaches of sousveillance. The legal and technological aspects, including the law’s fairly limited interpretation of the right to privacy, and issues of encryption and ‘making sense’ of big data, were discussed at length.

Keynote 2: Professor Kirstie Ball, Open University, highlighted the blurring of the lines between the state and the private sector in national security matters. To protect national security, the UK Border Agency created an obligation on airlines to provide a securitised data flow of passport information of those booked on planes, and taking flights. By reflecting on the concept of transparency from the research project, Ball addresses the notion of transparency veillance, in which transparency consequently has become multilayered and politicised.  She argues that to securitise something is to render it dangerous, and commercial activities are generally shot through with insecurities.

The Roundtable on Surveillance began with a scintillating short presentation on the work of PlanetLabs - https://www.planet.com/ - a satellite imaging company. This further highlighted the role of the private sector in the collection and dissemination of data and how they interact with respective national legislation in this field. PlanetLabs is based in the USA and the US Government has the ability to access their satellite images and equally to stop them to photographing certain areas of the world. From an intelligence studies perspective, this continues to raise questions in relation to the interaction between governments and private companies when matters of national security are involved. It also highlights the problematical debate witnessed within the UK Intelligence and Security Committee’s Annual Reports on accountability over matters concerning national security. Some discussion ensued on whether the dichotomy between surveillance and sousveillance remains useful given the uncertainty of data flows regardless of their origin. 

Monday, January 19, 2015

Seminar 1 Position Statement – Dr Lina Dencik, Cardiff University - Journalism/Media



Anti-surveillance resistance in a Snowden era
One of the key questions for me, for discussions on surveillance in a Snowden era is what anti-surveillance resistance does and should look like. In this debate, Steve Mann has highlighted different trajectories of resistance that include both sousveillance and counter-surveillance as mechanisms for highlighting and circumventing the architecture of contemporary forms of surveillance. Sousveillance makes use of technologies that form part of the broader ecology of ‘veillance’ but uses these technologies to monitor and observe powerful surveillance actors and infrastructures ‘from below’. The notion here is that, as a form of anti-surveillance resistance, such practice will shed light on the prevalence and abuse of surveillance by institutions of power to maintain social control. However, apart from questions regarding the inequality of power in the struggle over the direction and impact of the gaze, what the Snowden leaks have highlighted, is that these sousveillance technologies are themselves incorporated into an even broader form of surveillance, what Mann calls ‘gooveillance’ or ‘uberveillance’ in which all communications, from both above and below, is collected and monitored within a state-corporate regime of governance. In such circumstances, how appropriate is the practice of sousveillance as a form of resistance?

In the months following the Snowden leaks, we have instead seen a steady rise in interest in cryptography and the use of encryption tools as a form of anti-surveillance resistance. These practices might be considered practices of counter-surveillance where the aim of resistance lies in circumventing and bypassing surveillance infrastructures altogether, particularly with regards to online communication. Such forms of anti-surveillance resistance have become prominent and are increasingly part of debates on contemporary forms of control and counter-control. However, crucial questions remain regarding the remit and effectiveness of technological responses to surveillance. Even amongst cryptographers and hackers, there is an increasing awareness regarding the limitations of foregrounding technology in anti-surveillance resistance. Rather, such resistance needs to form part of a broader political movement that challenges the nature and extent of state and corporate powers in authoritarian and liberal democracies alike. Much of this movement has so far concerned itself with individual rights, and in particular, the right to privacy. This holds some promise for challenging surveillance powers outside the technological realm, but the problem with such discourse is that it lacks consideration for the ways in which there has been a fundamental shift in understandings and practices of privacy, not just politically, but in social and cultural terms, that means that many people do not connect with the terms of such debate. More broadly, however, framing anti-surveillance resistance around the issue of individual rights does little to illuminate the ways in which surveillance architectures form part of a set of power relations that advance certain interests over others. The challenge, therefore, may lie in integrating anti-surveillance resistance into broader ecologies of political activism that seek to highlight and challenge contemporary forms of exploitation and domination, making anti-surveillance resistance part of a broader social justice agenda.
Making such connections and building such broader movements is, of course, an enormous challenge, but it may be the most appropriate and relevant way of approaching anti-surveillance resistance in a Snowden era. 

Thursday, January 8, 2015

Seminar 1: Dr Dave Preskett, BioComposites Centre, Bangor University



I want to discuss the implications of the use of cloud servers for data storage and transfer (e.g. Dropbox), which we use routinely in the BioComposites Centre to distribute files with industrial partners in project management.  Early versions of this concept often held within their terms and conditions that data uploaded to the cloud servers became the property of the host servers themselves.  Apart from the importance of users of such software actually reading the terms and conditions (who ever really does!), in what manner can any assurance of confidentiality of cloud servers be “veillanced”?  Similarly, the University announced it had chosen to use Microsoft Outlook cloud server for all email applications and calendaring – this being prior to the Snowden disclosures but despite those disclosures, the system was duly taken up.  Are there safeguards in place?

Wednesday, January 7, 2015

Dr Andy McStay, Media Culture, Bangor University: Position statement for the seminar series



Clearly we’re not the first to discuss transparency, but our take on the topic stems in part from a book published on privacy, philosophy and new media I published last year with Peter Lang. This examined privacy in the most diverse manner possible, but in reading-up on utilitarianism for one of the earlier chapters, the quote below from the 19th century philosopher, Jeremy Bentham, jumped out at me. It comes from his 1834 essay titled Deontology:
A whole kingdom, the whole globe itself, will become a gymnasium, in which every man exercises himself before the eyes of every other man. Every gesture, every turn of limb or feature, in those whose motions have a visible impact on the general happiness, will be noticed and marked down. (Bentham, 1834: 101).
Academics will be well aware of Bentham’s outline for a panoptic prison that required far fewer staff to control inmates than had historically been the case, but the Bentham quote from Deontology seems to suggest a different understanding of visibility and watching. It is less repressive than Bentham and Foucault’s prison musings, and note too the last sentence on “general happiness”.
The key point is that transparency for Bentham is a positive concept to promote net societal benefit and general happiness, and not just a punitive measure.
Bentham’s approach to mutual watching and transparency as means of promoting “general happiness” should be seen in context of openness and accountability, and Enlightenment doctrine on making all things present so to generate understanding, and make life better. Although today we can see the quote in a very sinister fashion, Bentham’s intention is utilitarian involving maximum benefit for the many. This led me to account for Bentham, and the first two types of transparency:
(1) Liberal transparency – historically, a liberal and enlightenment norm that opens up machinations of power for public inspection, and the use of reason and knowledge as a force for promoting societal net benefit and happiness (Bentham 1834).
(2) Radical transparency that opens up public processes and the private lives of citizens for inspection (Bentham 1834; McStay 2014).
A recent utilitarian-inspired philosopher on privacy, Richard Posner, has gone as far as arguing that breaking down privacy domains and promoting transparency may be beneficial both economically and morally.
I think the latter point on morality is the more interesting dimension. Whereas a positive account of autonomy and privacy involves control about how to present and stage ourselves, to whom and in which contexts we want to do this, this is the very focus of Posner’s indignation. For Posner privacy is very much connected to the negative ideas of withholding and concealing, particularly in regard to personal uses of information. In building an account of transparency he points out that seeking to control the flow of information is a wish to control others’ perceptions, and overall this is a bad thing. To quote:
It is no answer that people have “the right to be let alone,” for few people want to be let alone. Rather, they want to manipulate the world around them by selective disclosure of facts about themselves. (1983: 234).
On the types of things we should be more upfront about, Posner’s examples include:
… full disclosure of sexuality, political affiliations, minor mental illnesses, early dealings with the law, credit scores, marital discord and nose-picking (all Posner’s examples).

If the pun can be excused, we can unpick this quote. For example:
-   On sexuality: if people are more transparent this will lead to greater tolerance
-   On mental illnesses: greater public awareness, less prejudice
-   On credit scores: more efficient lending systems
-   On nose picking: Posner’s final example is presumably a point on shame and secrecy about something we all do!
On first consideration – this is highly laudable. The problem is that each require acts of forced openness. This leaves us with the third type of transparency, or what we have termed forced transparency. This is obligatory openness, or transparency without consent or choice. In formalizing the idea of forced transparency, this is undesirable because resistance becomes tantamount to guilt, and the putting into practice of transparency would require use of power along with the stripping of choice and autonomy.
All of this seemed to the seminar series team to have fairly obvious implications for post-Snowden life, but what we liked about the original Bentham quote is that it allowed for positive conceptions of transparency, for mutual watching, for social benefits in information use, and even for enjoyment of mutual watching – as with social networking. This multidimensional character of transparency seems to fit well with the range of academic, business, artistic and technology-based delegates we have in Seminar 1, who each come with their own perspective on transparency.

Communicating the ideas to the public

One of our challenges across the 6 seminars is to identify core themes from DATA-PSST! that we think we want ordinary people to better understand - and then communicate them in an engaging manner.

Our chosen media form is a short (approx 10 mins) online documentary. We want it to creatively engage with our core themes, and be so appealing that people choose to share it around. It'll be trailed by a short VINE clip, to try and drum up greater online sharing through social media.

Any ideas?
Any good examples out there?
Please add via comments...
Thanks

Sunday, January 4, 2015

Seminar 1 Position Statement from Dr Martina Feilzer, Criminology and Criminal Justice, Bangor University


I see new technologies, wearable gadgets among the most recent, are blurring the boundaries between private and public space, as well as private and public data. We upload information onto social media platforms from our fitness trackers; we post pictures online including information as to where we were, at what time, and with whom. Does this mean we are prepared for this data to be hoovered up and used by authorities, private businesses, or researchers? Health insurance firms in Germany recently (Nov 2014) proposed plans to reward customers who voluntarily upload data from fitness trackers. How would consenting customers feel if the data were passed on to their GPs or other interested parties, or if they would be penalised, eventually, for bad lifestyle choices? The blurring of boundaries between private and public space and data makes it more difficult for the individual to control information about themselves. Data and knowledge about self has been commodified, making way for new forms of self-regulation, as well as regulation by others. These developments raise questions about current conceptions of veillance and transparency.

Also of interest to me are questions of privacy and image rights as exemplified in this recent paper by Tatiana Synodinou, Image Right and Copyright Law in Europe: Divergences and Convergences