Monday, December 22, 2014

Seminar 1 - Position statement from Dr Lynne Williams, Adult Nursing, Bangor University

In healthcare, as in society, surveillance is likely to be perceived negatively.  Professional and managerial interpretations of control can be contrasting, and direct monitoring of staff performance in healthcare is often difficult because of the nature of organisational structures and professional autonomy. Performance monitoring is mandatory through governance, and increased propensity for litigation around harm and poor practice (Timmons, 2003). This approach to surveillance naturally leads to divisions between “good” and “bad” employees which lead to the shaping of behavioural norms (Sewell & Barker, 2006). Caring surveillance is “policing the contractual arrangement between principal and agent to minimize opportunistic behaviour” (Sewell et al, 2012: 191), and is more acceptable for employees and employers as it implies surveillance is undertaken for the greater good (Sewell et al, 2012). Coercive surveillance, on the other hand, is performance measurement “as a case of the few watching the many in the interests of the few” (Sewell et al, 2012: 191), and can lead to resentment amongst individuals. I ask, can organisational surveillance be simultaneously “caring and coercive”?

Seminar 1 - Position statement from Dr Yvonne McDermott, Law, Bangor University

Sousveillance, or the recording of an activity from the perspective of participants in that activity, clearly brings positive outcomes insofar as it challenges the one-sided nature of surveillance. As sousveillant technologies become more ubiquitous, we are likely to see increasing legal developments to meet with the challenges and opportunities that they bring. I propose that there are three main areas of interplay between law and sousveillance, which will be discussed in detail. First, the law relating to privacy or ‘misuse of private information’ might form the basis of challenges to sousveillance. The courts (and in the first instance, the sousveiller) will have to ask: is this information to which a ‘reasonable expectation of privacy’ will attach, and if so, is that outweighed by the sousveiller’s right to freedom of expression? Second, the use of wearable technologies in, for example, cinemas, might give rise to challenges under copyright law. Third, there is a risk that suspicion surrounding sousveillant technologies will give rise to attempts to restrict them, in what Mann dubs ‘McVeillance’, or ‘surveillance, combined with a prohibition on sousveillance’. It is posited that human rights law, and the right to freedom of expression in particular, can be used to shield sousveillance from unjust interference. 

Seminar 1 Position Statement Ronan Devlin (designer-in-residence Bangor University’s Innovation Quarter), Jamie Woodruff (Ethical Hacker), Dr.Gillian Jein (Bangor University).


Veillance

I am working on a project concept in collaboration with academics Andy McStay, Vian Bakir & Gillian Jein which is to be implemented along with ethical hacker Jamie Woodruff and programming team FELD.

The initial concept ‘Sousveillance - watching the watchers’ named after a discussion with Vian about the work of Steve Mann, was shortlisted in an international digital art open call by Arts Council England and BBC’s online art initiative ‘The Space’. The artwork consists of a web application and will employ ethical hacking processes to tap into users’ Facebook, Google and other data streams, re-appropriating their information, extracting moments where the invisible streams of data surveillance technologies intersect with their everyday practices. These largely opaque (i.e. not transparent) and invisible territories will then be rendered visible through the creative process, specifically through their re-appropriation and collation into a kind of visual diary assembled as a Concrete poem in flux or a typographic mapping of the visible and invisible territories that increasingly constitute the spaces of our everyday lives.

Although working along similar lines, the project has developed into a project which addresses data collectivism (big data) as well as surveillance and is intended to be not just a representation of how we are watched and manipulated via our digital devices but also a forum for individuals to enter into dialogue with, re-appropriate, and themselves manipulate information formerly employed as an assist in dominant modes of shaping the public sphere.

Over Christmas I’ve been reading ‘You Are Not a Gadget’ and ‘Who Owns the Future’ by American writer and computer scientist Jaron Lanier. Lanier blames utopian naivety and miscalculation in the way the internet was designed for many of today’s societal imbalances. One such mistake was ‘missing’ the fact that the way information has become freely available allows for those with largest computational power to have a huge advantage over others and we have already seen governments and corporations monopolise on this through surveillance and ‘smart advertising’ manipulation.

One of the results is that the online economy is now largely an advertising space where facilitators such as Google and Instagram get rich while producers of their content - namely us - on which their billion dollar valuations are based, are unpaid for the data they supply. Parallel ‘real world’ problems can be illustrated by Walmart, for example, who in the late 80s used statistical analysis of data to predict, within fractions of a penny, their supply chain’s exact bottom line and bargain down retail prices and so corner the market. And, of course as we know what may seem like insignificant data streams intersect with wider rivers of the poverty line and the minimum living wage, with the result, in the case of Walmart, in the reduction of their own customer base’s spending power as many industries which service Walmart have seen reduced wages and loss of employment.

The problem seems to be that the public at large are viewed as consumers when in fact many of the population are actually producers of content and products. And not to be depressing, but real world issues - such as mass unemployment, decreasing wages and lack of job, pension and, by implication, life security -  seem increasingly hard to address given the widespread planned automation of agriculture, building, manufacturing & transportation. However, one solution Lanier suggests is to redress the currently expanding rich/poor economic imbalance by radically changing the current online model of producers and consumers, inversing the roles completely. In such a scenario, consumers become creators whose information is no longer free, considered therefore as producers who get paid for their data or ‘product’ (although we might prefer the word ‘work’ in this instance), this work is the ‘stuff’ that is then consumed by corporations and governments. Creative activities such as writing, programming, music, art, and design can all be considered within this bracket of work - as we all know creative activity demands time and thought, if we feel pressure to perform creatively outside of ‘paid labour’ then, by putting a value on these activities in the radical sense of altering the philosophical frameworks of online activity we may go some way towards contracting the wealth gap, and achieving greater work-life balance. By the same token, regarding personal data, governments, as consumers of our information, would therefore pay people for its use. Could this be a means to encourage more debate about transparency. Several issues certainly arise here as on the one hand having to pay for information would discourage governments from such pervasive surveillance networks. On the other hand, privatizing the public sphere (even if it is in the hands of the citizen) raises questions as to the validity of our current representational democratic model by drawing democracy down to the level of the individual (i.e. the origins of the Greek democratic organism). One controversial question would be to what extent we trust each other with our information. Would putting a price on our information discourage our sharing it with less official networks, like our friends, family, colleagues? This is just to say that these are questions that have informed our thinking about this project thus far.

To return to the intended artwork, Veillance (from Surveillance, from the Latin ‘Vigil’, watchful of and for others), will be an experimental forum through which we can observe how audiences respond, via a palette of editing tools, to the invitation to engage with and re-appropriate their data. The resulting collective tapestry will be an experiential map, illustrative of an alternative visible and invisible public space. Here our daily activities, in conversation with internet technologies, are viewed as acts of production, with creative and knowledge exchange occurring at the intersection of various and varied individual maps.

This work, not an end in itself, is intended to grow based on audience interaction and our team’s observation of how this manifests will form part of the research and development of a larger academic-led project which addresses data mining and surveillance by governments and corporations.

Note: Intellectual copyright law as model for personal data use/ownership

Seminar 1 - Position statement from Planet Labs, San Francisco

Planet Labs aims to take a complete picture of the Planet everyday, with a constellation of over 100 small satellites. The earth will be represented at a 3-5 meter per pixel resolution, allowing objects like cars, roads, trees and houses to be resolved. Planet Labs’ vision is to "democratize" access to imagery of the earth, allowing all individuals, companies and organizations equal access to monitoring data about the Planet.

We believe that this unique data set will transform human's understanding of the planet, and create considerable public and commercial value - from monitoring deforestation and polar ice cover, to precision agriculture, mining and pipeline monitoring.
Here's a short TED talk that will help explain Planet's approach.

Seminar 1 - Position statement from Dr. Justin Schlosberg, Journalism & Media, Birkbeck, University of London

In new media, popular and political discourse, transparency is often contextualised as an accountability end in and of itself. According to this logic, power can effectively be contained, monitored and held to account simply by pouring light into the dark corners of its machinations. I have three broad and related concerns with this line of thinking. First, transparency is worth little - in an accountability sense - without the force of publicity. A key question then turns on what kind of information surfaces in the public consciousness, and what role do mainstream media and digital intermediaries play in determining that outcome? My second concern is that meaningful accountability is contingent on resolutions. How then, do we ensure that the visibility of power is not reducible to mere spectacle, and that disclosure triggers the kind of social and political action that leads ultimately to meaningful sanction and reform? My third concern relates to the problem of co-option. The pervasive rhetoric of transparency and the 'end of secrecy' polemic has arguably been exploited by policymakers and the security state to distract from what in reality amounts to an expanding secrecy regime and a growth in 'sofa style' politics. Are we entering a new phase of history in which the processes of decision-making by the powerful are increasingly unrecorded?