Review: Everyware by Adam Greenfield / by Dan Lockton

The cover of the book, in a suitably quotidian setting This is the first book review I've done on this blog, though it won't be the last. In a sense, this is less of a conventional review than an attempt to discuss some of the ideas in the book, and synthesise them with points that have been raised by the examination of architectures of control: what can we learn from the arguments outlined in the book?

Adam Greenfield's Everyware: The dawning age of ubiquitous computing looks at the possibilities, opportunities and issues posed by the embedding of networked computing power and information processing in the environment, from the clichéd 'rooms that recognise you and adapt to your preferences' to surveillance systems linking databases to track people's behaviour with unprecedented precision. The book is presented as a series of 81 theses, each a chapter in itself and each addressing a specific proposition about ubiquitous computing and how it will be used.

There's likely to be a substantial overlap between architectures of control and pervasive everyware (thanks, Andreas), and, as an expert in the field, it's worth looking at how Greenfield sees the control aspects of everyware panning out.

Everyware as a discriminatory architecture enabler

"Everyware can be engaged inadvertently, unknowingly, or even unwillingly"

In Thesis 16, Greenfield introduces the possibilities of pervasive systems tracking and sensing our behaviour—and basing responses on that—without our being aware of it, or against our wishes. An example he gives is a toilet which tests its users' "urine for the breakdown products of opiates and communicate[s] its findings to [their] doctor, insurers or law-enforcement personnel," without the user's express say-so.

It's not hard to see that with this level of unknowingly/unwillingly active everyware in the environment, there could be a lot of 'architectures of control' consequences. For example, systems which constrain users' behaviour based on some arbitrary profile: a vending machine may refuse to serve a high-fat snack to someone whose RFID pay-card identifies him/her as obese; or, more critically, only a censored version of the internet or a library catalogue may be available to someone whose profile identifies him/her as likely to be 'unduly' influenced by certain materials, according to some arbitrary definition. Yes, Richard Stallman's Right To Read prophecy could well come to pass through individual profiling by networked ubiquitous computing power, in an even more sinister form than he anticipated.

Taking the 'discriminatory architecture' possibilities further, Thesis 30, concentrating on the post-9/11 'security' culture, looks at how:

"Everyware redefines not merely computing but surveillance as well... beyond simple observation there is control... At the heart of all ambitions aimed at the curtailment of mobility is the demand that people be identifiable at all times—all else follows from that. In an everyware world, this process of identification is a much subtler and more powerful thing than we often consider it to be; when the rhythm of your footsteps or the characteristic pattern of your transactions can give you away, it's clear that we're talking about something deeper than 'your papers, please.'

Once this piece of information is in hand, it's possible to ask questions like Who is allowed here? and What is he or she allowed to do here?... consider the ease with which an individual's networked currency cards, transit passes and keys can be traced or disabled, remotely—in fact, this already happens. But there's a panoply of ubiquitous security measures both actual and potential that are subtler still: navigation systems that omit all paths through an area where a National Special Security Event is transpiring, for example... Elevators that won't accept requests for floors you're not accredited for; retail items, from liquor to ammunition to Sudafed, that won't let you purchase them... Certain options simply do not appear as available to you, like greyed-out items on a desktop menu—in fact, you won't even get that back-handed notification—you won't even know the options ever existed."

This kind of 'creeping erosion of norms' is something that's concerned me a lot on this blog, as it seems to be a feature of so many dystopian visions, both real and fictional. From the more trivial—Japanese kids growing up believing it's perfectly normal to have to buy music again every time they change their phone—to society blindly walking into 1984 due to a "generational failure of memory about individual rights" (Simon Davies, LSE), it's the "you won't even know the [options|rights|abilities|technology|information|words to express dissent] ever existed" bit that scares me the most.

Going on, Greenfield quotes MIT's Gary T Marx's definition of an "engineered society," in which "the goal is to eliminate or limit violations by control of the physical and social environment." I'd say that, broadening the scope to include product design, and the implication to include manipulation of people's behaviour for commercial ends as well as political, that's pretty much the architectures of control concept as I see it.

In Thesis 42, Greenfield looks at the chain of events that might lead to an apparently innocuous use of data in one situation (e.g. the recording of ethnicity on an ID card, purely for 'statistical' purposes) escalating into a major problem further down the line, when that same ID record has become the basis of an everyware system which controls, say, access to a building. Any criteria recorded can be used as a basis for access restriction, and if 'enabled' deliberately or accidentally, it would be quite possible for certain people to be denied services or access to a building, etc, purely on an arbitrary, discriminatory criterion.

"...the result is that now the world has been provisioned with a system capable of the worst sort of discriminatory exclusion, and doing it all cold-bloodedly, at the level of its architecture... the deep design of ubiquitous systems will shape the choices available to us in day-to-day life, in ways both subtle and less so... It's easy to imagine being denied access to some accommodation, for example, because of some machine-rendered judgement as to our suitability, and... that judgement may well hinge on something we did far away in both space and time... All we'll be able to guess is that we conformed to some profile, or violated the nominal contours of some other...

The downstream consequences of even the least significant-seeming architectural decision could turn out to be considerable—and unpleasant."


Everyware as mass mind control enabler

In a—superficially—less contentious area, Thesis 34 includes the suggestion that everyware may allow more of us to relax: to enter the alpha-wave meditative state of "Tibetan monks in deep contemplation... it's easy to imagine environmental interventions, from light to sound to airflow to scent, designed to evoke the state of mindfulness, coupled to a body-monitor setting that helps you recognise when you've entered it." Creating this kind of device—whether biofeedback (closed loop) or open-loop—has interested designers for decades (indeed, my own rather primitive student project attempt a few years ago, MindCentre, featured light, sound and scent in an open-loop), but when coupled to the pervasive bio-monitoring of whole populations using everyware, some other possibilities surely present themselves.

Is it ridiculous to suggest that a population whose stress levels (and other biological indicators) are being constantly, automatically monitored, could equally well be calmed, 'reassured', subdued and controlled by everyware embedded in the environment designed for this purpose? One only has to look at the work of Hendricus Loos to see that the control technology exists, or is at least being developed (outside of the military); how long before it\'s networked to pervasive monitoring, even if, initially only of prisoners? See also this article by Francesca Cedor.\r\n\r\n\r\nEveryware as \'artefacts with politics\'\r\n\r\nOn a more general \'Do artefacts have politics?\'/\'Is design political?\' point, Greenfield observes that certain technologies have "inherent potentials, gradients of connection" which predispose them to be deployed and used in particular ways (Thesis 27), i.e. technodeterminism. That sounds pretty vague, but it\'s — to some extent — applying Marshall McLuhan\'s "the medium is the message" concept to technology. Greenfield makes an interesting point:\r\n\r\n

"It wouldn\'t have taken a surplus of imagination, even ahead of the fact, to discern the original Napster in Paul Baran\'s first paper on packet-switched networks, the Manhattan skyline in the Otis safety elevator patent, or the suburb and the strip mall latent in the heart of the internal combustion engine."

\r\n\r\nThat\'s an especially clear way of looking at \'intentions\' in design: to what extent are the future uses of a piece of technology, and the way it will affect society, embedded in the design, capabilities and interaction architecture? And to what extent are the designers aware of the power they control? In Thesis 42, Greenfield says, "whether consciously or not, values are encoded into a technology, in preference to others that might have been, and then enacted whenever the technology is employed".\r\n\r\nLawrence Lessig has made the point that the decentralised architecture of the internet — as originally, deliberately planned — is a major factor in its enormous diversity and rapid success; but what about in other fields? It\'s clear that Richard Stallman\'s development of the GPL (and Lessig\'s own Creative Commons licences) show a rigorous design intent to shape how they are applied and what can be done with the material they cover. But does it happen with other endeavours? Surely every RFID developer is aware of the possibilities of using the technology for tracking and control of people, even if he/she is \'only\' working on tracking parcels? As Greenfield puts it, "RFID \'wants\' to be everywhere and part of everything." He goes on to note that the 128-bit nature of the forthcoming IPv6 addressing standard — giving 2^128 possible addresses — pretty clearly demonstrates an intention to "transform everything in the world, even every part of every thing, into a node." \r\n\r\nNevertheless, in many cases, designed systems will be put to uses that the originators really did not intend. As Greenfield comments in Thesis 41:\r\n\r\n

"...connect... two discrete databases, design software that draws inferences fromt he appearance of certain patterns of fact—as our relational technology certainly allows us to do—and we have a situation where you can be identified by name and likely political sympathy as you walk through a space provisioned with the necessary sensors.\r\n\r\nDid anyone intend this? Of course not—at least, we can assume that the original designers of each separate system did not. But when... sensors and databases are networked and interoperable... it is a straightforward matter to combine them to produce effects unforeseen by their creators."

\r\n\r\nIn Thesis 23, the related idea of \'embedded assumptions\' in designed everyware products and systems is explored, with the example of a Japanese project to aid learning of the language, including alerting participants to "which of the many levels of politeness is appropriate in a given context," based on the system knowing every participant\'s social status, and "assign[ing] a rank to every person in the room... this ordering is a function of a student\'s age, position, and affiliations." Greenfield notes that, while this is entirely appropriate for the context in which the teaching system is used:\r\n\r\n

"It is nevertheless disconcerting to think how easily such discriminations can be hard-coded into something seemingly neutral and unimpeachable and to consider the force they have when uttered by such a source...\r\n\r\nEveryware [like almost all design, I would suggest (DL)]... will invariably reflect the assumptions its designers bring to it... those assumptions will result in orderings—and those orderings will be manifested pervasively, in everything from whose preferences take precedence while using a home-entertainment system to which of the injured supplicants clamouring for the attention of the ER staff gets cared for first."

\r\n\r\nThesis 69 states that:\r\n\r\n

"It is ethically incumbent on the designers of ubiquitous systems and environments to afford the human user some protection"

\r\n\r\nand I think I very much agree with that. From my perspective as a designer I would want to see that ethos promoted in universities and design schools: that is real, active user-centred, thoughtful design rather than the vague, posturing rhetoric which so often surrounds and obscures the subject. Indeed, I would further broaden the edict to include affording the human user some control, as well as merely protection—in all design—but that\'s a subject for another day (I have quite a lot to say on this issue, as you might expect!). Greenfield touches on this in Thesis 76 where he states that "ubiquitous systems must not introduce undue complications into ordinary operations" but I feel the principle really needs to be stronger than that. Thesis 77 proposes that "ubiquitous systems must offer users the ability to opt out, always and at any point," but I fear that will translate into reality as \'optional\' in the same way that the UK\'s proposed ID cards will be optional: if you don\'t have one, you\'ll be denied access to pretty much everything. And you can bet you\'ll be watched like a hawk.\r\n\r\n\r\nEveryware: transparent or not?\r\n\r\nGreenfield returns a number of times to the question of whether everyware should be presented to us as \'seamless\', with the relations between different systems not openly clear, or \'seamful\', where we understand and are informed about how systems will interact and pass data before we become involved with them. From an \'architectures of control\' point of view, the most relevant point here is mentioned in Theses 39 and 40:\r\n\r\n

"...the problem posed by the obscure interconnection of apparently discrete systems... the decision made to shield the user from the system\'s workings also conceals who is at risk and who stands to benefit in a given transaction...\r\n\r\n"MasterCard, for example, clearly hopes that people will lose track of what is signified by the tap of a PayPass card—that the action will become automatic and thus fade from perception."

\r\n\r\nThis is a very important issue and also seems especially pertinent to much in \'trusted\' computing where the user may well be entirely oblivious to what information is being collected about him or her, and to whom it is being transmitted, and, due to encryption, unable to access it even if the desire to investigate were there. Ross Anderson has explored this in great depth.\r\n\r\nThesis 74 proposes that "Ubiquitous systems must contain provisions for immediate and transparent querying of their ownership, use and capabilities," which is a succinct principle I very much hope will be followed, though I have a lot of doubt.\r\n\r\n\r\nFightback devices\r\n\r\nIn Thesis 78, Greenfield mentions the Georgia Tech CCD-light-flooding system to prevent unauthorised photography as a fightback device challenging everyware, i.e. that it will allow people to stop themselves being photographed or filmed without their permission.\r\n\r\nI feel that interpretation is somewhat naïve. I very, very much doubt that offering the device as a privacy protector for the public is a) in any way a real intention from Georgia Tech\'s point of view, or b) that members of the public who did use such a device to evade being filmed and photographed would be tolerated for long. Already in the UK we have shopping centres where hooded tops are banned so that every shopper\'s face can clearly be recorded on CCTV; I hardly think I\'d be allowed to get away with shining a laser into the cameras! \r\n\r\nAlthough Greenfield notes that the Georgia Tech device does seem "to be oriented less toward the individual\'s right to privacy than towards the needs of institutions attempting to secure themselves against digital observation," he uses examples of Honda testing a new car in secret (time for Hans Lehmann to dig out that old telephoto SLR!) and the Transportation Security Agency keeping details of airport security arrangements secret. The more recent press reports about the Georgia Tech device make pretty clear that the real intention (presumably the most lucrative) is to use it arbitrarily to stop members of the public photographing and filming things, rather than the other way round. If used at all, it\'ll be to stop people filming in cinemas, taking pictures of their kids with Santa at the mall (they\'ll have to buy an \'official\' photo instead), taking photos at sports events (again, that official photo), taking photos of landmarks (you\'ll have to buy a postcard) and so on. \r\n\r\nIt\'s not a fightback device: it\'s a grotesque addition to the rent-seekers\' armoury.\r\n\r\nRFID-destroyers (such as this highly impressive project), though, which Greenfield also mentions, certainly are fightback devices, and as he notes in Thesis 79, an arms race may well develop, which ultimately will only serve to enshrine the mindset of control further into the technology, with less chance for us to disentangle the ethics from the technical measures.\r\n\r\nConclusion\r\n\r\nOverall, this is a most impressive book which clearly leads the reader through the implications of ubiquitous computing, and the issues surrounding its development and deployment in a very logical style (the \'series of theses\' method helps in this: each point is carefully developed from the last and there\'s very little need to flick between different sections to cross-reference ideas). The book\'s structure has been designed, which is pleasing. Everyware has provided a lot of food for thought from my point of view, and I\'d recommend it to anyone with an interest in technology and the future of our society. Everyware, in some form, is inevitable, and it\'s essential that designers, technologists and policy-makers educate themselves right now about the issues. Greenfield\'s book is an excellent primer on the subject which ought to be on every designer\'s bookshelf.\r\n\r\nFinally, I thought it was appropriate to dig up that Gilles Deleuze quote again, since this really does seem a prescient description for the possibility of a more \'negative\' form of everyware:\r\n\r\n

“The progressive and dispersed installation of a new system of domination.”