Dust or Magic – Creative work in the digital age; Bob Hughes

Preface to the 2007 edition:
No more books about creativity, thanks; just a world where it is possible.

The lesson that emerges, far too cautiously, from Dust or Magic, is that we are in the middle of an undeclared war over what Norbert Wiener called “The human use of human beings”.

Since the book came out in late 1999 I’ve been able to meet and discuss working life with far more people, far more candidly than I ever could do before, and there is no question about it: this is war! On the one hand, we have the people who do the work. We have our own, personal experiences and beliefs about how we work best, which are all amazingly similar and are confirmed now by mountains of the most conclusive research imaginable. Our experiences, and the research, show unequivocally that we do our best work when we’re not feeling threatened or harassed, and are in control of our jobs and lives.

On the other hand, we have the system: capitalism and its latest human manifestation, managerialism. Workers of all kinds are being subjected to more and more control and monitoring, exposed more and more to arbitrary inspection and challenge, required more and more to account not only for their actions but also for their proposed actions and even for their feelings about their jobs: employers increasingly require you to hold sincere, positive feelings toward them and to their customers.

This applies to all workers; i.e., people who actually do something for their living, almost irrespective of salary (although the poorest-paid are in general the most monitored). Increasingly, computers are the means whereby the regime is enforced. This system, whose watchword is “excellence”, seems to have been designed to make good work almost (but not quite) impossible.

The system, it should be said, needs good stuff and does not actually object to it per se. What it really objects to is the situations in which good stuff is produced: situations where people feel secure, valued, and in control. These are precisely the situations where the system’s writ does not run, and they are anathema to it. So the good stuff people do is overwhelmingly done despite the system, in the teeth of its opposition, and in overlooked nooks and crannies of human decency and respect. For anyone who wants to spend their life doing good stuff, this is an important thing to grasp – because until you grasp it you are apt to blame yourself unnecessarily for your failures. And the system will encourage you to do this. It tells you that “you can make it if you try” and you tend to believe it, even when common-sense tells you it is a barefaced lie: we cannot all play for Manchester United, or be President of the United States, or beautiful, simply by “trying”. So if you want to do good stuff, the first thing you must do is distinguish situations that permit it from ones that don’t, and try to find ways of turning the ones that don’t into ones that do. You can’t do this without action that’s to some extent political, and based on solidarity: things that the system abhors and which it has developed a rich array of techniques for discouraging.

The most effective of these techniques is a special kind of Division of Labour, whereby the most dangerous members of the workforce are separated from their fellows by special favours and titles. Thus you, who read this, are perhaps called “creative” by your bosses, and allowed to wear your own clothes to work. Your work will be done in a different and curiously hallowed part of the building, may be slightly less intensely monitored than other people’s, and there will be strong hints that you may at some point enjoy hugely better rewards than the other workers do. You may even have a bar-football machine in the office (but also a strange inhibition about actually using it). It will also be assumed that, because you are so very different from other workers, you scorn the idea of joining a trade union, and are entirely happy to work whole weekends of unpaid overtime, and even brag about it. “Creativity” has become the new Methodism – an ideology that perverts workers into embracing their own humiliation.

 

Computers are all about control – but of what, by whom? They are routinely described as “revolutionary” and their literature is full of talk about “human values” and “human-centered design”. Computers are certainly permeating and affecting human life all over the planet. But what is the “computer-revolutionary agenda” and who, if anyone, is writing it? What great visions do we have, about life after a computer revolution?

Ben Shneiderman is justly considered one of the great names in Human-Computer Interface design but his recent book, “Leonardo’s Laptop” [1] worries me. He starts off by declaring that “The time is right for the high tech world to attend more closely to the needs of humanity.” Absolutely! But he then unfolds a vision of computer use that sounds to me very much like old-fashioned Victorian colonialism, with knobs on. His future-world seems to be a sort of sweetshop staffed by obliging, good-humoured foreigners. Has he got “humanity” confused with “customer”? When you travel, he predicts, “humanity” will be able to choose “local guides whom you hire for their colorful personality or botanical knowledge”. “Humanity” will be supported by kindly natives and applauded by his ever-adoring relatives every step of the way:

Imagine that after a sunrise climb, you reach the summit. You open up your phonecam and send a panoramic view to your grandparents, parents, and friends. They hear the sound of birds, smell the mountain air, feel the wind's coolness, and experience your feeling of success. They can hear each other cheering and point at the birds or click on other peaks to find out more. They remember how, on your last climb, a rockslide brought you unconscious to an emergency room. On that occasion, fortunately, your "World Wide Med" records guided the physician to care for you. She was able to review your medical history, with annotations in her local language helping her to prescribe the right treatment. Today's climb has a happier outcome, which restores everyone's confidence.

I get the feeling this utopia is not for everyone, but rather for distinguished American university professors and a few other high-earners. Shneiderman does not seem aware that a huge proportion[2] of the world’s population has never even made a phone call, let alone used a computer, or that the technology he describes depends on poverty wages and environmental squalor he’d never tolerate himself[3]. We are not told whether the foreign, female physician who sorted him out gets to use this stuff as well, in her “local language” or otherwise, but my guess is perhaps not. It’s a consumer revolution, and the people behind the counter are not in the equation. Perhaps they love their work, perhaps not. They are simply “there”, like Mount Everest. What’s important is the shopping experience.

Michael Dertouzos is another writer mining the same vein. Dertouzos is the hugely respected ex-head of MIT’s Laboratory for Computer Science, so what he says is taken very seriously. He even attends (he tells us) meetings of the World Economic Forum in Davos, where the world’s most powerful businesspeople and politicians decide policy for the rest of us.

In his 2001 book “The Unfinished Revolution[4]” Dertouzos does acknowledge that poverty exists. He even acknowledges that it exists in the USA:

In the US economy, an average of $3,000 in hardware, software, and related services is spent each year per citizen. In Bangaldesh it's $1, according to that country's embassy. I suspect that if I could find an "embassy" representing poor Americans, or the poor of any industrial nation, I would get an equally screeching dissonance between information technology expenditures in the ghetto and in the suburbs.

But then he falls for and recycles some startling perceptions of the world and its people. There is an assumption, which one hoped had died with the British Raj, that the world’s people are somehow happy, even fortunate, in their poverty – and also pitifully helpless. He suggests that they do not know how to feed themselves or take care of their own health, and need “our” help – which we supply by kindly allowing them to work for us and sell us things, cheaply.

He envisages globalised, internet counselling services, where the supposedly bone-idle, poor-but-happy women of "the East" provide cut-price solace for the tormented, rich women (hang on! why only women?) of "the West":

Older, experienced Indian women could spend a lot of time over the Net chatting with Western divorcées, who could benefit from their advice at costs substantially below the psychologist's counseling fee. The lack of time that characterizes Westerners would be counterbalanced by the plentiful time of people in India.

No suggestion, interestingly, of using poor American women to perform this service. Why not? They're in the right time zone; they even speak the same languages as the divorcées. And heaven knows they need the money! Why couldn't his own country’s famous “trailer-park trash” do this? Dertouzos does not say.

A few pages later he has Chinese companies selling translation services to "Western companies anxious to do business in China." Everybody would benefit. The little Chinese companies get fat on translation fees, and the Western company gets the Chinese market. Sounds fair?

No careers are risked by proposing revolutions like these. They rock no boats and threaten no vested interests. On the contrary, their universities’ corporate sponsors are surely very happy to have these scenarios presented as “the future” (and gratis, by big-name publishers) because it is a future they’d love to cater for; in fact, a future they absolutely need! It involves plenty of bandwith, plenty of gadgets, plenty for the investors to get excited about, which means plenty of profits.

They are not all acquiescence though: they can talk tough. Dertouzos chastises “us” for putting up with inferior gadgets, and Shneiderman chastises the self-indulgent developers for “producing tools for themselves, giving little thought to the needs of other users.” This is a common refrain in this genre of writing. It’s all the fault of the techies! The people who do the work have failed us again! Innumerable “usability” careers have waxed fat on this kind of finger-wagging. It’s what the bosses like to hear! No mention of the people who hire the techies, or fire them, set the deadlines, move the goalposts and tell them to get on with it. It’s as if the techies were an autonomous tribe that had somehow swept in from the Steppe and seized power. If only!

 

The world of computer work is often described as a community. The sense of community is certainly one of its most significant and wonderful features. In fact, it’s unlikely we would have the computer at all were it not for its deep, communitarian and even socialist roots. But it is a tragically naive, unorganised community. People still share their ideas and their knowledge – but they have less and less control over the work they do, let alone defining its purpose. We are hopeless at defending each other and are quite powerless when it comes to securing that impolite-to-mention prerequisite of creative work: enough money to live on.

A great deal is made at times, in the media, of the high salaries commanded by programmers and web designers who have currently-fashionable skills. Absolutely no fuss at all is made about the collapse in those people's earnings once their skills cease to be fashionable, or their entire industry goes out of fashion. Or of the huge amounts of time people have to spend re-learning their own jobs (or “updating their skills”) – almost always at their own expense, and at their own risk (the new, hard-earned skills may not turn out to be the next big thing after all). Or the now-endemic practice of getting talented young people to work for nothing, in the hope of a job at the end (it’s called “internship” – which sounds a bit more alluring than “unpaid skivvy”).

IT workers are encouraged to view themselves as an elite. Indeed they are not among the poorest-paid. But earnings are not as high as rumour suggests, and the lifetime-earnings profile for IT workers is now almost exactly that of unskilled manual workers in non-unionised Victorian industries. Your earnings may rise briskly during your twenties, but peak in your thirties, and then decline inexorably. The British trade union-based researcher Mike Cooley predicted this in the 1970s and 1980s (it was already happening then among industrial designers through the introduction of CAD systems)[5]. In the USA, Chris Benner has shown that Cooley’s prediction has come true with a vengeance in the very heartland of the "new economy", Silicon Valley[6].

Benner’s study shows that workers can and must organise. Indeed, workers are already organising - but mainly, so far, through self-help and knowledge-sharing groups, often mediated exclusively through email listservers. These groups are good for helping you upgrade your skills but have little or no bargaining power. Indeed they are often sponsored by the very organisations against whom workers need protection: the employers. Many new media workers recoil from the idea of taking on the employers. It seems impolite, not to mention risky. But there is nothing wrong with being clearheaded about the employer-worker relationship. Benner's case studies suggest very strongly that the groups that deal with these issues candidly and up front enjoy better career prospects. The National Writers’ Union Tech-Writers’ Trade Group and the Graphic Artists’ Guild have had no qualms about using “outmoded” trade-union tactics, and won solid benefits for their members. But these are exceptions.

New media work is insecure work – like so much of the work in the modern economy, including the low-paid and menial work done by immigrants, women and the poor. Benner’s study focuses on the growing role of “labor market intermediaries” in the new economy. Computer contract workers the world over are now familiar with the names of Manpower, Randstad, Adecco, Kelly and others like them. Notwithstanding the first-name chumminess that characterizes our dealings with them, these companies are there to “commodify” you in a way not even Karl Marx anticipated. You become a mere, machine-readable list of “skills”. From the 1970s, these erstwhile office temp agencies found a lucrative niche providing staff for IT companies that did not want the liability of a permanent relationship with their employees. Now they are doing the same thing, on an even bigger scale in the rest of the economy (Manpower is at present the world’s biggest employer!) providing low-paid, easy-to-sack contract workers to the construction, hotel, catering, agriculture and almost any other industry where responsibilities to workers might imperil corporate survival or profits.

Flexibilisation is supposedly a response to the "fast pace of progress". We have come to accept as normal the fact that computer skills are ever-changing, and computers themselves have shelf-lives approaching that of a supermarket banana, so that we have to buy our machines and software over and over again, as well as relearning our jobs over and over again. But much of this progress is bogus.

Most innovation in industry is now, in fact, sales effort. The paramount impetus for development is not human needs, but the computer companies’ need to stay in business, and keep growing. If they flag, the stock market will swallow them in one gulp. So there must always be an exciting “next-generation” product in the pipeline. “Research and Development” is now almost universally subordinate to Marketing. Indeed, that has been the accepted wisdom in the USA at least since the 1960s. In the 1970s, starting in the US, university research was also drawn into the sales effort. Researchers are obliged to pursue goals that industry desires, within the time-frames industry needs. We are led to assume that this has, at least, led to a wealth of technology. But has it? No it has not! The diversity of computers has diminished catastrophically since the market took an interest in them. And these computers are not even cheap: to buy one is to enter into a commitment to buy it all over again in three years’ time, or risk losing all your work and perhaps your livelihood. And that’s to say nothing of their cost to the people who make them: our computers would not be as affordable as they are without the labour of super-exploited, low-waged, non-unionised, female Asian workers, and the uncosted pollution of their living environments.

 

The system we work under is best explained, I think, not as a system for creating the things people need, but as a system of control, which it achieves by creating and maintaining scarcity. It is not simply that capitalism is careless or wasteful – its wastefulness has a patterned and deliberate quality. Surpluses (which corporations are forced to increase year on year or fall prey to takeovers) cannot be simply returned to the people from whom they were extracted. They must be got rid of somehow or invested (and generate even more surplus)[7]. Boom-bust cycles and wars are the traditional surplus-disposal mechanisms. Recent bloodlettings like the CD-ROM slump of 1996 and the very much bigger dot-com bust of 2000-2001 have probably helped rather than hindered the system. But faced with the overwhelming abundance made possible by newer technologies, another major mechanism has had to be deployed and developed: fashion.

The term “planned obsolescence” entered the general vocabulary in 1960, via the investigative journalist Vance Packard’s exposé of the US auto industry, “The Waste Makers”. He showed that fashion, driven by marketing and advertising, was now the driving force behind “innovation” in the auto industry. Indeed, technical innovation was being sidelined and sometimes actively stifled so that maximum revenue could be extracted from existing production lines. Planned obsolescence was first deployed by General Motors as a solution to the problem of “market saturation” (that is, customer-satisfaction). It wasn’t just a matter of making cars that fell to bits. Alfred P. Sloan (GM’s boss) recognized and ruthlessly exploited the power of the media to adjust people's needs and desires, so that they would abandon the perfectly functional devices you sold them last year for this year's new model. Public outrage about planned obsolescence was a nine-day wonder. It begat some fairly insipid consumer protection legislation, but planned obsolescence then went on from strength to strength and, with the arrival of microprocessors, became an accepted feature of almost everything in modern life. Most obvious is the extremely short shelf-life of the computers we use. A situation in which everybody had a satisfactory computer that would serve their main needs for the foreseeable future would be their manufacturers’ ultimate nightmare.

And computers are only the visible fraction of consumer electronics: entire industries are now made obsolete, on a regular basis – never mind individual model lines: consider the fate of film-based photography, kitchen appliances, video, hi-fi.

Computers have made “obsoleting” a precise art in all industries. For example, the Caterpillar company recently ran an ad featuring a bulldozer that had been discovered on a farm in the Ukraine, having been buried by its owners during the Fascist invasion of 1942. After being cleaned up, it started, presumably good for another 60 years: the definitive example of Caterpillar workmanship. The irony was that anyone who bought a new Caterpillar bulldozer on the strength of that advertisement would be lucky to get even 10 years’ use out of it. Thanks to its computers, Caterpillar can now calculate the effects of stresses, vibration and corrosion on every part of the vehicle, and tune “mean time between failures” to tolerances that were inconceivable in 1942. The vehicle has a design-life, and nothing is used in its manufacture beyond what is needed to reach that design-life. After that, failures can almost be guaranteed to come thick and fast. And electronic components ensure that there is less and less that can be repaired in any kind of vehicle or consumer device.

 

My point is that computer-work is not an isolated activity. It is connected to every other activity under the sun – and it has got genuinely revolutionary potential. It can be used to leach even more profit from the very poorest. But it can also empower the poorest. Its assimilation into so many aspects of modern life means that one has to be very dull indeed not to notice the connections. This makes computer workers potentially very powerful people – and a definite source of unease to those whose job is control.

For many people, the most politically important aspect of computers is and always has been the empowerment, and the sense of power, one can experience, or lose, through using them. Empowerment is not something that an authoritarian system is easy with: powerlessness and helplessness are what it prefers. And people who can do stuff for themselves are less biddable and more questioning than people who can’t.

This is why people like Douglas Engelbart, whose aim has always been the augmentation of human abilities, have found capital such a wary customer and such a treacherous ally; and why the possibility of controlling and truly owning our computers by programming them for ourselves gets more and more remote; and why democratic public debate about what we would like from our computers is not even on the agenda (despite the fact that at least one fundamental concept, object-oriented programming, came directly from such a debate – in Norway in the 1960s).

Computers are inherently and inescapably political because they are about power. A great deal of energy goes into denying this fact, but the political tradition is a rich one – from the Participatory Design movement, with its roots in trade-unionism in Scandinavia and England in the 1970s, to the free and open software movements, and the irrepressible, ubiquitous anarchistic hacker subculture.

Since 1999, some sections of computerdom have once again become overtly and actively political. I am thinking particularly of the Independent Media Centre movement (Indymedia), which began its life during the demonstrations against the World Trade Organisation in Seattle, in November 1999, and has exploded into a huge, self-organised world-wide phenomenon. It uses low-cost and where possible open-source computer technology to challenge the stranglehold of the mainstream media on the news, and give a voice and a common focus to the disparate groups who oppose the system – on issues such as GM crops, sweatshops, migrants’ rights, and of course war. Indymedia and the numerous other new activist networks played a major part in mobilizing, co-ordinating and sustaining world-wide popular opposition to the US and British governments’ invasions of Afghanistan and Iraq – in 2002 and 2003. In London in February 2003, two million people marched against the war, and came within a whisker of bringing down the pro-war Blair government. There has never been a demonstration on that scale before, and it could be that this generation of internet-mediated protests has already had a lasting effect.

We need systems that empower the people who have no power – but to do this work properly there has to be a change. We need to start articulating, demanding and fighting for a vision of a world we would not be ashamed to live and work in.

 


[1] Leonardo's Laptop: Human Needs and the New Computing Technologies (MIT Press, October 2002)

[2] The generally-given figure is half. This, apparently, originates in an Economist article of 1995 and is probably still fairly accurate. See Google Answers:”What percentage of the world has made a phone call?”

[3] For an introduction to the full costs of current computer technology see Mazurek, J. (1999). Making Microchips - Policy, Globalization, and Economic Restructuring in the Semiconductor Industry. Cambridge, Massachusetts, The MIT Press.

[4] Dertouzos, M. (2001). The Unfinished Revolution - Human-Centered Computers and What They Can Do For Us. New York, HarperCollins.

[5] Mike Cooley, “Architect or Bee?”; Hogarth Press 1987

[6] Chris Benner. “Work in the New Economy - Flexible Labor Markets in Silicon Valley”, Blackwell (2002).

[7] For a very good account of the “surplus-disposal problem” see Paul A Baran, and Paul M. Sweezy. (1968). Monopoly Capital - An essay on the American Economic and Social Order. Harmondsworth, Penguin