In not much more than 50 years, the word “creative” has become transformed from a fragile attribute of small children and poets, into a mighty “engine of economic growth”. How has this happened? Indeed, what has happened?
The fashionable orthodoxy, proclaimed for example by John Howkins (2001) and Richard Florida (2002), is that the “creative industries” are an evolutionary step onward from the “old”, sweat- and skill-based industries, and we need “new rules for a new economy”. But if we look at the scientific literature on creativity, and listen to what so-called “creative types” actually say about their work, it is hard to find a clear basis for this separation.
In my own time as a “creative” in advertising (the 1970s-80s) the term was a bone of bitter contention. It had apparently entered usage in the late 1950s. Today, advertising is seen as one of the first of the “creative industries”, but the new label was not accepted there without question and is still not used without heavy irony. A famously creative ad-man, David Ogilvy, detested the term. He spoke (and his work still speaks) for a great many “creatives” when he wrote:
"The so-called Creative Revolution usually ascribed to Bill Bernbach [founder of Doyle, Dane, Bernbach – DDB] and myself in the fifties could equally well have been ascribed to N. W. Ayer and Young and Rubicam in the thirties"... "Creativity strikes me as a high-falutin word for the work I have to do between now and Tuesday." (Ogilvy, 1983)
As for the scientific literature on creativity, this tends to support Ogilvy’s view that it does not differ greatly from other kinds of work; definitions of creativity (for example as offered by Mihalyi Csikszentmihalyi - 1975) overlap so extensively with the definitions of skill (for example, as offered by Charles More - 1980) that they might as well be considered part of the same phenomenon. Moreover, major developments in mind-science of the past quarter-century indicate that the separation of intellectual from manual activity (the “Cartesian split”) is a largely bogus enterprise. A succession of works, for example by Richard Gregory (1981, 1998), Francisco Varela (1992), Antonio Damasio (1994) and Joseph Ledoux (1998) have led as it were to a “resurrection of the body” in mind-science: thought and physicality are inseparable. It is all work after all, and all attempts to differentiate it and attach different values to its different varieties require serious explanation.
In 1974 the sociologist Michael Burawoy took a job as a machine operator at a Chicago engineering works, and noticed that the labour process he was part of was not entirely organised on the strict, Taylorist lines he had anticipated. It gradually dawned on him that the piece-rate system was organised as a sort of game (called “making out”), and this, rather than the larger realities of profit and exploitation, dominated the entire social milieu. Making out provided engrossing moment-by-moment challenges and satisfactions for the worker; it underpinned the social status system and fueled conversation within the plant and outside it, creating what Burawoy termed a “coercive culture” where it was impossible to critique one’s work except in terms of the (relatively trivial) game. The vertical conflicts between management and workers had been successfully “lateralised” into worker/worker rivalry. A game, says Burawoy, generates its own needs and means of satisfying them, which although not primary become the prime focus of concern, and so come to seem “natural and inevitable. Alternatives are eliminated or cast as utopian.” (Burawoy, 1979, p93)
The insights he came away with seem to shed much-needed analytical light on “the creative economy”, offering a radical re-interpretation of “creative” as an addictive new game for workers (and also consumers) to play, “manufacturing consent” (as Burawoy puts it) to their own exploitation.
The crux of the matter is that effective exploitation is not just a matter of maximizing control and eliminating uncertainty; on the contrary: “securing worker cooperation rests on a minimal uncertainty, the possibility that the workers will assert some control over the labor process.” (p 87)
“Creative”, I suggest, works for capital in exactly this way. In the workplace (if not in science) it is above all else a contested term which creates a highly-exploitable “zone of uncertainty”. The contest is not just over the meaning of the word, but whether what the worker has done is, indeed, creative or not – i.e., is valuable or worthless. The old-economy’s skilled worker could settle the matter with a micrometer-gauge, to show that the work was indeed correct to within so many thousandths of an inch. The new-style creative worker never knows whether his or her work is worthwhile or not, till the boss, or the client, or a sufficiently large and powerful group of peers, declares it so.
The power of the word “creative” is pretty well summed up in that expression so loved by managers: “I’ll know it when I see it” (or as they still say in the original creative industry, advertising, “the man from Del Monte, he say ‘yes’!”). Indeed, one could go so far as to say that the rise of the concept of a “creative industry” is a stark expression of capitalism’s preference, not for creating value, but for bestowing value on things simply by recognising them (and then, as a reward for this kindly act, taking ownership of them).
I am not being flippant. The calligrapher Edward Johnston (who invented modern sans-serif type for London Transport in the early 1900s, so that people “could see where they were going”) believed that “appreciation, or rather ‘to see that a thing is good’ (v Genesis) is the final creative act; in fact, that a thing is not completely created until it has been appreciated.” (Johnston 1959, p256). Recognition has intense value and therefore power: a kind of act of love which, when turned into a cash value, easily becomes a kind of rape.
This is consistent with the simultaneous rise of the importance of Intellectual Property (IP). Indeed, Howkins (2001) defines “creative industry” as one that creates intellectual property. Perhaps it is all really part of what Vananda Shiva (in “Biopiracy”, 1998, cf Bowring 2003) has called “the second coming of Columbus”: the “discovery” and annexation not only of other people’s lands, rivers and forests, but also of their plants, animals, knowledge and genes.
And it means that “creative” has exceptional disciplinary force. It touches very deeply on what it means to be a human being. The British psychologist Liam Hudson noted in 1966: “In some circles ‘creative’ does duty as a word of general approbation – meaning, approximately, ‘good’.” (Vernon 1970, p 217). But “creative” is a much more potent compliment than mere “good”, and “uncreative” is a much more devastating verdict than “bad”.
Be that as it may... this paper proposes that “creativity” is an important new area of capitalist mischief-making; turning work into a game, the better to divide people, galvanise consumption, and secure consent. I want to elaborate this proposition firstly by examining the managerial history of the “c-word”, followed by a short account of its impact in my own experience as a “creative worker” (as a copywriter in an assortment of UK advertising agencies in the 1970s and 1980s).
Scientific interest in creativity (in the sense of getting new ideas) began in the late 19th century, for example with the mathematician Henri Poincaré’s influential “Foundations of Science” (1908 – excerpt in Vernon 1970). But apart from some very rare occurrences, the world of business and politics took no interest in the concept until the start of the Cold War. It happened quite abruptly, in the USA, in 1950. Hudson expressed a commonly-held view in 1966, that the cause was
a diffuse cultural ground-swell, elevating the scientist from the status of technician to that of culture hero; and a more specific concern on the part of the American nation [i.e., the USA] with the state of their armaments industry.” (1966 – in Vernon 1970, p 218)
A key event was J.P. Guilford’s presidential address to the American Psychological Association (APA) in 1950, promoting creativity as a strategic national asset. An early “creativity skeptic”, Robert Weisberg (1986), observed that Guilford was operating outside the academic mainstream on so-called “creativity testing”. During WWII he had been Director of Psychological Research at the Santa Ana Army Air Base, working on the selection and ranking of aircrew trainees. His address to the APA caught the military/industrial establishment in a frenzy of excitement and anxiety following its successes with the Atomic Bomb, radar etc., terrified that the flow of scientific goodies might dry up. The anxiety became public hysteria in 1957, when Soviet engineers launched the first man-made satellite, Sputnik. Writing in 1958 (as the “missile-gap” obsession was taking shape) Guilford explicitly linked the need for research into creativity with the USA's
mortal struggle for survival of our way of life in the world. The military aspect of this struggle, with its race to develop new weapons and new strategies, has called for a stepped-up rate of invention." (“Traits of Creativity”, 1959 – in Vernon 1970, p 167)
Guilford planted two other important seeds: the idea that “creative” would be important for maintaining consumption (as in "creative pastimes" etc, to fight “boredom arising from increased leisure time.”); and the "divergent thinking" model of creativity, with its emphasis on “flexibility” and prejudice against “linear” thinking.
The "myth of divergent thinking" was demolished early on by well-designed and successfully-replicated experiments - for example Dunnette, Campbell and Jastaad's examination and demolition of brainstorming in 1963 - one of many such experiments described by Weisberg (1986, p62). Despite this, divergent thinking became part of the management gospel and still turns up in new forms in new personality tests, for example in Fiona Patterson's "Innovation Potential Indicator" (IPI - Howkins 2001, p16) – a psychometric test currently in favour in the UK – which appears to offer employers a way of sorting the easily-bored, "innovation oriented" and potentially "creative" recruit from the dull, introverted, inflexible types.
The “anti-boredom” role seems to have succeeded in a way Guilford didn’t perhaps anticipate: in helping to create a strong, general prejudice against ways of life that are stable and predictable, and to tar-and-feather forces that strive to introduce some stability into people’s lives: like traditional trade unions, and central planning; and to establish the idea of precarity as both exciting, and necessary.
So “creativity” went mainstream, with US government funds for research, and cheered on by popular journalism. This version of creativity, which we might call The Guilford Strain, rapidly escaped into the corporate environment.
This “explosion of creativity” seems to have had two phases: from the mid 1950s till the mid 1980s, there was a fairly distinct “creative heartland” consisting of the advertising agencies (which also started calling themselves an “industry” at some point during this period) and parts of the media; and “creative hobbies” like painting by numbers, rug-making from kits, and so on. In the second phase, “creative” radiated into almost every aspect of life until (as Rob Pope puts it, in his very rich and detailed account): “nowadays we can apparently ‘create’ everything from ‘the right image’ to ‘job opportunities’ and ‘a market’.” (Pope 2003, p40). The latter phase coincides closely with three other important phenomena: the collapse of the Soviet Union, the surge of globalisation/offshoring, and the invasion of all industries by electronics. Indeed, since the mid-1980s “creative” and “technology” have come to seem entirely natural companions, with the indicative slogan “unleash your creativity” appearing on all manner of personal computer products and peripherals.
Yet there is evidence that all this talk of “creativity” is not (Florida notwithstanding) matched by any increased, subjective sense of creativity in the workplace. On the contrary, Ewart Keep has found that, in UK National Skills Surveys, the number of employees reporting “a great deal of choice” over the way they worked fell from 52% in 1986 to 39% in 2001. Keep observes: “the opportunities for displaying creativity at work appear to be shrinking rather than expanding” (Keep 2002).
Electronics have been used across the whole spectrum of industry to augment the sales effort – by allowing incremental, sales-oriented improvements to and accelerated obsolescence of previously-stable technologies – so it could be said that the technology has allowed advertising to invade everything that moves.
Why should advertising, of all businesses, have been the one that first latched on to “creative”, and even made it its own? Why not an industry that actually creates something? But this is totally consistent with Baran and Sweezy’s account (1966) of the vital role played by the sales effort in modern capitalism. It also fits with and supports a view that capitalism is far less concerned with production per se, or profit, than it is with power. What more natural way to neutralize a dangerously capable workforce than by siphoning off its most articulate and inventive members? What is more:
“The greatest damage done by advertising is precisely that it incessantly demonstrates the prostitution of men and women who lend their intellects, their voices, their artistic skills to purposes in which they themselves do not believe, and that it teaches [in the words of Leo Marx] ‘the essential meaninglessness of all creations of the mind: words, images, and ideas.’ The real danger from advertising is that it helps to shatter and ultimately destroy our most precious non-material possessions: the confidence in the existence of meaningful purposes of human activity and respect for the integrity of man.” (Baran and Sweezy 1966, quoted by Robert McChesney and John Bellamy Foster, 2003 – emphasis added.)
How is the trick achieved? Modern management demands “excellence” and the desired, excellent individuals must be contradictory types: creative thinkers whose rebellious, disrespectful energies must constantly excite but never, ever upset the corporate applecart. This creative stereotype (which no human being can fulfil for long without coming to harm) has become a sort of storm-trooper for labour and management policy since 1950. Where and when was it forged?
One can see it taking shape in the work of the “creativity gurus”, like ad-man Alex Osborne, who seized on Guilford's "divergent thinking" and developed it in yet more management-friendly (and even more scientifically suspect) directions.
Osborne (the "O" in advertising agency BBD&O), added an important twist to creativity with his idea of “brainstorming” (1959), which rapidly became a management-training industry in its own right. Whereas Guilford was interested in discovering creative individuals, Osborne (and subsequent rivals, like Edward de Bono with his “Lateral Thinking” and W.J.J. Gordon with his “Synectics”) insisted that anybody could be creative -- provided they submitted to the rules of the game. Thus, brainstorming introduced a Calvinistic element: in principle, anyone could be one of the elect, but nobody could be sure that they actually were of the elect, until the “day of revelation” (the brainstorming session itself). And on that Day, it would be up to the Manager, or the Group, to decide who was creative, and who was not.
The elements of game-playing, and submission to ritual and quasi-paternal authority, helped foster the idea of work as compulsory fun for infantilised workers, with a gangsterish presumption against those experienced workers who "knew too much" to join in the fun, or to sustain a convincing pretence of doing so.
The "creativity training industry" continues to flourish in the consumer economies. Anna Craft (an educationist) describes a recent shift of focus onto “everyday creativity” by (for example) the UK’s National Advisory Committee for Creative and Cultural Education (NACCCE), which advocates teaching all workers how to be “creative” (Anna Craft, 2003). Which might be interpreted as a way of saying “teaching workers to enjoy whatever work they’re given”: a cheap alternative to the Government’s failed “high-skill, high-value economy” project (Lloyd and Payne 2002; Keep, 2002, Bolton 2004). And this is linked in with a boom in motivational courses and training in general, which increasingly cover newly-defined "skills", such as team-working, "motivation", and even “creativity”.
I joined my first advertising agency as a trainee copywriter in 1976. I had been trying to make a living as a calligrapher and signwriter and I learned about advertising while helping a commercial artist friend to do "finished art" (camera-ready artwork) for local advertising agencies.
“Finished art” deserves a mention: it was perhaps a good example of the persistence, or re-creation, of skill in a notionally de-skilled industry. Hand-lettering had been a core skill in commercial art, but it had largely been replaced by new technologies including Letraset dry-transfer lettering, the IBM “golfball” typesetter, and “bromides” (quick photographic reduction/enlargement systems for line and halftone artwork). However, the new technologies gave rise to new and interesting skills –"bashing down Letraset" required a well-evolved technique involving adhesive tape, an expensive, coated art-board known as “CS10” and a surgeon’s scapel. Artwork was assembled with Cow Gum on CS10, with a neat paper cover, and an NGA (National Graphical Association) stamp on the back. If there was no NGA stamp, the ad could not run.
This was generally agreed to be “skilled work”. It was explicitly not what the advertising agencies would call "creative" work (and we never used the word) yet it had all the features of creative work, as described for example by Mihalyi Csikszentmihalyi (1975), in terms of knowledge, skill, control and risk-taking, and the subjective pleasure of doing the job. The finished piece was always a delight to the appreciative eye. We often remarked that it had much greater aesthetic merit than the ad itself (more often than not, a scene of retail carnage announcing “Prices slashed!” and “Massive savings!”). We took care that things were trimmed straight and clean with no nasty scabs of extruded Cow Gum. The process-camera wouldn’t have minded, but we were judged by humans and did things properly. It seemed sad that so few people ever saw our work. In an earlier generation people would have seen it all the time: we would have been up on step-ladders, doing it in public, earning small amounts of admiration but, of course, less money. On the other hand, we were not responsible for the ads – we just did the artwork – which was comforting.
Within the advertising agency I found myself embroiled in a welter of undeclared, laterally-displaced conflicts. I learned that I was a “creative”, my boss was the “creative director”, and I was part of a “creative team”. It was difficult to establish trust with one’s fellow team-members because nobody knew for sure what anyone else was earning, and there was continuous rivalry between the various creative teams – mostly over who got to work on the prestige accounts and new-business pitches. But some trust was achieved – largely I think through the intense, rich and often hilarious discussions we had, in the necessary search for agreement and clear criteria in the tricky matters of expression and design.
A creative team consisted of a copywriter and a “visualiser” -- later on the title changed to “art-director” -- and perhaps a junior or two, depending on volume of work on that team’s accounts. A busy retail account might keep a number of visualisers squeaking away with their Magic Markers, while the copywriter agonised over alternative ways of saying “prices slashed”. The visualisers had generally done some sort of art school training but the copywriters had no training at all (and received none, apart from occasional peer-comment). But training was for some reason an alien concept: this was a game in which one supposedly won or failed on one’s one merits, and indeed, progression could be rapid. It was less obvious what happened to people who did not progress – although I do remember an inconclusive conversation once that began “I wonder what happens to old copywriters?”
Each creative team took its instructions from an account executive, who had access to The Client. Account executives were always known as “suits”. The suit would prepare a written brief for each job, which the Creative Director would check and perhaps send back for revision before passing it on to the team, who might also challenge it. The relationship between creatives and suits was plagued by mutual distrust, incomprehension, fear, and contempt – although occasionally friendships would develop, giving rise to the possibility that the new grouping would defect and set up an agency of its own, taking its clients and a few favoured colleagues with it.
After approval by the client, the team’s visuals and copy (a.k.a. “concepts”) went to a place called “Production”, where the creatives' rough efforts were translated into precise, visual specifications for the guidance of the finished artists, typesetters, illustrators, photographers, printers etc who had to turn our concepts into something that could be physically produced and would look OK. Production work was strictly segregated from creative work. It was skilled in the traditional sense: you could assess it by objective criteria (at the most basic, whether the type fitted or not). Production people were sometimes unionised and always went home at 5.30 pm or got paid overtime. Creatives were not unionised - indeed, the idea seemed alien – and stayed at work till 6.00pm or later and never claimed overtime. Looked at logically, production workers had the better deal – but they were very clearly a disparaged class, subservient to the Creative department, which was where the fun was, apparently.
In the “new economy”, there has been quite general recognition of the phenomenon of “work disguised as fun” (e.g., Ross, 2004).
The general idea is that if workers can be persuaded that they are enjoying themselves, and have special privileges that other workers don’t have, they will be content with lower wages. This is probably true (as it was true in the creative department) but there is an additional dimension: the power-assymetries that lurk in the grey, undefined area where perks of the jobs blend into “pilferage and the fiddle” (Ditton, 1977).
Creatives had a range of privileges. The most important privilege was a certain, circumscribed freedom to do what you had always wanted to do: drawing or writing – plus other activities that helped you get better at it: arguing, looking through portfolios of photographs, viewing showreels. It blended imperceptibly into debatable areas: wandering off to the shops for the afternoon, ostensibly to look at “the product” through the eyes of “the target market”; spending half the morning in a café because “the ideas flowed better” that way; in some cases, drinking four or five pints of Guinness at lunchtime because “I write my best ads when I’m pissed”.
These privileges perhaps created what Ditton calls “a hedonistic surplus” (a sense of having been rewarded over and above the value of the time stolen from the company); more importantly, they were indeterminate in the same way as the privileges of the rural poor during the period of the enclosures: an action tolerated as mere pilferage one day could be punished as theft the next. They were always available for managerial challenge, and so worked ultimately to the disadvantage of the employee.
A fascinating paper by Michel Anteby (Anteby 2003) discusses the practice of “perruque” (literally, “the wig”: making things for personal use in the firm’s time with the firm’s materials) in French aerospace factory. This is another category of “indeterminate privilege” very prevalent in the advertising agency – and very definitely nearer to the larceny end of the pilferage/theft spectrum. Anteby observes that perruque was beneficial to the company as well as the workers – for example, improving social cohesion, signalling trust and status, and forging useful friendship- and knowledge-networks that supplemented the narrower, less responsive formal company structure. In the advertising agency “perruque” activities ranged from the theft of art materials and “doing foreigners” (one’s own freelance jobs – possibly for other agencies) to the design of elaborate leaving and birthday cards – and a sort of currency in favours of this kind. In one agency where I worked, there was a brief craze for making scurrilous but unattributable audio-tapes about the detested creative director. These were produced in an ingenious if laborious way: the message was carefully written out backwards so that it could then be read backwards (with some difficulty and helpless hilarity) into a tape-recorder. The tape was then reversed for playback by careful splicing, and left lying around in the boardroom. This sounds like an act of rebellion but if it was, it had little effect. We all felt a great thrill of childish guilt and glee, but the creative director stayed put.
In Burawoy’s machine shop, life was dominated and shaped by the game of “making out”. In the creative department, the equivalent obsession was the constant desire to do “great ads” – which is to say, ads that you would be proud to say you’d worked on, and which would get you a job in a “good” advertising agency. A “good” advertising agency was one where, it was assumed, your creativity would at last be recognised, encouraged and rewarded.
In the late 1970s “great ads” included a famous series of surreal poster ads done for Benson and Hedges cigarettes by Collett, Dickenson and Pearce (CDP); David Abbott’s ads for Volvo cars and Parker pens; and the Cadburys Smash Martians. These were the ads our friends and family thought of, when “advertising” was mentioned. Then, as now, “great ads” were an infinitesimal fraction of all advertising – but we somehow believed that we could and should be doing ads like these. People would throw endless energy into any project that offered the chance of producing ads that would “look good in the book” (portfolio) -- their passport to a good agency – and by the time I left the world of ad-agencies in the mid 1980s, competition for creative jobs was so intense that young and no doubt very talented graduates would work for nothing as “interns”, just so that they could “build up a book” and say they’d worked for Saatchi and Saatchi, DDB, BBH, JWT or what have you.
Every week we pored over the new issue of the advertising magazine Campaign to see who had won what accounts and awards, who had done which great ads in which agencies, and speculate on the chances of getting a job there. Successful teams were shown in a remarkably formulaic and consistent way, which I am quite amazed to find is still going strong 30 years on. A typical article features a photograph of 2 bored-looking men staring scornfully at the reader from a risky-looking back-alley. One wears a suit, the other some exclusive-looking casual attire. In the copy, they speak of their steely commitment to producing "great advertising" and their contempt for the mediocrity, complacency, failure to take risks and lazy thinking that are vitiating the industry and endangering its very survival.
The suits were generally thought to be conspiring to prevent us doing great work. They would insist on cramming extra items into an ad – sabotaging its elegant, forceful message – or allow the client to modify a cleverly-wrought headline, or come up with inconvenient facts that needed to be mentioned in the ad, undermining the whole, daring proposition. Or ditch the “creative” campaign we’d worked on in favour of a bold-and-bloody price offer.
Or they would sidle up to us with a sad little brief with no budget and a confused rag-bag of products and ask us to “do something creative” with it or, even worse, “jazz it up a bit” – betraying the full, horrifying extent of their incomprehension. They thought creativity was something we could just sprinkle on the product, like fairy dust.
Viewed as a game system, there was a huge amount going on here – much more than in the relatively straightforward machine-shop where Burawoy worked. Indeed, the “suits and creatives” division seemed to allow management to run two different and sometimes mutually conflicting games simultaneously on the same problems – an outrageous waste of human talent, but clearly a sound business-proposition when talented workers can be persuaded to give so much of their time for nothing, and every producer of goods or services in the economy is obliged to advertise to the limit, or perish.
Burawoy examines the role of the “internal market” in jobs, in helping to maintain consent. In his machine shop this took the form of an evolving system of job-differentiation, pay differentials and incentives for long service, which kept skilled workers from leaving and reduced the scope for industrial action. In the advertising agencies (and apparently in more modern “creative industries”), the internal market was very different.
A striking feature of ad-land’s internal market was a sort of “speciation”: two quite separate populations of workers (“suits and creatives”) that had become so different at the genetic level they could no longer interbreed. It was very rare indeed for a suit to transfer to creative work, or vice versa. Both were exploited. But there were also power-assymetries between the two groups.
The "suits" included the senior managers, account-handlers, researchers, planners, media-buyers and PR people, and it was quite common for individuals to move up or across the food-chain from any of these quite different starting points. Suits might also "move over to the client side" (or "come in from the client side") – i.e., go and work for a client company as their marketing manager or what have you.
Among the creatives, there was almost no internal career progression. In my experience it was rare for someone to start as a junior and then work their way up to team-head, let alone creative director or managing director. The general idea was that you would “get your book together”, “build up a showreel” and then jump ship to another (better) agency. Appointments were nearly always made from outside, with as much of a fanfare as possible and an eye to the agency’s prestige. A new, junior team joining the agency would be described as “young, raw talent” and perhaps “hungry for awards” (an implied threat to the complacent natives). When making senior appointments, notoriety, fame, exoticness all seemed to be important criteria. And while it could be an excellent move for a suit to “move to the client side”, it was career-suicide for a creative to do the same thing: in-house creatives were assumed to be very inferior life forms with hopelessly low creative standards and skills.
There were further species-barriers within the creative department itself. There was one between writers and art-directors, and a massive one between the creatives and the production people. It was most unusual for an art-director to switch to writing or vice-versa -- and a severe breach of etiquette when a writer attempted to produce his or her own visuals. Likewise, it was very rarely that someone from "production" progressed to the creative department, and it would have been an outrage for a production layout-artist to be asked to do a visual.
These demarcations were not the idea of any trade union, and no shop-steward could have enforced them as successfully. They operated at such a deep level of the culture that they did not even need to be mentioned. They meant that quite simple jobs could become much more complicated and take much longer than they need have done. In prewar times (before the creative department existed) the writer's copy would go straight to a "commercial artist", who would do all the visual design work from "the creative bit" through to final type-spec. In my day, all "concepts" had to pass through the intermediation of an art director -- whether talented, talentless, literate or illiterate -- who would strain to impart some creative magic to the subject-matter, whether it was wanted or not.
The worst asymmetry was the asymmetry of truth-telling. For example, the “suit” could be fairly candid about the merits of the product being advertised: he or she could decide not to sell it on its merits and go on price instead. The creative however had to find or create some merit in even the most dismal offering, even if the client candidly admitted the product was “crap” – and to do that you had to enter into the delusion heart and soul on a daily basis, and come up with something that, even for five minutes, seemed “great”. The sack awaited anyone who admitted “this is the best we could come up with”. And to get promoted, to get onto the good accounts, to get your next job, you always had to appear to be loving your work, that you were proud of the ads in your portfolio (and in many cases, to pretend that they were your own) and that you sincerely believed the work you were doing right now was the best fun and the most interesting challenge you’d ever had in your life.
Richard Florida, in his “Rise of the Creative Class”, paints an optimistic picture of a new kind of worker, emerging in all areas of life, who is for some unexplained reason “no longer satisfied” with the regimented, predictable lives people led until as recently as 50 years ago. The reader is coerced into the vision by incessant use of the first person plural: “we pack every second full of creative stimuli” (p14); “we trade job security for autonomy” (p13), “we progress from job to job with amazingly little concern or effort” (! – p7).
He defines “creative” largely by lifestyle, and this brings all manner of other workers into the creative fold: lawyers and accountants (ones who mix work and play; wear no tie, work in the café), hairdressers, restaurateurs (who love cooking). To what extent are these workers like the “creatives” in advertising: insecure workers putting on the bravest possible face; in effect, commodifying themselves?
It is obviously not the case that every one of this new, creative class really is nurturing their creativity and feeling existentially fulfilled every second of the day. The hard data on earnings (for example in Benner, 2002) show that the creative classes have been sold a pup: old, boring job-progression and security till retirement have been traded for a quick buck in your late twenties, and the “fun” of “walking a lifelong tightrope” (Benner). What Florida is actually witnessing, is a nightmare, where ever more people are making ever more frantic efforts to present a care-free, in-control, autonomous appearance to their apparently care-free, in-control, autonomous friends and colleagues.
All of which is the inevitable consequence of fixation on the products of creativity, and failing to pay attention to the experience of creativity.
Thus, the word "creative" has been purloined by capital, but in a rather silly and one hopes ultimately futile way. It has got the label but not the thing. The experience of creativity either happens or does not happen according to its own rules, which are increasingly well-understood (and, increasingly, understood by workers themselves). It absolutely demands the very conditions that capital is most intent on eliminating: physical and emotional security, abundant knowledge of task and context, accumulated facility at deploying that knowledge, freedom to apply it as one sees fit, an interesting challenge to apply it to, and, usually, a keen-eyed, good-humoured audience for one's efforts. Just about everything one can think of that has value, in whatever sense, starts life that way - be it a theory of matter, a TV ad, or an earthenware pot - so that all well-made things are lurking threats to capital, insofar as they demonstrate to human beings what human beings can do, when they have the chance.
The proper response to the idea of a "creative economy" is probably Ghandi's, apropos of Western Civilisation: "it would be a good idea". And it isn’t impossible to imagine what a creative economy would be like: we cherish what’s left of economies that relied more on human creativity than ours does. The beauty of a hand-built wall or street, or a farm wagon, or piece of furniture – is as much a physical experience as a visual one.
Well laid-out typography and beautifully-lit photographs are pretty good, too – the tragedy being not that “creatives” spend so much time fussing over these things, but that these things can only be done, in the present economy, in special ghettos that are increasingly segregated from everyday life, and dedicated to the furtherance of waste.
They are all intimations, however, of a world in which human life and activity might be a wonderful addition to the environment.
Bob Hughes, September 2006
Anteby, M. (2003). "The 'moralities of poaching - manufacturing personal artifacts on the factory floor." Ethnography 4(2): 217-239.
Baran, P. A. and P. M. Sweezy (1966). Monopoly capital. An essay on the American economic and social order. New York & London, Monthly Review Press.
Benner, C. (2002). Work in the New Economy - Flexible Labor Markets in Silicon Valley, Blackwell.
Bolton, S. (2004). Conceptual Confusions: Emotion Work as Skilled Work. The Skills that Matter. C. Warhurst, I. Grugulis and E. Keep.
Bowring, Finn (2003). “Manufacturing Scarcity: food biotechnology and the life-sciences industry.” Capital and Class 79:107-144
Burawoy, M. (1979). Manufacturing consent : changes in the labor process under monopoly capitalism. Chicago ; London, University of Chicago Press.
Craft, A. (2003). "The Limits to Creativity in Education: Dilemmas for the Educator." British Journal of Educational Studies 51(2): 113-127.
Csikszentmihalyi, M. (1975). Beyond boredom and anxiety : the experience of play in work and games. San Francisco ; London, Jossey-Bass.
Damasio, A. R. (1994). Descartes' error : emotion, reason and the human brain. London, Picador.
Ditton, J. (1977). "Perks, pilferage, and the fiddle: The historical structure of invisible wages." Theory and Society 4(1): Pages 39 - 71.
Florida Richard, L. (2002). The rise of the creative class : and how it's transforming work, leisure, community and everyday life. New York, Basic Books.
Gregory, R. L. (1981). Mind in science : a history of explanations in psychology and physics. London, Weidenfeld and Nicolson.
Gregory, R. L. (1998). Eye and brain : the psychology of seeing. Oxford, Oxford University Press.
Howkins, J. (2001). The creative economy : how people make money from ideas. London, Allen Lane.
Johnston, P. (1959). Edward Johnston. London, Faber and Faber.
Keep, D. E. (2002). "ICT and its Impact on Skills and Creativity - Transformatory Catalyst or Dependent Variable?" [Online]. Available: http://www.terra-2000.org/Terra-2002/Pages/abstract_prague.htm
LeDoux, J. E. (1998). The emotional brain : the mysterious underpinnings of emotional life. New York, Simon & Schuster.
Lloyd, C. and J. Payne (2002). "On the 'Political Economy of Skill': Assessing the Possibilities for a Viable High Skills Project in the United Kingdom." New Political Economy 7(3): 367-395.
McChesney, R. and J. B. Foster (2003). "The Commercial Tidal Wave." Monthly Review 54(10).
More, C. (1980). Skill and the English working class, 1870-1914. London, Croom Helm.
Ogilvy, D. (1983). Ogilvy on advertising. London, Pan.
Pope, R. (2005). Creativity : theory, history, practice. Abingdon, Oxfordshire ; New York, NY, Routledge.
Ross, A. (2004). No-collar : the humane workplace and its hidden costs, Temple University Press.
Varela Francisco, J., E. Thompson, et al. (1991). The Embodied mind : cognitive science and human experience, The MIT Press.
Vernon Philip, E. (1970). Creativity. Selected readings. Edited by P. E. Vernon. Harmondsworth, Penguin Books.
Warhurst, C., I. Grugulis, et al. (2004). The skills that matter. Basingstoke, Palgrave Macmillan.
Weisberg, R. W. (1986). Creativity : genius and other myths. New York, W.H. Freeman.