Music retailing loses its voice…

"HMV" by Margaret Manchee (courtesy of the artist)

“HMV” by Margaret Manchee (courtesy of the artist)

The decision last month by HMV (UK) to go into administration is further indication of how traditional bricks and mortar music retailing has not managed to keep up with trends. A combination of new technology, different purchasing habits and industry fragmentation has seen the retail model come unstuck – in a similar fashion to chain store book retailing.

Few mainstream or high-street music retailers have managed to survive unscathed in recent years – Tower Records and Virgin Megastores have disappeared from all major markets, and Sam Goody has been re-branded in the USA – although HMV retains stores in Hong Kong and Singapore, and both Tower and HMV stores operate in Japan under local licenses to private equity investors; Virgin retail still has a presence in France, where it competes with the domestic chain of FNAC. Otherwise, it’s mostly local independent and specialist stores that manage to keep going, although in Australia the national chain stores JB HiFi and Sanity appear to buck the trend.

One reason why the major music retailers have not survived is that in order to grow and diversify their sales turnover they started stocking books, DVD’s, games, merchandise, concert tickets and audio accessories. This meant that they reduced the amount of rack space given over to music, and as a result they lost their retailing focus.

Another factor for their demise is that like their counterparts in book retailing, they become over-reliant on high volume sales of best-selling product put out by the major music labels, overlooking the fact that the average sales for best-selling albums have been declining since the 1980’s. They ended up selling fewer copies of each title, and compounded their sales decline by reducing the number of artists/products/genres that they stocked. At the same time, the 6 major global record labels that dominated in the 1980’s have been whittled down to just 3 – Universal Music Group, Sony Music Entertainment and Warner Music Group. (It’s virtually the reverse of the long tail theory, which has been a contributing factor to the success of Amazon in book and music retailing.)

In contrast, local independent and specialist music stores have kept innovating, and kept abreast of market trends. For example, international Record Store Day each April sees music fans queuing at dawn around the block at their local record store to get their hands on exclusive and limited releases, releases that are often produced in analogue formats of vinyl and cassette.

Relying on the trio of global record labels to supply major new product and maintain bestselling legacy back catalogue meant that the music megastores became totally removed from the development of new artists, new product and new genres. Whereas, the independent and specialist stores have a vested interest in spotting and supporting new local talent, and in building stronger relationships with their customers – both on-line and in-person – via special promotions, in-store performances, and limited one-off releases. The megastores simply lacked the wit, wisdom, flexibility and credibility to deploy creative sales tactics or develop personalised customer experiences.

HMV’s closure in the UK is yet more evidence of how the old world music industry business model has been broken, except for one important area: marketing. The major labels (and an increasing number of independent labels) still have considerable marketing clout. This is a similar story to the book-publishing world, which is likewise dominated by a few global houses. But even with their marketing budgets, the major labels are under threat from viral marketing, social networking and direct-to-consumer distribution.

I would argue that the major labels have always been their own worst enemies. For around 50 years, from the 1940’s to the 1990’s, the majors tired to control all aspects of manufacturing, distribution, publishing, licensing, sales and marketing; the Virgin and HMV (aka His Master’s Voice) stores had their origins in record labels, and at various times the major labels also developed recorded music technology – HMV and gramophones, Phillips and CD’s, Sony and the Walkman etc. Vertical integration is all very well, but unless the content is continuously refreshed, the audience starts to tune out; and the one thing that the majors have never been very good at is identifying and nurturing new talent or spotting /developing new trends in music.

From the 1950’s when Sun Records unleashed rock’n’roll on the world, through to the 1990’s when the Sub Pop label defined the “Seattle sound” of Nirvana and grunge, the majority of interesting new music has been fostered by independent labels – the hey-day being the late ‘70’s and early ‘80’s when punk brought the means of production to the participants themselves, allowing musicians to engage directly with their audience and without having to be intermediated by the majors. I’m thinking of innovative UK labels like Stiff, Chiswick, New Hormones, Rough Trade, 4AD, Step Forward, Factory, Zoo, Mute, Eric’s, Fast and Postcard. The majors only picked up on this new music once it had been developed, tested and cultivated by the small independent labels.

Having survived the post-punk interregnum of the independent upstarts (often through mimicry and imitation via so-called “boutique” labels launched by the majors themselves) the majors regrouped in the 1980’s, only to flounder once more when grass roots music movements like rap, hip-hop, house, electronic and techno emerged in the mid-to-late 1980’s.

More recently, the majors over-looked the potential of the Internet and digital music – they failed to embrace the new technology, and instead they tried to control and suppress it. Witness the majors’ failed attempts to sell direct to consumers via their proprietary on-line platforms, the proliferation of different and incompatible digital formats, and the over-zealous digital rights management systems (some of which even locked content after a fixed number of plays!).

Although Apple’s iTunes platform has transformed and opened up the sales and distribution of digital music, the marketing is still dominated by artists whose major labels are willing to buy shelf space and pay for promotional content. In effect, iTunes is the new music megastore.

The latest frontier in digital music is geo-blocking – which means some content on iTunes is not available in all markets, or it is sold at vastly different prices between markets – a practice that also applies to software, films and other digital content, and an issue that is likely to come under regulatory review in the near future.

Where do I see the future of the music retailing? Although predictions are incredibly difficult when the whole industry is so fragmented, I think there are 3 key (but unrelated) themes emerging:

1. Although total CD sales continue to decline, and illegal downloading threatens commercial sales of digital music, sales of vinyl records (both new and back catalogue) seem to be increasing. Some back catalogue titles previously issued by the majors are being licensed to independent labels that restore and curate this content – suggesting that the majors have little interest in their own legacy. Both newly issued and reissued vinyl records frequently come bundled with a copy of the CD, or with access to digital files, and often feature bonus material. To me, this implies that consumers want the “authenticity” of vinyl, along with the artwork, sleeve notes and tactile/contextual experience of the music, but they also want the convenience of portable music. It  suggests that well-presented content will generally find a market, as long as the music labels and record stores continue to connect with their audience.

2. TV talent programmes like “American Idol” reinforce a very narrow, shallow and ultimately sterile style of music, delivered via a karaoke production line. This says more about the entertainment industry’s need to sell and cross-promote new talent rather than any appetite for investing in original and creative artists or content. Let’s assume that the participation in (and the audience for) these shows is rooted in show biz rather than the music biz, but does anyone really think that any of these latter-day pop idols will ever have a back catalogue to match the likes of David Bowie or Joni Mitchell?

3. Digital music technology means that anyone and everyone with a smart phone or a tablet can make their own music and distribute it via the internet without leaving home, without signing publishing deals, without entering into a recording contract and without paying royalties. A lot of musicians choose to self-release and control all aspects of production, marketing and distribution, by-passing the “traditional” music industry altogether. However, this democratisation of music production introduces a series of paradoxes – the increased quantity of content does not necessarily equate to increased quality; the commoditisation of music reinforces its disposability; and in all this “noise” it’s increasingly hard for new artists to be heard or discovered. Which is why the major media channels will continue to dominate and influence most of what we get to hear, and control the sales and distribution.

Why Francis Bacon would never be on Facebook

“Champagne for my real friends, real pain for my sham friends” is a dictum widely attributed to the  20th century artist Francis Bacon, although its origins have been traced to the  late 1800’s. Whatever its provenance, Bacon is known to have used the phrase frequently in the company of friends and hangers-on in the pubs and clubs of London’s Soho district. It was a sort of rallying cry when he was buying drinks for his companions – some of whom were close friends, others were mere acquaintances, associates, groupies and antagonists.

Bacon died in 1992, but even if he was alive today, I doubt he would have used Facebook. Not because he was out of touch with popular culture (the collection of source material from his studio attests to his artistic interest in photography, sport, film, magazines, advertising etc.). No, his antipathy to Facebook and other social media would be based on the inability to distinguish between “real” and  “sham” friends. Facebook may allow users to categorize “friends” as Close Friends, Family, Acquaintances, but this is mostly about levels of sharing and frequency of updates; it does not really allow for more subtle categorisation reflecting the different types and varied nature of relationships we have with our professional and personal contacts; nor does it allow us to distinguish between sub-categories (e.g., “friends I’m willing to have dinner with”, “cinema friends”, “family we visit for the holidays”, “Friday night drinks colleagues”, “clients to invite to the cricket” etc.)

The Internet in general (and social media in particular) is a great leveller, but has the capacity to reduce all our real-world relationships to a homogenous mass of digital contacts. 

Would you take career advice from a sushi chef?

MV5BMTkzMDU1NzQxNV5BMl5BanBnXkFtZTcwMDAxODkyNw@@._V1_SY105_CR41,0,105,105_

The cohorts of Baby Boomers who entered the workforce during the latter stages of the Industrial Age represent the last generation who contemplated lifelong employment in the same career, if not in the same organization or even in the same job. Here in the Information Age, with increasing numbers of employees engaged in knowledge work, the notion of a single career for life, let alone a single job for life, is pure fantasy.

In the Information Age, our willingness to embrace career change is as important as our ability to develop and maintain our core technical skills. For example, while we may think it is necessary to become experts in the latest technology, it’s equally important to understand how and why that technology is being deployed in particular situations – this is where the real learning occurs, as both the content and the context for that technical application will inevitably change.

The Agrarian Age helped define the concept of life-long occupations – in agriculture, the military, government service, science and medicine, the trades and professions, and even among unskilled labourers.  Think of the workers who toiled their whole lives on building the great mediaeval cathedrals, never to see the final results of their labour as those major construction projects took several generations to complete.

The Industrial Age ushered in occupations that relied on workers acquiring and applying technical, practical and manual skills that in essence changed very little during their lifetime, particularly on manufacturing production lines. This era also saw the development of the formal workplace and business establishments, in contrast to the largely home-based work patterns of before.

The Information Age continues to see rapid changes in workplace structures, employment patterns and career development. This change demands that knowledge workers constantly improve their skills – keeping up to date with new technology, engaging in the latest management theory, embracing new business models. This continuous learning process is not best served by staying in the same role, the same environment or the same mindset for lengthy periods. Personal change is a surer way of keeping in touch with universal changes.

So for latter-day job seekers who are looking for insights into their own career choices and options, why would they take career advice from someone who has been doing the exact same thing for 50 years or more?  I was reminded of this when a recent edition of my high school alumni newsletter reported that a long-serving member of staff had retired after more than 40 years in the job. During my own time at the school, this particular teacher was also the careers adviser, and without meaning to disrespect his teaching abilities, why would anyone take careers advice from someone who had stayed in the same job his whole career?

And yet, who could fail to appreciate the explicit career advice in the critically acclaimed documentary, “Jiro Dreams of Sushi” (made by David Gelb in 2011)?

Jiro Ono has been making sushi for over 70 years, but continues to hone his skills as a sushi shokunin, always seeking perfection, constantly finding new and better ways to create his dishes. As a master sushi chef, Jiro makes sure he knows his suppliers and is familiar with their produce. As a leader he is quick to acknowledge that the food he serves to his customers is the result of much hard work and detailed preparation by his team of chefs. As a teacher, his Michelin 3-Star restaurant also offers lengthy (and highly valued) apprenticeships to aspiring itamae who are willing to dedicate themselves to pursuing their craft.

Even though the daily process of producing the highest quality sushi seems repetitive and even tedious, it is the willingness to face each day as both a new challenge and a fresh opportunity to improve one’s skills that gives Jiro his core purpose and sense of career satisfaction.

From personal experience, my own career development continues to be about defining my core values and improving my skills, understanding how to apply them in new situations, and how to enhance them by learning from colleagues, mentors, clients, suppliers and competitors, or from on-the-job and formal training.  Like Jiro the sushi shinkonin, I try and make this a daily process, by reflecting on how something can be done better or by understanding how new information can be incorporated into existing solutions.

Many of us working in the Information Age will recognize that we don’t pursue a single, linear career path, but engage in a series of both distinct and overlapping career sequences, connected by a common thread of transferable skills and inter-disciplinary learning applied to new roles, new projects or to new client engagements. Our challenge is to ensure we maintain purpose, relevance and a sense of direction as we navigate our “transactional” careers.

Footnote: The soundtrack for “Jiro Dreams of Sushi” features several compositions by Philip Glass, which seems totally appropriate, on several levels:  Glass, like fellow minimalist John Cage, is attracted to various aspects of Japanese culture; and as a minimalist, Glass’s music is often criticised for being repetitive, even boring – but attentive listening reveals that the repetitions subtly shift, revealing minuscule changes in pattern, rhythm and texture – much like every piece of sushi tastes subtly different.

Is being “creative” more authentic than being “realistic”?

How do we judge something to be authentic in the Information Age? In the 1990’s, I worked on several projects to transfer reference books from print to CD-ROM and on-line formats. Because much of this material comprised official documents of record, the digital versions had to be “authentic” to the hard copy (even though they were being presented in a totally different medium) and employ embedded cross-referencing, indexing and other navigational tools. In short, the digital editions had to have the visual likeness of a microfiche copy, the readability of an e-book, and the functionality of an html5 website (if I may be permitted a mixed technical metaphor).

The quandary facing many product developers and content curators these days is, “How far should we go in the pursuit of “realism” (and by inference, “authenticity”) when having to make editorial, creative and technical choices to achieve credible outcomes?” And as consumers, the challenge we face is, “How do we know that what we see, read, hear or experience is an accurate depiction of something that actually exists or once happened/existed, or that it represents a consistent rendering/interpretation of real/imagined/possible events within the context and confines of the media being used?”

The issue is not about “real” in contrast to “virtual”, “original” as opposed to “replica”, “copy” rather than “counterfeit” – and certainly not about “truth” over “fiction”.

I’m not going to dwell on whether our virtual lives are any more/less authentic than our flesh and blood existence – that’s a matter of EI and self-awareness. I’m not interested in debating the merits of CGI technology in cinema, or questioning the use of auto-tuning in pop music – that’s a matter of aesthetics. And I’m not even going to argue that Photoshop has no place in the news media – that’s a matter of ethics.

I’m more concerned with understanding how technology, combined with content, connectivity and convergence has reshaped the way we engage with new media, to the point that our ability to assess information objectively is impaired, and our experience of authenticity is seriously compromised.

Now for a test: Which of the following statements is the most authentic (or least inauthentic)?

1) “Documentary claims NASA commissioned film director Stanley Kubrick to fake the TV images of the Apollo 11 moon landing”

2) “Pop singer Beyonce mimes to the national anthem at President Obama’s Inauguration”

3) “Jane Austen to publish a new edition of “Pride and Prejudice”, featuring FaceBook, Twitter and sexting”

4) “Apple Corp announces that The Beatles are reforming, and will be performing their 1967 album “Sgt Pepper” live on tour”

OK, before dissecting the answers, I confess that one of these scenarios is totally made up – although, as we shall see, all of them have some basis in “reality”, and each of them presents a different dimension of “authenticity”.

1) Moon Landing: A few years ago, a documentary by William Karel called “The Dark Side of the Moon” suggested that NASA had indeed faked the Apollo 11 broadcasts. This story was based on an actual conspiracy theory that the TV images were a hoax, giving some credence to the notion that the Americans never went to the moon. The documentary uses a combination of recycled/re-contextualized archive footage, scripted interviews featuring real people playing themselves and professional actors playing fictional characters. To add credibility to the hoax theory that NASA commissioned Stanley Kubrick to shoot the fake moon landing in a studio, Karel involved Kubrick’s widow and other former colleagues. However, the names of the fictional characters are taken from characters in Kubrick’s own films. There are also bloopers and out-takes from the “interviews”. So, by the end of the film, it should be clear that the whole thing is a clever set-up – except that for some moon landing sceptics, “The Dark Side of the Moon” has lent support to their conspiracy theory. Recently, Gizmodo posted a brilliant rebuttal to the hoax theorists – namely, that neither NASA nor Kubrick could have faked the moon footage in 1969 because the required technology didn’t exist at that time…. I guess we’ll call this one an authentic/fictional mockumentary based on a real/imagined conspiracy theory concerning an alleged/improbable hoax.

2) Beyonce: It was revealed that Beyonce lip-sync’d her rendition of the national anthem, but she was miming to a real recording that she made with the actual US Marine Band the day before. Apart from the ongoing debate about whether pop singers do/don’t or should/shouldn’t mime during live performances (and let’s not get into the use of pre-recorded backing tracks…), the issues here are three-fold:

a) Does it make any difference to our experience of the event? (Probably not – anyone who heard Meatloaf perform at the AFL grand final a while back probably wishes he HAD been lip-sync’ing…)

b) Is Beyonce the first performer to mime at the Inauguration? No, and she won’t be the last, so big deal (Pre-recorded material is often used in these situations to compensate for bad weather, poor acoustics or possible technical hitches).

c) Does it make for a less authentic event? Possibly, but as others have pointed out, the President had taken the official oath the day before, and the outdoor event was more of a ceremony.

So, I’ll just label this an innocently staged event incorporating a well-intentioned fabrication designed to give the public what they want.

3) Jane Austen: This particular example of literary license does not involve the posthumously discovered work of a 19th century novelist. It doesn’t feature a 21st century medium channelling words from a dead writer. It doesn’t even concern the literary conceit of a contemporary author attempting to re-imagine a sequel to a classic work by an illustrious predecessor. (Although similar publishing events to all three scenarios have occurred in recent memory, so each of them is theoretically possible.) Instead, this refers to the forthcoming third “Bridget Jones” novel by Helen Fielding. It is generally  acknowledged that Jane Austen’s “Pride and Prejudice” was a reference point for Fielding’s first novel “Bridget Jones’s Diary”. The latter is neither a pastiche nor a parody of Austen, but does use similar themes and scenarios from “Pride and Prejudice” and places them in a contemporary context. Given that Fielding has recently been quoted as saying she has an interest in internet dating, it’s not too far-fetched to suggest that her characters will be busily sexting each other after a long session in the local wine bar. Let’s put this one in the category of artistic hommage, respectfully and authentically executed with due deference to its literary source material, and with a keen awareness of contemporary mores.

4) Sgt Pepper: OK, I admit that this scenario is totally fake, but it provides for some interesting hypotheses on how it might be done (assuming today’s technology, so no time-travel involved).

First, some background: as part of the music industry’s infatuation with managing and curating its back catalogue, there has been a noticeable trend for artists to tour and perform entire “classic” albums live on stage. This phenomenon reached its zenith last year, when German electronic band Kraftwerk performed a different album from their back catalogue at eight consecutive concerts staged in New York’s Museum of Modern Art. (Actually, I recall seeing Pink Floyd in 1977 when they played their two most recent albums, “Animals” (1977) and “Wish You Were Here” (1975) in full and in the exact same sequence as the original LP’s…)

Second, until cryonics and human cloning are a scientific certainty, I won’t be suggesting that we exhume the two members of The Beatles who are no longer with us, or grow a couple of replicants. Equally, I’m not interested in whether the 2009 interactive video game, “The Beatles: Rock Band” allows me and my friends to re-enact the experience of being The Beatles playing on stage – it’s not the same as a live concert performance.

Instead, here’s how the “Sgt. Pepper” album might be brought to life:

a) The surviving members of The Beatles recruit some colleagues to make up the numbers (cf. The Who, Rolling Stones, etc.)

b) A Beatles tribute band is hired to recreate the album faithfully and in its entirety (cf. too many examples to mention – there’s even a new trend to recreate “classic” rock concerts on their relevant anniversary. Meanwhile, American band Devo formed a new “version” of themselves, called Devo 2.0, comprising young unknown musicians – tellingly, this venture was a collaboration with Disney)

c) Use holograms to substitute for the missing members of the original line-up (well, holograms aren’t yet viable, but ghostly projections are a possibility – cf. deceased rapper Tupac – and the music business has produced various examples of posthumous, exhumed and recreated material featuring dead pop stars, The Beatles included)

d) Send out a team of replica android Beatles to perform on stage (cf. Kraftwerk – again)

Except that, the “real” Beatles abandoned live performances in 1966, thus they never performed any of the songs from “Sgt. Pepper” live in public (in fact, “Sgt. Pepper” is generally considered to be the first example of a rock album created totally within a studio environment, and never conceived of as a live experience, even though it is a loosely-defined concept album featuring a fictional “live” band – how post-modern can you get?). Hence, any attempt to stage or recreate a live concert of “Sgt. Pepper” as performed by The Beatles, even if it is plausible, would have to be considered totally inauthentic. But with  imagination (and a little help from our friends?) we can always dream…

POSTSCIPT: After posting this article, I came across the following insight into the creative process by novelist William Boyd:

“…the best way to arrive at the truth is to lie – to invent, to fictionalize. The curious alchemy of art – rather than the diligent assembling of documentary fact – can be a swifter and more potent route to understanding and empathy than the most detailed photographs or the most compendious documentation. You have to do your homework, sure – authenticity has to be striven for – but in the end it is the fecundity and idiosyncrasy of the novelist’s imagination that will make the thing work – or not.” [Taken from Boyd’s anthology of non-fiction writing, “Bamboo” (2005)]

Gizmodo

Beyonce

Fielding

Back Catalogue