Crown Court TV

Since studying Law at university, I sometimes wonder whether I’d ever get selected for Jury Service; surely the defence (or even the prosecution) would object to anyone who had more than a rudimentary knowledge of the law, because of the potential to influence the other members of the jury during their deliberations?

Apart from participating in a police identity parade (an extra curricular activity of my Criminal Law course), and aside from representing a couple of clients at employment and social security tribunals (through voluntary work), my only involvement with court hearings has been to prepare case papers (take witness statements, issue summonses, draft client briefs) on behalf of local councils, and to appear as a witness in some of those proceedings.

I graduated in Law 40 years ago, and although I never intended to become a solicitor or barrister, I am still fascinated by the legal process, and by court proceedings themselves. Hence, I have something of a weakness for police procedurals, and court room dramas on TV. Of course, not all court room proceedings are that riveting – out of curiosity, I once popped in to London’s Royal Courts of Justice, and was rather surprised to see a leading Judge appear to fall asleep during a case he was hearing…

One British TV series from the 1970s and 1980s, “Crown Court”, stands apart from its peers in the way it presented court cases in a realistic and non-sensational fashion. First, its somewhat dry approach to criminal court proceedings means that it tends to be less judgemental than more dramatic productions. Second, the focus on what happens within the court room itself means we get to see and hear only what is presented to the jury. There are no side bars, no ex-parte applications in judges’ chambers, and rarely any last-minute evidence or surprise witnesses. By removing the traditional drama, and presenting just the facts and the witnesses’ own evidence, we only have as much information about the case as the jury does in order to reach their verdict.

In some ways, “Crown Court” was a public information service. It was broadcast in the wake of significant changes in the Criminal Law system in England and Wales, and at a time of growing suspicion of police corruption (notably within the Met’s infamous Flying Squad). Also worth bearing in mind is the fact that TV cameras were not allowed into real court rooms, so it was a way to show the public how justice was being administered in their name, and what to expect should they have to appear in court, as defendant, witness or jury member.

The other fascinating aspect of “Crown Court” is the roll-call of actors, writers, directors and producers who subsequently became regulars on British TV. In that regard, it resembled an on-air repertory theatre, similar to the leading soap operas of the day, recalling an era of public broadcasting that has largely disappeared.

Next week: BYOB (Bring Your Own Brain)

 

 

Musical Idolatry

As a rebooted version of “Australian Idol” appears on network television, I can’t decide whether programs like this are a result of the current state of the music industry OR are they the cause of the industry’s malaise…?

I’ll admit upfront that I know I’m not the target demographic for these shows (Idol, Voice, Talent…), so I’m not even going to comment on the quality of the musical content or the presentation format.

Before we had recorded music or broadcast radio, the industry relied upon song writers selling sheet music, in the hope their compositions would get performed in theatres and concert halls – and audiences would want to buy copies of the songs to perform at home.

Then, radio largely killed the music hall, and with the advent of the 7″ vinyl record, together they eventually displaced the reliance on sheet music sales. From the early 1960s onwards, we also saw more artists writing, performing and recording their own material, which transformed both music publishing and the record industry itself.

Although record labels still exist as a means to identify, develop and commercialise new talent, only three of the so-called major labels have survived – a process of industry consolidation and M&A activity that began in earnest in the 1980s – ironically, a period now regarded as a “Golden Age” of pop music.

A key legacy of the punk movement of the 1970s was a network of independent music labels, distributors, publishers and retailers – along with a strong DIY ethic of self-released records and independent fanzines, thanks to lower production costs and easier access to manufacturing and distribution.

Now, there is more new music being released than is humanly possible to listen to. It is relatively quick and simple to produce and release your own music – record on a home laptop (even a tablet or smart phone will do), upload the finished mp3 files to user-accessible platforms such as Bandcamp and SoundCloud, and promote yourself on social media. However, without significant marketing dollars to buy an audience, those hoping to become an overnight viral sensation may be disappointed. And even if you do manage to get traction on one of the global streaming platforms, the income from digital plays is a fraction of what artists used to earn from physical sales.

So that’s how the major labels (and some of the larger independents) still manage to dominate the industry: they have the budget to spend on developing new talent, and they have money for marketing campaigns (and possibly to influence those streaming algorithms). Plus, they have access to a huge back catalogue that they can carry on repackaging at a fraction of the original production costs.

It’s also true, however, that the shorter shelf-life of many newer artists means that labels don’t have such an appetite for long-term development plans, where they are willing to nurture a new talent for several years, before expecting a return on their initial investment. Just as with fast fashion, the pop music industry has become hooked on a fast turnover of product, because they know only a fraction of new releases will ever become a hit, and they have to keep feeding the beast with new content.

Which brings me back to programs like Idol. First, it’s one way for the music industry to fast-track their next success. Second, it literally is a popularity contest – the industry gets an idea of what the public likes, so they can pre-determine part of their release schedule. Third, hosting these contests on commercial TV means advertising dollars and sponsorship deals can help defray their A&R and marketing costs (or, at least help them to prioritise where to spend their money).

But let’s not pretend that these singing shows are nothing more than televised karaoke. Performers don’t get to play their own songs, or even play any instruments (as far as I can tell). The program content relies on cover versions – usually songs that are well-known, and therefore already road-tested on the audience. Plus, by choosing to perform a particular song, a contestant may hope to win by association or identification with the successful artist who originally recorded it. But contestants are not free to choose whatever song they like – my understanding is there are only 1,000 (popular) songs to choose from, just like karaoke.

In pretending to discover new talent, in part, the industry is simply hoping to re-release songs in their back catalogue, albeit with a new face on the record. Through the restrictive format of these programs, the industry is not discovering new musicians or finding new song writers and composers, and it’s certainly not forging any new direction in music, because of the reliance upon an existing formula, and dependence on a very specific (and somewhat narrow) strand of pop music.

Next week: Eat The Rich?

 

Brexit Blues (Part II)

Brexit finally came into effect on January 31, 2020 with a transition period due to end on December 31, 2020. It’s still not clear whether key issues such as the post-Brexit trade agreement between the EU and the UK will be completed by then (a major talking point being imports of American chlorinated chicken….). Nor is it clear which other areas of EU laws and standards will survive post-transition. Both of which continue to cause uncertainty for British businesses and local governments that have to operate within and enforce many of these rules. Add to that the recent UK storms and floods, the post-Brexit air of racism and xenophobia, plus the coronavirus outbreak and the resulting drag on global markets and supply chains, and maybe the UK will run out of more than just pasta, yoghurt and chocolate. Perhaps those promised post-Brexit savings of £350m a week really will need to spent on the National Health Service…..

The “Vote Leave” campaign bus, 2016 (Image sourced from Bloomberg)

The seeds of the Brexit debacle were sown in David Cameron’s speech of January, 23 2013. As I wrote last year, that set in motion a series of flawed processes. Despite the protracted Brexit process, it’s now unlikely that the decision to leave will be reversed, especially as the opposition Labour Party has just been trounced at the polls. Instead, Labour continues to beat itself up over the failure of its outgoing leadership either to make a solid case in support of the Remain vote in the 2016 Referendum, or to establish and maintain a clear and coherent policy on Brexit leading right up to the December 2019 General Election. The Conservative Party under Boris Johnson has a huge Parliamentary majority, a fixed 5-year mandate, and a general disregard for traditional cabinet government and the delineation of roles between political advisors and civil servants. We have already seen that any form of dissent or even an alternative perspective will not be tolerated within government or within the Tory party, let alone from independent and non-partisan quarters.

Since that fateful speech of January, 2013, it’s possible to follow a Brexit-related narrative thread in film, TV and fiction. Not all of these accounts are directly about Brexit itself, but when viewed in a wider context, they touch on associated themes of national identity, democracy, political debate, public discourse, xenophobia, anti-elitism, anti-globalism, and broader popular culture.

The earliest such example I can recall is Brian Aldiss’s final novel, “Comfort Zone”, (published in December 2013), while the first truly “Brexit Novel” is probably Jonathan Coe’s “Middle England” (November 2018). Somewhat to be expected, political thrillers and spy novels have also touched on these themes – Andrew Marr’s “Children of the Master” (September 2015, and probably still essential reading for Labour’s current leadership candidates); John Le Carre’s “A Legacy of Spies” (September 2017); John Simpson’s “Moscow, Midnight” (October 2018); and John Lanchester’s “The Wall” (January, 2019). (For another intriguing and contemporary literary context, I highly recommend William Gibson’s introduction to the May 2013 edition of Kinglsey Amis’s “The Alteration”. Plus there’s an essay on the outgoing Labour leader in Amis junior’s collection of non-fiction, “The Rub of Time” published in October 2017.)*

Elsewhere there have been TV dramatisations to remind us how significant, important and forward-looking it was when the UK joined the EEC in 1973 – most notably the chronicling of the Wilson and Heath governments as portrayed in “The Crown”. Even a film like “The Darkest Hour” reveals the love-hate relationship Britain has had with Europe. More distant historical context can be seen in films like “All is True” and “Peterloo”.

No doubt, Brexit will continue to form a backdrop for many a story-teller and film-maker for years to come. And we will inevitably see recent political events re-told and dramatised in future documentaries and dramas. Hopefully, we will be able to view them objectively and gain some new perspective as a result. Meanwhile, the current reality makes it too depressing to contemplate something like “Boris Johnson – Brexit Belongs to Me!”

*Postcript: hot off the press, of course is “Agency”, Willam Gibson’s own alternative reality (combining elements of the “Time Romance” and “Counterfeit World” referenced in “The Alteration”) – I haven’t read it yet, but looking forward (!) to doing so….

Next week: Joy Division and 40+ years of Post-Punk

 

Blipverts vs the Attention Economy

There’s a scene in Nicolas Roeg’s 1976 film, “The Man Who Fell To Earth”, where David Bowie’s character sits watching a bank of TV screens, each tuned to a different station. At the same time he is channel surfing – either because his alien powers allow him to absorb multiple, simultaneous inputs, or because his experience of ennui on Earth leads him to seek more and more stimulus. Obviously a metaphor for the attention economy, long before such a term existed.

Watching the alien watching us… Image sourced from Flicker

At the time in the UK, we only had three TV channels to choose from, so the notion of 12 or more seemed exotic, even other worldly. And of those three channels, only one carried advertising. Much the same situation existed in British radio, with only one or two commercial networks, alongside the dominant BBC. So we had relatively little exposure to adverts, brand sponsorship or paid content in our broadcast media. (Mind you, this was still the era when tobacco companies could plaster their logos all over sporting events…)

For all its limitations, there were several virtues to this model. First, advertising airtime was at a premium (thanks to the broadcast content ratios), and ad spend was concentrated – so adverts really had to grab your attention. (Is it any wonder that so many successful film directors cut their teeth on commercials?) Second, this built-in monopoly often meant bigger TV production budgets, more variety of content and better quality programming on free-to-air networks than we typically see today with the over-reliance on so-called reality TV. Third, with less viewing choice, there was a greater shared experience among audiences – and more communal connection because we could talk about similar things.

Then along came cable and satellite networks, bringing more choice (and more advertising), but not necessarily better quality content. In fact, with TV advertising budgets spread more thinly, it’s not surprising that programming suffered. Networks had to compete for our attention, and they funded this by bombarding us with more ads and more paid content. (And this is before we even get to the internet age and time-shift, streaming and multicast platforms…)

Despite the increased viewing choices, broadcasting became narrow-casting – smaller and more fractured viewership, with programming appealing to niche audiences. Meanwhile, in the mid-80s (and soon after the launch of MTV), “Max Headroom” is credited with coining the term “blipvert”, meaning a very, very short (almost subliminal) television commercial. Although designed as a narrative device in the Max Headroom story, the blipvert can be seen as either a test of creativity (how to get your message across in minimal time); or a subversive propaganda technique (nefarious elements trying to sabotage your thinking through subtle suggestion and infiltration).

Which is essentially where we are in the attention economy. Audiences are increasingly disparate, and the battle for eyeballs (and minds) is being fought out across multiple devices, multiple screens, and multiple formats. In our search for more stimulation, and unless we are willing to pay for premium services and/or an ad-free experience, we are having to endure more ads that pop-up during our YouTube viewing, Spotify streaming or internet browsing. As a result, brands are trying to grab our attention, at increasing frequency, and for shorter, yet more rapid and intensive periods. (Even Words With Friends is offering in-game tokens in return for watching sponsored content.)

Some consumers are responding with ad-blockers, or by dropping their use of social media altogether; or they want payment for their valuable time. I think we are generally over the notion of giving away our personal data in return for some “free” services – the price in terms of intrusions upon our privacy is no longer worth paying. So, brands are having to try harder to capture our attention, and they need to personalize their message to make it seem relevant and worthy of our time – provided we are willing to let them know enough about our preferences, location, demographics, etc. so that they can serve up relevant and engaging content to each and every “audience of one”. And brands also want proof that the ads they have paid for have been seen by the people they intended to reach.

This delicate trade-off (between privacy, personalisation and payment) is one reason why the attention economy is seen as a prime use case for Blockchain and cryptocurrency: consumers can retain anonymity, while still sharing selected personal information (which they own and control) with whom they wish, when they wish, for as long as they wish, and they can even get paid to access relevant content; brands can receive confirmation that the personalised content they have paid for has been consumed by the people they intended to see it; and distributed ledgers can maintain a record of account and send/receive payments via smart contracts and digital wallets when and where the relevant transactions have taken place.

Next week: Jump-cut videos vs Slow TV