Pop in Perpetuity

Exactly a year ago, I blogged about ageing rockers and their propensity to continue touring and recording. This past weekend I experienced two events that provided almost polar opposites as to how musicians will perpetuate their “live” legacy. (Of course, in theory, their recordings will last forever, in physical, digital and streaming formats – as long as the equipment, technology and platforms survive…)

On the one hand, there was the Sun Ra Arkestra, who since their founder’s death in 1993, have continued to play the music of Sun Ra, respecting the sound, format and spirit of the original band formed in the 1950s. Some of the current band members played with Sun Ra himself, so there is a thread of continuity that connects us back to the past. But even as these surviving members depart this world, the music of Sun Ra will live on in concert form through subsequent generations of players. This type of perpetuity is not uncommon among bands of the 60s, 70s and 80s, although in many cases, there is usually at least one original band member performing, or members who overlapped with the band founders. (Some notable exceptions: Soft Machine, who continue performing and recording, but whose remaining original member left nearly 50 years ago; and Faust, who split into at least two separate bands that still tour and record under the same name.)

On the other hand, there was the high-tech concert presentation by the late composer and performer Ryuichi Sakamoto, entitled KAGAMI. This involved the use of AR headsets and a 3D avatar of Sakamoto, captured in sound and vision performing a selection of his music, sat at a grand piano. The audience, initially seated in a circle around the virtual performance area in order to acclimatise to what they were seeing, was invited to move around the avatar, and even peer into the open grand piano. Two things were striking: first, the 360 degree image was very impressive in the level of detail; second, even if someone was standing between the viewer and the avatar zone, the headset still presented the image of Sakamoto sat at the keyboard. The technology not only captures a digital visualisation of the pianist in action, it also replicates the notes he played as well as the tonal expression and the timbres, resonances and acoustics of the physical instrument. While the audio HiFi was superior to the atavistic CGI, the latter will no doubt improve; as will the slightly clunky and heavy headsets – the 50 minute duration is probably the most I could have endured.

Neither format of the above concerts is better or superior to the other. Both are authentic in their own way, and true to the artistry of musicians they celebrate. Of course, if we end up using AI to compose “new” music by Sakamoto, that may undermine that authenticity. But given Sun Ra’s origin story, I wouldn’t be surprised if he started beaming his new works from Saturn.

 

Ticket scalpers? Blockchain could fix that!

Music fans of a certain age and demographic have been complaining loudly about the use of “dynamic pricing” when trying to buy tickets for their favourite band’s highly anticipated reunion tour. (There must be a pun in there about “Don’t book online in anger”?)

Part of the rationale given for using a demand-based pricing system is to disincentivise scalpers. The higher the cost of the ticket in the primary market (not the same as the ticket’s face value), the smaller the potential mark-up in the secondary market. Except that some tickets with a face value of $150 were priced at $450 at the box office, only to be re-advertised in the secondary market for several thousand dollars. In other words, the touts have simply increased their margins, in response to the so-called dynamic pricing mechanism.

Without offering any sort of apology or mea culpa, the said band have now announced additional tour dates, tickets for which will be allocated and sold in a form of ballot. Stop me if you think I’m being cynical, but by quickly adding dates to an existing tour itinerary, it shows that the band knew there would be excess demand, because it’s not that easy to reserve major (and highly profitable) venues, even 12 months in advance. And if they can run a ballot system now, why couldn’t they have done that in the first place?

All of which simply shows how out of touch bands like this are with technology and market dynamics. In short, ticket sales and allocations could have been achieved far more equitably if the band and their promoters had chosen to use blockchain, crypto and web3.0 solutions.

Here’s a simple list of options that could have been used:

1. Issue all tickets as NFTs (non-fungible tokens)

2. Limit the number of tickets per digital wallet and/or the number of wallets per ticket buyer

3. Ensure the use of soul-bound tokens to link wallet ownership and ID to specific individuals (to limit the number of tickets per wallet, and to limit the resale of tickets)

4. Run social media campaigns, quests and airdrops to allocate and distribute tokens that entitle holders to a place in the ticket queue – e.g., the more active a wallet holder is in the band’s fan community, the higher their chance of securing a priority place in the ticket queue

5. Pre-publish the expected ticket price ranges, and enable wallet holders to vote on the minimum/maximum price they would be willing to pay (using something like Snapshot)

6. Cap the amount an NFT-based ticket can be sold for in the secondary market or write the token smart contract to allocate a percentage of the resale value as a commission to the ticket issuer

Of course, the UK competition regulators are taking a close look at this ticketing fiasco, to see if so-called dynamic pricing breached fair trading or other consumer protection laws. If punters were not aware that they may have to pay far more than the advertised or face value of a ticket, this would appear to be unfair and unconscionable conduct. It’s potentially a form of under-quoting – advertise the ticket at a artificially low price, then force buyers to pay well over the face value at the actual point of sale (under the guise of “market demand”), knowing full well that the fans had little or no choice in the matter.

One final thought – knowing the volatile history of this band, the chances are that the concerts (or at least some of them) may be cancelled. Hopefully, the ticket agent and box office operators won’t be counting the advance ticket sales as recognised revenue, rather they are required to hold the funds in a verified escrow account until the performances are delivered and the ticket revenue actually earned….. (again, something that could be easily factored into a smart contract – no release of funds until the loud-mouth sings?).

Next week: Cooking the books?

 

 

 

Album Celebrations

When the first 12″ vinyl record was issued in 1948, did any record labels expect that this format would still be in use nearly 80 years later? The death of the 33rpm disc has been predicted many times, based on industry events and cultural trends that were expected to render vinyl albums obsolete. Music cassettes, CDs, MiniDiscs, mp3s, 7″ 45rpm singles, home-taping, downloads and streaming were all seen as existential threats to albums. Yet, despite reaching near extinction in the 1990s, vinyl albums (both new releases and back catalogue) are currently enjoying something of a revival.

This resurgence of interest in albums can be attributed to several factors: baby boomers reliving their youth; Gen X/Y/Z watching shows like “Stranger Things”; the box set, reissue and collector market; retro fashion trends; and a desire for all things analogue, tactile and physical (in contrast to the vapidity of streaming…).

Streaming has definitely changed the way many people listen to music, to the extent that albums have become deconstructed and fragmented thanks to shuffle, algorithms, recommender engines, playlists and a focus on one-off songs and collaborations by today’s popular artists. By contrast, most albums represent a considered and coherent piece of work: a selection of tracks designed and sequenced to be heard in a specific order, reflecting the artist’s creative intention or narrative structure. Streaming means that the artist’s work is being intermediated in a way that was not intended. You wouldn’t expect a novel, play or film to be presented in any old order – the author/playwright/director expects us to view the work as they planned. (OK, so there are some notable examples that challenge this convention, such as B.S.Johnson’s novel, “The Unfortunates” or the recent “Eno” documentary.)

Thankfully, classic albums are now being celebrated for their longevity, with significant anniversaries of an album’s release warranting deluxe reissues and live tours. This past weekend I went to two such events. The first was a concert by Black Cab, marking 10 years since the release of their album “Games Of The XXI Olympiad”. Appropriately, the show was the same day as the opening of the Paris Olympics, and the band started with a brief version of “Fanfare for the Common Man”. The second was part of the 30th anniversary tour for “Dream it Down”, the third album by the Underground Lovers. As well as getting most of the original band members together, the concert also featured Amanda Brown, formerly of The Go-Betweens, and who played on the album itself. (Also on stage was original percussionist, Derek Yuen – whose day job is designing shoes for the Australian Olympic team…)

It’s hard to imagine we will be celebrating the date when an artist first dropped a stream on Spotify….!

[This year also marks the 40th anniversary of the release of “Pink Frost”, the break-through single by The Chills, New Zealand’s finest musical export. So it was sad to read of the recent passing of their founder, Martin Phillipps. The Chills were one of many Antipodean bands that always seemed to be playing in London in the late 1980s, often to much larger audiences than they enjoyed at home. Their classic early singles and EPs are once again available on vinyl. Do yourself a favour, as someone once said!]

Next week: A postscript on AI

 

 

 

 

 

Some final thoughts on AI

Last week, I attended a talk by musical polymath, Jim O’Rourke, on the Serge Paperface modular synthesizer. It was part memoir, part demonstration, and part philosophy tutorial. At its heart, the Serge is an outwardly human-controlled electronic instrument, incorporating any number and combination of processors, switches, circuits, rheostats, filters, voltage controllers and patch cables. These circuits take their lead from the operator’s initial instructions, but then use that data (voltage values) to generate and manipulate sound. As the sound evolves, the “composition” takes on the appearance of a neural network as the signal is re-patched to and from each component, sometimes with random and unexpected results – rather like our own thought patterns.

But the Serge is not an example of Artificial Intelligence, despite its ability to process multiple data points (sequentially, in parallel, and simultaneously) and notwithstanding the level of unpredictability. On the other hand, that unpredictability may make it more “human” than AI.

My reasons for using the Serge as the beginning of this concluding blog on AI are three-fold:

First, these modular synthesizers only became viable with the availability of transistors and integrated circuits that replaced the valves of old, just as today’s portable computers rely on silicon chips and microprocessors. Likewise, although some elements of AI have been around for decades, the exponential rise of mobile devices, the internet, cloud computing and social media has allowed AI to ride on the back of their growth and into our lives.

Second, O’Rourke referred to the Serge as being “a way of life”, in that it leads users to think differently about music, to adopt an open mind towards the notion of composition, and to experiment knowing the results will be unpredictable, even unstable. In other words, suspend all pre-conception and embrace its whims (even surrender to its charms). Which is what many optimists would have us do with AI – although I think that there are still too many current concerns (and the potential for great harm) before we can get fully comfortable with what AI is doing, even if much of may actually be positive and beneficial. At least the Serge can be turned off with the flick of a switch if things get out of hand.

Third, as part of his presentation O’Rourke made reference to Stephen Levy’s book, “Artificial Life”, published 30 years ago. In fact, he cited it almost as a counterfoil to AI, in that Levy was exploring the interface between biological life and digital DNA in a pre-internet era, yet his thesis is even more relevant as AI neural nets become a reality.

So, where do I think we are in the evolution of AI? A number of cliches come to mind – the Genie is already out of the bottle, and like King Canute we can’t turn back the tide, but like the Sorceror’s Apprentice maybe we shouldn’t meddle with something we don’t understand. I still believe the risks associated with deep fakes, AI hallucinations and other factual errors that will inevitably be repeated and replicated without a thought to correct the record represent a major concern. I also think more transparency is needed as to how LLMs are built, and the way AI is trained on them, as well as disclosures when AI is actually being deployed, and what content has been used to generate the results. Issues of copyright theft and IP infringements are probably manageable with a combination of technology, industry goodwill and legal common sense. Subject to those legal clarifications, questions about what is “real” or original and what is “fake” or artificial in terms of creativity will probably come down to personal taste and aesthetics. But expect to see lots of disputes in the field of arts and entertainment when it comes to annual awards and recognition for creativity and originality!

At times, I can see AI is simply a combination of mega databases, powerful search engines, predictive tools, programmable logic, smart decision trees, pattern recognition on steroids, all aided by hi-speed computer processing and widespread data distribution. At other times, it feels like we are all being made the subject matter or inputs of AI (it is happening “to” us, rather than working for us), and in return we get a mix of computer-generated outputs with a high dose of AI “dramatic license”.

My over-arching conclusion at this point in the AI journey is that it resembles GMO crops – unless you live off grid and all your computers are air-locked, then every device, network and database you interact with has been trained on, touched by or tainted with AI. It’s inevitable and unavoidable.

Next week: RWAs and the next phase of tokenisation