Back in the USA

Happy Independence Day!!! This post has turned out to be quite timely, as I’ve just come back from a trip to the USA, following a hiatus of 4 years. Two weeks is hardly enough time for a full evaluation, and I spent most of the time in Colorado and New Mexico, with a few days in San Francisco at the end – but it was enough to gain a few significant impressions.

I hadn’t known what to expect, in a post-Trump, post-pandemic and post Roe vs Wade landscape. Nowhere on my (limited) itinerary would be considered MAGA territory, so it was difficult to get a balanced perspective. If anything, my experiences simply confirmed that America remains a complex, at times contradictory, and very often a deeply divided society. And, just like the Presidential electoral process, it remains perplexing to outsiders.

For example, I’d been advised to be aware of the fentanyl zombies, violent crime and the homeless camps on the streets of San Francisco. Even friends who are long-term residents of the city warned that the Downtown area around Union Square is “a bit rough”. Since I had planned to stay just south of Union Square, I must admit to some apprehension before I left Australia.

Another San Francisco resident I contacted before my trip had complained that: “[T]he news has done a number on San Francisco. While we have the same homeless problems as other large cities, it’s not as bad as it’s made out in the news. We are facing some impacts from companies buying at the height of the market and now backing out, which is why you are seeing some bankruptcies (and there will be [a] few more). But overall, the city is doing well.”

On the other hand, the guide on my day-long walking tour explained that of the 62 major cities in the USA, San Francisco is the slowest in recovering from the disruption of the pandemic. They also felt that California, and San Francisco in particular, had not done a very good job of implementing the legalisation of recreational cannabis use in the State. Since even legal businesses are having problems getting banked, all that cash in the system is a target for criminal activity.

In the event, my stay in San Francisco passed without incident. Yes, there was plenty of evidence of homelessness, drug and mental health issues, and that’s without having to venture into the Tenderloin district. However, there were also plenty of domestic and overseas tourists visiting the city. True, the main business district felt much quieter than on my previous visits, as people continue working from home –  some companies have moved out and shops have closed down as a result. But elsewhere, the city felt normal, with people in their local neighbourhoods going about about their daily routines. I was surprised to see so many people wearing face masks, but given there were a number of public testing facilities still operating in parts of the city it suggests that COVID is still prevalent.

Prior to San Francisco, I had spent time in Denver, Boulder and Santa Fe. Admittedly I was there on a weekend, but Denver‘s Downtown area felt very quiet and hollowed out. Even the 16th Street Mall lacked vibrancy (maybe the major street works were a factor?), although there were more signs of life around the River North Arts District, Coors Field and of course Ball Arena as the Denver Nuggets took out their first NBA title.

The City of Boulder is a curious blend of old (gold rush) money, new (tech-flavoured) money, progressive politics (rainbow flags everywhere) and counter-culture lifestyles (“do you want CBD sprinkles on your espresso?”). I was treated to a ticket to see ’90s country star, Mary Chapin Carpenter playing at the historic Chautauqua Auditorium – an artist experiencing something of a comeback, and who manages to express liberal and inclusive values via a very conservative musical format. The support act was a musician whom I’d never heard of before, and who sang with a typical American country music twang – but when she spoke, she revealed herself to be Australian, and promptly played a song about the Melbourne streets of Fitzroy and Collingwood (this was also the song that was adapted as the theme to the “Wallender” TV series…). Separately, I was invited to dinner at Flagstaff House, one of Boulder’s best restaurants (with spectacular views). As a bonus, Boulder’s Pearl Street hosts an excellent record store, a couple of decent book shops, and cafes where it’s possible to get both a coffee and glass of wine at 5pm…

Higher up and further south, Santa Fe was something of a revelation. Not knowing what to expect, I thoroughly enjoyed my few days there: from a walking tour of the historic town centre, to sampling the wines of Gruet and D.H.Lescombes; from the numerous art galleries and museums to the Sunset Serenade of the Sky Railway; from the adobe-inspired architecture to the excellent food served everywhere. I learned more about American history in the 3-hour guided tour than a whole year of history lessons at my high school in England. The latter had largely focused on the events leading up to the American War of Independence (and taught mainly from a British perspective, of course). Whereas the contemporary walking tour provided a longer and more complex narrative that covered the key phases of New Mexico’s history: First Nation settlement, Spanish conquest, US annexation, Civil War intervention, and finally Statehood in 1912. Oh, and the plot to assassinate Trotsky and the nuclear tests of the Manhattan Project along the way. (NB – the Santa Fe tourism app was an invaluable guide to planning my itinerary.)

As much as I enjoy spending time in America, I can’t help observing that for what is ostensibly a secular country, religion plays a dominant, and at times domineering, role in political and public affairs – starting with the Federal motto of “In God We Trust”. I can’t understand why a Constitution and Bill of Rights that dis-established the Anglican Church (thereby separating Church from State), and which enshrine both the right to practice a religion and the freedom to adhere to no religion, has allowed certain religious tenets to impinge upon the rights and freedoms of others. A century ago it was the Temperance movement, more recently it was the Pro-Life camp, and now a range of issues (human evolution, flat earthers, gender diversity, sexual orientation, critical race theory, etc.) have many conservatives and fundamentalists working in league to dictate the public debate and constrain freedom of expression on such topics – meanwhile, it seems impossible to have a reasoned and mature discussion about gun control in the USA. Go figure!

Next week: Music streaming is so passé…

 

 

“The Digital Director”

Last year, the Australian Institute of Company Directors (AICD) ran a series of 10 webinars under the umbrella of “The Digital Director”. Despite the title, there was very little exploration of “digital” technology itself, but a great deal of discussion on how to manage IT within the traditional corporate structure – as between the board of directors, the management, and the workforce.

There was a great deal of debate on things like “digital mindset”, “digital adaption and adoption”, and “digital innovation and evolution”. During one webinar, the audience were encouraged to avoid using the term “digital transformation” (instead, think “digital economy”) – yet 2 of the 10 sessions had “digital transformation” in the title.

Specific technical topics were mainly confined to AI, data privacy, data governance and cyber security. It was acknowledged that while corporate Australia has widely adopted SaaS solutions, it lacks depth in digital skills; and the percentage of the ASX market capitalisation attributable to IP assets shows we are “30 years behind the USA”. There was specific mention of blockchain technology, but the two examples given are already obsolete (the ASX’s abandoned project to replace the CHESS system, and CBA’s indefinitely deferred roll-out of crypto assets on their mobile banking app).

Often, the discussion was more about change management, and dealing with the demands of “modern work” from a workforce whose expectations have changed greatly in recent years, thanks to the pandemic, remote working, and access to new technology. Yet, these are themes that have been with us ever since the first office productivity tools, the arrival of the internet, and the proliferation of mobile devices that blur the boundary between “work” and “personal”.

The series missed an opportunity to explore the impact of new technology on boards themselves, especially their decision-making processes. We have seen how the ICO (initial coin offering) phase of cryptocurrency markets in 2017-19 presented a wholly new dimension to the funding of start-up ventures; and how blockchain technology and smart contracts heralded a new form of corporate entity, the DAO (decentralised autonomous organisation).

Together, these innovations mean the formation and governance of companies will no longer rely on the traditional structure of shareholders, directors and executives – and as a consequence, board decision-making will also take a different format. Imagine being able to use AI tools to support strategic planning, or proof-of-stake to vote on board resolutions, and consensus mechanisms to determine AGMs.

As of now, “Digital Directors” need to understand how these emerging technologies will disrupt the boardroom itself, as well as the very corporate structures and governance frameworks that have been in place for over 400 years.

Next week: Back in the USA

 

 

 

BYOB (Bring Your Own Brain)

My Twitter and LinkedIn feeds are full of posts about artificial intelligence, machine learning, large language models, robotics and automation – and how these technologies will impact our jobs and our employment prospects, often in very dystopian tones. It can be quite depressing to trawl through this material, to the point of being overwhelmed by the imminent prospect of human obsolescence.

No doubt, getting to grips with these tools will be important if we are to navigate the future of work, understand the relationship between labour, capital and technology, and maintain economic relevance in a world of changing employment models.

But we have been here before, many times (remember the Luddites?), and so far, the human condition means we learn to adapt in order to survive. These transitions will be painful, and there will be casualties along the way, but there is cause for optimism if we remember our post-industrial history.

First, among recent Twitter posts there was a timely reminder that automation does not need to equal despair in the face of displaced jobs.

Second, the technology at our disposal will inevitably make us more productive as well as enabling us to reduce mundane or repetitive tasks, even freeing up more time for other (more creative) pursuits. The challenge will be in learning how to use these tools, and in efficient and effective ways so that we don’t swap one type of routine for another.

Third, there is still a need to consider the human factor when it comes to the work environment, business structures and organisational behaviour – not least personal interaction, communication skills and stakeholder management. After all, you still need someone to switch on the machines, and tell them what to do!

Fourth, the evolution of “bring your own device” (and remote working) means that many of us have grown accustomed to having a degree of autonomy in the ways in which we organise our time and schedule our tasks – giving us the potential for more flexible working conditions. Plus, we have seen how many apps we use at home are interchangeable with the tools we use for work – and although the risk is that we are “always on”, equally, we can get smarter at using these same technologies to establish boundaries between our work/life environments.

Fifth, all the technology in the world is not going to absolve us of the need to think for ourselves. We still need to bring our own cognitive faculties and critical thinking to an increasingly automated, AI-intermediated and virtual world. If anything, we have to ramp up our cerebral powers so that we don’t become subservient to the tech, to make sure the tech works for us (and not the other way around).

Adopting a new approach means:

  • not taking the tech for granted
  • being prepared to challenge the tech assumptions (and not be complicit in its in-built biases)
  • question the motives and intentions of the tech developers, managers and owners (especially those of known or suspected bad actors)
  • validate all the newly-available data to gain new insights (not repeat past mistakes)
  • evaluate the evidence based on actual events and outcomes
  • and not fall prey to hyperbolic and cataclysmic conjectures

Finally, it is interesting to note the recent debates on regulating this new tech – curtailing malign forces, maintaining protections on personal privacy, increasing data security, and ensuring greater access for those currently excluded. This is all part of a conscious narrative (that human component!) to limit the extent to which AI will be allowed to run rampant, and to hold tech (in all its forms) more accountable for the consequences of its actions.

Next week: “The Digital Director”

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Crown Court TV

Since studying Law at university, I sometimes wonder whether I’d ever get selected for Jury Service; surely the defence (or even the prosecution) would object to anyone who had more than a rudimentary knowledge of the law, because of the potential to influence the other members of the jury during their deliberations?

Apart from participating in a police identity parade (an extra curricular activity of my Criminal Law course), and aside from representing a couple of clients at employment and social security tribunals (through voluntary work), my only involvement with court hearings has been to prepare case papers (take witness statements, issue summonses, draft client briefs) on behalf of local councils, and to appear as a witness in some of those proceedings.

I graduated in Law 40 years ago, and although I never intended to become a solicitor or barrister, I am still fascinated by the legal process, and by court proceedings themselves. Hence, I have something of a weakness for police procedurals, and court room dramas on TV. Of course, not all court room proceedings are that riveting – out of curiosity, I once popped in to London’s Royal Courts of Justice, and was rather surprised to see a leading Judge appear to fall asleep during a case he was hearing…

One British TV series from the 1970s and 1980s, “Crown Court”, stands apart from its peers in the way it presented court cases in a realistic and non-sensational fashion. First, its somewhat dry approach to criminal court proceedings means that it tends to be less judgemental than more dramatic productions. Second, the focus on what happens within the court room itself means we get to see and hear only what is presented to the jury. There are no side bars, no ex-parte applications in judges’ chambers, and rarely any last-minute evidence or surprise witnesses. By removing the traditional drama, and presenting just the facts and the witnesses’ own evidence, we only have as much information about the case as the jury does in order to reach their verdict.

In some ways, “Crown Court” was a public information service. It was broadcast in the wake of significant changes in the Criminal Law system in England and Wales, and at a time of growing suspicion of police corruption (notably within the Met’s infamous Flying Squad). Also worth bearing in mind is the fact that TV cameras were not allowed into real court rooms, so it was a way to show the public how justice was being administered in their name, and what to expect should they have to appear in court, as defendant, witness or jury member.

The other fascinating aspect of “Crown Court” is the roll-call of actors, writers, directors and producers who subsequently became regulars on British TV. In that regard, it resembled an on-air repertory theatre, similar to the leading soap operas of the day, recalling an era of public broadcasting that has largely disappeared.

Next week: BYOB (Bring Your Own Brain)