Customer Experience vs Process Design

Why is customer experience so poor when it comes to process design? Regardless of the product or service, it can be so frustrating when having to deal with on-boarding, product upgrades, billing, payment, account updates and customer service. Banks, telcos, utilities and government services are particularly bad, but I am seeing more and more examples in on-line market places and payment solutions.

Often, it feels like the process design is built entirely according to the providers’ internal operating structures, and not around the customer. The classic example is when customers have to talk to separate sales, product, technical support and finance teams – and none of them talk to each other, and none of them know the full customer or product journey end to end.

Even when you do manage to talk to human beings on the phone, rather than a chat bot, as a customer you have to repeat yourself at every stage in the conversation, and you can end up having to train front line staff on how their products actually work or what the process should be to upgrade a service, pay a bill or trouble-shoot a technical problem.

You get the impression that many customer-facing team members never use their own services, or haven’t been given sufficient training or information to handle customer enquiries, and don’t have adequate authority to resolve customer problems.

On many occasions, I get the customer experience equivalent of “computer says ‘no’…” when it appears impossible to navigate a particular problem. The usual refrain is the “system” means things can only be done a certain way, regardless of the inconvenience to the customer, or the lack of thought that has gone into the “process”.

As I always remind these companies, a “process” is only as good as the people who design, build and operate it – and in blaming the “system” for a particular failing or inadequacy they are in effect criticising their own organisations and their own colleagues.

Next week: App Overload

 

 

No-code product development

Anyone familiar with product development should recognise the image below. It’s a schematic for a start-up idea I was working on several years ago – for an employee engagement, reward and recognition app. It was the result of a number of workshops with a digital agency covering problem statements, user scenarios, workflow solutions, personas, UX/UI design and back-end architecture frameworks.

At the time, the cost quoted to build the MVP was easily 5-6 figures – and even to get to that point still required a load of work on story boards, wire frames and clickable prototypes….

Now, I would expect the developers to use something like a combination of open-source and low-cost software applications to manage the middle-ware functions, dial-up a basic cloud server to host the database and connect to external APIs, and commission a web designer to build a dedicated front-end. (I’m not a developer, programmer or coder, so apologies for any glaring errors in my assumptions…)

The growth in self-serve SaaS platforms, public APIs and low-cost hosting solutions (plus the plethora of design marketplaces) should mean that a developer can build an MVP for a tenth of the cost we were quoted.

Hence the interest in “low-code/no-code” product development, and the use of modular components or stack to build a range of repetitive, automated and small scale applications. (For a dev’s perspective check out Martin Slaney’s article, and for a list of useful resources see Ellen Merryweather’s post from earlier this year.)

There are obvious limitations to this approach: anything too complex, too custom, or which needs to scale quickly may break the model. Equally, stringing together a set of black boxes/off-the-shelf solutions might not work, if there are unforeseen incompatibilities or programming conflicts – especially if one component is upgraded, and there are unknown inter-dependencies that impact the other links in the chain. Which means the product development process will need to ensure a layer of code audits and test environments before deploying into production.

I was reflecting on the benefits and challenges of hermetically sealed operating systems and software programs over the weekend. In trying to downgrade my operating system (so that I could run some legacy third-party applications that no longer work thanks to some recent systems and software “upgrades”), I encountered various challenges, and it took several attempts and a couple of workarounds. The biggest problem was the lack of anything to guide me in advance – that by making certain changes to the system settings, or configuring the software a certain way, either this app or that function wouldn’t work. Also, because each component (the operating system, the software program and the third party applications) wants to defend its own turf within my device, they don’t always play nicely together in a way that the end user wants to deploy them in a single environment.

App interoperability is something that continues to frustrate when it comes to so-called systems or software upgrades. It feels like there needs to be a specialist area of product development that can better identify, mitigate and resolve potential tech debt, as well as navigate the product development maintenance schedule in anticipation of future upgrades and their likely impact, or understand the opportunities for retrofitting and keeping legacy apps current. I see too many app developers abandoning their projects because it’s just too hard to reconfigure for the latest system changes.

Next week: Telstar!

 

 

 

“There’s a gap in the market, but is there a market in the gap?”

As a follow up to last week’s post on business strategy, this week’s theme is product development – in particular, the perennial debate over “product-market fit” that start-up businesses and incumbents both struggle with.

Launch it and they will drink it….. (image sourced from Adelaide Remember When via Facebook)

The link between business strategy and product development is two-fold: first, the business strategy defines what markets you are in (industry sectors, customer segments, geographic locations etc.), and therefore what products and services you offer; second, to engage target customers, you need to provide them with the solutions they want and are willing to pay for.

The “product-market fit” is a core challenge that many start-ups struggle to solve or articulate. A great product concept is worth nothing unless there are customers who want it, in the way that you intend to offer it, and which aligns with your go-to-market strategy.

I appreciate that there is an element of chicken and egg involved in product development – unless you can show customers an actual product it can be difficult to engage them; and unless you can engage them, how can they tell you what they want (assuming they already know the answer to that question)? How often do customers really say, “I didn’t know I needed that until I saw it”? (Mind you, a quick scan across various crowd-funding platforms, or TV shopping channels, can reveal thousands of amazing products you didn’t know you couldn’t live without!) Of course, if your product development team can successfully anticipate unmet or unforeseen needs, then they should be on to a winner every time! In fact, being ahead of the curve, and understanding or even predicting the market direction is a key aspect of business strategy and product development for medium and long-term planning and forecasting.

Then there is the “build it and they will come” strategy. A bold move in most cases, as it involves upfront deployment of capital and resources before a single customer walks through the door. The image above is the only visual record I can find of a soft drink marketed in South Australia during the late 1960s and early 1970s. And you read the label correctly – a chocolate flavoured carbonated beverage (not a chocolate milk or soy concoction). It was introduced when the local manufacturer faced strong competition from international soft drink brands. No doubt it was designed to “corner the market” in a hitherto under-served category and to diversify against the competitor strongholds over other product lines. Likewise, is was launched on the assumption that people like fizzy drinks and people like chocolate, so hey presto, we have a winning combination! It was short-lived, of course, but ironically this was also around the time that soft drink company Schweppes merged with confectionery business Cadbury, and commentators joked that they would launch a chocolate soda, or a fizzy bar of chocolate….

With data analysis and market research, it may be possible to predict likely successes, based on past experience (sales history), customer feedback (solicited and unsolicited) and market scans (what are the social, business and technology trends). But obviously, past performance is no guarantee of future returns. In my early days as a product manager in publishing, we had monthly commissioning committees where we each presented our proposals for front list titles. Financial forecasts for the new products were largely based on sales of relevant back catalogue, and customer surveys. As product managers, we got very good at how to “read” the data, and presenting the facts that best suited our proposals. In fact, the Chairman used to say we were almost too convincing, that it became difficult to second guess our predictions. With limited production capacity, it nevertheless became imperative to prioritise resources and even reject some titles, however “convincing” they seemed.

Then there is the need to have a constant pipeline of new products, to refresh the range, retire under-performing products, and to respond to changing market conditions and tastes. In the heyday of the popular music industry from the 1960s to the late 1990s, the major record labels reckoned they needed to release 20 new song titles for every hit recording. And of course, being able to identify those 20 releases in the first place was a work of art in itself. For many software companies, the pipeline is now based on scheduled releases and regular updates to existing products, including additional features and new enhancements (particularly subscription services).

An important role of product managers is knowing when to retire an existing service, especially in the face of declining or flat sales. Usually, this involves migrating existing customers to a new or improved platform, with the expectation of generating new revenue and/or improving margins. But convincing your colleagues to give up an established product (and potentially upset current customers) can sometimes be challenging, leading to reluctance, uncertainty and indecision. In a previous role, I was tasked with retiring a long-established product, and move the existing clients to a better (but more expensive) platform. Despite the naysayers, our team managed to retire the legacy product (resulting in substantial cost savings), and although some clients chose not to migrate, the overall revenue (and margin) increased.

Finally, reduced costs of technology and the abundance of data analytics means it should be easier to market test new prototypes, running proofs-of-concept or A/B testing different business models. But what that can mean for some start-ups is that they end up trying to replicate a winning formula, simply in order to capture market share (and therefore raise capital), and in pursuit of customers, they sacrifice revenue and profit.

Next week: Who fact-checks the fact-checkers?

 

 

 

30 years in publishing

It’s 30 years since I began my career in publishing. I have worked for two major global brands, a number of niche publishers, and now I work for a start-up. For all of this time, I have worked in non-fiction – mostly professional (law, tax, accounting), business and financial subjects. I began as an editor in London, became a commissioning editor, launched a publishing business in Hong Kong, managed a portfolio of financial information services for the capital markets in Asia Pacific, and currently lead the global business development efforts for a market data start-up in blockchain, crypto and digital assets. Even when I started back in 1989, industry commentators were predicting the end of print. And despite the best efforts of the internet and social media to decimate the traditional business models, we are still producing and consuming an ever-growing volume of content.

The importance of editing and proofreading still apply to publishing today…. Image sourced from Wikimedia Commons.

The first company I worked for was Sweet & Maxwell, a 200-year-old UK law publisher. In 1989, it had recently been acquired by The Thomson Corporation (now Thomson Reuters), a global media and information brand, and majority owned by the Thomson family of Canada. When I began as a legal editor with Sweet & Maxwell in London, Thomson still had newspaper and broadcasting interests (the family continues to own the Toronto Globe & Mail), a directory business (a rival to the Yellow Pages), a travel business (comprising an airline, a travel agent and a tour operator), and a portfolio of publishing brands that ranged from the arts to the sciences, from finance to medicine, from defence titles to reference works.

Thanks to Thomson, not only did I get incredible experience from working in the publishing industry, I also got to start a new business in Hong Kong (which is still in existence). This role took me to China for the first time in 1995, including a couple of private lunches at The Great Hall of The People in Beijing. The Hong Kong business expanded to include operations in Singapore and Malaysia – during which we survived the handover and the Asian currency crisis. I also spent quite a bit of time for Thomson in the USA, working on international sales and distribution, before joining one of their Australian businesses for a year.

Given the subscription nature of law, tax and accounting publishing, many of the printed titles came in the form of multi-volume loose-leaf encyclopedias, which required constant (and laborious) updating throughout the subscription year. In fact, as editors we had to forecast and estimate the average number of pages required to be added or updated each year. If we exceeded the page allowance, the production team would not be happy. And if the number of updates each year did not match the budgeted number we had promised subscribers, the finance team would not be happy. So, we had a plethora of weekly, monthly, bi-monthly, quarterly, semi-annual and annual deadlines and schedules to manage – even today, I recall the immense relief we experienced when we got the CRC (camera ready copy) for the next release back from the typesetters, on time, and on budget…

This blog owes its title to something that senior Thomson executives liked to proclaim: “Content is King!” We were still in the era of media magnates, when newspapers (with their display and classified advertising) had a license to print money – the “rivers of gold” as some called it. But as the internet and online search came to determine how readers discovered and consumed information, the catch cry became “Content in Context!”, as publishers needed to make sure they had the right material, at the right time, in the right place, for the right audience (and at the right price….).

Of course, over the 12 years I was at Thomson, technology completely changed the way we worked. When I first started, editors still did a lot of manual mark-up on hard copy, while other specialists were responsible for technical editing, layout, design, indexing, proofreading and tabling (creating footnotes and cross-references, and compiling lists of legal and academic citations). Most of the products were still in printed form, but this was a period of rapid transition to digital content – from dial-up databases to CD-ROM, from online to web formats. Word processing came into its own, as authors started to submit their manuscripts on floppy disk, and compositors leveraged SGML (Standard Generalized Markup Language) for typesetting and for rendering print books as digital documents. Hard to believe now, but CD-ROM editions of traditional text books and reference titles had to be exact visual replicas of the printed versions, so that in court, the judges and the lawyers could (literally) be on the same page if one party or other did not have the digital edition. Thankfully, some of the constraints disappeared as more content went online – reference works had to be readable in any web browser, while HTML enabled faster search, cross-referencing and indexing thanks to text tagging, Boolean logic, key words and embedded links.

The second global firm I worked for was Standard & Poor’s, part of the The McGraw-Hill Companies (now S&P Global). Similar to Thomson, when I started with McGraw-Hill, the McGraw family were major shareholders, and the group had extensive interests in broadcasting, magazines and education publishing, as well as financial services. But when I joined Standard & Poor’s in 2002, I was surprised that there were still print publications, and some in-house authors and editors continued to work with hard copy manuscripts and proofs (which they circulated to one another via their in/out trays and the internal mail system…). Thankfully, much of this time-consuming activity was streamlined in favour of more collaborative content development and management processes. And we migrated subscribers from print and CD-ROM to web and online (XML was then a key way of streaming financial data, especially for machine-to-machine transmission).

Working for Standard & Poor’s in a regional role, I was based in Melbourne but probably spent about 40% of my time overseas and interstate. My role involved product management and market development – but although I no longer edited content or reviewed proofs, I remained actively involved in product design, content development, user acceptance testing and client engagement. The latter was particularly interesting in Asia, especially China and Japan. Then the global financial crisis, and the role of credit rating agencies such as Standard & Poor’s, added an extra dimension to client discussions…

After a period as a freelance writer and editor, for the past few years I have been working for a startup news, research and market data provider, servicing the growing audience trading and investing in cryptocurrencies and digital assets. Most of the data is distributed via dedicated APIs, a website, desktop products and third party vendors. It may not sound like traditional publishing, but editorial values and production processes lie at the core of the business – quality digital content still needs a lot of work to capture, create and curate. And even though the internet gives the impression of reducing the price of online content to zero, there is still considerable value in standardizing, verifying and cataloguing all that data before it is served up to end users.

Next week: You said you wanted a revolution?