The Internet is passing through a period of consolidation, as befits an industry that has reached maturity:
1. A small number of mega-players dominate the market: Microsoft, Amazon, Twitter, Apple, Facebook, Google, Yahoo!, PayPal, YouTube and Wikipedia.
2. Product lines are being rationalized, as companies trim their offerings to focus on core business – the latest victim being Google’s Reader tool for RSS feeds.
3. The distinctions between hardware, software, content and apps are blurred because of overlapping services, increased inter-connectivity via mobile platforms, and cloud-based solutions.
4. The business model for Internet access and Web usage is primarily based on data consumption and/or underwritten by 3rd party advertising. Social Media and search services are often not counted as part of the usage, thus confusing our understanding of what content actually costs.
5. Since our concept of what constitutes “news” is rapidly being redefined by Social Media, and readers increasingly rely on Social Media channels to access news, it is harder for content providers to charge a premium for value-added information services such as quality journalism and objective news reporting.
I would argue that to rediscover a key purpose of the Net (as a means to send/receive meaningful news and information), we need to reflect on how radio broadcasting repositioned itself when television came along – hence “Steam Internet”.
“Steam Radio” was a term used in 1950’s Britain to differentiate sound broadcasting (radio) from audio-visual broadcasts (television). Although somewhat self-deprecating (suggesting something slow, and obsolete – echoing the demise of steam railways following the introduction of electric and diesel locomotives), it actually helped to embed specific values and purpose around the role of radio as a simple but effective medium to inform, educate and entertain, despite its apparent limitations.
My interest in radio means that I continue to use it as a primary source of daily news and current affairs, and as a convenient means to access international content. The discipline of radio means that content is generally well structured, the format’s limitations emphasise quality over quantity, and when done well there is both an immediacy and an intimate atmosphere that can really only be achieved by the audio format.
Far from becoming an obsolete medium in the Internet age, the growth of digital stations (as well as Internet radio and mobile-streaming) means that radio is undergoing a renaissance as it increasingly provides very specific choices in content, and offers ease of access without a lot of the “noise” of many news and information websites, with their pop-up ads, unstable video and data-hungry graphics.
Over the past decade, the major growth in Internet traffic in general, and World-wide Web usage in particular, has been driven by Social Media. However, neither the Net nor the Web was originally designed to be a mass-media platform, but the success of a highly interactive, deeply personalized and far-reaching network threatens the viability of the Internet as a means to effective communication.
As Web content and functionality has become more complex, so it actually becomes harder and more frustrating to find exactly what we want, because:
- search and retrieval is advertising-driven and based on popularity, frequency and connectivity (rather than on context, relevance and quality);
- content searches reduce everything to a common level of “hits” and “results”; and
- there is little or no hierarchy as to how information and search results are structured (maybe we need a Dewey Decimal system for organising Web content?). This is one reason why Twitter is enhancing its search function by using human intervention (i.e., contextual interpretation) to make more sense of trending news themes.
I’d like to offer a short historical perspective to provide further context for the need for “Steam Internet” services:
Along with bankers and brokers, lawyers were among the first to recognize the importance of dedicated Internet services for transacting data and information. The first on-line information service I ever used was Lexis-Nexis (a research tool for lawyers) when I was a paralegal in the 1980’s. Lexis-Nexis is a database that enables users to search summaries, transcripts and reports of relevant court decisions regarding specific points of law. It is a very structured and hierarchical content source. Back then, it was a dial-up service, requiring the user to place the handset of a fixed-line telephone into an external modem that was connected to the computer terminal from which the search was conducted. The reason I can remember it so vividly is because the first time I used it, I forgot to specify sufficiently narrow search terms, which meant pages and pages of text being churned out – and probably a bill of over $200, as the service was charged according to the number of results returned and pages printed.
In the mid-1990’s, when I was setting up my Internet access, the ISP was owned and run by a university, which made sense when we think that the Net grew out of the academic world. But even though I had an ISP account, I still had to download, install and configure a graphical browser (Mosaic) to access the Web – or alternatively, I could subscribe to a dedicated dial-up service such as AOL, that offered a limited number of dedicated information services. Otherwise, my Internet access really only supported e-mail via DOS-based applications, and the exchange of files. (This was pre-Explorer and pre-Netscape, and the browser wars of the 1990’s and early-2000’s – which continue to this day with Microsoft copping another EU fine just this month.)
As the Web became more interactive, but also more dependant on “push”-content driven by advertising-based search, user experience was enhanced by RSS readers – to get to the information we really needed, and to personalize what content would be pushed to our desktops. When I was demonstrating financial market information services to new clients the built-in RSS reader was a useful talking-point, because I had configured it to display scores from the English Premier League as well as general news and industry headlines. (There is an urban myth that some of the most popular news screens on Bloomberg are the sports results…)
Just a few years ago, pre-Social Media, there were discussions about building a dedicated, faster, more robust and more secure business-oriented Internet platform, because the popular and public demands placed on the Web were putting an inordinate strain on the whole system. Businesses felt the need to create a separate platform – not just VPN’s, but a new “Internet 2” for government, universities and businesses to communicate and interact. In the end, all that has happened is an expansion of the Top-Level Domains (.biz, .mobi), with a continued programme of generic TLD’s in the works, but this is simply creating more real-estate on the Web, not building a dedicated data and information-led Internet for business.
At this point, it’s worth reflecting that only last year, France’s Minitel videotex service and the UK’s Ceefax teletext service were both finally decommissioned, each having been in operation for over 30 years. In their prime, these were innovative precursors to the Web, even though neither of them was considered to be part of the Internet. Their relevance as dedicated information services should not be overlooked just because technology has overtaken them; that’s like saying the news media are redundant because their print circulation is in decline.
In conclusion, I’m therefore very attracted to the idea of a Steam Internet which mainly carries news and information services as a way to bring focus and structure to this content.
Declaration of interest: from time-to-time the author is a presenter on Community Radio, but does not currently derive an income from this activity, so no commercial or financial bias should be implied by his personal enthusiasm for this broadcast medium.
Pingback: Radio comes of age in the social media era | Content in Context