Why we need a “Steam Internet”

1981 Alcatel Minitel terminal(Photo by Jef Poskanzer - Licensed under Creative Commons Attribution-Share Alike)

1981 Alcatel Minitel terminal
(Photo by Jef Poskanzer – Licensed under Creative Commons Attribution-Share Alike)

The Internet is passing through a period of consolidation, as befits an industry that has reached maturity:

1. A small number of mega-players dominate the market: Microsoft, Amazon, Twitter, Apple, Facebook, Google, Yahoo!, PayPal, YouTube and Wikipedia.

2. Product lines are being rationalized, as companies trim their offerings to focus on core business – the latest victim being Google’s Reader tool for RSS feeds.

3. The distinctions between hardware, software, content and apps are blurred because of overlapping services, increased inter-connectivity via mobile platforms, and cloud-based solutions.

4. The business model for Internet access and Web usage is primarily based on data consumption and/or underwritten by 3rd party advertising. Social Media and search services are often not counted as part of the usage, thus confusing our understanding of what content actually costs.

5. Since our concept of what constitutes “news” is rapidly being redefined by Social Media, and readers increasingly rely on Social Media channels to access news, it is harder for content providers to charge a premium for value-added  information services such as quality journalism and objective news reporting.

I would argue that to rediscover a key purpose of the Net (as a means to send/receive meaningful news and information), we need to reflect on how radio broadcasting repositioned itself when television came along – hence “Steam Internet”.

“Steam Radio” was a term used in 1950’s Britain to differentiate sound broadcasting (radio) from audio-visual broadcasts (television). Although somewhat self-deprecating (suggesting something slow, and obsolete – echoing the demise of steam railways following the introduction of electric and diesel locomotives), it actually helped to embed specific values and purpose around the role of radio as a simple but effective medium to inform, educate and entertain, despite its apparent limitations.

My interest in radio means that I continue to use it as a primary source of daily news and current affairs, and as a convenient means to access international content. The discipline of radio means that content is generally well structured, the format’s limitations emphasise quality over quantity, and when done well there is both an immediacy and an intimate atmosphere that can really only be achieved by the audio format.

Far from becoming an obsolete medium in the Internet age, the growth of digital stations (as well as Internet radio and mobile-streaming) means that radio is undergoing a renaissance as it increasingly provides very specific choices in content, and offers ease of access without a lot of the “noise” of many news and information websites, with their pop-up ads, unstable video and data-hungry graphics.

Over the past decade, the major growth in Internet traffic in general, and World-wide Web usage in particular, has been driven by Social Media. However, neither the Net nor the Web was originally designed to be a mass-media platform, but the success of a highly interactive, deeply personalized and far-reaching network threatens the viability of the Internet as a means to effective communication.

As Web content and functionality has become more complex, so it actually becomes harder and more frustrating to find exactly what we want, because:

  • search and retrieval is advertising-driven and based on popularity, frequency and connectivity (rather than on context, relevance and quality);
  • content searches reduce everything to a common level of “hits” and “results”; and
  • there is little or no hierarchy as to how information and search results are structured (maybe we need a Dewey Decimal system for organising Web content?). This is one reason why Twitter is enhancing its search function by using human intervention (i.e., contextual interpretation) to make more sense of trending news themes.

I’d like to offer a short historical perspective to provide further context for the need for “Steam Internet” services:

Along with bankers and brokers, lawyers were among the first to recognize the importance of dedicated Internet services for transacting data and information. The first on-line information service I ever used was Lexis-Nexis (a research tool for lawyers) when I was a paralegal in the 1980’s. Lexis-Nexis is a database that enables users to search summaries, transcripts and reports of relevant court decisions regarding specific points of law. It is a very structured and hierarchical content source. Back then, it was a dial-up service, requiring the user to place the handset of a fixed-line telephone into an external modem that was connected to the computer terminal from which the search was conducted. The reason I can remember it so vividly is because the first time I used it, I forgot to specify sufficiently narrow search terms, which meant pages and pages of text being churned out – and probably a bill of over $200, as the service was charged according to the number of results returned and pages printed.

In the mid-1990’s, when I was setting up my Internet access, the ISP was owned and run by a university, which made sense when we think that the Net grew out of the academic world. But even though I had an ISP account, I still had to download, install and configure a graphical browser (Mosaic) to access the Web – or alternatively, I could subscribe to a dedicated dial-up service such as AOL, that offered a limited number of dedicated information services. Otherwise, my Internet access really only supported e-mail via DOS-based applications, and the exchange of files. (This was pre-Explorer and pre-Netscape, and the browser wars of the 1990’s and early-2000’s – which continue to this day with Microsoft copping another EU fine just this month.)

As the Web became more interactive, but also more dependant on “push”-content driven by advertising-based search, user experience was enhanced by RSS readers – to get to the information we really needed, and to personalize what content would be pushed to our desktops. When I was demonstrating financial market information services to new clients the built-in RSS reader was a useful talking-point, because I had configured it to display scores from the English Premier League as well as general news and industry headlines. (There is an urban myth that some of the most popular news screens on Bloomberg are the sports results…)

Just a few years ago, pre-Social Media, there were discussions about building a dedicated, faster, more robust and more secure business-oriented Internet platform, because the popular and public demands placed on the Web were putting an inordinate strain on the whole system. Businesses felt the need to create a separate platform – not just VPN’s, but a new “Internet 2” for government, universities and businesses to communicate and interact.  In the end, all that has happened is an expansion of the Top-Level Domains (.biz, .mobi), with a continued programme of generic TLD’s in the works, but this is simply creating more real-estate on the Web, not building a dedicated data and information-led Internet for business.

At this point, it’s worth reflecting that only last year, France’s Minitel videotex service and the UK’s Ceefax teletext service were both finally decommissioned, each having been in operation for over 30 years. In their prime, these were innovative precursors to the Web, even though neither of them was considered to be part of the Internet. Their relevance as dedicated information services should not be overlooked just because technology has overtaken them; that’s like saying the news media are redundant because their print circulation is in decline.

In conclusion, I’m therefore very attracted to the idea of a Steam Internet which mainly carries news and information services as a way to bring focus and structure to this content.    

 

Declaration of interest: from time-to-time the author is a presenter on Community Radio, but does not currently derive an income from this activity, so no commercial or financial bias should be implied by his personal enthusiasm for this broadcast medium.    

Facebook becomes “The Daily Like”?

Facebook has recently announced some changes to its core News Feed application. In short, Facebook subscribers will be able to apply a limited range of filters to their News Feed, by content type and source.

According to Facebook CEO Mark Zuckerberg: “What we’re trying to do is give everyone in the world the best personalised newspaper we can.”

That’s a big call.

First, this “newspaper content” will be sourced from your friends’ activity, (or their shared photos), a music feed (largely based on what your friends are listening to), or a feed of content from any page you like, or from any person that you follow.

Second, the updated News Feed design will promote the greater use of images, which will be given more screen space.

And thirdly, the goal is to grow revenue from sponsored posts and targeted advertisements, based on your personal “likes”, and other built-in Facebook algorithms like EdgeRank.

For me, whether published in print or on-line, the core prerequisites of a newspaper are:

  • A stated editorial policy, and preferably an independent editorial board 
  • Independent, objective and unbiased fact-based reporting (including fact-checking)
  • Robust journalistic standards
  • Strong editorial quality
  • Clear separation of content type (news, opinion, advertorial, advertising, sponsorship)
  • Adherence to a credible code of practice, especially on media ethics
  • Full declarations of interest by journalists, reporters and opinion writers*

Most newspapers are subject to media regulations or licensing systems around proprietorial “fitness”, ownership control, cross-media assets and censorship. Newspapers are also subject to general laws regarding libel, blasphemy, privacy, incitement, discrimination and copyright.

There are already a number of regulatory reviews of ownership, standards and ethics by mainstream news media in the wake of alleged phone hacking and other malpractices. There is also debate as to whether on-line platforms that carry “news” should be subject to more stringent media regulation.

Apart from some issues with censorship and legal requests to remove offensive material (which apply to all content providers), I don’t see Facebook operating under a formal newspaper regime.

While Facebook may aspire to become “The Daily Like”, I don’t believe it wants to be treated (or taken seriously) as a regular “newspaper”. Otherwise, it would likely have to register as a newspaper in every jurisdiction where its content is accessed,  disseminated or uploaded. Perhaps what Facebook really means is that it wants to dominate on-line advertising (using your own content as the bait) while continuing to claim the platform is “free” to end users.

*Declaration of interest: the author is not now on Facebook (reluctantly).

Broadcastr signs off: 9 Challenges for Social Media

Social Media platforms – there seems to be one born every minute. By the time you finish reading this article, another 5 will have been launched somewhere in the world. And probably 5 more will have been shut down.

A recent casualty of what I call the “50 Shades of Social Media” syndrome is Broadcastr, a user-contributed audio content platform for location-based story telling.

In their farewell note to the Broadcastr user community, co-founders Andy Hunter and Scott Lindenbaum stated:

“While we’d love to keep Broadcastr alive, technology requires money, active development, and maintenance. We’re a small team, and, sadly, don’t have the resources to continue development.”

Broadcastr has inevitably lost out to category-killer SoundCloud, an earlier site that dominates Social Media audio content (and is also the likely cause for the wavering fortunes of MySpace). In recent months, iTunes has withdrawn its Ping social networking application for music fans; Webdoc has rebranded itself as Urturn (possibly due to confusion surrounding its name) and Yahoo! has just announced it is withdrawing a number of Social Media products – not forgetting that Yahoo! dumped Buzz, a social news site that was hard to distinguish from Digg. There are even some mutterings that Google+ does not yet justify the hype as a serious Social Media platform to take on Facebook or Twitter.

Even if you are first to market with a new Social Media platform, most sites are just a different (not necessarily better) mousetrap – same bait to tempt you in, same tools to capture your attention. The sheer volume of sites means that they are hard to differentiate from one another – hence the “50 Shades of Social Media” syndrome. Each Social Media site is trying to become THE destination for its target audience, but as The Cure once sang, “In the caves, all cats are grey.” Despite their differences, all Social Media platforms end up looking pretty much the same.

In light of the heated competition for market traction, here are 9 challenges to success in Social Media:

1 There are essentially only 5 types of Social Media platform:

2 There are only a limited number of activities you can do within these sites, such as “like”, “follow”, “share”, “post”, “publish”, “comment”, “recommend” and “tag”.

3 Increasingly, single-purpose or single-interest Social Media sites are attempting to cross over into adjacent domains, in an attempt to build scale and stickiness, and to improve the user experience.

4 This diversification means Social Media lose focus, dilute their original offering, and potentially alienate users.

5 Every Social Media platform starts out claiming to be different and offering something unique – but both the content and the business models are relatively easy to replicate, which is why we see multiple variations of the same concept or minor iterations with each new site.

6 As we engage with multiple Social Media platforms, we need our own personal media monitoring and management systems just to keep tabs on everything, especially when sites start to overlap as they encompass richer media formats and enhanced content functionality.

7 Meanwhile, the increasing inter-connectivity between different sites means that as individual users we can multi-channel as if we are our own mini cable networks.

8 But as with cable TV, multi-channelling leads to audience fragmentation and narrowcasting (which in turn has an impact on advertising revenue).

9 The Social Media industry will be subject to further mergers and acquisitions like Facebook’s purchase of Instagram, and consolidation will inevitably result in an oligarchy of dominant players, as happens with all media.

“Everything on the Internet should be free…”

Last week I got into a very heated dinner-party debate with an artist, an academic and a publisher about the economic value of copyright protection in particular, and intellectual property rights in general.

It started with a discussion about file-sharing and illegal downloads, and led to an argument about patenting genomes. I can’t attribute directly, but the gist of the argument was as follows:

1 Copyright and patents do not encourage innovation – they stifle it

2 Intellectual property rights represent a modern phenomenon – ancient societies managed to exist without them

3 Everything on the Internet should be free – and not subject to copyright protection

Let’s agree that formal intellectual property laws are a relatively recent invention – the modern concept of patents emerged in 15th century Europe, and the first British copyright law was passed in 1710. These laws then grew in importance as technology introduced the printing press and the industrial revolution.

I would argue, however that all civilisations have placed a premium on knowledge, creativity and invention. Regardless of whether this knowledge is based on folklore, scientific experiment, geographical discovery or geological exploration – specific rights, actual economic benefits and certain legal protections have been afforded to those who establish ownership or control of these assets. Examples would include the right to copy ancient manuscripts held in monastic libraries; the monopolies and protection granted to members of craft guilds in plying their skills; the trading rights granted to merchants; and restricting the practice of certain tribal traditions to selected community elders.

Most of these knowledge-based activities involve a high degree of effort, ingenuity and risk-taking – so in return, it was acknowledged there needed to be financial and other rewards to act as incentives. In the case of science and technology, these incentives are often deemed essential to offset the huge capital costs of developing new products and processes. In the case of copyright, the rewards of author royalties and content licensing fees are desirable to encourage people to come up with new ideas and new concepts – even if the purpose is simply to amuse and entertain us.

Of course, the economic rewards need not simply be derived from patents or copyright – tax-breaks for R&D or public grants to fund academic research are some examples of alternative financial incentives for both inventors and people of ideas.

As for the concept that “everything on the Internet should be free”, I am reminded of what I once told a client, who could not understand why access to the on-line version of a printed reference work was costing him more than the “physical” cost of adding a new user log-in and password to our content publishing platform: “OK”, I replied, “you can have all the content for free, but we’re not going to index it, or structure it with headings and sub-headings; we won’t tag it, insert cross–references, or add hypertext links; we won’t even edit it; and finally, we won’t update it every time there is new material.” He soon got the point.