Gaming/VR/AR pitch night at Startup Victoria

Building on the successful format that has been the mainstay of Startup Vic‘s regular meetups for the past few years, February’s pitch night kicked off a scheduled programme of thematic events for 2017. First up was Gaming, VR and AR.

Photo by Daniel C, sourced from the Startup Victoria Meetup page

Hosted as usual by inspire9, the event drew a packed crowd, no doubt helped by the impressive panel of judges assembled by the organisers:

Dr Anna Newberry, responsible for driver-assistance technologies at Ford Australia; Stefani Adams, Innovation Partner at the Australia Post Accelerator; Tim Ruse, CEO of Zero Latency; Rupert Deans, Founder and CEO of Plattar; Samantha Hurley, Co-Founder and Director of Marketing Entourage; Gerry Sakkas, CEO of PlaySide Studios; and Joe Barber, a Commercialisation Advisor to the Department of Industry and Science, a Mentor at the Melbourne Accelerator Program (MAP), and angel investor.

Maintaining the tradition of this blog, I will comment on each startup pitch in the order in which they presented.

Metavents

This niche business offers an event planning app for festivals. At its heart is a tool that allows users to build a 3-D simulation of proposed events, combined with an AI capability to simulate risk management, logistics and team communications, plus a digital time capsule where event attendees can upload photos and other content.

Once licensed to event planners and organisers, the platform charges clients $1 per ticket sale, plus a 2.5% fee on donations and fees for other content and services such as the digital time capsule. In addition, Metavents is building strategic partnerships, and announced a relationship with the Vihara Foundation and its Rock Against Poverty programme from 2018.

All good so far. Then, things got a bit confusing. For example, in addition to festival and event logistics, Metavents claims to offer humanitarian support services in response to natural disasters, and emergency management capabilities for smart cities. There was also talk of a global network (linked to the UN?), and an impact investment fund.

I’m sure I wasn’t alone in thinking that the pitch was a bit disjointed and suffered from a lack of focus. But the pitch did reveal something of the founders’ core passion, and incorporated some impressive graphics – it just felt like a case of form over substance.

Second Sight

Second Sight is a game analytics service that “unlocks the secrets in player data”, by enriching existing big data sources with social media interactions. It does this by profiling players based on their behaviours, and providing this feedback and insights to game developers and product managers. Focusing on the mobile game market, Second Sight is initially targeting independent developers, and will then move on to corporate game businesses.

Second Sight’s own development path is to build automation tools first, then create a library of tasks and insights. With an estimated 1 million users (based on game statistics), 3 paying clients and another 27 beta clients, this startup is showing some promising market traction. However, there are a number of established competitors, including Omniata (which is more of a general user analytics engine, like Mixpanel or Flurry), GameAnalytics, deltaDNA and Xsolla, some of which offer free user services.

In response to the “ask”, ($500k in seed funding in return for 20% equity), the judges suggested that Second Sight might want to address the needs of a specific game sector.

Dark Shadow Studio

This presentation featured an application called Drone Legion, that merges drone experience with VR. Part simulation game, part training software, it was nice to see a demo of the app running in the background, without detracting from the pitch itself.

A key point made by the presentation is that the Civil Aviation Safety Authority (CASA), which is responsible for regulating drones in Australia, is in danger of falling behind other countries. For example, Drone Legion could be adapted to provide user training, testing and licensing before a customer buys a physical drone.

Although there are drone simulators available via Steam, they are not aimed at the general public. Drone Legion is also compatible with a range of gaming consoles.

The judges suggested that this pitch was more an individual game, rather than a business, so it was suggested that the founders should try to get funding from HTC or Oculus to build their first game. And given that one of the judges works for Australia Post (ostensibly a logistics company with a growing interest in drone technology….), there was the offer of a personal introduction.

Phoria

Phoria describes itself as an “immersive media business”, offering rapid 3-D visualisation (especially for the property development sector and the built environment),  and other services such as digital preservation.

But tonight, the pitch was about a plan to use “VR for social good”. Under the moniker “Dreamed”, Phoria is developing a niche health care solution, designing “patient experiences” to help them get out of their current care or treatment environment.

Predicated on an immersive therapy platform, Dreamed will offer a distribution service for cloud-based content, designed to be used alongside other, related assisted therapies that feature Animals, Nature and Music as stimulants for patient engagement and therapeutic outcomes. While not exactly a MedTech solution, Phoria’s “IP special sauce” is the use of VR as a constant dynamic feedback loop, which presumably learns from and adapts to user interaction and monitoring of appropriate patient diagnostics.

So, who pays for the service? Hopefully, hospitals will, especially if they can demonstrate reduced therapy costs and patient treatment times. (Maybe there will also be a consumer market alongside existing meditation apps?) But with some early-stage and potentially high-profile research underway via the Murdoch Childrens Research Institute, Phoria and Dreamed look to be making steady progress, notwithstanding the normally slow pace of medical research. Key to the research outcomes will be user acceptance and ease of service and content delivery, although a large number of unknowns remain in the context of the medical benefits. Meanwhile, Phoria continues to serve its core property market.

Finally, something which I found somewhat surprising, according to the presentation, there is no VR content licensing model currently available. Sounds like a job for a decentralized digital asset management and licensing registry (such as MyBit?).

On the night, and based on the judges’ votes, Phoria took out first place honours.

Next week: The Future of Work = Creativity + Autonomy

 

More In The Moment

In an earlier blog on “being in the moment”, I confessed that I often find the prospect (and practice) of meditation to be daunting and somewhat overwhelming. I forgot to mention that there is a park bench in one of Melbourne’s inner-city gardens which I have found to be a useful starting point. It features a quotation from Dr Ainsley Meares:

“Sit quietly, for it is in quietness we grow”

"clinamen" by Celeste Boursier-Mougenot (2013), purchased by NGV Foundation (Photo © Rory Manchee, all rights reserved)

“clinamen” by Celeste Boursier-Mougenot (2013), purchased by NGV Foundation (Photo © Rory Manchee, all rights reserved)

The significance of this insightful instruction has been driven home by some recent experiences:

  • Through my involvement with the Slow School of Business, I have participated in some Slow Coaching, where I was a Listener. The practice of “deep listening” really does require you to be present in the moment, to focus on what is being said by the Speaker, to observe how it is being expressed, and to give constructive feedback on what you have heard without judging or critiquing. It’s an extension of “active listening”, a technique I learned many years ago as a counsellor helping clients with their consumer debt problems, and I later used it as a manager to provide employee feedback during performance reviews. The key difference is that deep listening is not so concerned with exploring a linear narrative or identifying specific solutions, and is more about giving space to the Speaker to articulate what concerns or issues they are currently facing.
  • At a concert the other week I was struck by the number of people in the audience who were avidly taking photos and videos on their smart phones, or busy talking at the bar rather than appreciating the live performance in front of them. It made me wonder why some people bother going to gigs at all – it often seems like they are not there to watch and listen to the musicians! Apart from being disrespectful to the performers and other members of the audience, the happy snappers and the chatty drinkers can’t really be in the moment because they are too busy trying to capture a transient event for posterity (and who actually watches shaky live concert footage shot on a phone?). Or are they so self-absorbed that they are actually oblivious to what is going on around them?
  • Similarly, last weekend I visited the Twelve Apostles and was dismayed by the ubiquitous selfie-sticks and constant preening and posing at every vantage point. As the sun went down, hardly anyone was actually observing the dusk, let alone being still and listening to the waves below. Instead, everything was being reduced to a diluted digital experience. Again, who goes back and looks at all those photos (and do they do so more than once)? How do these images enhance the experience of simply being there? Did these visitors really appreciate the natural beauty and breathtaking views in front of them? Is a digital camera the only way to interpret the scene for themselves? Is it only “real” when they take a picture? Can it only “exist” as a bunch of pixels?

To underscore quite how significant “being in the moment” can be, I’m reminded of the Above All Human conference in January, where theoretical astrophysicist Dr Katie Mack scared the living day lights out of the audience when she discussed the impact of vacuum decay theory. In (very, very, very) short order, a shift in the current state of the Universe would wipe out life as we know it in a millisecond. It would happen so quickly, that no-one would see it coming. The effect would be catastrophic, but we wouldn’t know it was happening. As Dr Mack so eloquently put it, there would be no point in worrying about FOMO, because:

(a) there would be nothing left to be missing out on;

(b) no trace of your existence would remain; and

(c) in any event, there would be no-one left to miss you….

While I understand the need to validate our existence through “capturing the moment”, if we are too pre-occupied with taking photos, rather than focussing on our actual presence, we risk surrendering our experience to mere digital simulacra.

Next week: Whose IP is it anyway?

Form over content – when the technology is the product?

This week, I had my first experience of 3D cinema, with the rather amazing “Gravity”.

Image source: gravitymovie.warnerbros.com

Image source: gravitymovie.warnerbros.com

I wouldn’t say I’m an instant convert to the 3D format, but I certainly agree with many of the critics – that “Gravity” is not only a film that warrants 3D, it is possibly the best space movie since “2001: A Space Odyssey”. And while the CGI and 3D technologies combine effectively to take a relatively simple story and turn it into an epic, it is not just a case of “form over content” – there is real substance in this film, a great example of using the technology to enhance the audience experience, rather than hoping it can paper over the cracks of clumsy narrative and lame dialogue.

My hesitation in embracing the 3D experience stems from a suspicion that the format dictates the story, that the technology is the product. A few days before watching “Gravity”, I saw posters advertising the new Hobbit movie, “The Desolation of Smaug”. Not content with the “standard” 3D version, there is also a new format “3D HFR” (High Frame Rate, whatever that means). Oh, and for traditionalists there is also “normal” 2D.

Personally, I could never get into the whole Lord of the Rings saga – even as a child, Tolkien’s stories left me cold. So I have not seen any of the films, but when I saw a 3D preview for “Smaug”, my worst suspicions were confirmed: this really is a case of form over content, which is ironic given the legacy of the source material. To me, the 3D images looked like the pages of a children’s pop-up book, because the depth of vision is so poor that the actors look like animated cardboard cutouts badly superimposed on CGI landscapes.

Many contemporary productions, full of CGI and “enhanced” for 3D make Walt Disney’s Penguin Dance in “Mary Poppins” look far more naturalistic in comparison. And what about “Who framed Roger Rabbit?” as the pinnacle of live action meets cartoon imagery – surely Jessica Rabbit has more vitality than all the characters in “Avatar” put together?

Technology is a wonderful thing – used creatively and effectively it can deliver fantastic results, making great content even better. But used slavishly, and as an end in itself, it cannot compensate for poor material, and at best becomes a sterile technical exercise.

Is being “creative” more authentic than being “realistic”?

How do we judge something to be authentic in the Information Age? In the 1990’s, I worked on several projects to transfer reference books from print to CD-ROM and on-line formats. Because much of this material comprised official documents of record, the digital versions had to be “authentic” to the hard copy (even though they were being presented in a totally different medium) and employ embedded cross-referencing, indexing and other navigational tools. In short, the digital editions had to have the visual likeness of a microfiche copy, the readability of an e-book, and the functionality of an html5 website (if I may be permitted a mixed technical metaphor).

The quandary facing many product developers and content curators these days is, “How far should we go in the pursuit of “realism” (and by inference, “authenticity”) when having to make editorial, creative and technical choices to achieve credible outcomes?” And as consumers, the challenge we face is, “How do we know that what we see, read, hear or experience is an accurate depiction of something that actually exists or once happened/existed, or that it represents a consistent rendering/interpretation of real/imagined/possible events within the context and confines of the media being used?”

The issue is not about “real” in contrast to “virtual”, “original” as opposed to “replica”, “copy” rather than “counterfeit” – and certainly not about “truth” over “fiction”.

I’m not going to dwell on whether our virtual lives are any more/less authentic than our flesh and blood existence – that’s a matter of EI and self-awareness. I’m not interested in debating the merits of CGI technology in cinema, or questioning the use of auto-tuning in pop music – that’s a matter of aesthetics. And I’m not even going to argue that Photoshop has no place in the news media – that’s a matter of ethics.

I’m more concerned with understanding how technology, combined with content, connectivity and convergence has reshaped the way we engage with new media, to the point that our ability to assess information objectively is impaired, and our experience of authenticity is seriously compromised.

Now for a test: Which of the following statements is the most authentic (or least inauthentic)?

1) “Documentary claims NASA commissioned film director Stanley Kubrick to fake the TV images of the Apollo 11 moon landing”

2) “Pop singer Beyonce mimes to the national anthem at President Obama’s Inauguration”

3) “Jane Austen to publish a new edition of “Pride and Prejudice”, featuring FaceBook, Twitter and sexting”

4) “Apple Corp announces that The Beatles are reforming, and will be performing their 1967 album “Sgt Pepper” live on tour”

OK, before dissecting the answers, I confess that one of these scenarios is totally made up – although, as we shall see, all of them have some basis in “reality”, and each of them presents a different dimension of “authenticity”.

1) Moon Landing: A few years ago, a documentary by William Karel called “The Dark Side of the Moon” suggested that NASA had indeed faked the Apollo 11 broadcasts. This story was based on an actual conspiracy theory that the TV images were a hoax, giving some credence to the notion that the Americans never went to the moon. The documentary uses a combination of recycled/re-contextualized archive footage, scripted interviews featuring real people playing themselves and professional actors playing fictional characters. To add credibility to the hoax theory that NASA commissioned Stanley Kubrick to shoot the fake moon landing in a studio, Karel involved Kubrick’s widow and other former colleagues. However, the names of the fictional characters are taken from characters in Kubrick’s own films. There are also bloopers and out-takes from the “interviews”. So, by the end of the film, it should be clear that the whole thing is a clever set-up – except that for some moon landing sceptics, “The Dark Side of the Moon” has lent support to their conspiracy theory. Recently, Gizmodo posted a brilliant rebuttal to the hoax theorists – namely, that neither NASA nor Kubrick could have faked the moon footage in 1969 because the required technology didn’t exist at that time…. I guess we’ll call this one an authentic/fictional mockumentary based on a real/imagined conspiracy theory concerning an alleged/improbable hoax.

2) Beyonce: It was revealed that Beyonce lip-sync’d her rendition of the national anthem, but she was miming to a real recording that she made with the actual US Marine Band the day before. Apart from the ongoing debate about whether pop singers do/don’t or should/shouldn’t mime during live performances (and let’s not get into the use of pre-recorded backing tracks…), the issues here are three-fold:

a) Does it make any difference to our experience of the event? (Probably not – anyone who heard Meatloaf perform at the AFL grand final a while back probably wishes he HAD been lip-sync’ing…)

b) Is Beyonce the first performer to mime at the Inauguration? No, and she won’t be the last, so big deal (Pre-recorded material is often used in these situations to compensate for bad weather, poor acoustics or possible technical hitches).

c) Does it make for a less authentic event? Possibly, but as others have pointed out, the President had taken the official oath the day before, and the outdoor event was more of a ceremony.

So, I’ll just label this an innocently staged event incorporating a well-intentioned fabrication designed to give the public what they want.

3) Jane Austen: This particular example of literary license does not involve the posthumously discovered work of a 19th century novelist. It doesn’t feature a 21st century medium channelling words from a dead writer. It doesn’t even concern the literary conceit of a contemporary author attempting to re-imagine a sequel to a classic work by an illustrious predecessor. (Although similar publishing events to all three scenarios have occurred in recent memory, so each of them is theoretically possible.) Instead, this refers to the forthcoming third “Bridget Jones” novel by Helen Fielding. It is generally  acknowledged that Jane Austen’s “Pride and Prejudice” was a reference point for Fielding’s first novel “Bridget Jones’s Diary”. The latter is neither a pastiche nor a parody of Austen, but does use similar themes and scenarios from “Pride and Prejudice” and places them in a contemporary context. Given that Fielding has recently been quoted as saying she has an interest in internet dating, it’s not too far-fetched to suggest that her characters will be busily sexting each other after a long session in the local wine bar. Let’s put this one in the category of artistic hommage, respectfully and authentically executed with due deference to its literary source material, and with a keen awareness of contemporary mores.

4) Sgt Pepper: OK, I admit that this scenario is totally fake, but it provides for some interesting hypotheses on how it might be done (assuming today’s technology, so no time-travel involved).

First, some background: as part of the music industry’s infatuation with managing and curating its back catalogue, there has been a noticeable trend for artists to tour and perform entire “classic” albums live on stage. This phenomenon reached its zenith last year, when German electronic band Kraftwerk performed a different album from their back catalogue at eight consecutive concerts staged in New York’s Museum of Modern Art. (Actually, I recall seeing Pink Floyd in 1977 when they played their two most recent albums, “Animals” (1977) and “Wish You Were Here” (1975) in full and in the exact same sequence as the original LP’s…)

Second, until cryonics and human cloning are a scientific certainty, I won’t be suggesting that we exhume the two members of The Beatles who are no longer with us, or grow a couple of replicants. Equally, I’m not interested in whether the 2009 interactive video game, “The Beatles: Rock Band” allows me and my friends to re-enact the experience of being The Beatles playing on stage – it’s not the same as a live concert performance.

Instead, here’s how the “Sgt. Pepper” album might be brought to life:

a) The surviving members of The Beatles recruit some colleagues to make up the numbers (cf. The Who, Rolling Stones, etc.)

b) A Beatles tribute band is hired to recreate the album faithfully and in its entirety (cf. too many examples to mention – there’s even a new trend to recreate “classic” rock concerts on their relevant anniversary. Meanwhile, American band Devo formed a new “version” of themselves, called Devo 2.0, comprising young unknown musicians – tellingly, this venture was a collaboration with Disney)

c) Use holograms to substitute for the missing members of the original line-up (well, holograms aren’t yet viable, but ghostly projections are a possibility – cf. deceased rapper Tupac – and the music business has produced various examples of posthumous, exhumed and recreated material featuring dead pop stars, The Beatles included)

d) Send out a team of replica android Beatles to perform on stage (cf. Kraftwerk – again)

Except that, the “real” Beatles abandoned live performances in 1966, thus they never performed any of the songs from “Sgt. Pepper” live in public (in fact, “Sgt. Pepper” is generally considered to be the first example of a rock album created totally within a studio environment, and never conceived of as a live experience, even though it is a loosely-defined concept album featuring a fictional “live” band – how post-modern can you get?). Hence, any attempt to stage or recreate a live concert of “Sgt. Pepper” as performed by The Beatles, even if it is plausible, would have to be considered totally inauthentic. But with  imagination (and a little help from our friends?) we can always dream…

POSTSCIPT: After posting this article, I came across the following insight into the creative process by novelist William Boyd:

“…the best way to arrive at the truth is to lie – to invent, to fictionalize. The curious alchemy of art – rather than the diligent assembling of documentary fact – can be a swifter and more potent route to understanding and empathy than the most detailed photographs or the most compendious documentation. You have to do your homework, sure – authenticity has to be striven for – but in the end it is the fecundity and idiosyncrasy of the novelist’s imagination that will make the thing work – or not.” [Taken from Boyd’s anthology of non-fiction writing, “Bamboo” (2005)]

Gizmodo

Beyonce

Fielding

Back Catalogue