AngelCube15 – has your #startup got what it takes?

Startup Victoria‘s first Lean Startup meeting of the year heralded the launch of AngelCube‘s 2015 accelerator program (#AC15), for which applications are now open. A good opportunity to check in with previous successful applicants, and find out if your startup is made of the right stuff.

Screen Shot 2015-02-25 at 10.03.58 amThe info evening was hosted by inspire9, and supported by PwC, and Nathan from AngelCube kicked off proceedings by giving a run down on the accelerator program, the application process, and the type of startups that are more likely to be accepted.

What does the program offer?

  • A 3-month intensive learning and development experience
  • $20k in funding (in return for 10% of the business)
  • Co-working facilities
  • Working with Lean methodology (focus on Product-Market fit)
  • Access to great mentors and advisers, and early-stage investors
  • Participation in a fundraising roadshow (including time in the US)

There is an application form via AngelList, and the closing date is May 10 (but the sooner you can submit the better). From the hundreds of applications, AngelCube puts together a shortlist of 20, of which no more than 10 will likely be accepted.

What is AngelCube looking for?

  • Globally scalable tech startups (think beyond Australia!)
  • In-house tech skills/resources (it’s not really a matching service)
  • Great teams (more than the ideas themselves)
  • Customer traction (ideally revenue-generating)
  • Consumer-oriented solutions (rather than B2B)

What has the experience been like for successful graduates?

Three alumni of previous AngelCube programs offered some personal insights, and then participated in a Q&A with the audience of 400:

Screen Shot 2015-02-25 at 10.02.34 amFirst up was Peter from Ediply, a service that matches students to the course or university of their choice. Given the growth in education and lifelong learning, and the increasing numbers of students (especially from Asia) looking to study overseas, the business seemed like a natural fit for AngelCube. However, it was still a relatively new or unknown sector in terms of end-user or independent services (rather than in-house marketing and enrollment efforts) – which sort of broke one of AngelCube’s rules for acceptance: no established market. Peter stressed that the main reasons for applying were the need to overcome some development barriers, and to get out of a “Melbourne mindset”.

 

Screen Shot 2015-02-25 at 10.03.01 amAsh from Tablo (“YouTube for books”) probably broke another AngelCube rule, in that he was a sole applicant (not part of a team) and he had limited tech resources. AngelCube made him work harder, think big, and keep going – and helped him to become a disruptive force in publishing, with customers in 130 countries collectively publishing 1 million words a day. He’s also closed a C-round of funding, and has some impressive investors on his share register.

Screen Shot 2015-02-25 at 10.03.28 amLastly, David from etaskr (“a private label elance”) had to quit a full-time job with one week’s notice once he got accepted into AngelCube. He even had to Google how to pitch. Plus he came into the program with a totally different idea, got slammed, failed to get customer traction, and ended up pivoting to an enterprise software solution (and broke another AngelCube rule in the process – no B2B, because of the longer sales cycle). Despite having to live on very little money for 6 months (less than $200 pw) the team persevered, and are now starting to get traction, including overseas markets like Holland. His final words were “risk is not something to fear, but to overcome”.

Q&A with the audience

Most of the questions were about the application process for AngelCube, and how it helped the successful startups, particularly with going global. In large part, this due to some great networks, access to high-profile connections (“we got to meet the first employees at Yammer!”) and links to some influential investors. There was also some discussion about how to secure your first customers (mainly via social marketing techniques), and the challenge of enterprise sales (“it sucks, because you need 100 different minds to all say ‘Yes!'”).

Finally, for more insights, please visit these links to previous posts about AngelCube and some of the successful applicants.)

Next week: Help! I need to get some perspective…

The 3L’s that kill #data projects

The typical data project starts with the BA or systems architect asking: “fast, cheap or good – which one do you want?” But in my experience, no matter how much time you have, or how much money you are willing to throw at it, or what features you are willing to sacrifice, many initiatives are doomed to fail before you even start because of inherent obstacles – what I like to refer to as the 3L’s of data projects.

Image taken from "Computers at Work" © 1969 The Hamlyn Publishing Group

Image taken from “Computers at Work” © 1969 The Hamlyn Publishing Group

Reflecting on work I have been doing with various clients over the past few years, it seems to me that despite their commitment to invest in system upgrades, migrate their content to new delivery platforms and automate their data processing, they often come unstuck due to fundamental flaws in their existing operations:

Legacy

This is the most common challenge – overhauling legacy IT systems or outmoded data sets. Often, the incumbent system is still working fine (provided someone remembers how it was built, configured or programmed), and the data in and of itself is perfectly good (as long as it can be kept up-to-date). But the old applications won’t talk to the new ones (or even each other), or the data format is not suited to new business needs or customer requirements.

Legacy systems require the most time and money to replace or upgrade. A colleague who works in financial services was recently bemoaning the costs being quoted to rewrite part of a legacy application – it seemed an astronomical amount of money to write a single line of code…

As painful as it seems, there may be little alternative but to salvage what data you can, decommission the software and throw it out along with the old mainframe it was running on!

Latency

Many data projects (especially in financial services) focus on reducing systems latency to enhance high-frequency and algorithmic securities trading, data streaming, real-time content delivery, complex search and retrieval, and multiple simultaneous user logins. From a machine-to-machine data handover and transaction perspective, such projects can deliver spectacular results – with the goal being end-to-end straight through processing in real-time.

However, what often gets overlooked is the level of human intervention – from collecting, normalizing and entering the data, to the double- and triple-handling to transform, convert and manipulate individual records before the content goes into production. For example, when you contact a telco, utility or other service provider to update your account details, have you ever wondered why they tell you it will take several working days for these changes to take effect? Invariably, the system that captures your information in “real-time” needs to wait for someone to run an overnight batch upload or someone else to convert the data to the appropriate format or yet another person to run a verification check BEFORE the new information can be entered into the central database or repository.

Latency caused by inefficient data processing not only costs time, it can also introduce data errors caused by multiple handling. Better to reduce the number of hand-off stages, and focus on improving data quality via batch sampling, error rate reduction and “capture once, use many” workflows.

Which leads me the third element of the troika – data governance (or the lack thereof).

Laissez-faire

In an ideal world, organisations would have an overarching data governance model, which embraces formal management and operational functions including: data acquisition, capture, processing, maintenance and stewardship.

However, we often see that the lack of a common data governance model (or worse, a laissez-faire attitude that allows individual departments to do their own thing) means there is little co-operation between functions, additional costs arising from multiple handling and higher error rates, plus inefficiencies in getting the data to where it needs to be within the shortest time possible and within acceptable transaction costs.

Some examples of where even a simple data capture model would help include:

  • standardising data entry rules for basic information like names and addresses, telephone numbers and postal codes
  • consistent formatting for dates, prices, measurements and product codes
  • clear data structures for parent/child/sibling relationships and related parties
  • coherent tagging and taxonomies for field types, values and other attributes
  • streamlining processes for new record verification and de-duplication

From experience, autonomous business units often work against the idea of a common data model because of the way departmental IT budgets are handled (including the P&L treatment of and ROI assumptions used for managing data costs), or because every team thinks they have unique and special data needs which only they can address, or because of a misplaced sense of “ownership” over enterprise data (notwithstanding compliance firewalls and other regulatory requirements necessitating some data separation).

Conclusion

One way to think about major data projects (systems upgrades, database migration, data automation) is to approach it rather like a house renovation or extension: if the existing foundations are inadequate, or if the old infrastructure (pipes, wiring, drains, etc.) is antiquated, what would your architect or builder recommend (and how much would they quote) if you said you simply wanted to incorporate what was already there into the new project? Would your budget accommodate a major retrofit or complex re-build? And would you expect to live in the property while the work is being carried out?

Next week: AngelCube15 – has your #startup got what it takes?

“Why? Because we’ve always done it this way…”

A couple of blogs ago, one of my regular correspondents kindly laid down a challenge. He suggested that part of the answer to the problem I was writing about (i.e., how to manage data overload) could be found within Simon Sinek’s “Start With Why”.

Why?I’m quite familiar with Sinek’s investigation of “Why?”, but I wasn’t sure it was applicable in the context of my topic. Don’t get me wrong – the “Golden Circle” is a great tool for getting leadership teams to explore and articulate their purpose, and it can help individual business owners to re-connect with the reasons they do what they do.

It can even facilitate new product and service development.

But, I believe it’s harder to apply at an operational or processing level, where the sorts of decisions I was referring to in my blog are typically being made: what tools to use, what systems to adopt, what software to deploy etc.

There are several reasons why organisations do things the way they do them. When undertaking a business process review, I frequently ask the question, “Why are you doing this?”

Here are some typical responses I’ve received (and my conclusions in parentheses):

  • “Because we have to” (compliance)
  • “Because we’ve been told to” (command and control)
  • “Because we’ve always done it like this” (inertia)
  • “Because everyone else is doing it” (cheap/easy/popular)
  • “Because our consultants recommended it” (cop-out)

In one experience, I had to implement a process change within a publishing team, comprising experts (writers) and technicians (editors). The problem was, that even though the content was published on-line, most of the production processes were done on hard copy, before the final versions were uploaded via a content management system. The inefficiencies in the process were compounded by a near-adversarial relationship between writers and editors, at times bordering on a war of attrition.

When I asked the team why they worked this way, their responses were mainly along the lines of “command and control” and “inertia”. Behaviours were reinforced by some self-imposed demarcation.

The writers felt it was their role as experts to demonstrate everything they knew about the topic (without necessarily saying what they actually thought); while the editors felt they were required to work within a rigid house style (to the point of pedantry), maintain writing quality (at the expense of timeliness), and to maintain content structure and format (over context and insight).

  • Both sides felt they were meeting the organisation’s purpose: to deliver quality information to their customers to help them make informed decisions.
  • Both believed they were following clear operational guidelines, such as production, technical, and compliance.
  • Both were passionate about what they did, and took great pride in their work.

Unfortunately, the procedures which they had each been told to follow were inefficient, at times contradictory, and increasingly out of step with what customers actually wanted.

Based on market feedback clients told us they:

  • favoured timeliness over 100% perfection;
  • preferred insights over data dumps; and
  • really wanted “little and often” in terms of content updates

Thankfully, the voice of the customer prevailed, and the introduction of more timely content management processes resulted in frequent updating (via regular bulletins) backed by the “traditional” in-depth analysis.

When starting a change management project, conducting a process review, or undertaking a root-cause analysis, if asking “Why?” doesn’t get you very far in getting to the bottom of a problem, I find that it can help to pose another question: “What would your customers think about this?” For example, if customers knew how many times a piece of data was handed back and forth before their order/request/enquiry was processed, what impression might that give about an organisation?

For most companies, their sense of purpose is driven by a strong or underlying desire to serve their customers better – it’s as simple as that.

Next week: The 3L’s that kill #data projects

The Great #Data Overload Part 3: Differentiating in a #Digital World

Have you noticed that what was once old is new again? In particular, I’m talking about traditional direct marketing techniques, such as door-to-door sales, print circulars, and telephone cold calling. It’s as if businesses realise that to be heard and to get noticed in the digital world, you have to do something different or unexpected, and nobody expects to see a door-to-door salesperson these days!

MBPI mostly work from a home office, and in recent months I have had door-knockers trying to sell me car tyres, energy-saving devices and fire extinguishers. That’s in addition to the telesales calls persuading me to switch phone and utility providers, take out insurance or upgrade my security software (yes, I know that last one is probably a scam). Plus, more and more local businesses and tradespeople are using good old-fashioned leaflets and letter box drops (which is interesting, given that around 58% of local search is done on a mobile device).

Why are some advertisers reverting to this form of direct marketing?

I can think of several reasons:

  • They need to cut through the digital noise and reach their target audience via “novel” promotional tactics.
  • Their products and services are less-suited to on-line or in-app purchasing decisions.
  • Their sales activities are focused on acquiring existing customers from competitors, a conversion process more likely to succeed via personal contact.
  • Or simply, the costs make more sense.

Why is it important to differentiate? 

It’s 10 years since “Blue Ocean Strategy” was published, which stressed the need to stand apart from your competition (“avoid the shark-infested waters”). The message is even more relevant today, because the ubiquity of social media and content marketing platforms means that everyone has access to the same tools, and it’s not that difficult to play technology catch up; and while there may be good reasons for your business to engage with these channels to market, you also need some alternatives, like offering direct customer engagement that is not wholly reliant upon on-line and digital. That’s why some banks are opening more branches as part of their growth and customer acquisition strategy, why some retailers are offering “buy on-line, collect in-store”, and why some service companies are moving to an integrated, end-to-end customer experience, so that customers get the same person helping to resolve their problem from start to finish.

How to differentiate?

Standing out from the crowd (for the right reasons!) is critical to attracting customer attention. Competing on price alone is typically a race to the bottom where nobody wins. Getting noticed, especially when everyone is using the same marketing tools and sales offers, may mean doing something unusual or unexpected (for example, ALDI‘s “anti-ads”) as part of your marketing campaign. Or connecting directly with your audience in a way that doesn’t rely on “Likes”, “Shares” or “Follows”.

Sometimes it’s as simple as as this leaflet (shown above) found in my letter box the other day. At first, I thought it was a flyer for a local bar. Then, I noticed it was promoting a new smart phone app. On closer inspection, the flyer comprised a printed sheet hand-pasted onto a page torn from a magazine. That’s a lot of manual effort to promote a digital product, but using a lo-tech solution that totally makes sense! (No doubt, it appeals to the hipster crowd, ’cause retro’s cool, right?) So, the element of surprise (if that was the intention) worked – it got my attention because I wouldn’t have expected to receive a leaflet for a new app.*

Next week: “Why? Because we’ve always done it this way…”

Notes

* For an interesting story on the power of the unexpected, see Adam Posner’s talk on customer loyalty programs.