The "Cloud" has been seeing a lot of momentum this past year, and one place where that is readily apparent is in the stock price of companies making major strategic investments in Cloud technology and associated offerings, as well as aggressive go-to-market plans with those offerings.
To demonstrate this, take a look at the one-year stock price increase of eight major cloud vendors versus the Dow Jones Industrial Average. These eight growth companies were selected because of their software-as-a-service (SAAS) or infrastructure-as-a-service (IAAS) focus. They are Informatica (INFA), Salesforce.com (CRM), Amazon (AMZN), Netsuite (N), Rackspace (RAX), Success Factors (SFSF), Akamai (AKAM), and VMWare (VMW). These securities have seen on average an 81% price increase over the past year, versus a paltry 6% versus the Dow Jones Industrial Average (which at least has gone up).
Will it continue? There is still a long way to go in this space, so probably so.
The debates rage on about "Public Clouds" and "Private Clouds" and which is more appropriate for serious computing efforts, including in business systems and all across the universe of applications.
Most vendors, not surprisingly, line up behind the approach that best suits their product offerings.
For example, SAAS vendors (Salesforce, NetSuite, SuccessFactors) say that multi-tenant applications are the Cloud, citing the need for a business solution with shared, multi-tenant software resources, including databases, are needed to truly make the Cloud useful. Yet many of these vendors are often criticized for not providing "open" models, so still some long-term questions remain. Yes, these Clouds are easy get into, but how do you get out of them if necessary?
The infrastructure-as-a-service crowd (Amazon's EC2, Google App Engine, Rackspace) will suggest that only infrastructure is the "true" Cloud, meaning essentially renting clean servers by the minute and storage by the byte represent the original "open" Cloud vision, enabling applications to be moved from Cloud to Cloud without difficulty. However, this is just servers and storage in the end (at least for now), so the user still has to build everything themselves. Ok for some, not entirely useful for most.
And of course the enterprise software folks (Oracle, SAP, IBM) often claim that the Cloud can and should be "Private" because it's a better security model and enables you to manage it within
the organization. This enables them to capitalize on the hype of the Cloud without having to change too much of their actual offerings. Of course the challenge with this model is the lack of sharing licenses or hardware across organizations becomes quite expensive, and quite frankly we have had this model before under other names such as "mainframe", "client-server" and other "in-house" architectures. Sure, there is some incremental innovation and usefulness, but it's not too much different than what has always been offered, just another iteration.
So while there are valid use cases for each of the above scenarios, there is one thing I want to point out with Public versus Private Cloud discussions when businesses are unsure which route to go. It goes all the way back to the birth of the Cloud as a concept itself.
The reason we even have the Cloud in the first place is that heavily-trafficked Web sites such as Google and Amazon found they had to build massive, high performance, scalable systems to be able to handle the processing load at peak times (Amazon at Christmas for example). This meant that during non-peak times, they found themselves with lots of excess, unused computing capacity.
This of course spawned the idea that they could leverage this excess capacity, as well as their expertise
in managing high-performance, distributed, "Web scale" computing technology as an additional line of revenue, and possibly launching a brand new industry of opportunities. Hence, the Cloud was born.
The one key piece of this Cloud concept is "expertise". This is something that you get in Public Cloud environments that you don't get in Private Clouds. With Private Clouds, you get all of the hardware and software (and the corresponding purchased licenses) that you need, but you don't have a team of experts that have been running that platform for years monitoring, managing, and supporting that platform in real-time while you use it, including having visibility into it as it runs. By definition you therefore don't have engineers supporting the success of your application systems on a minute-by-minute basis.
This real-time team of experts, and their associated expertise developed over time, is something you get inherently in the Public Cloud scenario. The folks who run these systems have as their core mission in life to keep the platform up and running, battle test it over time, improve it, enhance it, test it, analyze operational data, review performance charts, improve and enhance it again, and on and on, day after day.
Although a bit overused, the electric generator is a good example of demonstrating the difference. If you have your own electrical generators powering your home, it doesn't matter that thousands of other people have one just like it in their homes. If it goes down, you are on your own, and it's your responsibility to keep the electricity flowing from room to room. But if you plug into the electric grid run by your local power company, and there is an outage while you are having dinner somewhere, likely it will be fixed before you even get home from the restaurant. And you might not even notice there was a problem since you weren't at home (you were out dining in the "Dinner Cloud" and outsourcing the washing of dishes). This is because the system was monitored, a problem was detected, and a team was ready to spring into action once the outage occurred.
How long would it have taken to call the generator repairman to get him scheduled to come out with a power outage in your own generator? There's a reason electricity grids have evolved the way they have.
Oh, and all of the innovation occuring behind the scenes at the power company on a day to day basis? It comes to you automatically, often while you sleep, as opposed to a new giant chunk of hardware arriving every 18-24 months that you have to figure out how to configure and get up and running again.
So how is this relevant to StrikeIron?
Well, the same is also true in our case. While we are more the Software-as-a-Service variety of Cloud Computing (and in our case "data-as-a-service"), we recognize that users have a choice in the way to obtain the type of functionality we offer. A lot of the powerful capabilities we have such as our Cloud-managed Contact Record Verification Suite
, such as real-time telephone, address, and email verification, could also be purchased and brought in-house as software applications and raw data sources, and a similar result could be achieved in terms of better, more usable customer data assets. The approach would just be a heck of a lot different.
In the latter scenario, all of the verification reference data would have to be managed and maintained internally. One would have to acquire the software and data files, and then get the functionality up and running. It would then have to be designed and delivered in such a way to be able to handle the various loads of data verification that might appear from different applications at different times, and often in high volume scenarios. Also, all of the other expertise around availability, testing, updating, and the usual effort associated with in-house solutions would have to be developed internally.
With us, all we do day in and day out is focus on verifying and delivering our real-time data verification capabilities to thousands of applications simultaneously with a very high level of performance at all times, delivering 24x7x365. All you need to do, just like the electric company, is plug into us. All of the data management, updating, software maintenance, and performance testing and improving is done by us, with all of the heavy lifting abstracted from you.
Since we launched our system in 2005, we have constantly improved our finely-tuned delivery and fault-tolerant capabilities, including load-balancing, high speed data I/O, redundancy, external monitoring, and everything else we have to provide to be able to support our customers and their production applications. And we are getting smarter and better about how we go about it every day. This expertise is something that each and every one of our customers gets to leverage with every single call to our system. This is why we have only had minutes of downtime over the last four years.
So could in-house solutions provide the same end result? Maybe in the sense that yes you could end up with good clean customer data somehow on your own. But at what cost, effort, and with what missed opportunities? Focus on your core business, and leave the external data verification effort to us. We will keep the lights on. Guaranteed.
Customer data is a corporate asset that has a value and if managed correctly can provide an organization with a sizable return. The challenge often becomes on how to measure the value of this electronic asset and quantify its internal rate of return. Imagine for a moment if we could increase customer retention considerably, increase per-customer value, and also increase the number of customers that we have in total? While different for each organization, calculating this kind of return can be a fun exercise to do, and I firmly believe one way to achieve these gains is through the customer data asset.
Now, we all know a company's success can often be determined from how well it manages its customer relationships. The raw material for managing these customer relationships exist in the form of an organization's customer data. In other words, accurate, comprehensive, and complete customer data can help considerably in achieving better customer relationships. After all, how well can you communicate with your customer base if you have poor data to do it with?
As an example, ongoing customer communication such as a newsletter can strengthen ties and loyalty, as well as keep customers informed of positive company developments that make them want to continue to be a customer. However, without having accurate and up-to-date information for this communication (such as accurate physical addresses and current email addresses), this is not possible to do effectively.
Also, simple post-order product delivery to the wrong place due to address typos can create considerable customer service problems. We have all experienced the pain of items not shipped correctly to us from an online vendor, and the maddening follow-up process which usually occurs in order to straighten things out. Simple data verification at the point of sale can help reduce the incidence of this drastically.
With better data, call center representatives can respond faster and more effectively to customers when they don't have to spend time correcting customer information. This of course helps them provide better service.
Amazingly, organizations have invested millions of dollars into sophisticated CRM systems such as Salesforce, NetSuite, Oracle, and others, and yet the overwhelming majority of these instances are still plagued by inaccurate, incomplete, and otherwise poor data quality that grinds away at the ROI potential of these kinds of systems.
While the analysts have broad ranges of the cost of poor data quality to an organization, depending on what all is included and the characteristics of the organization, the one thing they agree on is that the cost is large.
The good news is that on the customer data front, the key to better data is to perform validation and data enhancement at the point of data collection, whether it be from a Web form, a call center, a mailing response, or any other way customer information is collected. These days with XML-based customer data verification services
available via simple integration over the Web, this is quite easy to achieve.
Sure, good customer data can be augmented with back office processing by utilizing additional data sources, but this typically is only successful if the originally-captured data is complete and accurate.
So while the bottom-line impact of poor customer data is not always a straight-forward calculation, trusting your intuition that better data equals better customer relationships, and addressing the issue of customer data quality early in the data life cycle can pay significant dividends.
Acquisitions of companies such as Sun, BEA, Peoplesoft, Cognos, Siebel, Business Objects, and countless others the past few years have created a competition vacuum in the enterprise software space. For example, in the last five years or so, Oracle has spent over thirty billion USD purchasing nearly sixty companies. Microsoft has gobbled up eighty or so, IBM sixty, EMC forty and Hewlett-Packard approximately thirty-five. And these are just the giants.
The next tier of enterprise software companies also have pretty long lists of recent acquisitions. So one can imagine quite easily that this collective buying spree has created a deep void in the landscape of enterprise software, and as a result creates a tremendous opportunity.
After all, not much has happened in terms of new products and innovation in the space in the past several years, save for a handful of companies such as Salesforce.com, NetSuite and some of the various SAAS and open sources models that have emerged. But even much of this is nearing the ten year mark.
Interestingly, some of the Fortune 500 have annual I.T. budgets north of a billion dollars per year. And those that don't have budgets that are indeed quite large. This, combined with the fact that many of their primary systems were built and deployed in the 1990's (yes that's ten to twenty years ago) and are getting a bit "long in the tooth" as they say of aging horses, creates an interesting set of dynamics.
In addition, Cloud infrastructure is maturing and getting more firmly in place with more efficient computing resource and data storage models. It is quickly becoming the seedbed for future enterprise software innovation, not only in new software categories, but also in the traditional categories of business intelligence, analytics, data management, and employee and customer-facing applications.
All of these trends point to a "perfect storm" of opportunity. Their alignment ought to be attractive to a new wave of entrepreneurs that can take advantage of the emerging Cloud Computing trend in new and exciting ways. This will enable a great deal of new innovation in the enterprise/corporate information technology space.
So while much of the technology press is caught up in all of the Android-iPhone rage, Facebook privacy issues, and the Groupons and Four Squares of the world, quietly many technology veterans are taking notice of this enterprise software void and recognizing the opportunity for what it is.
As one example, Marc Andreessen of Netscape fame has recently indicated that his venture capital firm is investing in a "new wave" of enterprise software companies. Others are sure to follow this trend of focus including both entrepreneurs and investors.
In other words, I don't subscribe to the opinion held by some that enterprise software is dead. So over the next couple of years, I do expect a wave of new enterprise software companies to emerge, setting off another arms race in the corporate I.T. space as organizations battle it out to stay a step ahead of their competition.
Fortunately, companies like StrikeIron with our data-as-a-service external data and data verification components can benefit extensively from this trend, providing important pieces to these emerging applications with ease.
It should be exciting times ahead.