2011 has been the year of the Cloud database. The idea of shared database resources and the abstraction of underlying hardware seems to be catching on. Just like Web and application servers, paying-as-you-go and eliminating unused database resources, licenses, hardware, and all of the associated cost is proving to have attractive enough business models that the major vendors are betting on it in significant ways.
The recent excitement has not been limited to just the fanfare around "big data" technologies. Lately, most of the major announcements have come around the traditional relational, table-driven SQL environments Web applications make use of much more widely than the key-value pair data storage mechanisms "NoSQL" technology uses for Web-scale data-intensive applications such as Facebook, NetFlix, etc.
Here are some of the new Cloud database offerings for 2011:
Saleforce.com has launched Database.com, enabling developers in other Cloud server environments such as Amazon's EC2 and the Google App Engine to utilize its database resources, not just users of Salesforce's CRM and Force.com platforms. You can also build applications in PHP or on the Android platform and utilize Database.com resources. The idea is to reach a broader set of developers and application types than just CRM-centric applications.
At Oracle Open World a couple of weeks ago, Oracle announced the Oracle Database Cloud Service, a hosted database offering running Oracle's 11gR2 database platform available in a monthly subscription model, accessible either via JDBC or its own REST API.
Earlier this month, Google announced Google Cloud SQL, a database service that will be available as part of its App Engine offering based on MySQL, complete with a Web-based administration panel.
Amazon, to complement its other Cloud services and highly used EC2 infrastructure, has made the Amazon Relational Database Service (RDS) available to enable SQL capabilities from Cloud applications, giving you a choice of underlying database technology to use such as MySQL or Oracle. It is currently in beta.
Microsoft also has its SQL Azure Cloud Database offering available in the Cloud, generally positioned as suited for applications that use the Microsoft stack for developers that will want to leverage some of the benefits of the Cloud.
Some of the above offerings have only been announced so far, and not actually launched. Or, they have limited preview access available now. Also, even the business models in some of these cases have not even been completely divulged, or if so are very likely to change.
Clearly there is a considerable marketshare land grab existing now. All of the major vendors are recognizing that traditional-SQL Cloud storage infrastructure will be an important technology going forward. Adding a solid database layer to the Cloud architecture story seems like an important step in the continuing enterprise and commercial software move to the Cloud, and these new vendor offerings should in turn accelerate this move.
So, is this really the wave of the future? Some of the major questions that will have to be answered include those around latency. When data requests have to hop from a client application, then to the application server, to the database, and then back to the server and client, even multiple times within a single request, it can result in quite a performance hit. Likely, these machines exist far from each other geographically and might really slow things done, annoying an end-user with the slow page loads. This is probably why most infrastructure providers realize that they have to have the corresponding database capabilities available and accessed natively to reduce this latency. However, performance, along with security issues (perceived or otherwise) still could be a significant barrier to mainstream adoption.
Also, most of the relational database environments that exist in the Cloud only have a subset of SQL capabilities available and in some cases can be quite limited. For example, many of these Cloud SQL platforms don't support cross-table joins, at least not yet. This is a very common requirement for SQL applications. The lack of support is primarily because joins can consume a lot of resources, another performance-killer in shared environments.
Once most of this storage and Cloud database infrastructure gets in place however, incorporating more content-oriented data services such as customer data verification will become commonplace and easy to leverage. We may even see them incorporated into the database offerings themselves as they look to differentiate themselves from vendor to vendor. Cloud-based database offerings have the advantage of making much larger libraries of data-oriented add-on capabilities available right out of the box, so the story here is much more than just cost.
While SQL Cloud offering announcements are all the rage in 2011, 2012 will undoubtedly tell the adoption tale. No doubt these offerings will be ideal and cost-effective for many use cases out there. But will demand be large enough quickly enough to support all of these vendors and drive the innovation at a speed that will make these platforms viable in the near future for enterprise and commercial applications? The answer is likely yes, but the next twelve months or so will give us a lot of the supporting data to measure the extent of the trend.
At the CTIA Enterprise and Applications event last week in San Diego, the trade group announced that the total number of text messages sent in the United States increased 16% to 1.138 trillion during the past year.
One of the reasons for this is the continued explosive growth in smartphones. In the past year, the number of active smartphones in the US grew from 61.2 million to 95.5 million, representing 67% year-over-year growth. In fact, the CTIA also reported that there are more wireless subscriptions in the United States than there are people. There are 327.6 million US mobile connections and only 315.5 million people in the country, a 103.9% penetration rate.
Given these statistics, it's no coincidence that the number of SMS messages being sent continues to climb. Text messaging is still the #1 data application on devices and is still the best way to reach someone. People carry their cell phones far more often than other computing devices such as tablets and PCs, allowing an individual to be reached immediately, practically everywhere. Also, enforced regulations and a per-message cost have prevented mobile devices from being inundated with spam like we see with other communications mediums such as email and instant messaging.
The continued growth of rich applications available on smartphones and their continued penetration will only make them even more pervasive in our lives as time goes on. One can envision a day where Smartphones are used for on-the-spot payments, credit cards, as our hotel key, and maybe even to unlock and drive our cars. We are just at the dawn of the smartphone age. New standards like LTE are also giving these devices an ever-increasing amount of digital bandwidth, pouring more fuel on an already raging fire.
The other key factor for continued SMS growth and its communication supremacy is that there exists a global standard for messaging. This enables phones worldwide, regardless of carrier and in hundreds of countries, to be able to receive and send text messages to each other, and even more importantly receive them from software and Internet applications. This makes SMS text messaging not only ideal, but practically a required feature for opt-in marketing automation, reminders, alerts, shipping notifications, data-driven triggers, and other system-delivered notifications.
On the application side, platform-independent, Cloud-based SMS APIs and Web services are available for easy integration into applications, Web sites, business processes, and other places where tapping into the worldwide network of cell phones is useful, not only with customers and prospective customers, but also within employee networks, work groups, and other organizations where immediate or scheduled notification has significant value. These APIs dramatically reduce the complexity and cost required to SMS-enable just about any application critical to an organization.
For a long time to come, SMS will continue to rule as a premier business communication vehicle.
The CTIA Fall event wrapped up in San Francisco this past week and the message was loud and clear. The enterprise is going to be the next hot place for mobile and wireless over the next twelve months and beyond.
In fact, the CTIA even changed the name of the annual event to "Enterprise and Applications" this year, reflecting this enterprise focus and where the organization thinks the largest opportunities will be in 2011.
To provide a foundation for this thinking, here are some of the trends that will drive continued innovation and adoption in enterprise mobile computing going forward:
- Mobile operating system innovation is continuing at a fast pace thanks to competition, including Android, Blackberry, Apple's iOS, and Windows Phone 7, and much of these new features have significant enterprise relevance.
- Smartphone innovation is moving beyond consumers and into the enterprise now, as evidenced by new device launches such as Motorola's Android Pro, which takes on the Blackberry, currently the enterprise stalwart device.
- There are currently 61 million wireless devices on US networks, a number growing every day.
- Another $21 billion has been invested in wireless infrastructure by the industry the last twelve months.
- The success of the iPad is launching a new revolution in tablet computing, perhaps even more ideal for the enterprise than smartphones.
- The emergence of cloud computing, a major trend in the enterprise and ideally suite for mobile applications, will drive much enterprise mobile innovation.
- We are inching closer and closer to a 4G world as evidenced by LTE rollouts, where data throughput will be increased considerably, and latency significantly reduced.
- Human barriers to enterprise adoption such as IT and Executive Management are decreasing as these individuals are using wireless devices more and more in their own lives and are seeing the value and potential within the enterprise.
We have witnessed this uptick already in 2010 at StrikeIron, seeing our SMS Alerts and Notifications text messaging solution increase significantly in adoption in 2010. Text messaging is especially important when communicating with customers, employees, and others whose device types are not known, or where freedom of device type usage is encouraged. These types of SMS-based alerts and notifications can dramatically improve customer service, strengthen relationships with customers, and drive new business opportunities. And thanks to Web Services API solutions like ours, it is easy to integrate SMS capabilities into applications, Web sites, business processes, or anywhere else where a real-time notification could be beneficial.
2011 should be a very interesting year to watch for mobile computing in the enterprise.
Location-based technology is not new, yet we definitely seem to be getting closer and closer to the proverbial "tornado" as location-savvy technology providers are emerging and innovating at a faster rate than ever before.
The space has really caught fire as of late due to combining location-aware technology with social network applications and the pervasiveness of smartphones. This combination has really provided the foundation for the space to flourish. And it has. Companies like Where.com, PlacePop, Placecast.net, and Urban Mapping are just a few examples of small companies with big plans and with geolocation at the heart of their business models.
Also, user-initiated, location-revealing "checking in" seems to be the latest craze, as Web applications such as FourSquare, GoWalla, and BrightKite are competing heavily, enabling an individual to see who they might know is also in attendance at an event. This can also help in the discovery of new places and meeting new people. The group location review site Yelp has recently added check-in capabilities as well.
And of course the likely suspects are in on the land rush too. Google's Latitude tells others that you allow to know your current location as you move about. Facebook and Twitter are rolling out location-based features as well, and Yahoo has already done an acquisition in the space with Indonesian-based Koprol (the "Asian FourSquare"). Even location-based games such as Booyah's MyTown are on the upswing.
Now that the technology exists, the value is fairly obvious. Clearly it makes sense for a travel company to provide trip-related content specific to a site visitor's nearest airport. As preseason football magazines discovered long ago with their regional magazine covers, locally relevant headlines are more likely to get attention (and therefore clicked), especially with sports teams and news. And there are clearly regional differences in product interest. Earthquake preparedness for example is probably much more eye-catching in Northern California than say hurricane preparedness which will resonate more in the Carolinas.
Remember all of those frequent membership cards the local sandwich shops hand out so you can get a free sandwich every seventh visit? Unless you are living by the penny, they are just too much of a hassle to carry around and manage. However, in a geo-savvy smartphone world, they should become unnecessary. A store ought to have the ability to keep track, with permission and incentives of course, of who is frequenting their stores and purchasing habits, and reward appropriately.
The customer loyalty possibilities are endless. Imagine the local ice cream shop being able to determine at the press of a button who its top twenty-five customers are, and hand delivering to these customers on their birthday an ice cream cake based on their flavor history.
Software companies are also using IP address-oriented technology to ensure location-based license compliance. MLB TV is using it to enforce blackout restrictions for watching streaming broadcasts of Major League Baseball. Business Intelligence applications are also integrating these types of products to gather site trending information, geographical response to offers, customer base location, and more. The use case list is practically endless.
However, there are still some challenges that don't make things entirely easy. Most Web applications are using IP address related technologies
and API's to determine location when someone visits a site, and then providing location-specific content to the visitor. However, this is problematic for browser-based smartphones, as they will typically show the IP address for the location of the carrier's hosting servers rather than the location of the smartphone.
For example, the location for a Blackberry will show as Toronto when using standard IP address-based technology, which is where RIM, the creator of the Blackberry, is headquartered. Of course, this is not useful for location-specific custom content on the device's browser.
Instead, smartphones are using carrier-specific location services or built-in GPS to determine location and therefore require user permission (they are usually prompted) for location to be used. This can make application development for location-based applications to become more complex with different requirements for different devices, and require device-specific applications to be built.
Auto-determined location-specific content can be dangerous however with search engine robots if sites are not implemented correctly with location-specific URLS (which sometimes requires performance trade offs) as robots will only see the content for the location of the servers from which they happen to be crawling from, and only index as much. This can hurt traffic potential.
Also, latitude and longitude coordinates can be obtained from site visitor-provided addresses to determine more precise locations. This of course provides for a whole new slate of use cases, and actually has been around for awhile.
All in all, despite some of the challenges that exist, powerful location-based technologies will continue to become increasingly sophisticated, and location-based applications will become more and more a part of our lives. Look for a lot to happen in this space in the next 12-24 months.
There are several great reasons for integrating live foreign currency exchange rates into applications and Websites, especially since it is so easy and cost-effective to do.
For example, Melissa Smith of retailcustomerexperience.com reported this month
that online retailers are seeing as much as a 25% online sales gain by showing prices in local currencies when utilizing a visitor's browser or IP address
to determine location.
Also, companies with a global presence can track expenses such as media costs and the corresponding sales revenue associated with those media costs using current exchange rates, especially when the expenses are being paid out of foreign accounts. Using accurate daily rates prevents these costs and sales numbers from being misleading due to global currency swings, or specific currencies trending up or down over certain periods of time. This also enables decision makers to better gauge success or failure with international advertising campaigns.
This is not only true with advertising costs, but any international sales or accounting reports can have a degree of consistency to them when they are unified using a single, accurate currency rate. This is important because (and especially lately) fluctuations between the US Dollar and the Euro for example can be 5% or more in a given month, and can see 20% fluctuations in a year. This is also true of many other currencies relative to each other as well.
One possible solution is to manually obtain current foreign exchange rates from the Web and plug them into your Website content management system on a periodic basis. However, this can be a hassle, requires manual work, and if not done often enough can lead to serious accuracy problems.
Another is to screen-scrape rates from various Websites via a script of some kind. This may cause legal complications, as well as run the risk of scripts breaking and having to be re-implemented when the source Websites change.
There are also other vendors offering currency rate tables via CSV files for purchase that also require these rates to be stored, managed, and maintained and can add significant process complexity into application or Website development cycles.
The best way to integrate foreign exchange rates (as with most data that changes frequently) is to utilize a SOAP or REST-based Web service where the current rate is retrieved wherever and whenever it is required for a calculation. This ensures the greatest possible accuracy, and eliminates manual processing and the costs associated with maintaining, storing, and updating currency rate tables. It also requires no hardware or software to be purchased, and essentially enables the plucking of currency rates from the "Cloud" when required. It is benefits like these that are causing the current surge in cloud computing.
And since this approach uses an API, the currency rates can be integrated into anything that can consume a Web service, including popular SAAS applications such as Salesforce.com, ecommerce applications such as Magento, and into smartphone devices such as the iPhone and Android platforms. And, depending of course on the platform, the integration can be achieved with just a few simple lines of code.StrikeIron's Foreign Currency Rates Web Service API
carries current exchange rates for over 160 currencies that are updated every thirty minutes throughout each business day. These rates are aggregated from a variety of global banks and currency markets. Historical rates back to 2004 based on the London close are also provided.
There is a lot of commerce occurring around the globe. When it can be simple, straight-forward, and cost-effective to implement foreign currency rates into any application, Website, or business process, using them ought to fall into the "no-brainer" category.
Acquisitions of companies such as Sun, BEA, Peoplesoft, Cognos, Siebel, Business Objects, and countless others the past few years have created a competition vacuum in the enterprise software space. For example, in the last five years or so, Oracle has spent over thirty billion USD purchasing nearly sixty companies. Microsoft has gobbled up eighty or so, IBM sixty, EMC forty and Hewlett-Packard approximately thirty-five. And these are just the giants.
The next tier of enterprise software companies also have pretty long lists of recent acquisitions. So one can imagine quite easily that this collective buying spree has created a deep void in the landscape of enterprise software, and as a result creates a tremendous opportunity.
After all, not much has happened in terms of new products and innovation in the space in the past several years, save for a handful of companies such as Salesforce.com, NetSuite and some of the various SAAS and open sources models that have emerged. But even much of this is nearing the ten year mark.
Interestingly, some of the Fortune 500 have annual I.T. budgets north of a billion dollars per year. And those that don't have budgets that are indeed quite large. This, combined with the fact that many of their primary systems were built and deployed in the 1990's (yes that's ten to twenty years ago) and are getting a bit "long in the tooth" as they say of aging horses, creates an interesting set of dynamics.
In addition, Cloud infrastructure is maturing and getting more firmly in place with more efficient computing resource and data storage models. It is quickly becoming the seedbed for future enterprise software innovation, not only in new software categories, but also in the traditional categories of business intelligence, analytics, data management, and employee and customer-facing applications.
All of these trends point to a "perfect storm" of opportunity. Their alignment ought to be attractive to a new wave of entrepreneurs that can take advantage of the emerging Cloud Computing trend in new and exciting ways. This will enable a great deal of new innovation in the enterprise/corporate information technology space.
So while much of the technology press is caught up in all of the Android-iPhone rage, Facebook privacy issues, and the Groupons and Four Squares of the world, quietly many technology veterans are taking notice of this enterprise software void and recognizing the opportunity for what it is.
As one example, Marc Andreessen of Netscape fame has recently indicated that his venture capital firm is investing in a "new wave" of enterprise software companies. Others are sure to follow this trend of focus including both entrepreneurs and investors.
In other words, I don't subscribe to the opinion held by some that enterprise software is dead. So over the next couple of years, I do expect a wave of new enterprise software companies to emerge, setting off another arms race in the corporate I.T. space as organizations battle it out to stay a step ahead of their competition.
Fortunately, companies like StrikeIron with our data-as-a-service external data and data verification components can benefit extensively from this trend, providing important pieces to these emerging applications with ease.
It should be exciting times ahead.