The 2012 UP-START Cloud Awards (@UP_Con) has named StrikeIron as a finalist for the Best Cloud Broker Award, recognizing the company for its advancement and contribution to the cloud computing business and technology communities. StrikeIron is one of five finalists.
Other enterprise companies competing for awards include Salesforce.com, DellCloud Applications, Amazon Web Services, Rackspace and Apple, to name a few.
A group of cloud experts determined the winners in each of 22 categories after tallying votes based on the Borda count election method. Final voting will take place during the Third Annual Global UP 2012 Cloud Computing Conference, which is December 12, 2012, at the South San Francisco Conference Center, with more than 150,000 attendees expected.
“We are thrilled to add another award to our portfolio,” says Justin Helming, StrikeIron’s VP of Marketing. “StrikeIron is the best cloud broker service, because our IronCloud™ platform has been proven over nearly 10 years for thousands of customers, who have successfully processed billions of transactions.”
StrikeIron was founded in 2003 by Bob Brauer (founder of DataFlux), Richard Holcomb (co-founder of HAHT Software and Q+E Software), and Robert Dale. As a result, StrikeIron has developed a solid infrastructure and has won many accolades, including recognition from Inc. Magazine, Business Leader Magazine, Gartner and Forrester Research.
As organizations move applications to the Cloud where it makes sense to do so, they should recognize that this is an ideal opportunity to improve the value of the underlying data assets that feed these applications. After all, any system, Cloud or otherwise, is only as good as the data within it.
A "move to the Cloud" provides a unique opportunity to both ensure existing data is of the highest possible quality and to also install mechanisms to govern that all future data that enters the system is accurate, current, and complete
. This is especially ideal if data is also being moved from an existing internal database to a Database-as-a-service (DBAAS) product like SQL Azure or Amazon RDS, or a to a database that will be running on top of a Cloud service such as MySQL or SQL Server on Amazon, Microsoft Azure, Rackspace, or any other Cloud platform.
As data is moving from its source database, where it currently exists, into its target Cloud database, you can take advantage of this ideal time to:
- Ensure all physical addresses are valid, accurate, current and complete
- Ensure all email addresses are live, working email addresses that have not been disabled or changed (otherwise, you could find yourself on spam lists simply by trying to contact your customers)
- Ensure all telephone numbers are valid, accurate, and current
- Ensure all data fields are consistent in content and individual data elements are non-ambiguous, making data analysis and the emerging field of data science much more effective
- Fill in all missing data where possible
- Eliminate duplicate contact and customer records
- Incorporate any other data-specific business rules and requirements that make sense for your organization
Also, the wise organization puts real-time data quality and data enhancement mechanisms in place at the points of data collection, such as a data entry form or within a Web-to-lead process, to ensure that all new data coming into a system is of the highest possible quality. This also prevents degradation of data over time, so the same set of issues do not occur again a short time later. Otherwise, this will lead to more cleansing efforts and cost downstream.
A significant part of the success of any Cloud initiative revolves around cleansing existing data during migration, getting real-time data quality mechanisms in place, and establishing an ongoing data management plan with metrics and goals for going forward. Don't let rare application migration opportunities such as this go to waste.
Organizations collect leads in many different ways. Leads are collected when prospects respond to offers, fill out forms in websites, or send email inquiries. Leads can occur simply by word of mouth or other lead capturing tools as well.
Once a database of leads has been created, what can be done to improve the value of this asset that helps drive all future sales? How can the data be leveraged to optimize the time of the sales organization before they receive these leads?
There are 3 main ways to improve the value of lead data:
The first is to eliminate bogus or phantom leads. One way to do this is by validating the existence of email addresses for leads. This not only removes obvious fake email addresses, but also filters email addresses from domains that are not operational and have been turned off or cannot accept email. Other data such as phone numbers or mailing addresses can also be used for lead validation. Lead filtering such as this can prevent the sales organization from wasting valuable time on these phantom leads. It can also reduce spend for marketing campaigns and keep an organization off of spam lists.
Secondly, improving the quality of lead data is also important. This means ensuring that phone numbers are correct, addresses are accurate and do not contain typos, any missing information like zip codes or area codes is added, and correcting anything else that require members of the sales team to waste time chasing down a lead. Incomplete or inaccurate information and associated lost time hinder a sales force from understanding and meeting the needs of actual prospects.
Third and finally, appending additional information to each lead can help provide data points that can be used for segmentation, personalization, localization, and other forms of targeted marketing. For example, appending latitude and longitude data can help with market visualization and opportunity identification. Combining that data with census and other demographic information help score leads for sales to prioritize opportunities.
Sales leads are the lifeblood of revenue generation. The smart organizations recognize that and treat them as such.
2011 has been the year of the Cloud database. The idea of shared database resources and the abstraction of underlying hardware seems to be catching on. Just like Web and application servers, paying-as-you-go and eliminating unused database resources, licenses, hardware, and all of the associated cost is proving to have attractive enough business models that the major vendors are betting on it in significant ways.
The recent excitement has not been limited to just the fanfare around "big data" technologies. Lately, most of the major announcements have come around the traditional relational, table-driven SQL environments Web applications make use of much more widely than the key-value pair data storage mechanisms "NoSQL" technology uses for Web-scale data-intensive applications such as Facebook, NetFlix, etc.
Here are some of the new Cloud database offerings for 2011:
Saleforce.com has launched Database.com, enabling developers in other Cloud server environments such as Amazon's EC2 and the Google App Engine to utilize its database resources, not just users of Salesforce's CRM and Force.com platforms. You can also build applications in PHP or on the Android platform and utilize Database.com resources. The idea is to reach a broader set of developers and application types than just CRM-centric applications.
At Oracle Open World a couple of weeks ago, Oracle announced the Oracle Database Cloud Service, a hosted database offering running Oracle's 11gR2 database platform available in a monthly subscription model, accessible either via JDBC or its own REST API.
Earlier this month, Google announced Google Cloud SQL, a database service that will be available as part of its App Engine offering based on MySQL, complete with a Web-based administration panel.
Amazon, to complement its other Cloud services and highly used EC2 infrastructure, has made the Amazon Relational Database Service (RDS) available to enable SQL capabilities from Cloud applications, giving you a choice of underlying database technology to use such as MySQL or Oracle. It is currently in beta.
Microsoft also has its SQL Azure Cloud Database offering available in the Cloud, generally positioned as suited for applications that use the Microsoft stack for developers that will want to leverage some of the benefits of the Cloud.
Some of the above offerings have only been announced so far, and not actually launched. Or, they have limited preview access available now. Also, even the business models in some of these cases have not even been completely divulged, or if so are very likely to change.
Clearly there is a considerable marketshare land grab existing now. All of the major vendors are recognizing that traditional-SQL Cloud storage infrastructure will be an important technology going forward. Adding a solid database layer to the Cloud architecture story seems like an important step in the continuing enterprise and commercial software move to the Cloud, and these new vendor offerings should in turn accelerate this move.
So, is this really the wave of the future? Some of the major questions that will have to be answered include those around latency. When data requests have to hop from a client application, then to the application server, to the database, and then back to the server and client, even multiple times within a single request, it can result in quite a performance hit. Likely, these machines exist far from each other geographically and might really slow things done, annoying an end-user with the slow page loads. This is probably why most infrastructure providers realize that they have to have the corresponding database capabilities available and accessed natively to reduce this latency. However, performance, along with security issues (perceived or otherwise) still could be a significant barrier to mainstream adoption.
Also, most of the relational database environments that exist in the Cloud only have a subset of SQL capabilities available and in some cases can be quite limited. For example, many of these Cloud SQL platforms don't support cross-table joins, at least not yet. This is a very common requirement for SQL applications. The lack of support is primarily because joins can consume a lot of resources, another performance-killer in shared environments.
Once most of this storage and Cloud database infrastructure gets in place however, incorporating more content-oriented data services such as customer data verification will become commonplace and easy to leverage. We may even see them incorporated into the database offerings themselves as they look to differentiate themselves from vendor to vendor. Cloud-based database offerings have the advantage of making much larger libraries of data-oriented add-on capabilities available right out of the box, so the story here is much more than just cost.
While SQL Cloud offering announcements are all the rage in 2011, 2012 will undoubtedly tell the adoption tale. No doubt these offerings will be ideal and cost-effective for many use cases out there. But will demand be large enough quickly enough to support all of these vendors and drive the innovation at a speed that will make these platforms viable in the near future for enterprise and commercial applications? The answer is likely yes, but the next twelve months or so will give us a lot of the supporting data to measure the extent of the trend.
At the very heart of any CRM system is the data within it. To communicate effectively with customers and prospects requires a high level of current, accurate data about each and every contact record in the system. The same is true in order to have the best possible business intelligence as the foundation of decision making.
There are traditional software methods for achieving high quality data, but they typically can be expensive and time-consuming to implement, and even more costly to maintain. Using a "Data-as-a-Service" (Daas) approach provides an effective model for incorporating real-time data validation checks into a CRM system, including verifying the accuracy of email addresses, phone numbers and mailing addresses. A solution in the "Cloud" like DaaS allows an easy implementation because the integration is completed in advance by the vendor, hence the "turnkey" notion.
This Web-based approach also eliminates the need to update the data reference sources that serve as the basis of these data validation checks. This is because the frequent updates occur at a master data center that is pre-wired into the solution via the Cloud. The reference data utilizes the Internet and is accessed as needed, rather than stored as a separate, full copy and maintained at each and every site where the software is in use. Since data reference updates are automatic in the world of DaaS, there is no risk that the reference data becomes aged and gradually loses its usefulness as is what often happens with on-premise software solutions.
Finally, the Web/Cloud-based solutions have the advantage of usage-based business models rather than large, upfront software investments that come with a high degree of risk and typically include implementation costs. Rather, the DaaS business example enables a grow-as-you-succeed model which is generally best for an organization.
In the case of Oracle CRM On Demand, StrikeIron has collaborated with our partner ActivePrime to help deliver this type of turnkey, pre-integrated solution, improving the quality and usability of data within Oracle CRM On Demand. Within this solution, available now, a single click validates the critical points of contact data via the Web, with all reference data updated automatically. This enables a much better foundation of high quality data on which to operate a sales force or marketing organization, ultimately resulting in a much better CRM ROI. You can see the screen shots below for email address validation, phone number validation and mailing address validation.
It has been five years since Oracle CRM On Demand was released and now finally there is an integrated real-time data quality solution that is easy to turn on and put to use.
Acquiring "qualified" leads from a third party source can be of considerable value and a great time saver. High quality, well-targeted leads can drive and grow a business substantially in just about any vertical, especially in the era of global commerce and short sales cycles.
However, there are countless companies offering these targeted leads, and unfortunately these leads and the supporting data for each lead are of greatly varying quality. Poor quality leads can include invalid or missing contact information, contact data (including email addresses) that is no longer current, or other types of data inaccuracies.
Purchased leads with less than perfect contact information can render many of these individuals or organizations unreachable, and therefore a waste of important resources, including the cost of acquiring these leads. This can substanitally hurt the ROI of what might otherwise be a good campaign. In addition, knowing the quality level of leads can also help better score a particular campaign so it can be tweaked, improved, or even discontinued. However, if there is no handle on the level of lead quality for a given campaign, then there can be a great deal of frustration during analysis, especially when it comes time to report on campaign success metrics.
One thing that can be done to minimize the impact of poor lead quality is to validate the quality of leads received from any vendor, especially early on in the vendor relationship, but also on an ongoing basis. This type of quality testing is important as you employ multiple lead vendors, compare results across similar campaigns, and build lasting relationships with the vendors that provide the best lead data to drive your campaigns.
Fortunately, validating address data, ensuring correct telephone data, and verifying that email addresses are still enabled and actively receiving email can easily be achieved by solutions that leverage and apply the benefits of the "Cloud". Solutions such as StrikeIron's Contact Record Verification Suite and its lead validation APIs employ multiple, constantly updated reference data and algorithms to ensure the hightest possible lead quality, and can quickly and easily be integrated into any business process, Web site, or application that utilizes lead data. No software or management of reference data is required as all lead verification occurs in real-time via the Web. This easy, cost-effective process can go a long way in optimizing campaigns, comparing vendors, and achieving a successful, efficient marketing department that drives great value for the rest of the organization.
Leverages Informatica Cloud to enable high quality, verified contact record data via StrikeIron’s Contact Record Verification Suite
Click to continue reading
Amazon's new SES (Simple Email Service) product is a scalable, transaction-based offering for programmatically sending large amounts of email. This is accomplished using Amazon's Web-scale architecture, most especially for applications that already use EC2 (server rental) and S3 (storage rental). By utilizing SES you are essentially leveraging the "Cloud" to send emails from applications and Web sites rather than investing in your own software and hardware infrastructure to do so. This process substantially reduces cost and complexity as do most Cloud services and in this case requires only a simple API call. There is no network configuration or email server setup required in this process.
However, there are some significant restrictions to consider that Amazon has imposed on the user. The SES service will only let you start out with a limited quota until you build a "good reputation" within their system. The initial limit is 200 emails per day. This will increase substantially once you build your reputation within Amazon’s service.
One criteria used to "build your reputation" within Amazon is based on number of bouncebacks or emails that could not be delivered because the email address is invalid or has been disabled. Having a clean, verified email list prior to and during your ongoing use of the service is extremely important to minimize the number of bouncebacks you receive. If your use of SES returns a large number of bouncebacks from non-working email addresses, your quota will not be raised and you may be disqualified from using the system. Multiple bouncebacks can really hurt your reputation and will prevent you from being able to fully maximize Amazon's SES product.
Fortunately, you can use another Cloud-based service (available from StrikeIron) for verifying the validity of an email address before using Amazon's SES service (or any other email service). It is another simple API check that will indicate if a given email address is not valid (an actual non-intrusive, real-time check across the Web without ever sending an actual email). This is exactly one of the primary uses of StrikeIron's Email Verification Service- building email service provider reputation by significantly reducing bouncebacks and staying off of spam lists which can kill your ability to communicate with customers and prospective customers electronically.
As email technology becomes more sophisticated, so should those of us who make use of these technologies especially when it is so easy. The business upside can be dramatic and provide great results for companies.
There are three primary points of communication with customers and potential customers. They are the physical address (mail), the email address, and the telephone number. And often more than one in each case.
All businesses aren't the same, but in general, how important is it to communicate regularly with customers and contacts? What value can you place on the accuracy of data about your customers? Does it mirror the value of the customers themselves?
Most would agree that these data points about contact data are important enough to ensure resources are available to ensure this contact information is current, accurate, and complete. After all, these are the gateways to those who drive the bottom line. Can you afford for this information to be wrong or incomplete?
So what are some of the threats to "Big Three" accuracy?
One threat is that email addresses are changed regularly, often resulting in the disabling of existing email addresses. This can happen when someone changes jobs or leaves a company, and in an era where once the spam kings get a hold of an email address, 95% or more of email can be spam, sometimes email addresses are changed just to be relieved from this electronic deluge of junk email.
Also, at least 40 million Americans change their mailing address at least once each year, and this usually results in one or more phone numbers being changed. And of course with the skyrocketing popularity of smartphones, keeping up with a contact's various telephone points of contact can be a bear.
Each of these are just some examples of contact data can degrade over time.
Taking these "facts of life" and combining them with the large number of typos that can occur during the data collection process of these data elements, especially over the Web, and you have a recipe for a significant data accuracy problem.
Getting the "Big Three" right isn't always easy, but in most cases, investing effort and resources on this issue along with the application of various solutions designed to solve these kinds of problems can pay significantly dividends, both short-term and long-term. Focusing on these three primary points of contact, and greatly improving the validity and accuracy of that information, can go along way in getting the results you are looking for when communicating with customers and potential customers.
And of course, perhaps our Contact Record Verification Suite can help. We'd be happy to talk with you about it and help address your particular situation. After all, that's what we do every day.