Capturing customer information at the point-of-sale (POS) is a critical ingredient in creating repeat business. That being said, efficiency at checkout is also important to both customer experience and operational efficiency.
A solution many retailers are implementing today is to find addresses by phone number at the POS terminal. This means the customer service representative only needs to ask a customer for their phone number. Solutions like StrikeIron's Phone and Address Lookup Service will find and quickly retrieve the correspondiing address by phone number via an instantaneous Cloud lookup. The service can even append email addresses and other demographic and socioeconomic data for later segmentation.
For example, a 1200 store automotive products and service company, uses StrikeIron's Reverse Phone and Address Lookup Advanced Service to append contact information at the point-of-sale. The customer service representative enters the customer's phone number in the POS terminal and the solution finds addresses by phone number. This expedites the entry of the data, as well as ensures the data entered is accurate.
In regards to this use case, the automotive products and service company's customers give them mobile numbers to contact when the work on their car is complete. Thus, there was a need to have a solution that covers both landlines and cell phones.
We would like to hear from you. Are you collecting customer information during checkout? If not why? If so, what has been your experience?
Have we gotten to the point where almost everything delivered over the Internet is considered "Cloud"? In fact, now it doesn't even have to be delivered over the Internet, as "Private Clouds" are quickly becoming a catch-all phrase to market older software and hardware products while simultaneously jumping on the Cloud bandwagon. After all, almost every IT management survey these days is indicating the "Cloud" as the key computing platform for quite some time to come. Many vendors, even deceptively at times, are rewriting collateral to match the trend.
While some will say that there is no true definition of Cloud and they can therefore "Cloud-promote" as they please, there are things an astute buyer should look for when determine if a product or service is actually "Cloud-like", especially since Cloudwashing might very well reach its crescendo this year.
For example, there should be a multi-tenant architecture allowing the underlying software resources to be shared transparently. There should also be a usage-based metering and billing business model, so costs of resource utilization matches actual use across a community of users.
There should also be linear scalability, a virtualized infrastructure, and an abstraction away from any underlying hardware and software complexity. In other words, reduced complexity is generally proportional to how "Cloud" a given software service is likely to be. As an example, our IronCloud
platform that delivers commercial data-as-a-service products
is a usage-based billed service that adheres to all of these principles.
Even though it is true that there are scenarios where a "Private Cloud" makes sense, it should still adhere to the above characteristics of a Cloud, even if it is limited to a single organization and resources are being shared across that organization in a true multi-tenant sense. But if you start to hear terms like "upgrade path", "required hardware", "required OS", "appliance", and so forth, very likely the use of the word "Cloud" has been liberally applied to describe the product or service.
So the next time someone says "to the Cloud!", make sure that is where you are really headed.
Free Webinar from Informatica and StrikeIron on new Data Quality features in Informatica Cloud
Cary, NC and Redwood City, CA, February 16, 2012: Informatica and StrikeIron announced that they will host a new webinar titled “Validate Addresses and Emails in the Cloud”.
This webinar will focus on why data quality is important for your data migration and integration projects and what Contact Validation capabilities are available in the Informatica Cloud Winter 2012 release.
Speakers Darren Cunningham, VP of Marketing for Informatica Cloud and Bob Brauer, Founder and Chief Strategy Officer of StrikeIron will host this webinar on February 21, 2012 at 10:30 AM Pacific Time (1:30 PM Eastern Time).
Specifically, the attendees of this webinar will learn:
- Why data quality is critical
- How data migration and integration tasks provide the perfect opportunity to improve data quality using the Contact Validation capabilities in Informatica Cloud Winter 2012.
- The Data Quality Solutions available in Informatica Cloud Winter 2012 including: Email Verification, Address Validation, Geocoding, and Do No Call List Verification.
- How to use the Contact Validation capabilities in Informatica Cloud Winter 2012.
To register for the free, one-hour webinar, please click on the following link: http://info.informaticacloud.com/forms/Webinar_CVS?elq=ab380da54aea4647a9fdc7ac816556d7?PRSI
Would you like to share this infographic?
Just copy and paste the embed code below:
Since the emergence of the pure "Cloud," the focus across the industry has primarily been on Infrastructure-as-a-Service. Servers are rented by the minute, procured and released when needed, and storage is made available on-demand. These usage-based business models have been very financially attractive for all kinds of scenarios, especially because most of the details enabling these resources are abstracted away from the recipient, or actual user, of these capabilities. Cost-effective + easier = significant adoption.
Amazon's EC2 and S3, Microsoft's Azure, and Google's App Engine and Cloud Storage offerings are all examples of this "by-the-minute" server and "by-the-byte" storage trend. Now, all kinds of advanced features such as control over load-balancing, physical server location, and other fine-tuning capabilities are being offered by each of the major vendors as they look for differentiation in the market.
However, one thing interesting is that most of the Cloud vendors began their Cloud offerings with bucket-oriented, key-value pair approaches to data storage, primarily to handle many types of unstructered data. Table-driven, SQL-based applications were all but considered dinosaurs by many, giving way to an era of "NoSQL" - all the rage just a short while ago.
Fast forward a couple of years now, and all of the major vendors have began offering SQL-based Cloud storage systems:
- Google Cloud SQL is now available in a "Limited Preview" release.
- Microsoft's Azure has Azure SQL available.
- Amazon now has Amazon RDS (Relational Database Service) available, providing use of Oracle, Informix or MySQL databases.
- Even Salesforce has its SOQL (Salesforce Object Query Language) capability, enabling SQL-like commands to be executed within APEX on its Force.com "Cloud" offering.
So what does this tell us?
It means that enterprises are moving existing applications to the Cloud at a significant rate. These applications were built initially with SQL requirements, and as a result the demand is now there to mirror the database access requirements out in the Cloud.
As these SaaS applications running on Cloud infrastructure proliferate, they will become more and more "data hungry". The major Cloud vendors are simply responding to this need by putting down a foundation for a data-centric Cloud.
This is why we think the "Data Era" of the Cloud will soon be upon us. A lot of the traditional data-oriented applications, such as real-time data quality, will be just as relevant in the Cloud, if not even more so, as they were with on-premise enterprise applications.
In addition, smartphones, tablets, and other mobile devices are ideal "Cloud clients", increasing demand for communicating back and forth by utilizing and sending data from and to the Cloud. This furthers the "data-driven premise" and its associated infrastructure requirements.
Even vertical applications (e.g. healthcare) are now moving the Cloud, providing demand for specialized data moving through the Cloud's data highways.
Each of these factors will contribute heavily to the forthcoming "Data Era of the Cloud". The question is, will you be ready? Or better yet, will your data be ready?
StrikeIron is excited to announce its inclusion in SAP's PartnerEdge Program and the integration of our Cloud-based Web service product offerings into SAP's Business ByDesign
business management solution. This integration and partnership enables us to broaden access of our Contact Record Verification Suite
into the ecosystem of SAP Business ByDesign, SAP's fully integrated business management solution. SAP Business ByDesign is delivered on-demand and dedicated to companies in the small and midsize enterprise (SME) market, as well as to subsidiaries of large enterprises running SAP solutions.
Here is a YouTube video demonstration showing StrikeIron's Cloud-based Reverse Phone and Address Lookup Web Service
integrated into SAP's Business ByDesign. The entire workflow is demonstrated: http://www.youtube.com/watch?v=PB5OJArHWBQ
To read the full release: http://bit.ly/zD6jHV
Outbound call centers have several knobs that managers can turn to improve the efficiency of their operation.
Data quality, specifically validating phone numbers prior to dialing, is one of the most critical (and easiest) knobs you can turn to improve the number of connections a team can make.
Validating phone numbers is the process of determining whether a phone number is valid and should be dialed. When call center agents have access to validated phone numbers, they are able to save their valuable time. Thus, there is an associated tangible value.
For example, you may have a list of 1,000,000 prospects that you will call through your call center. Assume that it takes 30 seconds to dial a number, the average talk time is 3 minutes per connect, 15% of your numbers are invalid, your connect rate is 20% for the good data, and you pay a fully loaded cost of $10 per hour to your call center reps.
It will take your team 16,833 hours to call the list with invalid phone numbers but only 15,583 hours to call the list after validating the phone numbers.
This is equivalent to a savings of over $12,500 in labor costs alone, much more than it will cost to run the numbers through a real-time phone validation solution.
Using a service tailored to validating phone numbers prior to dialing will result in a tremendous savings in money and time in calling invalid numbers. Additionally, many call center platforms enable you to integrate Web Service API’s with a few clicks (or a few lines of code).
StrikeIron’s Phone Validation service provides a real-time, Web Service API for validating phone numbers. This service can be easily integrated into any CRM or call center system. You can signup for a free trial, email us, or call us at 1.866.562.3920 .
Have you recently integrated a phone validation solution in your call center? What kind of returns did you see? Tell us below.
Regardless of your industry, bulk SMS text messaging is on the rise due to the worldwide proliferation of smartphones in the marketplace. In 2011, 6.9 trillion text messages were sent. It is predicted that this number will reach to 8 trillion in 2012.
All signs are clear that the future of marketing is going mobile. In fact, the mobile marketing spend is predicted to be $6.6 billion this year.
These indicators show the importance of businesses utilizing mobile technologies, particularly bulk SMS text messaging.
2012 brings with it a number of opportunities to create SMS campaigns that engage customers and drive sales. Bulk SMS text messaging is by far the most popular mobile means of communication in business. Text messages are a simple and effective way to reach a wide audience with a targeted message. Bulk SMS text messaging can be used in many ways, including text short-codes, text-to-win promotions, and mobile coupons.
Learn more about how bulk SMS text messaging can transform your marketing and sales efforts: http://offers.strikeiron.com/sms-whitepaper/
Master Data Management, also known as MDM, often comes up in conversation as a key information technology initiative for an enterprise, including considerations for leveraging the Cloud as an ideal environment for MDM. We can save Cloud considerations for a later post, so for now, a very basic MDM primer might be a good idea for those who scratch their head at the mere mention of the term. It's actually a much simpler concept than descriptions of it often entail.
MDM is simply about keeping non-transactional data, such as customer data, in a single place (logically or physically) to be shared across many different systems. When each system that uses this customer data, whether it is a CRM system, an accounting system, a support system, or a business intelligence system, updates or adds to data about a customer, all connected systems gain the benefit of that change.
This "master data" approach also eliminates inconsistencies of things like contact information and customer notes across different systems where customer data may be captured, and does a far better job of keeping customer data current. To use a few timeworn adages, with MDM we are keeping all of our customer data eggs in one basket, with the idea that the whole is greater than the sum of the parts.
The other key benefit to maintaining and managing customer data in one place for use across multiple line-of-business systems is that customer data-specific activities can be performed with that data, benefitting each connected system that makes use of the data. This includes customer data quality initiatives, not only validating and correcting customer data to keep it as current as possible, but also using external third party data sources to create a more vivid, detailed view of an organization's customers. Other activities include customer data consolidation (matching and eliminating redundancy), data governance, easing the distribution of customer data (the Cloud), better customer communication, and a foundation for richer analytics that utilize customer data.
At the end of the day, customer data is one of a company's most important assets. The MDM approach helps maximize the value of that data for use.