Making sure packages reach your customers is the critical, final step in any eCommerce transaction. Address Verification Web Services are a cost effective, easy-to-integrate solution that avoids shipping to invalid addresses in the US and internationally.
Packages shipped to invalid mailing addresses result in both unhappy customers and address correction fees upwards of $11 per shipment from shipping companies. As seen in this Undelivered As Addressed Infographic, over $600 Billion dollars is wasted on shipping to invalid addresses every year.
Shipping, both charges and experience, is one of the most significant barriers to consumers choosing eCommerce vendors instead of brick-and-mortar stores. You can streamline fulfillment and virtually eliminate shipments to invalid addresses with address verification web services like StrikeIron’s North American Address Verification and Global Address Verification solutions.
Here are 5 tips to use address validation API’s to improve your customer’s experience and save money on returned shipments:
1) Be mindful on where you integrate your address checker into your checkout experience. As with your entire eCommerce flow, the goal is to minimize shopping cart abandonment. Where the user enters their shipping and billing addresses is a key piece of this overall experience.
2) Pick a solution that corrects addresses instead of just verifying them. Correcting a mistyped address will streamline the flow and minimize abandonment.
3) Give the customer choices. Great tip from one of StrikeIron’s long time eCommerce customers is to present the user with a corrected address and allow them to override the selection if they still feel their original address is correct. These overrides should be flagged and manually followed up on by your customer service team. This gives the customer a feeling of control and can help resolve any discrepancies through good, old-fashion customer service.
4) Verify both shipping and billing addresses. Credit card processors offer AVS, but AVS only verifies the link between a card and an address. It does not see if the address is valid. If you need to contact the customer at the billing address, make sure it is valid.
5) If you ship internationally, make sure your forms are not only North American focused. For example, don’t require a state from a drop down box as this will frustrate international buyers and signal that you do not have extensive experience in shipping internationally.
In addition to the tips above, it is important to find the right address verification partner. Make sure you pick a partner that has a dependable, high performance cloud delivery platform to ensure address verification, just like the rest of your eCommerce site, is available 24/7/365.
Late last week, Amazon released an update
to its DynamoDB
service, a fully managed NoSQL
offering for efficiently handling extremely large amounts of data in Web-scale (generally meaning very high user volume) application environments. The DynamoDB offering was originally launched in beta back in January, so this is its first update since then.
The update is a "batch write/update" capability, enabling multiple data items to be written or updated in a single API call. The idea is to reduce Internet latency by minimizing trips back and forth to Amazon's various physical data storage entities from the calling application. According to Amazon, this was in response to developer forum feedback requests.
This update to help address what was already an initial key selling point of DynamoDB tells us that latency is still a significant challenge for cloud-based storage. After all, one of the key attributes of DynamoDB when first launched was speed and performance consistency, something that their NoSQL precursor to DynamoDB, SimpleDB
, was unable to deliver, at least according to some developers and users who claimed data retrieval response times ran unacceptably into the minutes. This also could have been a primary reason for SimpleDB's lower adoption rates. Amazon is well aware of these performance challenges, and hence the significance of its first DynamoDB update.
Another key tenant of DynamoDB is that it is a managed offering, meaning the details of data management requirements such as moving data from one distributed data store to another is completely abstracted away from the developer. This is great news, as complexity of cloud environments was proving to be too challenging for many developers trying to leverage cloud storage capabilities. The masses were scratching their heads as to how to overcome storage performance bottlenecks, attain replication, achieve response latency consistency, and perform other operations-related data management challenges when it was in their purview to do so. By the way, management complexity will likely still be a major challenge for other NoSQL vendors, and there are many "big data" startups offering products in this category, who do not offer the same level of abstraction that DynamoDB offers. It will be interesting to see if the launch of DynamoDB becomes a significant threat to many of these startups.
We learned this reduction of complexity lesson at StrikeIron
within our own niche offerings as well. We gained a much bigger uptake of our simpler, more granular Web services APIs, such as email verification
, address verification
, and other products such as reverse address and telephone lookups
as single, individual services, rather than complex services with many different methods and capabilities. This proved true even if the the more complex services provided more advanced power within a single API. In other words, simplified remote controls for television sets are probably still the best idea for maximum television adoption, as initial confusion and frustration tends to be inversely proportional to the adoption of any technology.
Another interesting point is that this is the fifth class of database product offerings in Amazon's portfolio. Along with DynamoDB, there is also still the aforementioned SimpleDB, a schemaless NoSQL offering for "smaller" datasets. There is also the original S3
offering with a simple Web service based interface for storing, retrieving, and deleting data objects in a straightforward key/value pair format. Next, there is Amazon RDS
for managed, relational database capabilities that utilize traditional SQL for manipulating data and is more applicable for traditional applications. Finally, there are the various Amazon Machine Image (AMI) offerings on EC2
(Oracle, MySQL, etc.) for those who don't want a managed relational database and would rather have complete control over their instances (and not have to utilize their own hardware) and the RDBMs that run on them.
This tells us that the world is far from one-size-fits-all cloud database management systems, and we can all expect to be operating in hybrid storage environments that will vary from application to application for quite some time to come. I suppose that's good news for those who make a living on the operations teams of information technology.
And along with each new database offering from Amazon also comes a different business model. In the case of DynamoDB for example, Amazon has introduced the concept of "read and write capacity units", where charges will be based on the combination of frequency of usage and physical data size. This demonstrates that the business models are still somewhat far from optimal, and will likely change again in the future. Clearly they are not yet quite right for the major vendors trying to figure it all out as business model adjustments in the Cloud are not just limited to Amazon.
In summary, following the Amazon database release timeline over the years yields some interesting information, namely that speed/latency, reduction of complexity, the likelihood of hybrid compute and storage environments for some time to come, and ever-changing cloud business models are the primary focus of cloud vendors responding to the needs of their users. And as any innovator knows, the challenges are where the opportunities are.
There are a number of email verification software applications for sale that claim to perform email pings to verify your email lists from your PC. These applications are less accurate and scalable than Web-based Email Validation Services
While these email validation applications may sound like a great way to quickly verify your email lists, they can be fraught with issues. Some problems you may encounter include the following:
1) Desktop software performs email server pings which do not provide accurate results, especially for consumer-oriented domains (e.g. Yahoo! Mail, Gmail, Hotmail, etc.). Before spam became rampant, there was a simple VRFY command that could quickly be used for a mail client to check to see if the recipient’s email was valid. As you can imagine, this was abused by spammers and most email servers (consumer and corporate) have disabled this command.
2) Desktop software uses your IP address. This is an issue because email servers will flag an IP address for throttling and blacklisting if they see a lot of inbound calls from addresses that normally do not generate high quantities of email. This could affect future emails sent from your IP address and may even raise red flags with your Internet Service Provider.
3) Desktop software doesn’t scale. Tied closely to #1, you can only verify a certain number of email addresses before your IP addresses will get shut off or throttled.
A better solution is to use an email verification web service API solution, like StrikeIron’s Email Verification, for the following reasons:
1) Cloud based services support the most up-to-date methods and algorithms to provide the most accurate results. At StrikeIron, we have a team focused on constantly evolving and improving our proprietary email verification technology. Additionally, since the web service API stays the same, customers do not need to install or integrate new software to take advantage of these constant improvements.
2) Cloud-based solutions do not use your IP addresses. You call the web service via a SOAP or REST call, and we do all the work. Your IP address will never be flagged as a potential spammer.
3) Cloud-based solutions scale. We handle scalability across all of our customers, so we can handle high volumes of throughput for either real-time or batch processing.
Email verification is a critical part of ensuring your email marketing campaigns are successfully delivered to your customers’ and prospects’ inbox. Cloud-based solutions are a much more accurate, reliable, and scalable way to validate your email lists when compared to desktop email validation software applications.
Outbound calling campaigns are still alive and well. The first ingredient to your successful outbound campaign is accurate phone numbers. Phone Append solutions can both add a phone number to prospect records that are missing numbers, as well as verify that the number is still linked to that person or business.
Furthermore, running your prospect database through a phone append solution will add other valuable, actionable data for businesses and consumers. For example, StrikeIron’s Reverse Phone and Address Lookup service appends phone, name, and address data. In regards to company data it can add titles, SIC codes, and revenue. For consumer data, it will add home ownership, household income, age, etc.
There are three main ways to add phone append solutions to your workflow:
1) Upon acquisition - When prospects either enter their contact information (e.g. on a landing page) or when lead lists are imported. There are several benefits to performing a phone append at acquisition:
- Prevents bad leads from getting into your systems. By performing the append at acquisition, you can avoid polluting your CRM system with bad data.
- Requires less data. This means you can have fewer fields on your landing page or get the full value from incomplete leads.
2) Batch – In this scenario, the lead or CRM database is run through a solution to append phone, address, and other data. Typically this is done periodically to make sure the data is fresh and up-to-date.
3) Interactively – In this scenario, the sales person presses a button in the CRM system, which will invoke the phone append solution in real-time. The benefit of this approach is that the data is not appended until the sales person is ready to call them. Thus, it will be as fresh as possible.
Data quality is critical for sales organizations. Integrating phone append solutions like StrikeIron’s Reverse Phone and Address Lookup to your sales workflow will improve the effectiveness and efficiency of your sales team.
Get started today with a free trial!
As the "Cloud" has evolved and matured from its roots the past few years, the alternatives for deploying a cloud-based solution have been almost entirely proprietary and commercial. They typically have required at least a credit card to even get started "renting" servers and storage that might be needed for only short periods of time and to achieve more flexible scalability models. With the success and momentum of OpenStack, an open source cloud operating system for deploying, hosting, and managing public and private clouds within a data center, this appears to be changing.
The OpenStack project, launched initially with code contributions from Rackspace and NASA, provides the software components for making cloud management functionality available from within any data center, including one's own, similar to what Amazon, VMWare, Microsoft and other cloud vendors are now offering commercially. Deploying OpenStack enables cloud-based applications and systems utilizing virtual capacity to be launched without the associated run-time fees the current slate of vendors require, as all of the software is freely distributable and accessible.
At first glance, this seems to be an ideal solution for larger enterprise IT organizations to offer up traditional cloud functionality, such as virtual servers and storage availability, to its constituents within the organization and without the fear of vendor lock-in and and ever-increasing vendor costs. This approach also provides for access to implementation details and the ability to customize based on specialized needs - also important in many scenarios and something not typically or easily offered by the larger commercial vendors. So the benefits to the private cloud space to those who find it appropriate to build and manage their own cloud environments are clear.
However, Rackspace itself just announced making public cloud services available using OpenStack, and others are likely to follow in the not-too-distant future, leveraging community-developed innovation in the areas of scalability, performance, and high availability that might ultimately be difficult for any single proprietary vendor to match. This should enable public service providers, especially in niche markets, to proliferate as well.
Major high tech vendors are also backing and aligning with OpenStack. In addition to Rackspace and NASA, Deutsche Telekom, AT&T, IBM, Dell, Cisco, and RedHat all have much to gain from the success of OpenStack and have announced as partners, code contributers, and sources of funding. Commercial distributions have already emerged such as StackOps. Funding for OpenStack-oriented companies has begun from the venture community, and events such as the OpenStack Design Summit and Conference this week in San Francisco are getting larger and selling out quickly.
All of the foundational pieces are in place for OpenStack to have quite a run towards achieving its goal of becoming the universal cloud platform of the future and the leaders of the "open era" of the Cloud. This is an exciting development for companies like StrikeIron and our cloud-based data-as-a-service and real-time customer data validation offerings, as the data layer of the Cloud will become even more promising and fertile as OpenStack continues to accelerate organizations towards easier adoption of cloud computing models and all of its benefits.
Email validation, if done effectively, not only keeps organizations off of spam lists when communicating with customers and prospective customers, but also can provide new opportunities. Running a batch of email validation checks against an entire CRM contact database at certain time intervals is likely to yield a subset of invalid email addresses within the contact data. Of course, this is usually because people leave organizations for one reason or another. It could also mean they changed their email address, or be the result of some other possible reason.
Since the majority of the time these invalid email addresses likely represent contacts at customer or target companies that are no longer there (and their email addresses have been disabled), they likely have a replacement or someone else within that organization that is important to create a new relationship with. Realizing this and being alerted of it via an email validation process approach helps savvy customer-facing employees get in front of and be proactive with replacement contacts sooner rather than later in order to maintain key, existing relationships.
This process only works of course if real-time email validation solutions are utilized that do not rely upon static database lookups using data that can age quickly. Only solutions that go out to the Internet with each email check to recognize invalid email addresses in real-time will be useful in these scenarios.
The identification of these invalid email addresses in real-time typically triggers an action for a customer service representative or a sales person to reach out to the company to begin building the new relationship. This contact action can be a key trigger for new opportunities as well as strengthening existing customer relationships as there is a basis for communication with the customer, partner, or target company since the point of contact is no longer a part of the organization.
Fortunately, it is very easy to run these kinds of batch email validations using a Cloud-based email validation product such as StrikeIron provides. A mechanism is created that calls out to the StrikeIron platform via an API function call as each contract record is checked, simply reporting which email addresses are no longer valid. This creates a contact action list for customer service or sales teams to follow up with and potentially turn into new business opportunities in addition to maintaining existing ones.
The appropriate time interval to run these types of mass email validation processes against a contact database could be every 30 days, every 90 days, or whatever seems ideal for a given business. Keeping contact data fresh and current can be a tremendous competitive advantage, especially when it's so easy.
I had an opportunity to moderate a panel at the Data 2.0 Summit this week in San Francisco entitled "Why You Should Join the API Economy". There was a considerable amount of thought leadership on the panel, including Chris Moody, President of Gnip; Gaurav Dillon, CEO of SnapLogic; Chris Lippi, VP Products of Mashery; Peter Kirwan, Entrepreneur-in-Residence of Neustar; and Tim Milliron, Director of Engineering at Twilio.
We explored several topics including where success is occurring now within various API ecosystems (what is working), where money is actually being made with APIs, what some of the adoption challenges are moving forward, and how people can begin moving down an API path (both publishing APIs and finding relevant and valuable ones to consume) - all of these topics I plan to cover in future blog entries.
However, one area we explored that I thought was especially interesting is the adoption of API-centric business models within larger enterprises. Sure, high tech companies like Cisco and Salesforce have been utilizing APIs as significant parts of their business models for years. But where it is becoming especially interesting and demonstrates APIs moving into the mainstream is the traction of APIs and DAAS (data-as-a-service) in traditional vertical industries.
For example, many government entities are now opening up data channels to enable citizens to create innovative applications, such as San Francisco's open data portal, on the publishing side of data and APIs. Opening up this data to the masses can drive all sorts of innovation that bring benefits to entire communities.
On the consumption side, Mohawk Paper's (a company founded in the late 1800's) inspirational data integration case study that Gartner published was discussed as evidence of an enterprise pulling data together from multiple third parties to create a custom solution in the Cloud. One of these services is StrikeIron's real-time foreign exchange rate service API. And of course, among our 1800 customers there are several Fortune 500 companies that are leveraging our various API's and DaaS products at increasing rates, all evidence of expanding adoption in the enterprise.
As we see API-centric and DaaS-centric business models emerge that find traction in the enterprise in addition to all of the smaller entrepreneurial innovators and startups, we know we are getting closer and closer to mainstream adoption, which is where some of the biggest opportunities are yet to be realized.