You have access to the IP address of every visitor to your site. IP address web services provide valuable insight on your visitors from this simple set of numbers. You can learn what company (or ISP) your visitors are from, their latitude / longitude, and even their country, state, and city.
IP addresses are the way Internet traffic is routed between a web browser and a website. When your visitors request a webpage, their IP address is sent as part of this request. There are many interesting things you can find out from the visitor’s IP address from a simple web service call.
Our customers use this information for a variety of use cases including:
- Localized content. Based on the location of the IP address, you present the user with localized offers. For example, daily deal sites use IP address geocoding APIs to display the appropriate deal for the nearest, supported, metropolitan area.
- Language localization. Based on the country, you can easily localize the language and dialect for a particular country or region. For example, content outlets can automatically provide the correct language for the visitor as opposed to adding an extra step for the user to choose their language.
- Picking the best time for outreach. If you are reaching out to your customer through phone or email, it is useful to know their timezone so you can call during an appropriate time or send email messages at a time that will provide better open and click rates. An example of this is our sales team will not follow up with leads on the West Coast until late morning East Coast time.
- Fraud Protection. Linking an IP address location with a billing address provides an indicator of potentially fraudulent activity, enabling eCommerce merchants to use this linkage to score the likelihood of fraudulent transactions. For example, If the visitor is coming from a different country than the billing address for the credit card, then this is a possible indication of fraud.
There are many other use cases where IP address API’s can provide useful, actionable, insight for your visitors.
Interested in learning more? Signup for a free IP address web service trial or contact sales at email@example.com
We will be demonstrating our real-time lead enhancement, data quality, and SMS mobile marketing solutions this week at LeadsCon East in New York City.
The focus of the LeadsCon event is lead generation, online direct response and customer acquisition. Each of these marketing categories represents a prime application of StrikeIron's solutions that greatly increase the value of a lead through real-time data validation and data enhancement, making StrikeIron an ideal partner for companies that are taking on these kinds of initiatives.
For example, StrikeIron's real-time email verification solution, delivered via easy-to-integrate APIs in the Cloud, ensures that an email address is valid, live, and can accept messages. This helps to maintain clean, current, email addresses in customer and prospect databases, as sending email to disabled email addresses can put companies on spam lists and severely hamper marketing communications efforts.
Other solutions that will be demonstrated include address verification (North American and Global), phone validation, reverse phone and address append, email append, and our SMS text messaging solution, enabling organizations to communicate and engage with customers on mobile devices anytime and anywhere in the world.
All of these solutions are delivered through IronCloud, our award-winning Cloud-based data delivery platform, making it very easy to integrate any or all of them into Web sites, applications, business processes, or anything else that has the ability to consume a SOAP or REST-based Web service.
If you would like to meet with a StrikeIron representative at LeadsCon East to explore these solutions, please visit http://offers.strikeiron.com/leadscon
As organizations move applications to the Cloud where it makes sense to do so, they should recognize that this is an ideal opportunity to improve the value of the underlying data assets that feed these applications. After all, any system, Cloud or otherwise, is only as good as the data within it.
A "move to the Cloud" provides a unique opportunity to both ensure existing data is of the highest possible quality and to also install mechanisms to govern that all future data that enters the system is accurate, current, and complete
. This is especially ideal if data is also being moved from an existing internal database to a Database-as-a-service (DBAAS) product like SQL Azure or Amazon RDS, or a to a database that will be running on top of a Cloud service such as MySQL or SQL Server on Amazon, Microsoft Azure, Rackspace, or any other Cloud platform.
As data is moving from its source database, where it currently exists, into its target Cloud database, you can take advantage of this ideal time to:
- Ensure all physical addresses are valid, accurate, current and complete
- Ensure all email addresses are live, working email addresses that have not been disabled or changed (otherwise, you could find yourself on spam lists simply by trying to contact your customers)
- Ensure all telephone numbers are valid, accurate, and current
- Ensure all data fields are consistent in content and individual data elements are non-ambiguous, making data analysis and the emerging field of data science much more effective
- Fill in all missing data where possible
- Eliminate duplicate contact and customer records
- Incorporate any other data-specific business rules and requirements that make sense for your organization
Also, the wise organization puts real-time data quality and data enhancement mechanisms in place at the points of data collection, such as a data entry form or within a Web-to-lead process, to ensure that all new data coming into a system is of the highest possible quality. This also prevents degradation of data over time, so the same set of issues do not occur again a short time later. Otherwise, this will lead to more cleansing efforts and cost downstream.
A significant part of the success of any Cloud initiative revolves around cleansing existing data during migration, getting real-time data quality mechanisms in place, and establishing an ongoing data management plan with metrics and goals for going forward. Don't let rare application migration opportunities such as this go to waste.
We have all been through it before, you acquire a new user and send them an email verification link … and they never click it. It is not just you; email verification links kill conversion rates. Companies are seeing a drop off in users clicking the verification link of between 15-25% according to this conversation on Quora.
There are a variety of reasons for such large abandonment numbers. If your email gets filtered to spam, the user will likely never see the link and either reach back out with a complaint or forget about you all together. Some email servers use “greylisting” to deter spammers, which will result in your message taking several minutes to hours to show up in the user's inbox. By asking the user to leave your website, wait for an email, click a link, and resume, you are breaking the flow and introducing opportunities for the user to become distracted or disinterested.
Some companies use email verification activation links to either prevent fraudulent repeat registrations by requiring a “real person” to click a link or to make sure the user enters a valid email.
Unfortunately, verification link messages don’t achieve either of these goals very well. It is easy to write an automated script to check an email inbox and automatically follow the activation link. If you are using links to verify that a user entered the right address, when they enter a wrong one then you are depending on them to find you again and re-register (plus your verification message would have bounced, which can cause deliverability issues for future messages).
Implementing a real-time email verification solution is an easy way to ensure users are entering valid emails so that they can complete the registration process without leaving your site. This will ensure the email address is valid while the user is still engaged with your site. Any invalid addresses can be flagged to the user for correction. Additionally, preventing invalid email addresses reduces the number of hard bounces from welcome messages, thus decreasing the chance of getting filtered to spam or bulk folders for all of your communications.
What is your email verification abandonment? Have you integrated real-time email verification to improve this drop-off? Let us know at firstname.lastname@example.org.
Many of StrikeIron's direct customers integrate our various API-delivered data services into applications, Web sites, and business processes entirely on their own, usually with a single line of code or two - a testament to how easy this is to do. These product offerings available on the Cloud can be integrated into anything that can consume a SOAP or REST-based Web service (which is just about anything).
However, StrikeIron has also developed technology integration partnerships with many of today’s top software and Internet solutions platforms, solutions which are all enhanced by integrating Data-as-a-Service capabilities from StrikeIron.
Having these capabilities, such as real-time address verification, email verification, sales tax rates, foreign currency rates, SMS text messaging, and phone verification, pre-integrated into various other platforms that are already in use by large customers every day can be a very compelling solution. It is a win-win-win scenario for our customers, partners, and our technology.
One such partner is Informatica. Informatica has integrated several StrikeIron services for the purposes of contact data validation within its Informatica Cloud platform, as data validation is a very important step in the integration of data between various platforms. These services can be used via the Informatica Cloud StrikeIron plug-in, or as directly integrated within the Informatica Cloud platform per our most recent partnership. In the latter case, some of our services are available for use simply by checking a box directly within Informatica's Cloud application. This makes it very easy to have high quality, validated data arriving at a target destination, having been cleansed as an intermediate step while in transit from its source. You can view a recorded Webinar here.
Customer contact information (e.g. phone numbers, addresses, emails, etc.) is notoriously volatile and difficult to maintain at high accuracy levels. Experts estimate that 2% of records in a CRM database become obsolete each month due to customers dying, divorcing, marrying or moving.
Data hygiene techniques must be implemented to ensure the highest level of data quality. The best way to tackle this is to find a data quality provider whose core competency is in data hygiene. For example, StrikeIron can handle and cleanse customer data obtained from point-of-sale, website, call center, social media, survey, email, etc.
Look for a data validation partner like StrikeIron that takes into account the following:
• Existence: whether the organization has the data
• Validity: whether the data values fall within an acceptable range or domain
• Consistency: whether the same piece of data stored in multiple locations contains the same values
• Integrity: completeness of relationships between data elements and across data sets
• Accuracy: whether the data describes the properties of the object it is meant to model
• Relevance: whether the data is the appropriate data to support the business objectives
Data quality solutions and processes should attempt to improve the accuracy and completeness of the information your organization receives. This involves cleansing and transforming data by removing inaccuracies and standardizing on
While you know and understand your data better than anyone else, a third-party can be very beneficial in adding value. StrikeIron can validate the data before or after it is entered into your database. The former is better in preventing errors at the source. The end-result is high-quality customer data in your database, ensuring increased revenue/cost reduction.
"Big Data" is all the rage these days, and the Big Data marketing umbrella seems to be rapidly expanding as a result. The term is getting slapped on all kinds of product marketing narratives, including many kinds of data-oriented analysis product, or in products where data exists in any kind of volume (much of which is an evolution of data warehousing concepts needing some newness). So of course the usual market confusion is present as with any hot industry term.
As for me, I like to think conceptually of Big Data as referring to datasets that are so large, they tend to fall outside the scalability and performance afforded in traditional table-driven SQL-based data management approaches, and instead need a different way of thinking about and handling the tremendous amount of potential information that exists within these data entities.
The term Big Data emerged as many Web-scale companies such as Facebook, Twitter, Google, Amazon, and others started stretching the limits of traditional databases with their sheer data volumes and performance requirements, and began to realize they needed a data management approach more finely-tuned to their massive data requirements.
As a result, technology such as Hadoop, Cassandra, BigData, Dynamo, and others began to appear to assist in addressing these requirements. Analytics solutions focused on these massive data volumes have also begun to appear, as well as storage and performance alternatives slated as ideal for Big Data. There is also a new class of operational metrics solutions that help to generate these volumes of data, including both software and hardware instrumentation.
However, one concept seems to be often missing from these excited conversations: data quality. While it is true that much of big data goes well beyond structured data, much of it is still data, and data always has the potential to be unusable or flat out wrong. This omission of course creates opportunity for the astute and innovative. Many of the traditional data stewardship approaches are still applicable and necessary and need to be implemented with Big Data characteristics in mind. Customer data quality, profiling, data standardization, consistency prior to analysis and integration, rules-based testing, and even non-technology oriented quality initiatives (data completeness incentives for example) need to be part of any Big Data strategy for anyone hoping to have any sort of success.
So as you embark on the path of massive data volumes, be sure that a data quality strategy exists as part of the larger Big Data strategy, and keep your eye out for what happens in this space as its still in its formative period. After all, the last thing any organization wants is Big Bad Data.