The concept of API Management has gained momentum lately with a couple of large acquisitions and increasing investments occurring in the space. Also, many SOA (Service Oriented Architecture) and other infrastructure vendors have re-invented themselves as API Management solution providers to capitalize on the expanding opportunity. Suddenly, API Management has become one of the gold rush technologies of 2013, and of critical importance in the information technology landscape.
This has happened for several reasons. First, the proliferation of SaaS applications and integration requirements has contributed significantly to the interest in API Management. Second, legacy applications moving to the Cloud and the need to pass messages and data between these applications have also driven the increase in requirements for Internet-based APIs. Also, the explosion in mobile devices, tablets, other smart devices, and apps have been major factors. The applications that run on these devices act as Cloud clients and communicate with APIs to send and receive data to various servers around the globe in order to operate. These mobile solutions have created a fertile landscape for API creation and deployment as well.
As organizations become determined to publish APIs as part of their Internet infrastructure and application strategies, they learn that simply putting an API up for use by others is not entirely too complex. However, they also discover that publishing a reliable, secure, enterprise-grade API to a controlled audience with high levels of performance that have the ability to scale with adoption can be quite a serious undertaking. These kinds of requirements are why the API Management space has emerged to the forefront.
Typically, there are two approaches to API Management. The first is a proxy model. In this case, the API Management vendor requires you to provide them with an API endpoint address exposed on the Internet. In other words, you must already have an existing, functional API in place and in a production setting that the API Management vendor will then "wrap" with their own version of the API. Wrapping your API provides the vendor the ability to provide user management and authentication, report usage statistics, and provide other features such as support for multiple protocols like SOAP, REST, and JSON.
However, the proxy model still requires you to manage the infrastructure and the software and data behind your API. This includes the unpredictable cost and investment required to scale upon increases in adoption. You must also build in the ability to fail-over when necessary (such as server or database failures), ensure performance and reliability, and essentially mirror the infrastructure requirements of the API Management vendor.
This approach also increases the potential points of failure, as both you and your API Management vendor must have consistent, high-performance and reliable uptime numbers simultaneously. This can also be an additional challenge, because if either one of you experience a hiccup or an unsupportable spike in traffic, you are essentially "down". That can be quite a frustrating experience for those that have integrated to your API. It also means you are still bearing the brunt of all of the API hosting costs, and the API Management vendor is merely managing your endpoint for you. All of the other costs are your own, and can add up dramatically in the long-term.
Certainly, there may be cases where the proxy model is attractive, and there might be strategic business reasons as to why you would want to maintain the core API infrastructure yourself. Or, perhaps there is no way to extricate the underlying IT asset from your own servers and the managed endpoint scenario is your only option. In these cases, a proxy model might still be workable.
However, API Management models also exist that are entirely "turnkey", meaning 100%, full-stack, hosted API scenarios that don't require you to have a commercial-grade API of your own readily available in advance to participate. In this case, you provide the data, software, or other asset that will be expressed as an API and available to your constituents, and the API Management vendor deploys all of it within their own infrastructure, allowing you to remotely manage and administrate access and availability to that API. This includes all of the user management, reporting and other features that you would receive in the proxy model. The striking difference is that in this hosted case; scalability, multi-tenancy, performance, reliability, ongoing monitoring, and security are all abstracted away from you and in the hands of the domain experts who are immersed in these activities hourly. This also means you don't have to have a team of API infrastructure experts available on your staff to monitor and manage the ongoing usage of your APIs. This is a dramatically simpler approach.
While it can take some time to deploy initially because of the extra API design step, the fully-hosted API model eliminates the infrastructure and ongoing management costs of creating, hosting, deploying, and maintaining the API and its underlying assets yourself long-term. This is generally why you would consider outsourcing API Management in the first place. No longer do you have to "put out integration fires" or otherwise manage and invest in the infrastructure of the underlying information asset and its production environment. Nor must you pre-invest in unnecessary capacity to handle spikes in scale, including all of the details of publishing data and functionality out to the open Web. Instead, the hosted API scenario is truly more Cloud-like in its permanent benefits and ease of deployment and maintenance. You gain all of the power of a commercial-grade API, but without all of the focus, expertise, cost, and other requirements to get it up and running and manage over time.
Before embarking in exploring relationships with API management vendors, It is important to determine which of these models, proxy or hosted, make the most sense for you now and for the long-term success of your API strategy. While most API Management vendors handle one scenario or the other, fortunately StrikeIron handles both and we can explore these options with you based on our experience of deploying hundreds of multi-tenant APIs in critical production environments, as well as with your own needs.
The basic premise of APIs (Application Programmer Interfaces) is to make integration and customized usage with an application or between applications possible. These interfaces generally provide a pre-determined set of application communication "methods" for the purposes of sending and receiving messages to an application and invoking various commands. Application APIs have been around for some time. However, Web-based APIs, allowing the integration of applications to occur over the Internet, are more recent.
Traditionally, application to application integration was difficult. Custom code usually had to be written to make applications work together. It was typically a requirement that the applications were available on the same network, limiting the scope of integration. Not to mention that every time a new application was to be tied in, more custom code had to be written. Overall, it was a fairly tedious and expensive proposition to integrate applications, and the task was rarely a repeatable exercise.
Web APIs solve most of these challenges. For example, using text-based protocols like XML or JSON enable platform independence, or the concept that two applications can communicate and work in conjunction with each other even though they are running on entirely different hardware and software platforms.
This is a powerful concept, especially when the Internet has eliminated geographical requirements. For example, platform independence is what enables a Windows-based PC the ability to request a piece of information, such as a customer number, from a mainframe - even if that mainframe is running in a data center on the other side of the world. More recently, Web APIs enable mobile devices such as an iPhone to retrieve information from servers in real-time, which might be currency rates that are being maintained and updated on a Linux server thousands of miles away.
So while many of the old challenges are being solved, we want to be very careful to ensure that an API-based integration approach does not introduce several new complexities, throwing a wrench into an otherwise well-thought-out integration strategy that an organization might currently be considering.
For instance, it is fairly common for an organization or a data provider to publish multiple APIs to its constituency, each one representing a certain function or type of data. While consistency generally makes common sense, the challenge in practice is that the data sources that will be published might be coming from different places with varying data content standards and data structures, making normalization harder.
If an organization has five different APIs to publish, there might be cases where constituents want to integrate several, if not all of these APIs. If each one requires a different integration approach due to inconsistency of implementation across these APIs, the likelihood of adoption decreases, and all of the additional complexity introduced could make the ongoing maintenance of the integration quite difficult - the exact opposite of what we want to achieve with API-based integration.
Years of experience both creating APIs and serving thousands of customers who have integrated these APIs into production have given StrikeIron a foundation of knowledge around the deployment and ongoing use of APIs that address these kinds of issues. When we help customers publish API's through our IronCloud API Management platform, these are the typical areas of normalization we focus on as part of our best practices for APIs:
Data content normalization. This one might be the most difficult as it could require some manipulation of large content datasets. However, as with any dataset and database, data content consistency makes analysis and reporting much easier. A simple example is to ensure gender data is always "m" and "f" rather than multiple variations of "male", "female", etc. Content normalization requirements also could be more complex, like consistent product naming standards.
Data structure normalization. It's important that APIs delivering data follow the same structural formats to ensure that working with the resultant data within client applications is as usable as possible. For a basic example, if one API uses "Full Name" as a data parameter, and another API uses "Given Name" and "Last Name" as two separate parameters, the usefulness of the data can degrade as comparing data can be challenging.
Authentication. This is where utilizing third party platforms, such as IronCloud, to publish APIs can be exceptionally helpful as they typically provide a de-coupled authentication layer allowing many different types of authentication (and protocols) to be leveraged. Examples include SOAP header-based authentication, SOAP parameter-based authentication, certificate-based/HTTP Secure REST, WS*Security, and several others. Here, not only is consistency important, but flexibility as well. It is hard to predict what IDEs or other development tools will be used when integration to an API occurs by a client application, and some of those tools might only support some authentication approaches. The important thing is that once an authentication method is decided upon, that same method can be repeated across all APIs that have been published by an organization.
Response code consistency. Most APIs have a set of response codes associated with them to relay important information back to the calling application after an API method has been invoked. For example, an API might need to respond with a "password not valid" or "data not found" response. In these cases, utilizing a response code such as "404" might be appropriate for data not found. The key thing is to ensure that these response codes are consistent from one API to the next. Otherwise, a developer will have to understand and write additional code to handle the various responses from each different API. This creates more complexity with each new API that is integrated.
API behavior consistency. If mechanisms such as timeouts or API usage reporting capabilities are present as parameters in an API, it's a good idea to ensure that they are available in multiple APIs in the same way. This can prevent unnecessary coding and unexpected client application behavior when developers try to leverage multiple APIs from the same organization.
Business model consistency. It is rare that there is not a usage control mechanism in place governing the use of an externally published API. Whether credits, hits, daily maximums, monthly usage, or some other accounting mechanism is in place governing usage, be sure that is it consistent across all of the APIs that you publish to minimize usage contingency code that needs to be created by the developer. Inconsistency here can cause considerable challenges as usage governance code issues have the nature of often being detected during production use, and that is always undesired. Foresight here can have long-term benefits in terms of adoption and overall stickiness of API usage.
This is a basic model for API design best practices that StrikeIron has developed over the years helping customers and partners design and deploy APIs that are in production in thousands of organizations around the globe. While our IronCloud API Hosting and Management Platform handles a lot of these details, foresight and good API design can make a big long-term impact towards the success of an API Strategy.
If you are considering publishing data or other business functionality in the form of a Web API for integration by others, our experience and track-record of successful API deployment and integration could be helpful and a great choice to minimize complexity, accelerate speed to market, achieve scalability, control usage and ultimately achieve the benefits a well thought-out API strategy can provide. Let us know of your plans and we will gladly provide some initial consultation to see if IronCloud, our API management platform, is right for your needs.
StrikeIron has released Email Verification 6, the next generation of its award-winning technology and the most accurate email verification solution in the market.
Building upon the past decade of experience, StrikeIron has developed an innovative solution that provides immediate value by boosting email performance. Improved accuracy and increased email deliverability. This means enhanced marketing campaigns and more effective customer relationship management for your organization.
Like other StrikeIron solutions, Email Verification 6 is cloud-based technology. It works in real-time and does not rely upon database lookups as a reference, as these reference files can become quickly outdated. It instead goes through a four-step real-time process using web resources to validate email addresses for the most accurate, current results. This includes validation at the recipient level, as well as several quality checks to ensure correct status codes are returned.
This next version of email verification contains a set of proprietary algorithms that further enhance the accuracy and scalability of email verification across both consumer- and business-oriented domain addresses. In addition to enhanced accuracy, new capabilities include email address correction, detection of disposable and role accounts, and identification of malicious elements like traps.
The end-result is an even more powerful solution that transforms any email list into a valuable marketing asset, as well as helping to avoid blacklists and spam folders, inclusion which would be harmful to future marketing communications
From enterprises to SMB's, organizations of all sizes can choose real-time, batch or a combination approach, utilizing email verification to improve email campaign performance with better deliverability.
Interested in learning more? Signup for a free Email Verification trial or contact sales at firstname.lastname@example.org.
Marketo over the past couple of years has emerged as one of the top "lead management" platforms in the industry. Most recently, Gartner has them in the top right of their Magic Quadrant for CRM Lead Management (along with Eloqua, now owned by Oracle). Marketo's Revenue Performance and Marketing Automation platform helps marketers manage multiple digital channels, including email, social, online events and mobile. It is a key component of strategies that develop leads and opportunities for sales teams.
As with most data-centric solutions, Marketo's platform is significantly enhanced by ensuring all of the contact data for customers and prospects at its core is correct, current, and comprehensive. At a high level, this significantly enhances the ROI that Marketo customers attain from its platform, and reduces missed opportunities due to inaccurate lead data.
At a more granular level, Address Verification, Phone Number Validation, and especially Real-Time Email Validation can critically improve Marketing Automation efforts. Email validation ensures email addresses, especially when the opt-ins may have been awhile ago, are active and receiving email. This is a critical part of professional lead management efforts. If invalid email addresses existing in a customer or prospect database are not withdrawn, a large number of bounced emails will cause an organization to fall out of favor with ISPs and very likely end up on various spam lists, causing all marketing communications to arrive in spam folders.
Since Marketo was a natural for StrikeIron's capabilities, customers of ours suggested integrating the two platforms to enhance the value and experience of using Marketo. We jumped at the opportunity!
Late in 2012, Marketo introduced their innovative "Webhooks" capability for integrating third party applications via XML-based interfaces. Using their "trigger" functionality (such as "new lead created" or "Web from filled out"), a call-out to a third-party API can be achieved as part of any Marketo process. The resultant response data can then be mapped into the Marketo system and become a part of individual lead record data. In the case of StrikeIron, this includes validated mailing addresses, phone numbers, and email addresses, as well as additional enhancement data that are also returned by the StrikeIron APIs.
Since all of StrikeIron's Cloud APIs have the same data structure, behavior, and open XML protocol formats (in this case, REST was used), it makes it easy to integrate all of our various API products into the Marketo platform. In other words, since both support standard, open platforms designed with integration in mind, the integration was significantly less difficult and deeper than with some other third-party platforms where we have integrated our customer data validaton solutions. In fact, the affinity between the two platforms was significant enough that we went ahead and integrated our SMS Text Messaging API into the Marketo platform as well, supporting Marketo's mobile efforts.
After successfully testing our various Webhook integrations, including with production customers, Marketo then provided StrikeIron with the Marketo-Integrated designation that you can see here: http://launchpoint.marketo.com/strikeiron-inc/749-strikeiron-email-verification
We currently have several solutions now available pre-integrated within Marketo as part of Marketo's Launchpoint partner channel. If you search for "StrikeIron" here, you can see the various integrated solutions currently available.
The last thing anyone wants is a marketing automation platform that is relying on bad or otherwise incomplete data. This powerful, yet easy integration of two platforms ensures that won't be the case.
Point-of-Sale (POS) systems are continuing to evolve. What was once only a "processing and recording of sales" mechanism (such as a cash register), POS implementation is now a considerable competitive advantage in a retailer's strategy. As a result, retailers are always looking to get more long-term ROI out of their POS systems. They are determining how this interaction opportunity with a customer can support other parts of a company's multichannel sales strategy as well.
The operation of these systems is rapidly moving from a counter-based cash register to the retailer's floor, as credit card processing hardware increasingly supports this model. Mobile handheld devices, tablets and other consumer devices performing the actual sales transaction are now commonplace within a store, no longer tying personnel to the cash register. As a result, customers are now often asked to enter their own information via the on-the-floor transaction device, and an email address now more than ever is part of the collected customer information. Primarily, this is because customers increasingly prefer more efficient and environment-friendly e-receipts. Better yet, collecting an email address can then provide an opportunity for future communications via email for the retailer, and can result in increased ongoing customer engagement.
The value of collecting an email address can be considerable. Email-oriented and Web-based marketing are outpacing traditional advertising and direct mail methods, and multi-channel sales strategies are as critical to success as ever. Even with retailers where traditional methods of POS still make the most sense via a cash register, collecting email addresses are equally important for customer retention success.
However, there are a couple of challenges to collecting this kind of customer data on the floor. First, data entry can sometimes be a little tougher with these mobile devices. If the customer enters their own data, which is now often the case, the chances of collecting incorrect or invalid email addresses either by accident or otherwise can go up considerably. Even if the email address is collected properly, 30-40% of people change their email address every year. Simple typos or not, these issues can not only prevent e-mail receipts from being delivered, but a large incidence of invalid emails being entered can result in future marketing communications going into spam folders if Internet Service Providers (ISPs) detect enough email bounces and failed deliveries coming from the same sending source.
Real-time email validation that utilize an instantaneous out-to-the-Cloud check to ensure email deliverability can substantially reduce typos and otherwise bad email addresses from getting into the system in the first place, right there on the floor at the point of data capture. Utilizing this kind of technology of email address validation can reduce email bounces and failures by 90% or more. It can be an effective tool in ensuring the highest possible levels of data integrity when capturing customer details.
StrikeIron's Email Verification Solution is cloud-based; allowing it to determine in real-time if an email address is valid and deliverable before sending a message. It is used in many production POS systems today to ensure as often as possible that correct email addresses are obtained when collecting customer information.
The cloud-based, real-time approach is an important one, as StrikeIron is constantly evolving its algorithms in the background without requiring customers to update their POS integration in any way. Our team of email verification experts is constantly tweaking, enhancing, and otherwise modifying the algorithms that make these real-time checks as accurate as possible on an ongoing basis without any effort from the customer leveraging the technology.
Email Verification is only one easy-to-integrate API product available from StrikeIron. Others include Phone Number Validation, Address Verification, Do Not Call List checking, SMS Text Messaging and several more. For more information, please contact email@example.com.
To effectively reach your customers and prospects, you depend upon accurate and reliable phone numbers. StrikeIron has developed a new API that improves your telemarketing and mobile campaigns. The Mobile ID solution enables you to quickly and easily identify phone number type, allowing you to integrate this intelligence into your mobile strategies.
The Mobile ID solution helps marketers distinguish between landline, VoIP and cellular phone numbers. This not only supports regulatory issues surrounding mobile telemarketing, but it also enables marketers to better segment and get the most out of their mobile marketing efforts. Now your organization has certainty in what type of number they are dialing before placing a call.
This solution is complementary to StrikeIron's SMS text messaging service. If you are capturing critical phone number data, you certainly have considered how to leverage that data to support a mobile strategy. Consumers increasingly prefer text over email and other forms of communication.
If you have always wanted to test a mobile campaign, we recommend first utilizing Mobile ID to determine which phone numbers in your CRM system are cellular and capable of receiving a text message. Once you have parsed out all landline and VoIP numbers, you will be able to start implementing mobile text alerts and notifications. With our SMS API, we recommend sending a welcome message initially to make sure users want to opt-in to mobile messages.
If you are interested in learning more, signup for a free trial of Mobile ID today!
The general premise of data warehousing hasn't changed much over the years. The idea is still to aggregate as much relevant data as possible from multiple sources, centralize it in a repository of some kind, catalog it, and then utilize it for reporting and analytics to make better business decisions. An effective data warehousing strategy seamlessly enables trend analysis, predictive analytics, forecasting, decision support, and just about anything else we now categorize under the umbrella of "data science."
The premise is not different these days, but rather, it is more the shifting nature of the data sources that the warehouse must draw from to capture as much useful information as possible. It's the data that's changed, not the goal.
First, there is the rapid proliferation of social-generated data in all of its unstructured forms, making the data extraction and transformation components of loading data to the warehouse more difficult than it has been in the past. But this isn't really groundbreaking for 2013, as social data and the creation of various Big Data technologies its growth has spawned, such as Hadoop, have been emerging for several years now.
Instead, what will likely be significantly different in 2013 is the acceleration of the deployment of a multitude of SaaS applications within the enterprise, especially in the larger, often slower to adopt, companies that populate the Fortune 2000. As the deal sizes grow in size, the SaaS footprint is clearly becoming significantly bigger.
This is where it becomes interesting. It's not just that an organization has several different SaaS applications such as Salesforce, Workday, and Success Factors in place and in use across the enterprise, with a single instance of each in use by all. Instead, due to the nature of the easier adoption of these SaaS applications, many of them have come in through the back door departmentally and at different times rather than through a centralized IT-controlled proliferation. This means that multiple instances of the same application are popping up everywhere.
For example, there are large enterprises that now have 10, 20, or even 50+ instances of Salesforce running across the entire organization. Each instance has its own set of customization of data collection and storage, separate add-on applications installed, different data feeding these applications, and unique implementation approaches. This could result in the old adage of solving old problems while creating new ones.
Some questions that could be asked are what kind of data collection and ETL challenges will this cause for those wishing to leverage a data warehousing strategy? Is the fact that the operational data from these various SaaS applications is stored and maintained by different vendors, each of which who is incentivized to keep it that way, make things easier or more difficult for data warehousing and the analysis it enables? Will data fragmentation and the resultant data integration strategies scale across all of these instances of SaaS applications? It will be interesting to see organizations meet the "SaaS sprawl" challenge, especially as it relates to cross-enterprise data collection strategy.
Furthermore, SaaS applications have taken an ever-increasing hold of the enterprise as of late with larger and larger deals. With the Cloud and SaaS applications a major part of their 2013 strategies, Oracle, SAP, IBM, and the more traditional software vendors have taken notice. SAP's Business ByDesign, Oracle's Fusion Applications, and recent SaaS acquisitions will surely add to what could become a hodge podge of SaaS applications across the enterprise.
To meet these challenges currently, cloud data warehousing offerings from companies like BitYota and Amazon's Redshift are beginning to emerge with a core theme of the cloud as the centralized data storage repository. ETL and data integration solutions such as Informatica's Cloud and Dell's Boomi are racing to meet these traditional data warehousing requirements in the cloud paradigm. Also, the traditional data cleansing requirements of data warehousing are being met with their cloud-based counterparts for better, more usable data in these new age warehouses. One thing that will never change is that bad data will always equal bad analysis, and the need for making investments in data quality strategies will continue to exist.
As the landscape of SaaS continues its rapid expansion, and the data within these applications continues to burgeon, 2013 will definitely be a pivotal year in the dawn of a new class of data warehousing technologies.
Would you like to share this infographic?
Just copy and paste the embed code below:
StrikeIron’s API Address Services reduces time, minimizes stress and removes fulfillment frustrations for user community by validating domestic and international addresses and completing ZIP codes
Fred Benenson, a Kickstarter employee, believes in practicing what he preaches. He decided to become a “client” (or creator) of Kickstarter to try the user experience first-hand.
Kickstarter is a unique online business model, which helps bring imaginative projects to life through the direct support of others. Since its launch April 28, 2009, the company has provided an online platform for more than 3.1M people, who pledged $440M and funded over 33,000 creative projects, including films, games, music, art, design, technology and other creative projects.
Benenson shares his story in a blog, “Kickstarter Fulfillment and Product Development: A Story of Dogfood and Data Validation,” highlighted here.
His idea was simple: “I fought SOPA (Stop Online Piracy Act) and all I got was this stupid t-shirt.” Fulfilling orders, however, was more complicated than expected. Keeping an eye on expenses, he found a free 30-day batch mailing software. He imported addresses from a CSV file, only to learn the requirements were strict. Field entries required specific orders and data was limited to three lines, which mean he had to include t-shirt sizes next to address names (something not everyone probably wants disclosed). After a “time consuming task and stressful process,” the software rejected most of the addresses for a ZIP code problem.
Benenson – who was sure other creators had similar frustrations with order fulfillment and shipments – took his story back to the internal Kickstarter team, which sparked discussions about how to improve data processing creators face when delivering rewards to backers. They determined one solution to solve the “dirty-address problem was to use an external service to validate the addresses supplied by backers.”
Colleague and Product Manager Jed Meade conducted research for verification services. StrikeIron emerged as the best solution and Kickstarter integrated with StrikeIron services to verify North American and Global shipping addresses in real-time as a value-add service for their creators in 2012. They upgraded their usage several months later.
“We are thrilled to call another innovative, online company a StrikeIron client,” says Justin Helmig, VP of Marketing for StrikeIron. As another point of interest, Helmig points out that adhering to best marketing practices; StrikeIron relies on sophisticated analytics to determine which marketing efforts yield the greatest ROI.
“Kickstarter was diligent in their research,” he says. “They found us through several channels. From a marketing perspective, this demonstrates the value of investing in integrated marketing across channels to offer prospects multiple touch points.”
Learn more about how to better connect with customers through multichannel marketing.
SMS is the most widely used data application in the world, is considered by Forrester Research the “workhorse of mobile," and reports say people sent as many as 10 trillion text messages in 2012. Mobile experts agree SMS will remain a significant part of the mobile landscape for the foreseeable future.
Companies across the globe rely on bulk text messaging to solve a host of business needs and as a result positively affect their bottom line by:
Improving customer satisfaction by sending timely notifications about packages, or service technician arrivals
Increasing revenue by texting reminders to refill prescriptions or service vehicles
Generating sales through special text promotions and offers
Decreasing labor and operating expenses by messaging
Expanding event attendance with save-the-date, early registration or time sensitive data
Keeping internal teams informed with quick links to a new product, meeting changes, or other informational messages
Preventing physical harm by distributing emergency and safety warnings.
SMS – which has never been easier, faster, more cost-effective, or reliable as it is today – is a business best practice, one most big brands and enterprise companies’ use. It makes sense. Text messaging enables businesses to reach people on their mobile devices and smart phones anytime, anywhere quickly and easily.
The First Text Message
In December 1992, the first SMS message – sent from a computer to a cell phone in the UK on the Vodafone network – was successful. The message, "Merry Christmas," sparked a text messaging and mobile communications revolution.
Today, adults are more than twice as likely to adopt SMS over any other form of mobile messaging, including email, MMS, and instant messaging. Additionally, more than 80% of the U.S. population owns a mobile phone and 73% regularly send or receive text messages daily.
A 20-Year SMS Evolution
Finnish engineer Matti Makkonen gets credit for the innovative idea of delivering a simple message from one device to another, even if the receiving device is off, or outside a coverage area as he enjoyed a bite to eat in a Denmark pizzeria.
Following this epiphany, he wrote down a few specifications and in December 1992 the first SMS message – sent from a computer to a cell phone in the UK on the Vodafone network – wishing someone “Merry Christmas” was successful.
Initially, carriers were not expecting much activity around SMS. The growth stemmed from teenagers taking advantage of the far less expensive communications method compared to voice to-voice calls charged for each minute of usage. A host of acronym-based expressions emerged (LOL, OMG and CUL8R to name a few of the firsts), accelerating the mobile messaging revolution even further. Carriers standardized SMS messaging formats in 1999.
Additionally, people appreciated the private nature of SMS messaging, and mobile carriers capitalized on the technology going to great lengths to prevent SPAM and other potentially unwelcome messages, which could cause usage – and therefore their revenue – to deteriorate.
An SMS Mobile Marketing Pioneer
StrikeIron, a pioneer in data quality communications, provided SMS capabilities to customers at the beginning of the 21st Century, sending texts through email. In approximately, 2006, StrikeIron developed more sophisticated ways for businesses to integrate SMS technology.
Today, StrikeIron enables the programmatic sending of text messages to mobile devices in real-time using a Cloud-based API, continuing to enhance SMS applications based on customer business needs and market demands.
Learn more about StrikeIron SMS Mobile Marketing solutions:
On behalf of StrikeIron, we wish you a happy and safe holiday, and look forward to your comments below!