Just ahead of its annual media day and a major leadership event in Orlando next week, SAS on Friday disclosed its latest move – and a big one it appears to be – into deeper yet faster and more economical use of “big data” analytics.

The deal involves programming called Hadoop, which is widely used to help manage data centers. SAS will be working with Hortonworks, a privately held firm in Palo Alto, Calif.

Just as the big analytics market is booming, so is Hadoop. From sales of $1.5 billion in 2012, Hadoop revenues are expected to soar to nearly $21 billion by 2019, according to Trapsparency Market research. The analytics market could hit $60 billion om 2017, up from under $10 billion in 2012, says research firm MarketsandMarkets. 

Says SAS Executive Mike Ames: ”With SAS and Hadoop our customers are able to answer questions that previously were impossible to answer because of the size and scale of the data and processing required.”

SAS is a global leader in analytics, making sense of so-called big data which includes information gathered from disparate locations. However, SAS says some data remained unreachable without software like Hadoop. To gather and crunch information SAS says “could not be captured or analyzed,” it struck the alliance with Hortonworks.

To Hortonworks, SAS represents an opportunity to take its technology to a broad market. SAS, with annual revenues of more than $2 billion, has thousands of customers worldwide.

“It’s not the first deal [in analytics]. However it is the most significant given SAS’s position as a global leader in business analytics,” John Kreisa, vice president of strategic marketing for Hortonworks, told WRALTechWire.

“Hadoop has been used for analysis of data before. However we believe for Hadoop to achieve broad adoption it needs to enable organizations to leverage their existing skills. With a massive installed base of SAS users this means that more organizations are able to exploit the benefits of Hadoop for analyzing even more data.”

From Grid to Cloud to Hadoop

The deal builds on the vision of Jim Goodnight, co-founder and chief executive officer of SAS, who created a special division reporting to him that explored what Ames calls “distributed in-memory technology.” Like other firms, SAS has for several years been working on software that turned stand-along servers and PCs into grids and the virtualized cloud computing environments today. Hadoop is the next step for SAS.

Hortonworks is a fast-growing firm launched by former Yahoo engineers that in two years after launch has raised more than $100 million in venture capital, is expecting $100 million in revenue by next year and could go public. The SAS-Hortonworks agreement calls for both firms to invest in joint activities such as sales, marketing, training plus further research and development.

The firms are blending Hortonworks’ ability to manage data center operations with SAS’ own recently developed SAS “LASR in-memory server,” which SAS touts as a much faster means of analyzing data. As SAS executives have explained LASR before, huge projects that used to take hours can now be done in a fraction of the time. Ames notes that the “commodity computing” enabled by data centers through such tools as virtualization and run on Hardoop can be utilized “in a different way” by using LASR.

The time is right to deepen its arsenal of big data offerings, SAS says, because the company has become deeply familiar with the technology, Hadoop has matured as a “viable platform for analytics,” and customers are wanting what a joint SAS-Hortonworks solution can offer.

“Data scientists and business analysts need to easily and quickly explore, visualize and analyze big data stored in Hadoop in an interactive and collaborative environment,” said Randy Guard, SAS vice president of Product Management, in announcing the alliance. “SAS provides domain-specific analytics and data management capabilities natively on Hadoop to reduce data movement and take advantage of Hadoop’s distributed computational power. Use of SAS analytics software with Hortonworks Data Platform will help businesses quickly discover and capitalize on new business insights from their Hadoop-based data.”

Why Hadoop and Hortonworks?

In a Q&A with WRALTechWire, Mike Ames, SAS Director of Product Management, Data Integration Product Management, talked about the significance of the Hortonworks deal. (He also noted that SAS works with Cloudera, another Hadoop provider, but Friday’s announcement focused on Hortonworks.)

  • Why has SAS made the decision to enter the world of Hadoop?

First and foremost our customers have embraced Hadoop and have asked SAS to provide Advanced Analytics on Hadoop. Hadoop provides the ability to store and process vast amounts of data – regardless of the type, at a cost and scale that was previously unfeasible.

What good is storing and processing if you can’t make sense of the data?

That is where SAS comes in applying advanced analytics to vast stores of data. With SAS and Hadoop our customers are able to answer questions that previously were impossible to answer because of the size and scale of the data and processing required.

  • Did you consider other alternatives to Hortonworks for this move, and if so why select Hortonworks?

At SAS our two main partners Hortonworks and Cloudera offer more Hadoop experience than any other organizations in the community.

Hortonworks is the most active participant in the community on projects like Map Reduce, Yarn, Pig, Stinger, Ambari and others that are natural complements to what SAS has to offer in the Advanced Analytics market.

SAS support for Hadoop spans the entire data-to-decision process and will expand across partnerships with other leading Hadoop providers. Stay tuned for more information related to this soon.

  • Why had SAS not embraced Hadoop earlier?

SAS has been involved in the Hadoop community since 2010; it wasn’t until the true commercial uptake of Hadoop in the last couple of years that Hadoop was really a viable platform for advanced analytics.

  • This appears to open new dimensions of data mining/analytics for SAS and clients. How significant a decision is this for SAS?

It is a significant decision for SAS, because SAS takes advantage of the commodity computing of Hadoop in a different way.

Instead of file based processing that Hadoop is known for, SAS lifts data into memory across the cluster using SAS LASR in-memory server, and in-memory customers are able to interactively analyze, explore, mine and process data at a scale and with algorithms and methods that were previously unfeasible with any other architecture.

  • Has SAS had to invest time and programming to make this investment? How long has the process taken? 

SAS has been developing our distributed in-memory technology over the last six or seven years. It started with Grid computing, then moved to in-database processing and now onto Hadoop. SAS created an ‘In-Memory Analytics’ division, reporting directly to Dr. Goodnight, to further the distributed computing and advanced analytic development for SAS.

More information about the SAS-Hortonworks alliance is available online.

[SAS ARCHIVE: Check out more than a decade of SAS stories as reported in WRALTechWire.]