As analytics aimed at helping companies make use of all their data proliferate, companies worldwide are embracing these new technologies and finding ever-more-innovative ways to use them. Those that fail to do so risk being left on the sidelines.

Author: Tiziana Barghini

“Know your customers.” For years it has been a business mantra that is easier to say than to follow. Now companies can put that maxim into practice with the help of big-data analytics. But as many firms embrace the endless possibilities that big data provides in improving their internal processes and their relations with customers, others will have to step up their game or risk getting left on the sidelines.

Andrews, IBM: Banks aim at predicting customers’ life-changing events to offer them a service before they imagine they need it.

A client complaining about her bank with a group of friends over coffee will probably go unnoticed, but one who posts those complaints on social media is likely to attract a better offer by a competitor. An electricity provider can inform customers in real time how much power they use and show them the actual cost of keeping the lights on. Banks, utilities, grocery chains and asset managers can understand their markets better than ever, thanks to a technology invented about 10 years ago that is slowly taking over the way business is done.

“Corporations are now able to understand who the customer is and what forces are impacting his or her behavior in doing or not doing business with them,” said Phani Nagarjuna, founder and chief executive officer of Neuvora, a start-up offering analytical services to Fortune 500 companies. “We tell these companies what they have to do to minimize the risks of their customers leaving them.”

Big-data analytics make all this possible. ‘Big data’ refers to all the digital information—facts and numbers, unstructured as they may appear in real life—that are continuously updated and ever-changing and too numerous to be contained by a single server and analyzed by a single computer. Big-data solutions, essentially, are software solutions aimed at analyzing large data sets. From a corporate perspective, this could include all the data produced via internal workflows, externally between companies and their counterparties (including customers, banks, regulators and so on) and beyond. Gartner Consulting defines it as “high-volume, high-velocity, and/or high-variety information assets that require new forms of processing to enable enhanced decision-making, insight discovery and process optimization.” Analytics solutions make that data available in innumerable outputs that can then be used to increase efficiency, drive sales, reduce risk or meet any of the myriad other goals that are still being defined by companies, analysts, consultants—and innovative start-ups—worldwide.

Such technology is becoming an embedded feature at many organizations, and soon companies across different industries and continents will find they cannot do without it if they want to stay competitive. At stake is not only the ability to get customer insights as granular as each individual but also the opportunity to save money with better forecasting, cheaper data management and smarter organizations. Similar to what happened when the World Wide Web became commonly used in the mid-1990s, big-data solutions are about to become an everyday tool for all.

“Big data is now seen as the new frontier of competition,” says Mark Torr, senior director for EMEA and AP Analytical Platform Centre of Excellence for SAS. “But ultimately it will become a commodity, and everybody will have to deal with it.”

The technology to analyze big data originated with Google’s studies on how to deal with the growing information available on the Web. The key engine that now handles that processing and storage is an open-source solution set named after the elephant
puppet of the son of its co-creator, Doug Cutting. The elephant was named Hadoop. At the time, Cutting worked at Yahoo! Apache Hadoop solutions are now distributed by IT firms Cloudera, Hortonworks and MapR. And countless software vendors make use of the Hadoop framework to provide analytics solutions to their clients.

Large corporations—manufacturers such as Daimler, utilities such as British Gas, and banks such Netherland’s ING, Austrian retail bank Bawag and Spain’s Santander—are all using Hadoop to crunch their vast data. According to presentations of their executives at the 2015 Hadoop summits in London and Brussels, some were looking to cut costs and most were seeking new ways to connect with customers and smarter alternatives to run their business.

“Santander has pooled aggregate information on clients, unifying all available information, including structured and nonstructured data and transaction activity. It plans to add to this public information on clients outside of their relationship with the bank. This information will be used to perform fraud analysis,” said an official source at Santander. “Our key vision for the Group is to obtain a 360-degree view of each individual customer.”

The financial industry with its multitude of data was the first to embrace big-data analysis and take up Hadoop after Cloudera launched the solution (or, more accurately, group of solutions) in 2008, says Steven Totman—who defines himself as a big-data evangelist and the first Hadoop vendor. “Big data is not something new. In all these organizations it tends to be the data they already had, but that they never took value from before.”

Comments

No comments yet

Add a Comment

You must be a registered user with Global Finance Magazine to comment.

Forgot Password?