“Trickle, Trickle, Trickle”( “Simple”,”Fast”,”Cheap”) Part 1

Image result for trickling effect

Data analytics has trickled down from large corporates and is now readily available in the mainstream. No longer the sole preserve of large corporations, it is more accessible, more immediate and more affordable. When you think about it “computers” were invented for analysing data, and really “big data” has been around for ages and has just been rebranded. Organisations are swamped in data; a steady trickle from accounts and ERP packages was swelled with the advent of email, e-commerce and the increasing use of CRM. Now with new technologies able to scrape unstructured and as well as structured data and “big data” entering the lexicon of business language, we are waking up to a deluge. The challenge is where to start turning it all to business advantage. Firms will find different reasons to surface data for analytics. Its sector specific and depends on where the company is coming from.

Image result for accenture  Image result for accenture

Take Accenture as an example, they are seeing a lot of focus on getting a better understanding of customers and customer behaviour. Companies are looking to leverage broad sets of data to get a holistic view that allows them to engage in a more personalised and targeted way. In manufacturing Accenture is seeing is a focus on operations, leveraging sensor driven data to send out alerts on the imminent failure of a device, a good example of this would be financial institutions and how they focus on risk, identifying deviations in data that exposes fraud sooner rather than later. They key challenge for companies is trying to corral their data and get it in order, trying to identify the sources of the data and whether they can trust it. Despite all the hype around big data, organisations in Ireland are still relatively immature and struggling with the fundamentals. Accenture estimates that anywhere between 20 and 70 percent of a project is the data cleansing piece, making the data ready for analytics, long established processes such as ETL(extract transform load) are still used, but whats changed is the expectation of faster results. Making this possible are new technologies such as “Hadoop” that can crunch vast amounts of data quickly. If a bank wants to measure the impact of closing down a branch, for example, where you are not as concerned about the quality of data, but want to get a quick answer to a quick question then Hadoop will do the job. It will be able to tell you the systemic impact of closing it down.

Image result for hadoop  Image result for hadoop

Leave a Reply

Your email address will not be published. Required fields are marked *