[ Note : I have published this blog originally at L&T Infotech blogsite - www.lntinfotechblogs.com/Lists/Posts/Post.aspx?ID=38 ].
Big data is often defined by three Vs. – Volume, Variety and Velocity. While this definition captures the essence of Big data, it is limiting when used to define technologies that support Big data. These technologies can do much more than just handling ‘Big data.’ In fact, most enterprises can derive more value by using these for ‘small data’ analytics.
Besides handling large variety of data, these technologies provide new analytical capabilities, including natural language processing, pattern recognition, machine learning and much more. You can use these capabilities effectively for small (or ‘not so large’ data) in non-traditional ways and get more value out of this data.
Here is how –
1. Create ‘Data labs’ rather than just a data warehouse
Big data technologies provide advanced analytical environment. The focus is on analyzing the data, rather than structuring and storing the data. Such environment gives a perfect sandbox for experts to ‘experiment’ with data and derive intelligence out of it. For example, Insurance actuaries can derive specific patterns out of claims history data by linking external factors with loss events and define rules for pricing and loss predictions.
2. Don’t just predict, but adapt continuously to changing realities
Big data technologies provide machine learning capabilities that allow calibrating predictive models continuously by comparing actual outcomes with predictions.
3. Change ‘Forecasting’ to ‘Now-casting’
Big data technologies can help in analyzing large stream of data at real-time, without hampering performance. This capability can be used effectively to provide ‘real-time’ analytics. For example, Insurers can define new products that charge premiums based on real-time risk data emitted by sensors or telematics instruments, rather than traditional approach of calculating premiums based on forecasting of risks.
4. Don’t get constrained by a Data model
Have you ever undergone the pain of living with a data model that no longer supports business requirements? Well, don’t worry anymore. Most Big data technologies support ‘Open format’ and dynamic changes to data records to suit analytical needs.
5. Forget Massive data movements
In big data platforms, the data is co-located with analytical processing involving minimal data movements. Forget about those large, multi-year ETL programs.
6. Save cost with low-cost commodity hardware
Large data warehousing and MDM programs often require expensive enterprise hardware and licensing to support desired level of performance. This expense can be as large as 50% of your total cost of ownership (TCO). The big data platforms are designed to work with low-cost commodity hardware (including bursting on cloud), and most are open-sourced. This can help you slash the hardware/licensing costs significantly.
So the moral of the story is – Big data technologies provide many capabilities that make them an attractive choice for ‘small data’ analytics as well. Be innovative in leveraging these capabilities to complement your current analytics world.
No comments:
Post a Comment