Friday, November 8, 2013

Insurance: Regulations likely to bring back more focus on ‘Risk Management’ practices and global visibility

At recent G20 Summits, the G20 Leaders endorsed the implementation of an integrated set of policy measures to address the risks to the global financial system from systemically important financial institutions (G-SIFIs). Accordingly FSB (Financial Stability Board), in conjunction with IAIS (International Association of Insurance Supervisors) identified an initial list of Global Systemically Important Insurers (G-SIIs) consisting of 9 groups:
1) Allianz, 2) AIG, 3) Generali, 4) Aviva, 5) Axa, 6) MetLife, 7) Ping An, 8) Prudential Financial and 9) Prudential plc.

Basically, FSB is trying to solve 'Too big to fail' problem by hand picking large global institutes and subjecting them to a set of policy measures.


Although the initial focus is on these designated global SIFIs, it may eventually come down to all large, internationally active carriers. Many domestic regulators are likely to designate domestically important insurers and apply policy measures on the similar lines. In June 2013, U.S Department of Treasury FSOC designated AIG and Prudential Financial as SIFIs, and both will be subject to stricter regulatory standards and supervisory oversight under the 2010 Dodd-Frank Act.

The set of policy measures comprise:
• recovery and resolution planning requirements;
• enhanced group-wide supervision; and
• higher loss absorbency requirements ( including non-traditional non-insurance (NTNI) subsidiaries)

The impact of these regulations is still being worked out, however, at the minimum, it requires following from IT perspective -
• Integration of data sources for recovery/resolution and risk management across the group
• Identification of changes to Risk management metrics and bringing in visibility at group and legal entity level
• Identification of intra-group exposures
• Approach for ring fencing NTNIs and development of risk management plan & systems

In July 2014, Systematically important Reinsurers will be identified and they are likely to have similar impact. I will update this entry as we do more research on this.

Wednesday, September 18, 2013

New article : Big data Trends 2014

The recent article in Alsbridge Outsourcing Center - Big Data Trends 2014, includes some of my thoughts on how Insurance companies can effectively leverage the super abundance of data - http://www.outsourcing-center.com/2013-09-big-data-trends-2014-new-uses-new-challenges-new-solutions-58181.html
Here are some excerpts -
In this era of low interest rates, insurance companies need strong real-time analytics capabilities to achieve the elusive underwriting profit and sustained growth,” explained Amit Unde, chief architect and director of insurance solutions for L&T Infotech. “Going forward, the competitive battles will be played on the data turf. It’s the companies that leverage both external and internal Big Data, predictive analytics and adoptive underwriting models that will come out on top.
With Google Maps and location intelligence services, the underwriter can view a property from all angles and assess distance from a coastline, flood plain or other potential hazards. Online access to hundreds of different data sources—from videos to photos to loss trends and other documents— is now just a few clicks away,” Unde said. “But, without the right tools, mining this data is still a highly manual process.
I wouldn’t be surprised if, in the next five years, the next big player in the commercial insurance industry was a new company with a Big Data-driven automated policy issuance and claims payout model,” Unde said. “Automated decision-making has the potential to transform the industry, enabling small players to compete with large insurers, based on their technology.
In the insurance industry, companies should validate against a set of rules or cross-verify against multiple sources,” Unde said. “However, in most cases, it doesn’t make sense for insurers to boil the ocean to get 100 percent data accuracy. It makes better sense to apply the 80/20 rule to achieve the desired accuracy for the 80 percent of the dataset without having to invest intensive efforts—then asking ‘did you mean’ questions in the remaining 20 percent of cases.
Let me know what you think about the article.

Friday, August 9, 2013

Webinar : Aggregate, Visualize and Manage: The Fundamentals for Gaining A Single View of Risk

The ability to aggregate, visualize, understand and manage risk is fundamental to the profitability of insurance carriers and reinsurance companies. In some cases, it’s fundamental to legal compliance, overall solvency and long-term viability.
Yet it has been difficult to date for most insurers to gain a single, operational view of risk across their organizations. Until now…
Carriers are beginning to leverage technology solutions for a comprehensive risk management solution – one that helps carriers overcome siloed operational structures, IT systems and data sources to integrate information across the enterprise, ensure its quality, enrich it with third-party data – and then present a single, map-based view of operational risk in near-real-time.
View this one-hour webinar ( conducted on Wednesday, July 31, at 2:00 PM)as Rich Ward, Business Solution Architect with Pitney Bowes Software and Amit Unde, Chief Architect, Insurance Solutions with L&T Infotech, discuss new trends in location intelligence technologies and how near-real-time geospatial analytics are drastically changing catastrophe modeling, underwriting and risk management practices.

Wednesday, September 12, 2012

Get more value from Big Data technologies – Use it for Small Data Analytics

[ Note : I have published this blog originally at L&T Infotech blogsite - www.lntinfotechblogs.com/Lists/Posts/Post.aspx?ID=38 ].

Big data is often defined by three Vs. – Volume, Variety and Velocity. While this definition captures the essence of Big data, it is limiting when used to define technologies that support Big data. These technologies can do much more than just handling ‘Big data.’ In fact, most enterprises can derive more value by using these for ‘small data’ analytics.
Besides handling large variety of data, these technologies provide new analytical capabilities, including natural language processing, pattern recognition, machine learning and much more. You can use these capabilities effectively for small (or ‘not so large’ data) in non-traditional ways and get more value out of this data.
Here is how –
1. Create ‘Data labs’ rather than just a data warehouse
Big data technologies provide advanced analytical environment. The focus is on analyzing the data, rather than structuring and storing the data. Such environment gives a perfect sandbox for experts to ‘experiment’ with data and derive intelligence out of it. For example, Insurance actuaries can derive specific patterns out of claims history data by linking external factors with loss events and define rules for pricing and loss predictions.
2. Don’t just predict, but adapt continuously to changing realities
Big data technologies provide machine learning capabilities that allow calibrating predictive models continuously by comparing actual outcomes with predictions.
3. Change ‘Forecasting’ to ‘Now-casting’
Big data technologies can help in analyzing large stream of data at real-time, without hampering performance. This capability can be used effectively to provide ‘real-time’ analytics. For example, Insurers can define new products that charge premiums based on real-time risk data emitted by sensors or telematics instruments, rather than traditional approach of calculating premiums based on forecasting of risks.
4. Don’t get constrained by a Data model
Have you ever undergone the pain of living with a data model that no longer supports business requirements? Well, don’t worry anymore. Most Big data technologies support ‘Open format’ and dynamic changes to data records to suit analytical needs.
5. Forget Massive data movements
In big data platforms, the data is co-located with analytical processing involving minimal data movements. Forget about those large, multi-year ETL programs.
6. Save cost with low-cost commodity hardware
Large data warehousing and MDM programs often require expensive enterprise hardware and licensing to support desired level of performance. This expense can be as large as 50% of your total cost of ownership (TCO). The big data platforms are designed to work with low-cost commodity hardware (including bursting on cloud), and most are open-sourced. This can help you slash the hardware/licensing costs significantly.
So the moral of the story is – Big data technologies provide many capabilities that make them an attractive choice for ‘small data’ analytics as well. Be innovative in leveraging these capabilities to complement your current analytics world.