Solvency II Data Quality - you solved the puzzle?

21.1.2016, Jani Antikainen

The EU delegated regulation for reaching uniform policyholder protection (among many other things) across EU insurance business is finally here (after many delays). What makes this regulation interesting is that it’s one of the very first regulations, which promote the quality of information as a fundamental factor for producing even remotely believable calculation and reporting figures. Now, this really is something new and gives a great hope that there will be more along these lines soon! Something rather simple has been realized by someone setting the regulation – the quality of the data used for (any) calculation is rather crucial to the results. The realization is stunning – but only due to fact that only now, well past year 2000 requirements like Solvency II (SII) data quality for technical provisions (Article 19, 20 and 21) are seeing daylight. It has been realized that if the numbers you start with are right, then the results you get have a greater chance of reflecting reality – staggering revelation!

Not an easy job, really

Cynicism aside but not totally embracing humankind’s bright future either – it’s a rather sad story that perhaps only a few of the (re-)insurance companies really saw the opportunity exposed here and only perceived this as a regulatory issue. It was and still is MUCH more! It’s a solid game breaker opportunity for any (re-)insurance company to break out of its stagnation and develop business and operations delivering it. Smart companies realized that there was lots of issues they had with data quality and information management, which could be many easily solved and more numerous started solving by the ”must-do”-SII regulation. Really smart ones realized, what the data quality articles required – an established, across divisions and departments, from top to down and down to up data quality management practices; governance, metrics, processes, roles, shared definitions of key terms like ”customer”& ”insurance product”. It required a whole framework, otherwise it was impossible to make effective.

Not many (re)-insurers were familiar with ”advanced” data quality management, let alone that they had anything functional established. Even today – only few do. Not even perhaps you, dear reader; in fact it is more likely that you do not than that you did – I have done field study on this.  I consider myself a rather professional data management – at least on Finland’s general competence level – and I had to spend rather much time interpreting the requirements set by the regulation. It was not easy – they are very vague, abstract and offer looooots of room for interpretation. I was lucky – I had ready concepts in my head, which I could utilize and turn the abstract into concrete – all were not that lucky and ended up with some PPTs at the network drive, hardly anything changed and no opportunity realized.

Not either the final frontier

There are many data quality frameworks available, which can be realized. A concept of master data management (MDM) can for example be adapted for the purpose; rather easily, in fact. Taking such an approach rather quickly turns the earlier totally unintelligible sentences like ” the insufficiency of data is not due to inadequate internal processes and procedures of collecting, storing or validating data used for the valuation of technical provisions” means that you had to have figured out the whole lifecycle of the data from the very first appearance of it all the way across functions and systems (operations) to the Solvency calculation (reporting). Really not that easy, especially, if you did not understand, what might solve the puzzle. It also meant that you had to have control over the data’s whole lifecycle and ability to spot places, where the integrity of the data might get compromized (be it ICT -systems or people) and it meant that you were required to make the processes of collecting and processing the data totally transparent and repeatable 1:1.

The example presented … is not easy to solve, but if you were able to solve it, it presented you with solid understanding of how your business information travels through operations, who change it and why, why it was created separately many times, why the data from one system compared to another was conflicting,  why there were so much errors in the data resulting in unneeded reserves, errorous claim calculations, inconsistency of reporting figures (you always had to question the figures first, right?) and so on. Huge possibility to get rid of these nuisances and get that competitive edge everybody is so much after. Or did you just settle for the PPTs and nominal ”fulfilling” of the regulation and crossing your finger that regulatory inspection bodies would not see through your smoke and mirror? Hopefully not – they perhaps will not see through the mist yet (and are much occupied with too few resources), but one day they will, rest assured. And you lost the opportunity of building momentum for solving the ever-present-and-prevailing data quality issues.

Time is out, but there is still light, if you seize it!

Lo, the opportunity is not all lost and presenting a solid plan for the regulator governing bodies on how you will improve your data management for SII requirements will still save much of the day. But not for long, carpe diem and all that – it is now needed!

Easy start? I think that I and Sparta have this pretty well covered and ”done that” makes things fast and repeatable – exactly what you need. Be in contact, help is available and offered – you can still be on the winners’ league, not the one whose name appears on the black list for SII criminations (not all publicity is that good, after all).

Welcome Solvency II – hope to see other regulations understanding the utmost importance of data’s quality soon!

Jani Antikainen

Chairman of the Board