Contact with us.

+34 941 123 251


Bosonit Agile Center.

Portales 71. 2º. Of 7,8,9 y 10. LOGROÑO

Isabel la Católica, 6. (Edificio Hiberus). ZARAGOZA

Salvador Granes, 3. MADRID

San Esteban de Etxebarri, 8. BILBAO

Contact with us
Back to top


  /  #SomosBosonit   /  Big Data for financial markets: regulatory bodies.

The banking sector is one of the most regulated in the financial sphere, but the last crisis in 2008 left many doubts about the ability of regulatory bodies to prevent this type of disaster.

After the fall in the value of many large banks (in 2008: Santander, BBVA, Popular, CaixaBank and Bankinter ceded 50% of their value), even the bankruptcy of some others (e.g. Lehman Brothers); regulatory agencies on both sides of the Atlantic had to negotiate to set standards in both regulatory and accounting criteria.

It is thanks to these new criteria that the system began to see the light at the end of the tunnel, as the concept of “expected loss ” was included, obliging financial institutions to store certain provisions for each loan they grant.

This provisioning is included in order to cover possible losses due to an unpaid loan. We say possible losses because we are keeping a percentage of each loan that is granted, without knowing with certainty if the loan will be defaulted. This logical concept was practically not taken into account before, since financial institutions were not concerned about potential losses, but only about losses that had already occurred.

For the larger banks, some measures were included that prevent external supervisors from calculating the provisions to be stored, these calculations are known as internal models, and make a big difference for the institutions. For example: If supervisors, generally applying the regulations, oblige us to make a provision of 20% for mortgages with a high degree of default; the internal models allow us to refine these provisions much more, so that the different entities can determine their provision based on their historical sales data.

As you can imagine, large banks have immense historical data stored . For the analysis and search for value in these data, Big Data technologies are used, so that by analysing these sales histories, the selected banks can establish their own provisioning models based on their past experiences.

For example: After applying analysis in a major bank, it is determined that mortgage loans, between 100,000-150,000 €, granted to men under 30 years of age, who have worked at least 5 years and, with at least 20,000 euros saved, must have a provision of 10% of total credit. On the contrary, the general models applied by supervisors establish that these loans must have a provision of 15%.

If the selected bank presents these results before the supervisory bodies, it will be able to apply its internal provisions; so that, through the use of Big Data techniques and technologies, the bank in question has the possibility of supplying less money in each loan, which will allow it to have more liquid money for its investments.

This need for treatment of large amounts of data makes the financial sector a clear example of the use of Big Data technologies: Until now, the most advanced banks had incorporated these technologies for, above all, fraud detection and security issues; now all banks (large and small) are incorporating these technologies to their servers to have the means to respond to Europe on their actions (It is estimated that in the last 7 years, 10 large banks worldwide paid about 43 billion dollars in fines for not complying or for not reporting correctly on CRR compliance). On the other hand, this trend seems to be reversed, since already in 2016, it was estimated that European banks invested some 16 billion dollars in software solutions alone.

Author: Iván Gómez Arnedo. Linkedin.