Research

/Research
Research 2017-01-16T09:00:32+00:00

Research is a part of our corporate culture. The following are some of our publications in different media sources.

2014 – Maximum Likelihood Estimation of the correlation parameters for elliptical copulas 2017-01-16T09:00:34+00:00

Working paper, 19 December 2014

Authors: Lorenzo Hernández, Jorge Tejero, Jaime Vinuesa

We present an algorithm to obtain the maximum likelihood estimates of the correlation parameters of elliptical copulas. Previously existing methods for this task were either fast but only approximate or exact but very time-consuming, especially for high-dimensional problems. Our proposal combines the advantages of both, since it obtains the exact estimates and its performance makes it suitable for most practical applications. The algorithm is given with explicit expressions for the Gaussian and Student’s t copulas.

Maximum Likelihood Estimation of the correlation parameters for elliptical copulas (arxvig.org)

2013 – Closed-form approximations for operational value-at-risk 2017-01-16T09:00:35+00:00

Journal of Operational Risk, 19 December 2013.

Authors: Lorenzo Hernández, Jorge Tejero, Alberto Suárez and Santiago Carrillo-Menéndez.

In the loss distribution approach, operational risk is modeled in terms of the distribution of sums of independent random losses. The frequency count in the period of aggregation and the severities of the individual loss events are assumed to be independent of each other. Operational value-at-risk is then computed as a high percentile of the aggregate loss distribution. In this work we present a sequence of closed-form approximations to this measure of operational risk. These approximations are obtained by the truncation of a perturbative expansion of the percentile of the aggregate loss distribution at different orders. This expansion is valid when the aggregate loss is dominated by the maximum individual loss. This is the case in practice, because the loss severities are typically very heavy-tailed and can be modeled with subexponential distributions, such as the lognormal or the generalized Pareto distribution. The two lowest-order terms in the perturbative series are similar to the single-loss approximation and to the correction by the mean, respectively. Including higher-order terms leads to significant improvements in the quality of the approximation. Besides their accuracy and low computational cost, these closed-form expressions do not require that the moments of the severity distribution, including the mean, be finite.

http://www.risk.net/journal-of-operational-risk/technical-paper/2317749/closed-form-approximations-for-operational-value-at-risk

2013 – Percentiles of sums of heavy-tailed random variables: beyond the single-loss approximation 2015-01-09T11:07:18+00:00

Springer US, Statistics and Computing, February 2013.

Authors: Lorenzo Hernández, Jorge Tejero, Alberto Suárez, Santiago Carrillo-Menéndez.

A perturbative approach is used to derive approximations of arbitrary order to estimate high percentiles of sums of positive independent random variables that exhibit heavy tails. Closed-form expressions for the successive approximations are obtained both when the number of terms in the sum is deterministic and when it is random. The zeroth order approximation is the percentile of the maximum term in the sum. Higher orders in the perturbative series involve the right-truncated moments of the individual random variables that appear in the sum. These censored moments are always finite. As a result, and in contrast to previous approximations proposed in the literature, the perturbative series has the same form regardless of whether these random variables have a finite mean or not. For high percentiles, and specially for heavier tails, the quality of the estimate improves as more terms are included in the series, up to a certain order. Beyond that order the convergence of the series deteriorates. Nevertheless, the approximations obtained by truncating the perturbative series at intermediate orders are remarkably accurate for a variety of distributions in a wide range of parameters.

http://dx.doi.org/10.1007/s11222-013-9376-6

2012 – Reconstructing heavy tailed distributions by splicing with maximum entropy in the mean 2014-03-17T15:57:57+00:00

Journal of Operational Risk, 30 June 2012.

Authors: Santiago Carrillo Menéndez, Henryk Gzyl and Aldo Tagliani.

Sometimes it is not possible to obtain a single parametric density with the desired tail behavior to fit a given data set. Splicing two different parametric densities is a useful process in such cases. Since the two parts depend on local data, a question arises over how best to assemble the two parts so that the properties of the whole data set are taken into account. We propose an application of the method of maximum entropy in the mean to splice the two parts together in such a way that the resulting global density has the first two moments of the full data set.

http://www.risk.net/journal-of-operational-risk/technical-paper/2186743/reconstructing-heavy-tailed-distributions-splicing-maximum-entropy-mean

2012 – Robust quantification of the exposure to operational risk, Bringing economic sense to economic capital 2017-01-16T09:00:34+00:00

Computers & Operations Research, Volume 39, Issue 4, April 2012.

Authors: Santiago Carrillo Menéndez and Alberto Suárez.

Operational risk is commonly analyzed in terms of the distribution of aggregate yearly losses. Risk measures can then be computed as statistics of this distribution that focus on the region of extreme losses. Assuming independence among the operational risk events and between the likelihood that they occur and their magnitude, separate models are made for the frequency and for the severity of the losses. These are then combined to estimate the distribution of aggregate losses. While the detailed form of the frequency distribution does not significantly affect the risk analysis, the choice of model for the severity often has a significant impact on operational risk measures. For heavy-tailed distributions these measures are dominated by extreme losses, whose probability cannot be reliably extrapolated from the available data. With limited empirical evidence, it is difficult to distinguish among alternative models that produce very different values of the risk measures. Furthermore, the estimates obtained can be unstable and overly sensitive to the presence or absence of single extreme events. Setting a bound on the maximum amount that can be lost in a single event reduces the dependence on the distributional assumptions and improves the robustness and stability of the risk measures, while preserving their sensitivity to changes in the risk profile. This bound should be determined by expert assessment on the basis of economic arguments and validated by the regulator, so that it can be used as a control parameter in the risk analysis.

http://dx.doi.org/10.1016/j.cor.2010.10.001

2010 – Optimization Problems with Cardinality Constraints 2017-01-16T09:00:35+00:00

Computational Intelligence in Optimization. Applications and Implementations, Volume 7, 2010.

Authors: Sergio García Moratilla, Ruben Ruiz-Torrubiano and Alberto Suarez.

In this article we review several hybrid techniques that can be used to accurately and efficiently solve large optimization problems with cardinality constraints. Exact methods, such as branch-and-bound, require lengthy computations and are, for this reason, infeasible in practice. As an alternative, this study focuses on approximate techniques that can identify near-optimal solutions at a reduced computational cost. Most of the methods considered encode the candidate solutions as sets. This representation, when used in conjunction with specially devised search operators, is specially suited to problems whose solution involves the selection of optimal subsets of specified cardinality. The performance of these techniques is illustrated in optimization problems of practical interest that arise in the fields of machine learning (pruning of ensembles of classifiers), quantitative finance (portfolio selection), time-series modeling (index tracking) and statistical data analysis (sparse principal component analysis).

http://dx.doi.org/10.1007/978-3-642-12775-5_5

2008 – A theoretical comparison between moments and L-moments. 2017-01-16T09:00:35+00:00

Authors: Nicolás Hernández Pérez, Santiago Carrillo Menéndez and Luis Seco.

Despite its popularity in applied statistics, standard measures of shape have long been recognized to be unsatisfactory, due to their extreme sensitivity to outliers and poor sample efficiency. These difficulties seem to be largely overcome by a new system: the L-moments. During the last decade several authors have established the superior performance of L-moments over classical moments based on heuristic studies, but until present no formal explanation has been provided. We address these issues from a theoretical viewpoint. Our comparative programme is focussed on two aspects, which highlight the statistical performance of a descriptive measure: qualitative robustness and global efficiency. L-moments are treated as members of a general class of descriptive measures that are shown to outperform conventional moments based on these criteria. Consequently, they may be considered as appealing substitutes as L-moments to replace the standard measures. Since the results obtained hold for rather large nonparametric sets of distribution functions, they unify previous heuristic studies

A theoretical comparison between moments and L-moments.pdf

2007 – Matemática financiera 2014-03-17T16:00:16+00:00

Chapter of the book Matemáticas en la frontera.

Author: Santiago Carrillo Menéndez.

2007 – El entorno AMA: los datos y su tratamiento. 2014-03-17T16:00:06+00:00

Chapter of the book La gestión del riesgo operacional. De la teoría a su aplicación.

Authors: Santiago Carrillo Menéndez, Mercedes Marhuenda Collado and Alberto Suárez González.

2006 – Medición efectiva del riesgo operacional 2017-01-16T09:00:35+00:00

Estabilidad financiera, Volume 11, 2006.

Authors: Santiago Carrillo Menéndez and Alberto Suárez.

En este trabajo se estudian algunos de los problemas planteados hoy a los gestores de riesgo operacional en el diseño e implementación de modelos avanzados. Utilizando datos sintéticos se analizan los impactos del riesgo de modelo, las dificultades en la aplicación de la teoría de valores extremos para el ajuste de la distribución de severidad y, finalmente, la validez de aproximaciones numéricas al uso. Las simulaciones realizadas muestran que, con bases de datos de pérdidas operacionales de tamaño reducido, que es la situación en la que numerosas entidades se encuentran actualmente, es difícil distinguir entre modelos alternativos que proporcionan estimaciones del capital regulatorio muy distintas. Una aplicación directa de la teoría de valores extremos basada en el ajuste de distribuciones de Pareto a sucesos de pérdidas por encima de un umbral elevado tiende a producir estimaciones que son muy inestables y excesivamente elevadas, por lo que existen muchas dudas acerca de su relevancia económica. Finalmente, las aproximaciones numéricas propuestas en la literatura podrían no ser adecuadas en los rangos de valores de las pérdidas operacionales reales. Estas observaciones ponen de manifiesto la necesidad de adoptar una actitud cauta al elegir los modelos para la cuantificación del riesgo operacional y la importancia de contrastar los modelos alternativos con datos empíricos.

http://www.bde.es/f/webbde/Secciones/Publicaciones/InformesBoletinesRevistas/RevistaEstabilidadFinanciera/06/Fic/IEFnov06.pdf

2006 – Relaciones entre matemáticas y finanzas. 2014-03-17T15:59:41+00:00

Encuentros multidisciplinares, Volume 8, Nº 23, 2006.

Author: Santiago Carrillo Menéndez and Antonio Sánchez Calle.

http://www.encuentros-multidisciplinares.org/Revistan%C2%BA23/Indice_n%C2%BA_23_2006.htm

2006 – Modelos Multifactoriales en Riesgo de Crédito 2014-03-17T15:59:24+00:00

Revista de economía financiera, Volume 10, 2006.

Authors: Pablo Blanco, Santiago Carrillo Menéndez, Antonio Sánchez Calle, Cesar de Sánchez Lucas and Juan Ignacio Valdés Alcocer.

In the first part of this paper, we study the impact, in terms of regulatory capital of using multifactor models for credit risk. The second one is dedicated to the study of a portfolio composed by companies of the Moody´s All Corporate and we see some evidence of the existence of at least two factors.

http://www.aefin.es/aefin_data/ref_mun.asp?n=10

2005 – Basilea II: Una Mirada Crítica. Los retos de la industria bancaria. 2014-03-17T15:59:16+00:00

Mediterráneo Económico, Volume 8, 2005.

Author: Santiago Carrillo Menéndez.

Cuando salga este libro habrá pasado más de un año desde la publicación del documento definitivo del Comité de Basilea. Sería pretencioso y poco útil intentar resumir aquí los contenidos de uno texto ampliamente expuesto, debatido e interpretado en los últimos tiempos. En consecuencia, el objetivo de estas líneas es resumir algunos de los aspectos más novedosos (y no siempre destacados en la literatura) y analizar algunas lagunas que impiden una convergencia más plena entre capital regulatorio y capital económico.

http://www.publicacionescajamar.es/pdf/publicaciones-periodicas/mediterraneo-economico/8/8-124.pdf

2003 – Computational Tools for the Analysis of Market Risk 2014-03-17T15:59:07+00:00

Computational Economics, Volume 21, 2003.

Authors: Santiago Carrillo Menéndez and Alberto Suárez.

The estimation and management of risk is an important and complex task faced by market regulators and financial institutions. Accurate and reliable quantitative measures of risk are needed to minimize undesirable effects on a given portfolio fromlarge fluctuations in market conditions. To accomplish this, a series of computational tools has beendesigned, implemented, and incorporated into MatRisk, an integratedenvironment for risk assessment developed in MATLAB. Besides standard measures, such as Value at Risk(VaR), the application includes other more sophisticated risk measures that address the inability of VaRproperly to characterize the structure of risk. Conditionalrisk measures can also be estimated for autoregressive models with heteroskedasticity, including some novel mixture models. These tools are illustrated with a comprehensive risk analysis of the Spanish IBEX35 stock index.

http://dx.doi.org/10.1023/A%3A1022267720606

2003 – New Families of Distributions fitting L-moments for Modelling Financial Data 2017-01-16T09:00:34+00:00

Authors: Nicolás Hernández Pérez, Santiago Carrillo Menéndez and Luis Seco.

The classical problem in statistics of estimating an unknown distribution from a given a series of observations is approached from the point of view of interpolating primary features of the shape of any distribution. Unlike traditional approaches that aim at matching descriptive measures based on algebraic moments, we choose to match more robust statistics: the L-moments. Our main contribution is to present a new system of parametric families that are capable of interpolating an arbitrary finite set of L-moments. Our methodology is based on the representation of certain subsets of quantile functions by means of positive measures and the concept of entropy for making up for missing information. This approach also allows to incorporate additional constraints of a more qualitative nature such as unimodality.
The calibration of these families is accomplished by simply matching sample L-moments. We justify the feasibility of this method almost surely, and discuss a numerically tractable algorithm for its implementation.

New Families of Distributions fitting L-moments for Modelling Financial Data.pdf

2002 – Operational Risk: identify, measure, manage. 2014-03-17T15:58:51+00:00

Chapter of the book Operational Resilience the art of risk management.

Authors: Pierre Pourquery, Matthew Hilbert and Santiago Carrillo Menéndez.

2001 – Nuevos retos en la medición del riesgo de mercado 2014-03-17T15:58:42+00:00

Perspectivas del sistema financiero, Nº 72, 2001.

Authors: Santiago Carrillo Menéndez and Prosper Lamothe.

http://www.funcas.es/Publicaciones/Detalle.aspx?IdArt=9736

1998 – Valoración de opciones asiáticas y doble barrera 2014-03-17T15:58:35+00:00