GUEGAN Dominique

< Back to ILB Patrimony
Topics of productions
Affiliations
  • 2016 - 2019
    IPAG Business School
  • 2015 - 2020
    Ca Foscari University of Venice
  • 2012 - 2020
    Centre d'économie de la Sorbonne
  • 2018 - 2019
    University of Economics Ho Chi Minh City
  • 2012 - 2013
    Ecole d'économie de Paris
  • 2021
  • 2020
  • 2019
  • 2018
  • 2017
  • 2016
  • 2015
  • 2014
  • 2013
  • 2011
  • 2009
  • 2008
  • 2006
  • 2005
  • 2003
  • 2002
  • 2000
  • 1998
  • 1995
  • Multivariate radial symmetry of copula functions: finite sample comparison in the i.i.d case.

    Monica BILLIO, Lorenzo FRATTAROLO, Dominique GUEGAN
    Dependence Modeling | 2021
    No summary available.
  • A Note on the Interpretability of Machine Learning Algorithms.

    Dominique GUEGAN
    2020
    We are interested in the analysis of the concept of interpretability associated with a ML algorithm. We distinguish between the "How", i.e., how a black box or a very complex algorithm works, and the "Why", i.e. why an algorithm produces such a result. These questions appeal to many actors, users, professions, regulators among others. Using a formal standardized framework , we indicate the solutions that exist by specifying which elements of the supply chain are impacted when we provide answers to the previous questions. This presentation, by standardizing the notations, allows to compare the different approaches and to highlight the specificities of each of them: both their objective and their process. The study is not exhaustive and the subject is far from being closed.
  • Hedging in alternative aarkets

    Rostislav HALIPLII, Dominique GUEGAN, Marius christian FRUNZA, Catherine BRUNEAU, Dominique GUEGAN, Marius christian FRUNZA, Julien CHEVALLIER, Stephane GOUTTE
    2020
    The research in this thesis focuses on two alternative markets: cryptocurrencies and oil products. Most alternative markets are far from efficient, and this generates many modeling challenges. Models based on Gaussian distributions are still the most popular choice for quantitative financial analysts and are implemented even in markets that are far from efficient. A robust modeling framework for alternative assets must start with a non-Gaussian distribution. Therefore, throughout this thesis, the general theme of all simulations and estimations is the use of generalized hyperbolic distributions. This approach has a dual justification. On the one hand, it is essential to develop a sharp quantitative framework beyond the Gaussian universe, testing the performance of the new model in real situations. On the other hand, the markets that are the subject of this research (oil products and cryptocurrencies) have neither the fundamentals nor the empirical behavior that could justify traditional modeling.
  • Blockchain seminar: Risk and Blockchain.

    Dominique GUEGAN
    Blockchain seminar: Risk and Blockchain | 2019
    No summary available.
  • Assessing tail risk for nonlinear dependence of MSCI sector indices: A copula three-stage approach.

    Giovanni de LUCA, Dominique GUEGAN, Giorgia RIVIECCIO
    Finance Research Letters | 2019
    The author propose a copula-based three-stage estimation technique in order to describe the serial and cross-sectional nonlinear dependence among financial multiple time series, exploring the existence of tail risk. We find out on MSCI World Sector Indices the higher performance of the approach against the classical Vector AutoRegressive models, giving the implications of misspecified assumptions for margins and/or joint distribution and providing tail dependence measures of financial variables involved in the analysis.
  • Crypto assets: the role of ICO tokens within a well-diversified portfolio.

    Saman ADHAMI, Dominique GUEGAN
    2019
    This paper reexamines the discussion on blockchain technology, crypto assets and ICOs, providing also evidence that in crypto markets there are currently two classes of assets, namely standalone cryptocurrencies (or 'coins') and tokens, which result from an ICO and are intrinsically linked to the performance of the issuing company or venture. While the former have been arguments of various empirical studies regarding their price dynamics and their effect on the variance of a well-diversified portfolio, no such study has been done to analyze listed tokens, which in our sample are over 700 and with a backing of about $17.3Bn from their respective ICOs. Therefore, investors interested in optimizing their portfolios should first assess the diversifier, hedge or safe haven role of tokens vis-à-vis traditional assets, on top of 'coins', in order to sensibly use this new asset class. After constructing various indices to represent both the token asset class as a whole and its sub-classes, we model dynamic conditional correlations among all the assets in our sample to obtain time-varying correlations for each token-asset pair. We find that tokens are effective diversifiers but not a hedge or a safe haven asset. We evidence that tokens retain important systematic differences with the two other asset classes to which they are most generally compared to, namely 'coins' and equities.
  • A probative value for authentication use case blockchain.

    Dominique GUEGAN, Christophe HENOT
    Digital Finance | 2019
    The Fintech industry has facilitated the development of companies using blockchain technology. The use of this technology inside banking system and industry opens the route to several questions regarding the business activity, legal environment, and insurance devices. In this paper, considering the creation of small companies interested to develop their business with a public blockchain, we analyse from different aspects why a company (in banking or insurance system, and industry) decides that a blockchain protocol is more legitimate than another one for the business which it wants to develop looking at the legal (in case of dispute) points of view. We associate with each blockchain a probative value which permits to assure in case of dispute that a transaction has been really done. We illustrate our proposal using 13 blockchains providing in that case a ranking between these blockchains for their use in business environment. We associate with this probative value some main characteristics of any blockchain as market capitalization and log-return volatilities that the investors need to also take into account with the new probative value for their managerial strategy.
  • Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and Autometrics.

    Camila EPPRECHT, Dominique GUEGAN, Alvaro VEIGA, Joel CORREA DA ROSA
    Communications in Statistics - Simulation and Computation | 2019
    No summary available.
  • Risks and Blockchain.

    Dominique GUEGAN
    1st International Symposium on Entrepreneurship, Blockchain and Crypto-Finance | 2019
    No summary available.
  • Credit Risk Analysis using Machine and Deep Learning Models.

    Dominique GUEGAN
    Credit Risk Analysis Using Machine and Deep Learning Models | 2019
    Due to the hyper technology associated to Big Data, data availability and computing power, most banks or lending financial institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modelling process to test the stability of binary classifiers by comparing performance on separate data. We observe that tree-based models are more stable than models based on multilayer artificial neural networks. This opens several questions relative to the intensive used of deep learning systems in the enterprises.
  • Operational risk in blockchain payments.

    Dominique GUEGAN
    Fin-Tech HO2020 European Project: FINTECH Risk Management | 2019
    No summary available.
  • Fintech and Blockchain.

    Dominique GUEGAN
    Reading seminars 2018-2019 | 2019
    The recent development for crypto-assets make the interest on blockchain technology growing. To understand the interest of this innovation for the economy and the finance, it is important to study the basements of the concept around the technology blockchain. First, we will introduce the notion of peer-topeer lending which is one of the component of the question, and then we will explain the concept of Bitcoin which is the first crypto-asset which has been developed based on a PoW (Proof-of-work) protocol. We will analyze at the same time the risks associated to this crypto-asset and the regulation which exists. In a second step, we will present and analyze different classes of blockchains and the mechanisms associated to them. We will introduce the concept of close and open blockchain, public and private. This will permit to introduce the different kinds of protocols which govern these blockchains. Finally, we will introduce the concept of ICO, new way to funds using cryptocurrencies. The analysis of the governance of this new kind of start-up will be provided. The risks associated to these ICO will be analysed and the recent regulation on them provided.
  • Big Data, Artificial Intelligence and Blockchain.

    Dominique GUEGAN
    Big Data, Artificial Intelligence and Blockchain | 2019
    1 - Big Data and regulatory Learning: How to supervise deep learning models? This seminar aims to present (i) The notions of Big Data, (ii) The architecture indispensable for the use of Big Data, (iii) The concept of artificial intelligence and the models associated, (iv) The question of the existence of a regulatory framework, (v) The study of a use case developed inside the banking system concerning the credit scoring. 2 - Crypto-currencies and the Challenge for Financial Regulation: The example of Bitcoin after defining the notion of peer-to-peer lending, and distributed ledger, the Bitcoin concept is introduced. Regulation around the crypto-assets is analysed: remote and immediate risks as risks for the users. 3 - The digital world: Blockchain and ICO: in this talk we analyse the concept of blockchain and the different classes of blockchain with their properties. The regulatory framework is presented. The Initial Coin offerings is also discussed with their interest and limits.
  • Risk Measurement.

    Dominique GUEGAN, Bertrand k. HASSANI
    2019
    This book combines theory and practice to analyze risk measurement from different points of view. The limitations of a model depend on the framework on which it has been built as well as specific assumptions, and risk managers need to be aware of these when assessing risks. The authors investigate the impact of these limitations, propose an alternative way of thinking that challenges traditional assumptions, and also provide novel solutions. Starting with the traditional Value at Risk (VaR) model and its limitations, the book discusses concepts like the expected shortfall, the spectral measure, the use of the spectrum, and the distortion risk measures from both a univariate and a multivariate perspective.
  • The other side of the Coin: Risks of the Libra Blockchain.

    Louis ABRAHAM, Dominique GUEGAN
    2019
    Libra was presented as a cryptocurrency on June 18, 2019 by Facebook. On the same day, Facebook announced plans for Calibra, a subsidiary in charge of the development of an electronic wallet and financial services. In view of the primary risk of sovereignty posed by the creation of Libra, the Central Banks quickly took very clear positions against the project and adressed a lot of questions to the responsible of the project focusing on regulation aspects and national sovereignty. The purpose of this paper is to provide a holistic analysis of the project to encompass several aspects of its implementation and the issues it raises. We address a set of questions that are part of the cryptocurrency environment and blockchain technology that supports the Libra project. We identify the main risks considering at the same time: political risk, financial risks, economical risks, technological risks and ethics focusing on the governance of the project based on two levels: one for the Association and the other for the Libra Blockchain. We emphazise the difficulty to regulate such a project as soon as it will depend on several countries whose legislations are very different. The future of this kind of project is discussed through the emergence of the Central Bank Digital Currencies.
  • Artificial Intelligence, Data, Ethics: An Holistic Approach for Risks and Regulation.

    Alexis BOGROFF, Dominique GUEGAN
    2019
    An extensive list of risks relative to big data frameworks and their use through models of artificial intelligence is provided along with measurements and implementable solutions. Bias, interpretability and ethics are studied in depth, with several interpretations from the point of view of developers, companies and regulators. Reflexions suggest that fragmented frameworks increase the risks of models misspecification, opacity and bias in the result. Domain experts and statisticians need to be involved in the whole process as the business objective must drive each decision from the data extraction step to the final activatable prediction. We propose an holistic and original approach to take into account the risks encountered all along the implementation of systems using artificial intelligence from the choice of the data and the selection of the algorithm, to the decision making.
  • Initial Crypto-asset Offerings (ICOs), tokenization and corporate governance.

    Stephane BLEMUS, Dominique GUEGAN
    2019
    This paper discusses the potential impacts of the so-called “initial coin offerings”, and of several developments based on distributed ledger technology (“DLT”), on corporate governance. While many academic papers focus mainly on the legal qualification of DLT and crypto-assets, and most notably in relation to the potential definition of the latter as securities/financial instruments, the authors analyze some of the use cases based on DLT technology and their potential for significant changes of the corporate governance analyses. This article studies the consequences due to the emergence of new kinds of firm stakeholders, i.e. the crypto-assets holders, on the governance of small and medium-sized enterprises (“SMEs”) as well as of publicly traded companies. Since early 2016, a new way of raising funds has rapidly emerged as a major issue for FinTech founders and financial regulators. Frequently referred to as initial coin offerings, Initial Token Offerings (“ITO”), Token Generation Events (“TGE”) or simply “token sales”, we use in our paper the terminology Initial Crypto-asset Offerings (“ICO”), as it describes more effectively than “initial coin offerings” the vast diversity of assets that could be created and which goes far beyond the payment instrument issue.
  • More accurate measurement for enhanced controls: VaR vs ES?

    Dominique GUEGAN, Bertrand k. HASSANI
    Journal of International Financial Markets, Institutions and Money | 2018
    This paper (this work was achieved through the Laboratory of Excellence on Financial Regulation (Labex ReFi) supported by PRESheSam under the reference ANR-10-LABEX-0095) analyses how risks are measured in financial institutions, for instance Market, Credit, Operational, among others with respect to the choice of risk measures, the choice of distributions used to model them and the level of confidence selected. We discuss and illustrate the characteristics, the paradoxes and the issues observed, comparing the Value-at-Risk and the Expected Shortfall in practice. This paper is built as a differential diagnosis and aims at discussing the reliability of the risk measures and making some recommendations. (This paper has been written in a very particular period of time as most regulatory papers written in the past 20 years are currently being questioned by both practitioners and regulators themselves. Some disarray has been observed among risk managers as most models required by the regulation have not been consistent with their own objective of risk management. The enlightenment brought by this paper is based on an academic analysis of the issues engendered by some pieces of regulation and its purpose is not to create any sort of controversy).
  • Regulatory learning: How to supervise machine learning models? An application to credit scoring.

    Dominique GUEGAN, Bertrand HASSANI
    The Journal of Finance and Data Science | 2018
    The arrival of Big Data strategies is threatening the latest trends in financial regulation related to the simplification of models and the enhancement of the comparability of approaches chosen by financial institutions. Indeed, the intrinsic dynamic philosophy of Big Data strategies is almost incompatible with the current legal and regulatory framework as illustrated in this paper. Besides, as presented in our application to credit scoring, the model selection may also evolve dynamically forcing both practitioners and regulators to develop libraries of models, strategies allowing to switch from one to the other as well as supervising approaches allowing financial institutions to innovate in a risk mitigated environment. The purpose of this paper is therefore to analyse the issues related to the Big Data environment and in particular to machine learning models highlighting the issues present in the current framework confronting the data flows, the model selection process and the necessity to generate appropriate outcomes.
  • Credit Risk Analysis Using Machine and Deep Learning Models.

    Dominique GUEGAN, Peter ADDO, Bertrand HASSANI
    Risks | 2018
    Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises.
  • Credit Risk Analysis using Machine and Deep Learning models.

    Peter ADDO, Dominique GUEGAN, Bertrand HASSANI
    2018
    Due to the hyper technology associated to Big Data, data availability and computing power, most banks or lending financial institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modelling process to test the stability of binary classifiers by comparing performance on separate data. We observe that tree-based models are more stable than models based on multilayer artificial neural networks. This opens several questions relative to the intensive used of deep learning systems in the enterprises.
  • The Digital World: II - Alternatives to the Bitcoin Blockchain.

    Dominique GUEGAN
    Bankers Markets & Investors : an academic & professional review | 2018
    No summary available.
  • Measuring risk in an explosive environment.

    Dominique GUEGAN, Kruse becher ROBIN, Hans jorg METTENHEIM, VON, Wegener CHRISTOPH
    Vietnam Symposium in Banking and Finance (VSBF) | 2018
    Financial asset bubbles can be characterized by periods of expansion and collapse. Expansions are often modeled as explosive processes for the asset price. Ignoring such explosiveness leads to misspecified Value-at-Risk (VaR) and related measures, e.g. Expected Shortfall. Considering an explosive autoregressive model. We find that the unadjusted down-side VaR is overestimated in explosive periods and also misspecified during the collapse. The form of the misspecification strongly depends on several factors: (i) horizon of the VaR forecast, (ii) duration and strength of the explosive regime (as measured by the length of the explosive subsample and the explosive root), and (iii) the nature of the collapse. The size of the effects (in terms of capital requirements) are quantified by means of an extensive Monte Carlo simulation study. We propose a correction term to be added to the VaR which accounts for the unexpected loss due to a burst. In our empirical applications, we demonstrate the merits and limits of the suggested VaR adjustments, which have to be taken into account for management purposes.
  • The Digital World: II – Alternatives to the Bitcoin Blockchain?

    Dominique GUEGAN
    2018
    In a previous paper (The Digital World: I - Bitcoin: from history to real live, Guégan, 2018), we explain some limits and interests of the Bitcoin system and why the central bankers and regulators need to take some decision on its existence. In this article, we develop some alternatives to the Bitcoin blockchain which are considered by the banking system and industries.
  • ICO: the new way to raise funds without constraints?

    Dominique GUEGAN
    Revue Banque | 2018
    ICOs give people with cryptocurrencies the opportunity to invest these amounts. These operations have multiplied over the last two years, but they present significant risks for both subscribers and issuers. There is an urgent need to establish an appropriate regulatory framework for them.
  • A new token: the CommodCoin. What could be its interest for financial market? A macro-economic modelling.

    Dominique GUEGAN
    Digital, Innovation, Entrepreneurship and Financing | 2018
    In this paper, we discuss the interest for financial market of a tokens built on commodities. First, we introduce the notion of tokens, and we propose a classification of tokens which have be built based on different crypto-currencies and then on different classes and generations of blockchains, some are created by states in order to intensify the local or the national economy. Then finally, we introduce a tokens indexed on a commodity which could be proposed by a banking system in view to develop a monopolistic market and control the sector of this commodity. We propose a macro-economic model where a part of the economy uses the CommodCoin, and then a counter-factual analysis to analyze the impact of the introduction of the CommodCoin on the economy. After the introduction of a baseline model corresponding to a small open economy model with two non-competing sectors using the fiat currency, we present a counter-factual model in which the two sectors can use the CommodCoin for intra-sector transactions and also fiat currencies. We question the interest of this kind of proposal for the financial industry.
  • Measuring risk an explosive environment.

    Dominique GUEGAN, Kruse becher ROBIN, Hans jorg METTENHEIM, VON, Wegener CHRISTOPH
    Forecasting Financial Markets (FFM) | 2018
    Financial asset bubbles can be characterized by periods of expansion and collapse. Expansions are often modeled as explosive processes for the asset price. Ignoring such explosiveness leads to misspecified Value-at-Risk (VaR) and related measures, e.g. Expected Shortfall. Considering an explosive autoregressive model. We find that the unadjusted down-side VaR is overestimated in explosive periods and also misspecified during the collapse. The form of the misspecification strongly depends on several factors: (i) horizon of the VaR forecast, (ii) duration and strength of the explosive regime (as measured by the length of the explosive subsample and the explosive root), and (iii) the nature of the collapse. The size of the effects (in terms of capital requirements) are quantified by means of an extensive Monte Carlo simulation study. We propose a correction term to be added to the VaR which accounts for the unexpected loss due to a burst. In our empirical applications, we demonstrate the merits and limits of the suggested VaR adjustments, which have to be taken into account for management purposes.
  • Initial Token Offerings (ITOs) and corporate governance.

    Dominique GUEGAN, Stephane BLEMUS
    Forecasting Financial Markets (FFM) | 2018
    In this talk, we analyse the new concept for raising through IPO, the regulation question and the governance.
  • Assessment of proxy-hedging in jet-fuel markets.

    Dominique GUEGAN, Marius FRUNZA, Rostislav HALIPLII
    IRMBAM 2018 | 2018
    The aim of this research is to explore the risk associated with hedging in jet fuel markets. It focuses on finding the most effective proxy hedge instrument for the Singapore spot market. Due to its particularities, this market does not exhibit the same features as traditional financial markets do. In appearance, it seems very related to the oil market, but in reality it exhibits insufficient liquidity and shows unusual volatility clustering effects. This behavior has a direct impact on the hedging strategies of refineries, airline companies and jet fuel traders. The paper explores the econometric features of the jet fuel price and underlines the need of fat tail distributions and volatility clustering models. Also, it examines the density forecasting capacity of various proxy hedge instruments including kerosene, crude and gasoil futures. The results show that Singapore Gasoil Futures contract is the best candidate for hedging the Singapore Jet Fuel spot price.
  • The Digital World: I - Bitcoin: from history to real life.

    Dominique GUEGAN
    Bankers Markets & Investors : an academic & professional review | 2018
    No summary available.
  • ICOs the new way to raise funds without constraints?

    Dominique GUEGAN
    2018
    ICOs give people with crypto-currencies the opportunity to invest these amounts. These operations have multiplied over the last two years, but they present significant risks for both subscribers and issuers. An appropriate regulatory framework is urgently needed.
  • The Digital World: I - Bitcoin: from history to real live.

    Dominique GUEGAN
    2018
    Bitcoin can be considered as a medium exchange restricted to online markets, but it is not a unit of account and a store of value, and thus cannot be considered as a money. Bitcoin value is very volatile and traded for different prices in different exchanges platforms, and thus can be used for arbitrage purpose. His behavior can be associated with a high volatile stock, and most transactions in Bitcoin are aimed to speculative instruments. The high volatility in Bitcoin and the occurrence of speculative bubble depend on positive sentiment and confidence about Bitcoin market: several variables may be considered as indicators (volume of transactions, number of transactions, number of Google research, wikipedia requests). The star of the crypto-currencies has attained the 19 716 dollars in December 2017 and decreased to 6 707 dollars March 29, 2018. In capitalization it is at this time the 30th mondial currency. We explain some limits and interests of the Bitcoin system and why the central bankers and regulators need to take some decision on its existence, and what could be the possible evolution of the Bitcoin Blockchain.
  • Is the Bitcoin Rush Over?

    Dominique GUEGAN, Marius FRUNZA
    2018
    The aim of this research is to explore the econometric features of Bitcoin-USD rates. Various non-Gaussian models are fitted to daily returns in order to underline the unique characteristics of Bitcoin when compared to other more traditional currencies. Market efficiency hypothesis is tested further, and the main reasons for breaches in efficiency are discussed. The main goal of the paper is to assess the presence of bubble effects in this market with customized tests able to detect the timing of various bubbles. The results show that the Bitcoin prices had two episodes of rapid inflation in 2014 and 2017.
  • On the parameters estimation of the Seasonal FISSAR Model.

    Papa CISSE, Dominique GUEGAN, Abdou ka DIONGUE
    2018
    In this paper, we discuss the methods of estimating the parameters of the Seasonal FISSAR (Fractionally Integrated Separable Spatial Autoregressive with seasonality) model. First we implement the regression method based on the log-periodogram and the classical Whittle method for estimating memory parameters. To estimate the model's parameters simultaneously - innovation parameters and memory parameters- the maximum likelihood method, and the Whittle method based on the MCMC simulation are considered. We are investigated the consistency and the asymptotic normality of the estimators by simulation.
  • A novel multivariate risk measure: the Kendall VaR.

    Matthieu GARCIN, Dominique GUEGAN, Bertrand HASSANI
    2018
    The definition of multivariate Value at Risk is a challenging problem, whose most common solutions are given by the lower- and upper-orthant VaRs, which are based on copulas: the lower-orthant VaR is indeed the quantile of the multivariate distribution function, whereas the upper-orthant VaR is the quantile of the multivariate survival function. In this paper we introduce a new approach introducing a total-order multivariate Value at Risk, referred to as the Kendall Value at Risk, which links the copula approach to an alternative definition of multivariate quantiles, known as the quantile surface, which is not used in finance, to our knowledge. We more precisely transform the notion of orthant VaR thanks to the Kendall function so as to get a multivariate VaR with some advantageous properties compared to the standard orthant VaR: it is based on a total order and, for a non-atomic and Rd-supported density function, there is no distinction anymore between the d-dimensional VaRs based on the distribution function or on the survival function. We quantify the differences between this new kendall VaR and orthant VaRs. In particular, we show that the Kendall VaR is less (respectively more) conservative than the lower-orthant (resp. upper-orthant) VaR. The definition and the properties of the Kendall VaR are illustrated using Gumbel and Clayton copulas with lognormal marginal distributions and several levels of risk.
  • A Probative Value for Authentication Use Case Blockchain.

    Dominique GUEGAN, Christophe HENOT
    2018
    The Fintech industry has facilitated the development of companies using blockchain technology. The use of this technology inside banking system and industry opens the route to several questions regarding the business activity, legal environment and insurance devices. In this paper, considering the creation of small companies interested to develop their business with a public blockchain, we analyse from different aspects why a company (in banking or insurance system, and industry) decides that a blockchain protocal is more legitimate than another one for the business it wants to develop looking at the legal (in case of dispute) points of view. We associate to each blockchain a probative value which permits to assure in case of dispute that a transaction has been really done. We illustrate our proposal using thirteen blockchains providing in that case a ranking between these blockchains for their use in business environment. We associate to this probative value some main characteristics of any blockchain as market capitalization and log returns volatilities that the investors need to take also into account with the new probative value for their managerial strategy.
  • Credit Risk Analysis Using machine and Deep Learning Models.

    Dominique GUEGAN
    3small Business Risk, Financial Regulation and Big Data Analytics | 2018
    Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classidiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises.
  • Credit Risk Analysis Using Machine and Deep Learning Models.

    Peter martey ADDO, Dominique GUEGAN, Bertrand HASSANI
    SSRN Electronic Journal | 2018
    No summary available.
  • Testing for leverage effects in the returns of US equities.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO, Hanjarivo LALAHARISON
    Journal of Empirical Finance | 2018
    This article questions the empirical usefulness of leverage effects to forecast the dynamics of equity returns. In sample, we consistently find a significant but limited contribution of leverage effects over the past 25 years of S&P 500 returns. From an out-of-sample forecasting perspective and using a variety of different models, we find no statistical or economical value in using leverage effects, provided that an asymmetric and fat-tailed conditional distribution is used. This conclusion holds both at the index level and for 70% of the individual stocks constituents of the equity index.
  • Nonparametric forecasting of multivariate probability density functions.

    Dominique GUEGAN, Matteo IACOPINI
    2018
    The study of dependence between random variables is the core of theoretical and applied statistics. Static and dynamic copula models are useful for describing the dependence structure, which is fully encrypted in the copula probability density function. However, these models are not always able to describe the temporal change of the dependence patterns, which is a key characteristic of financial data. We propose a novel nonparametric framework for modelling a time series of copula probability density functions, which allows to forecast the entire function without the need of post-processing procedures to grant positiveness and unit integral. We exploit a suitable isometry that allows to transfer the analysis in a subset of the space of square integrable functions, where we build on nonparametric functional data analysis techniques to perform the analysis. The framework does not assume the densities to belong to any parametric family and it can be successfully applied also to general multivariate probability density functions with bounded or unbounded support. Finally, a noteworthy field of application pertains the study of time varying networks represented through vine copula models. We apply the proposed methodology for estimating and forecasting the time varying dependence structure between the S&P500 and NASDAQ indices.
  • Measuring risks in the tail: The extreme VaR and its confidence interval.

    Dominique GUEGAN, Bertrand HASSANI, Kehan LI
    Risk and Decision Analysis | 2017
    Contrary to the current regulatory trend regarding extreme risks, the purpose of this paper is to emphasize the necessity of considering the Value-at-Risk (VaR) with extreme confidence levels like 99.9%, as an alternative way of measuring risks in the "extreme tail". Although the mathematical definition of the extreme VaR is trivial, its computation is challenging in practice, because the uncertainty of the extreme VaR may not be negligible for a finite amount of data. We begin to build confidence intervals around the unknown VaR. We build them using two different approaches, the first one uses the asymptotic Gaussian result and the second saddlepoint approach, the latter proves to be more robust when we use finite samples. We compare our approach with other methodologies which are based on bootstrapping techniques, focusing on the estimation of the extreme quantiles of a distribution. Finally, we apply these confidence intervals to perform a stress testing exercice with historical stock returns during the financial crisis, in order to identify potential violations of the VaR during periods of turmoil on financial markets.
  • Bitcoin and the challenges for financial regulation.

    Dominique GUEGAN, Anastasia SOTIROPOULOU
    Capital Markets Law Journal | 2017
    Bitcoin is the most popular virtual currency and has attracted extraordinary attention as a financial innovation. This attention results less from Bitcoin's role as a digital medium of payment, exchange and store of value, than from the decentralized nature of Bitcoin transactions.Bitcoins pose various risks, some of them being remote and others more immediate. If remote risks do not, presently, require any regulatory intervention, immediate risks should not remain beyond the reach of financial law. Regulators need to put in place frameworks that protect against these risks but in a way that does not restrain innovation. Theoretically, there are three aspects of the Bicoin ecosystem that may be subject to regulation: the Bitcoin system itself (Bitcoin protocol), the uses of Bitcoin and the members of the Bitcoin system. The regulation of the Bitcoin system itself proves extremely difficult as there is no central authority that administers and controls the system, which could be subject to regulation. On the contrary, regulation could apply to illegal uses that can be made of Bitcoins and to some of the members of the Bitcoin system, especially the exchange and wallet service providers.
  • Public Blockchain versus Private Blockchain: Issues and Limits.

    Dominique GUEGAN
    2017
    Blockchain is a very popular topic in the banking and insurance industry, what is it about? The notion of blockchain comes from cryptography and it is a protocol allowing to transmit information in a secure way. We will distinguish two approaches, the decentralized public approach and the centralized private approach. The concept of blockchain appeared thanks to the emergence of crypto-currencies and in particular of Bitcoin. If blockchain is to become an important tool within banks, then it is necessary to have a fair knowledge of the underlying tools and the issues associated with this new technology. Indeed, it appears necessary to identify the risks associated with it and to propose strategies to control them.
  • Public blockchain vs. private blockchain: issues and limits.

    Dominique GUEGAN
    Revue Banque | 2017
    Blockchain is a popular topic in the banking and insurance industry, but what exactly is it? The concept of blockchain comes from cryptography and it is a protocol that allows to transmit information in a secure way. We will distinguish between the decentralized public approach and the centralized private approach.
  • Public blockchain and smart contracts. Ethéreum: possibilities and limits.

    Dominique GUEGAN
    Revue Banque | 2017
    Ethéreum is a decentralized exchange protocol that not only produces a crypto-currency, but also allows users to create smart contracts. But if the platform leaves a lot of freedom to the actors in terms of application development, questions of security and robustness still arise concerning the protocol, the platforms, the bugs in the contracts code.
  • Bitcoin and the challenge for regulation.

    Dominique GUEGAN
    Vietnam Symposium in Banking and Finance | 2017
    In a relatively short period of time, virtual currencies ("VC") have gained significant traction and become an economic reality, with Bitcoin being the most dominant among over 500 virtual currencies. Their advent, beginning with Bitcoin in 2008, has quickly exploded into an emerging financial ecosystem that offers new possibilities for peer-to-peer payment systems, money transmission and investment opportunities not only for purchasers and sellers of virtual currencies, but also for investors in virtual currency business activity, and perhaps more significantly, for consumers.
  • Regulatory Learning: Credit Scoring Application of Machine Learning.

    Dominique GUEGAN, Bertrand HASSANI
    DMBD 2017 | 2017
    The arrival of Big Data strategies is threatening the latest trends in financial regulation related to the simplification of models and the enhancement of the comparability of approaches chosen by financial institutions. Indeed, the intrinsic dynamic philosophy of Big Data strategies is almost incompatible with the current legal and regulatory framework as illustrated in this paper. Besides, as presented in our application to credit scoring, the model selection may also evolve dynamically forcing both practitioners and regulators to develop libraries of models, strategies allowing to switch from one to the other as well as supervising approaches allowing financial institutions to innovate in a risk mitigated environment. The purpose of this paper is therefore to analyse the issues related to the Big Data environment and in particular to machine learning models highlighting the issues present in the current framework confronting the data flows, the model selection process and the necessity to generate appropriate outcomes.
  • Public Blockchain versus Private blockhain.

    Dominique GUEGAN
    2017
    In this document, we introduce some thinkings relative to the concept of blockchain, how it works and what are the issues for the banking system. Thus, first we recall what cryptography is, then we introduce the concept of blockchain as a protocol for transmitting information in a secure way, distinguishing two possible approaches: the decentralized public approach and the centralized private approach. The notion of cryptocurrency is introduced and two examples of applications of the public blockchains that are the bitcoin and the etherium are provided.
  • Impact of multimodality of distributions on VaR and ES calculations.

    Dominique GUEGAN, Bertrand HASSANI, Kehan LI
    2017
    Unimodal probability distribution has been widely used for Value-at-Risk (VaR) computation by investors, risk managers and regulators. However, financial data may be characterized by distributions having more than one modes. Using a unimodal distribution may lead to bias for risk measure computation. In this paper, we discuss the influence of using multimodal distributions on VaR and Expected Shortfall (ES) calculation. Two multimodal distribution families are considered: Cobb's family and distortion family. We provide two ways to compute the VaR and the ES for them: an adapted rejection sampling technique for Cobb's family and an inversion approach for distortion family. For empirical study, two data sets are considered: a daily data set concerning operational risk and a three month scenario of market portfolio return built five minutes intraday data. With a complete spectrum of confidence levels from 0001 to 0.999, we analyze the VaR and the ES to see the interest of using multimodal distribution instead of unimodal distribution.
  • Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and Autometrics.

    Camila EPPRECHT, Dominique GUEGAN, Alvaro VEIGA, Joel CORREA DA ROSA
    2017
    In this paper we compare two approaches of model selection methods for linear regression models: classical approach - Autometrics (automatic general-to-specific selection) — and statistical learning - LASSO (ℓ1-norm regularization) and adaLASSO (adaptive LASSO). In a simulation experiment, considering a simple setup with orthogonal candidate variables and independent data, we compare the performance of the methods concerning predictive power (out-of-sample forecast), selection of the correct model (variable selection) and parameter estimation. The case where the number of candidate variables exceeds the number of observation is considered as well. Finally, in an application using genomic data from a highthroughput experiment we compare the predictive power of the methods to predict epidermal thickness in psoriatic patients.
  • Testing for Leverage Effects in the Returns of US Equities.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO, Hanjarivo LALAHARISON
    2017
    This article questions the empirical usefulness of leverage effects to describe the dynamics of equity returns. Relying on both in and out of sample tests we consistently find a weak contribution of leverage effects over the past 25 years of S&P 500 returns. The skewness in the conditional distribution of the returns's time series models in found to explain most of the returns' distribution's asymmetry. This conclusion holds both at the index level and for 70% of the individual stocks constituents of the equity index.
  • Regulatory Learning: how to supervise machine learning models? An application to credit scoring.

    Dominique GUEGAN, Bertrand HASSANI
    2017
    The arrival of big data strategies is threatening the lastest trends in financial regulation related to the simplification of models and the enhancement of the comparability of approaches chosen by financial institutions. Indeed, the intrinsic dynamic philosophy of Big Data strategies is almost incompatible with the current legal and regulatory framework as illustrated in this paper. Besides, as presented in our application to credit scoring, the model selection may also evolve dynamically forcing both practitioners and regulators to develop libraries of models, strategies allowing to switch from one to the other as well as supervising approaches allowing financial institutions to innovate in a risk mitigated environment. The purpose of this paper is therefore to analyse the issues related to the Big Data environment and in particular to machine learning models highlighting the issues present in the current framework confronting the data flows, the model selection process and the necessity to generate appropriate outcomes.
  • Three-stage estimation method for non-linear multiple time-series.

    Dominique GUEGAN, Giovanni DE LUCA, Giorgia RIVIECCIO
    2017
    We present the three-stage pseudo maximum likelihood estimation in order to reduce the computational burdens when a copula-based model is applied to multiple time-series in high dimensions. The method is applied to general stationary Markov time series, under some assumptions which include a time-invariant copula as well as marginal distributions, extending the results of Yi and Liao [2010]. We explore, via simulated and real data, the performance of the model compared to the classical vectorial autoregressive model, giving the implications of misspecified assumptions for margins and/or joint distribution and providing tail dependence measures of economic variables involved in the analysis.
  • Public Blockchain and Smart Contracts. The possibilities opened by Ethéreum. and its limits.

    Dominique GUEGAN
    2017
    Ethéreum is a decentralized exchange protocol that not only produces a crypto-currency, but also allows users to create smart contracts. But if the platform leaves a lot of freedom to the actors in terms of application development, questions of security and robustness still arise concerning the protocol, the platforms, the bugs in the contracts code.
  • An alternative class of distortion operators.

    Dominique GUEGAN, Bertrand HASSANI, Kehan LI
    2017
    The distortion operator proposed by Wang (2000) has been developed in the actuarial literature and that are now part of the risk measurement tools inventory available for practitioners in finance and insurance. In this article, we propose an alternative class of distortion operators with explicit analytical inverse mapping. The distortion operators are based on tangent function allowing to transform a symmetrical unimodal distribution to an asymmetrical multimodal distribution.
  • Measuring risks in the extreme tail: The extreme VaR and its confidence interval.

    Dominique GUEGAN, Bertrand HASSANI, Kehan LI
    2017
    Contrary to the current regulatory trend concerning extreme risks, the purpose of this paper is to emphasize the necessity of considering the Value-at-Risk (VaR) with extreme confidence levels like 99.9%, as an alternative way to measure risks in the “extreme tail”. Although the mathematical definition of the extreme VaR is trivial, its computation is challenging in practice, because the uncertainty of the extreme VaR may not be negligible for a finite amount of data. We begin to build confidence intervals around the unknown VaR. We build them using two different approaches, the first using Smirnov's result (Smirnov, 1949 [24]) and the second Zhu and Zhou's result (Zhu and Zhou, 2009 [25]), showing that this last one is robust when we use finite samples. We compare our approach with other methodologies which are based on bootstrapping techniques, Christoffersen et al. (2005) [7], focusing on the estimation of the extreme quantiles of a distribution. Finally, we apply these confidence intervals to perform a stress testing exercice with historical stock returns during financial crisis, for identifying potential violations of the VaR during turmoil periods on financial markets.
  • Public Blockchain versus Private Blockchain: Issues and Limits.

    Dominique GUEGAN
    Revue Banque | 2017
    Blockchain is a very popular topic in the banking and insurance industry, what is it about? The notion of blockchain comes from cryptography and it is a protocol allowing to transmit information in a secure way. We will distinguish two approaches, the decentralized public approach and the centralized private approach. The concept of blockchain appeared thanks to the emergence of crypto-currencies and in particular of Bitcoin. If blockchain is to become an important tool within banks, then it is necessary to have a fair knowledge of the underlying tools and the issues associated with this new technology. Indeed, it appears necessary to identify the risks associated with it and to propose strategies to control them.
  • Multivariate Reflection Symmetry of Copula Functions.

    Monica BILLIO, Lorenzo FRATTAROLO, Dominique GUEGAN
    2017
    We propose a multivariate nonparametric copula test of reflection symmetry. The test is valid in any number of dimensions, extending previous results that cover the bivariate case. Furthermore, the asymptotic theory for the test relies on recent results on the dependent multiplier bootstrap, valid for sub-exponentially strongly mixing data. Consequently to the introduction of those two features, the procedure is suitable for financial time series whose asymmetric dependence, in distressed periods, has already been documented elsewhere. We conduct an extensive simulation study of empirical size and power and provide several examples of applications. In particular, we investigate the use of the statistic as a financial stress indicator by comparing it with the CISS, the leading ECB indicator.
  • Impact of multimodality of distributions on VaR and ES calculation.

    Dominique GUEGAN, Bertrand HASSANI, Kehan LI
    10th International conference of the ERCIM WG on Computational and Methodological Statistics (CMStatistics 2017) | 2017
    Unimodal probability distribution - its probability density having only one peak, like Gaussian distribution or Student-t distribution - has been widely used for Value-at-Risk (VaR) computation by both regulators and risk managers. However, a histogram of financial data may have more than one modes. In this case, first, a unimodal distribution may not provide a good fit on the data due to the number of modes, which may lead to bias for risk measure computation. second, these modes in the tails contain important information for trading and risk management purposes. For capturing information in the modes, in this paper, we discuss how to use a multimodal distribution - its density having more than one peaks - from Cobb's distribution family (Cobb et al., 1983), to compute the VaR and the Expected Shortfall (ES) for the data having multimodal histogram. In general, the cumulative distribution function of a multimodal distribution belonging to Cobb's distribution family does not have a closed-form expression. It means that we cannot compute the VaR and the ES, or generate random numbers using classical inversion approaches. Thus we use Monte-Carlo techniques to compute the VaR and the ES for Cobb's distribution. first, based on rejection sampling algorithm, we develop a way to simulate random numbers directly from a multimodal probability density belonging to Cobb's family. second, we compute the VaR and the ES by using their empirical estimators and these simulated random numbers. In the end, with these techniques, we compute the VaR and the ES for one data associated with market risk and another data associated with operational risk.
  • Statistical properties of the Seasonal Fractionally Integrated Separable Spatial Autoregressive Model.

    Papa ousmane CISSE, Abdou ka DIONGUE, Dominique GUEGAN
    Afrika Statistika | 2016
    In this paper we introduce a new model called Fractionally Integrated Separable Spatial Autoregressive processes with Seasonality and denoted Seasonal FISSAR. We focus on the class of separable spatial models whose correlation structure can be expressed as a product of correlations. This new modelling allows taking into account the seasonality patterns observed in spatial data. We investigate the properties of this new model providing stationary conditions, some explicit form of the autocovariance function and the spectral density. We also establish the asymptotic behaviour of the spectral density function near the seasonal frequencies.
  • Combining risk measures to overcome their limitations - spectrum representation of the sub-additivity issue, distortion requirement and added-value of the Spatial VaR solution: An application to Regulatory Requirement for Financial Institutions.

    Dominique GUEGAN, Bertrand HASSANI
    2016
    To measure the major risks experienced by financial institutions, for instance Market, Credit and Operational, regarding the risk measures, the distributions used to model them and the level of confidence, the regulation either offers a limited choice or demands the implementation of a particular approach. In this paper, we highlight and illustrate the paradoxes and issues observed when implementing an approach over another, the inconsistencies between the methodologies suggested and the problems related to their interpretation. Starting from the notion of coherence, we discuss their properties, we propose alternative solutions, new risk measures like spectrum and spatial approaches, and we provide practitioners and supervisor with some recommendations to assess, manage and control risks in a financial institution.
  • Risk Measures At Risk- Are we missing the point? Discussions around sub-additivity and distortion.

    Dominique GUEGAN, Bertrand k. HASSANI
    2016
    This paper discusses the regulatory requirements (Basel Committee, ECB-SSM andEBA) to measure the major risks of financial institutions, for instance Market, Credit and Operational, regarding the choice of the risk measures, the choice of the distributions used to model them and the level of confidence. We highlight and illustrate paradoxes and issues observed when implementing one approach over another, the inconsistencies between the methodologies suggested and the goals required to achieve them. We focus on the notion of sub-additivity and alternative risk measures, providing the supervisor with some recommendations and risk managers with some tools to assess and manage the risks in a financial institution.
  • Wavelet shrinkage of a noisy dynamical system with non-linear noise impact.

    Matthieu GARCIN, Dominique GUEGAN
    Physica D: Nonlinear Phenomena | 2016
    By filtering wavelet coefficients, it is possible to construct a good estimate of a pure signal from noisy data. Especially, for a simple linear noise influence, Donoho and Johnstone (1994) have already defined an optimal filter design in the sense of a minimization of the error made when estimating the pure signal. We set here a different framework where the influence of the noise is non-linear. In particular, we propose a method to filter the wavelet coefficients of a discrete dynamical system disrupted by a weak noise, in order to construct good estimates of the pure signal, including Bayes’ estimate, minimax estimate, oracular estimate or thresholding estimate. We present the example of a logistic and a Lorenz chaotic dynamical system as well as an adaptation of our technique in order to show empirically the robustness of the thresholding method in presence of leptokurtic noise. Moreover, we test both the hard and the soft thresholding and also another kind of smoother thresholding which seems to have almost the same reconstruction power as the hard thresholding. Finally, besides the tests on an estimated dataset, the method is tested on financial data: oil prices and NOK/USD exchange rate.
  • Financial Regulation: More Accurate Measurements for Control Enhancements and the Capture of the Intrinsic Uncertainty of the VaR.

    Dominique GUEGAN
    vsbf: 2016 Vietnam Symposium in Banking and Finance | 2016
    During the crisis, the failure of models and the lack of capture of extreme exposures led the regulator to change the way risks were analysed either by requiring financial institution to use sub-exponential probability distribution (), either to change the way correlations were captured () or to suggest switching from the VaR to sub-additive risk measures (). Indeed, models have had played a major role during this crisis either as catalysts or triggers. Before capturing dependencies, the technical point is the choice of the probability distribution and its use to provide a risk measure which is the key point for Bankers and regulators. It is now well known and admitted by practitioners that the important information for risk management is inside the tails of the distributions which characterize the risk factors. The purpose of this paper is to discuss the issue of considering the distributions used to characterise the risks and the associated risk measures independently, as we will argue that distributions and risk measures are indivisible.
  • Risk Measures at Risk- Are we missing the point? Discussions around sub-additivity and distortion.

    Dominique GUEGAN
    Conference on Banking and Finance | 2016
    To measure the major risks experienced by financial institutions, for instance Market, Credit and Operational, regarding the risk measures, the distributions used to model them and the level of confidence, the regulation either offers a limited choice or demands the implementation of a particular approach. In this paper, we review, highlight and illustrate the paradoxes and issues observed when implementing an approach over another, the inconsistencies between the methodologies suggested and the problems related to their interpretation. Starting with a discussion on the intrinsic properties of two classical risk measures: the Value-at-Risk and the Expected Shortfall, at each step we illustrate our proposal through an example based on real data using estimates of the risk measures themselves. \footnote{In this paper, we discuss the theoretical foundations of the risk measures as well as their estimates, applying them to real data for risk management purpose. Our objective is not a statistical discussion of the properties of the estimates used, for that we refer to a companion paper: (Guegan2016). Thus, this exercise provides practitioners and supervisors with some recommendations to assess, manage and control risks in a financial institution relying on alternative approaches, for instance, spectral, spectrum, distortion and Spatial risk measures.
  • More Accurate Measurement for Enhanced Controls: VaR vs ES?

    Dominique GUEGAN, Bertrand HASSANI
    2016
    This paper analyses how risks are measured in financial institutions, for instance Market, Credit, Operational, etc with respect to the choice of the risk measures, the choice of the distributions used to model them and the level of confidence selected. We discuss and illustrate the characteristics, the paradoxes and the issues observed comparing the Value-at-Risk and the Expected Shortfall in practice. This paper is built as a differential diagnosis and aims at discussing the reliability of the risk measures as long as making some recommendations.
  • Note on a new Seasonal Fractionally Integrated Separable Spatial Autoregressive Model.

    Papa ousmane CISSE, Abdou ka DIONGUE, Dominique GUEGAN
    2016
    In this paper, we introduce a new model called Fractionally Integrated Separable Spatial Autoregressive processes with Seasonality and denoted Seasonal FISSAR for two-dimensional spatial data. We focus on the class of separable spatial models whose correlation structure can be expressed as a product of correlations. This new modelling allows taking into account the seasonality patterns observed in spatial data. We investigate the properties of this new model providing stationary conditions, some explicit expressions form of the autocovariance function and the spectral density function. We establish the asymptotic behaviour of the spectral density function near the seasonal frequencies and perform some simulations to illustrate the behaviour of the model.
  • Uncertainty in historical Value-at-Risk: an alternative quantile-based risk measure.

    Dominique GUEGAN, Bertrand k. HASSANI, Kehan LI
    2016
    The financial industry has extensively used quantile-based risk measures relying on the Value-at-Risk (VaR). They need to be estimated from relevant historical data set. Consequently, they contain uncertainty. We propose an alternative quantile-based risk measure (the Spectral Stress VaR) to capture the uncertainty in the historical VaR approach. This one provides flexibility to the risk manager to implement prudential regulatory framework. It can be a VaR based stressed risk measure. In the end we propose a stress testing application for it.
  • Future Perspectives in Risk Models and Finance.

    Raphael DOUADY, Alain BENSOUSSAN, Dominique GUEGAN, Charles TAPIERO
    International Series in Operations Research & Management Science | 2015
    This book provides a perspective on a number of approaches to financial modelling and risk management. It examines both theoretical and practical issues. Theoretically, financial risks models are models of a real and a financial “uncertainty”, based on both common and private information and economic theories defining the rules that financial markets comply to. Financial models are thus challenged by their definitions and by a changing financial system fueled by globalization, technology growth, complexity, regulation and the many factors that contribute to rendering financial processes to be continuously questioned and re-assessed. The underlying mathematical foundations of financial risks models provide future guidelines for risk modeling. The book’s chapters provide selective insights and developments that can contribute to better understand the complexity of financial modelling and its ability to bridge financial theories and their practice. Future Perspectives in Risk Models and Finance begins with an extensive outline by Alain Bensoussan et al. of GLM estimation techniques combined with proofs of fundamental results. Applications to static and dynamic models provide a unified approach to the estimation of nonlinear risk models. A second section is concerned with the definition of risks and their management. In particular, Guegan and Hassani review a number of risk models definition emphasizing the importance of bi-modal distributions for financial regulation. An additional chapter provides a review of stress testing and their implications. Nassim Taleb and Sandis provide an anti-fragility approach based on “skin in the game”. To conclude, Raphael Douady discusses the noncyclical CAR (Capital Adequacy Rule) and their effects of aversion of systemic risks. A third section emphasizes analytic financial modelling approaches and techniques. Tapiero and Vallois provide an overview of mathematical systems and their use in financial modeling. These systems span the fundamental Arrow-Debreu framework underlying financial models of complete markets and subsequently, mathematical systems departing from this framework but yet generalizing their approach to dynamic financial models. Explicitly, models based on fractional calculus, on persistence (short memory) and on entropy-based non-extensiveness. Applications of these models are used to define a modeling approach to incomplete financial models and their potential use as a “measure of incompleteness”. Subsequently Bianchi and Pianese provide an extensive overview of multi-fractional models and their important applications to Asset price modeling. Finally, Tapiero and Jinquyi consider the binomial pricing model by discussing the effects of memory on the pricing of asset prices.
  • Optimal wavelet shrinkage of a noisy dynamical system with non-linear noise impact.

    Matthieu GARCIN, Dominique GUEGAN
    2015
    By filtering wavelet coefficients, it is possible to construct a good estimate of a pure signal from noisy data. Especially, for a simple linear noise influence, Donoho and Johnstone (1994) have already defined an optimal filter design in the sense of a good reconstruction of the pure signal. We set here a different framework where the influence of the noise is non-linear. In particular, we propose an optimal method to filter the wavelet coefficients of a discrete dynamical system disrupted by a weak noise, in order to construct good estimates of the pure signal, including Bayes' estimate, minimax estimate, oracular estimate or thresholding estimate. We present the example of a simple chaotic dynamical system as well as an adaptation of our technique in order to show empirically the robustness of the thresholding method in presence of leptokurtic noise. Moreover, we test both the hard and the soft thresholding and also another kind of smoother thresholding which seems to have almost the same reconstruction power as the hard thresholding.
  • A time series approach to option pricing: Models, Methods and Empirical Performances.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO
    2015
    No summary available.
  • Dynamic factor analysis of carbon allowances prices: From classic Arbitrage pricing Theory to Switching Regimes.

    Dominique GUEGAN, Marius cristian FRUNZA, Antonin LASSOUDIERE
    International Journal of Financial Markets and derivative | 2015
    The aim of this paper is to identify the fundamental factors that drive the allowances market and to built an APT-like model in order to provide accurate forecasts for CO2. We show that historic dependency patterns emphasis energy, natural gas, oil, coal and equity indexes as major factors driving the carbon allowances prices. There is strong evidence that model residuals are heavily tailed and asymmetric, thereby generalized hyperbolic distribution provides with the best fit results. Introducing dynamics inside the parameters of the APT model via a Hidden Markov Chain Model outperforms the results obtained with a static approach. Empirical results clearly indicate that this model could be used for price forecasting, that it is effective in and out of sample producing consisten results in allowances futures price prediction.
  • Risk or Regulatory Capital? Bringing distributions back in the foreground.

    Dominique GUEGAN, Bertrand HASSANI
    2015
    This paper discusses the regulatory requirement (Basel Committee, ECB-SSM and EBA) to measure financial institutions' major risks, for instance Market, Credit and Operational, regarding the choice of the risk measures, the choice of the distributions used to model them and the level of confidence. We highlight and illustrate the paradoxes and the issues observed implementing an approach over another and the inconsistencies between the methodologies suggested and the goal to achieve. This paper make some recommendations to the supervisor and proposes alternative procedures to measure the risks.
  • The Spectral Stress VaR (SSVaR).

    Dominique GUEGAN, Bertrand k. HASSANI, Kehan LI
    2015
    One of the key lessons of the crisis which began in 2007 has been the need to strengthen the risk coverage of the capital framework. In response, the Basel Committee in July 2009 completed a number of critical reforms to the Basel II framework which will raise capital requirements for the trading book and complex securitisation exposures, a major source of losses for many international active banks. One of the reforms is to introduce a stressed value-at-risk (VaR) capital requirement based on a continuous 12-month period of significant financial stress (Basel III (2011) [1]. However the Basel framework does not specify a model to calculate the stressed VaR and leaves it up to the banks to develop an appropriate internal model to capture material risks they face. Consequently we propose a forward stress risk measure “spectral stress VaR” (SSVaR) as an implementation model of stressed VaR, by exploiting the asymptotic normality property of the distribution of estimator of VaR p. In particular to allow SSVaR incorporating the tail structure information we perform the spectral analysis to build it. Using a data set composed of operational risk factors we fit a panel of distributions to construct the SSVaR in order to stress it. Additionally we show how the SSVaR can be an indicator regarding the inner model robustness for the bank.
  • A Rank-based Approach to Cross-Sectional Analysis.

    Dominique GUEGAN, Monica BILLIO, Ludovic CALES
    European Journal of Operational Research | 2015
    Sharpe-like ratios have been traditionally used to measure the performances of portfolio managers. However, they are known to suffer major drawbacks. Among them, two are intricate : (1) they are relative to a peer's performance and (2) the best score is generally assumed to correspond to a "good" portfolio allocation, with no guarantee on the goodness of this allocation. Last but no least (3) these measures suffer significant estimation errors leading to the inability to distinguish two managers' performances. In this paper, we propose a cross-sectional measure of portfolio performance dealing with these three issues. First, we define the score of a portfolio over a single period as the percentage of investable portfolios outperformed by this portfolio. This score quantifies the goodness of the allocation remedying drawbacks (1) and (2). The new information brought by the cross-sectionality of this score is then discussed through applications. Secondly, we build a performance index, as the average cross-section score over successive periods, whose estimation partially answers drawback (3). In order to assess its informativeness and using empirical data, we compare its forecasts with those of the Sharpe and Sortino ratios. The results show that our measure is the most robust and informative. It validates the utility of such cross-sectional performance measure.
  • A Time Series Approach to Option Pricing.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO
    2015
    No summary available.
  • Which is the best model for the US inflation rate: a structural changes model or a long memory process.

    Dominique GUEGAN, Lanouar CHARFEDDINE
    Journal of Applied Econometrics | 2015
    This paper analyzes the dynamics of the US inflation series using two classes of models : structural changes models and Long memory processes. For the first class, we use the Markov Switching (MS-AR) model of Hamilton (1989) and the Structural Change (SCH-AR) model using the sequential method proposed by Bai and Perron (1998, 2003). For the second class, we use the ARFIMA process developed by Granger and Joyeux (1980). Moreover, we investigate whether the observed long memory behavior is a true behavior or a spurious behavior created by the presence of breaks in time series. Our empirical results provide evidence for changes in mean, breaks dates coincide exactly with some economic and financial events such Vietnam War and the two oil price shocks. Moreover, we show that the observed long memory behavior is spurious and is due to the presence of breaks in data set.
  • Testing for Leverage Effect in Financial Returns.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO, Hanjarivo LALAHARISON
    SSRN Electronic Journal | 2014
    No summary available.
  • Emerging Countries Sovereign Rating Adjustment using Market Information: Impact on Financial Institutions’ Investment Decisions.

    Dominique GUEGAN, Bertrand k. HASSANI, Xin ZHAO
    Emerging Markets and the Global Economy | 2014
    No summary available.
  • The Time Series Toolbox for Financial Returns.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO
    A Time Series Approach to Option Pricing | 2014
    No summary available.
  • Probability density of the empirical wavelet coefficients of a noisy chaos.

    Matthieu GARCIN, Dominique GUEGAN
    Physica D: Nonlinear Phenomena | 2014
    We are interested in the random empirical wavelet coefficients of a noisy signal when this signal is a unidimensional or multidimensional chaos. More precisely we provide an expression of the conditional probability density of such coefficients, given a discrete observation grid. The noise is assumed to be described by a symmetric alpha-stable random variable. If the noise is a dynamic noise, then we present the exact expression of the probability density of each wavelet coefficient of the noisy signal. If we face a measurement noise, then the noise has a non-linear influence and we propose two approximations. The first one relies on a Taylor expansion whereas the second one, relying on an Edgeworth expansion, improves the first general Taylor approximation if the cumulants of the noise are defined. We give some illustrations of these theoretical results for the logistic map, the tent map and a multidimensional chaos, the Hénon map, disrupted by a Gaussian or a Cauchy noise.
  • The univariate MT-STAR model and a new linearity and unit root test procedure.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    Computational Statistics & Data Analysis | 2014
    A novel procedure to test for linearity and unit root in a nonlinear framework is proposed by introducing a new model–the MT-STAR model–which has similar properties of the ESTAR model but reduces the effects of the identification problem and can also account for asymmetry in the adjustment mechanism towards equilibrium. The asymptotic distribution of the proposed unit root test is non standard and is derived. The power of the test is evaluated through a simulation study and some empirical illustrations on real exchange rates show its accuracy.
  • From Time Series of Returns to Option Prices: The Stochastic Discount Factor Approach.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO
    A Time Series Approach to Option Pricing | 2014
    No summary available.
  • Distortion Risk Measure or the Transformation of Unimodal Distributions into Multimodal Functions.

    Dominique GUEGAN, Bertrand HASSANI
    International Series in Operations Research & Management Science | 2014
    No summary available.
  • Stress Testing Engineering: The Real Risk Measurement?

    Dominique GUEGAN, Bertrand k. HASSANI
    International Series in Operations Research & Management Science | 2014
    No summary available.
  • Nonlinear Dynamics and Wavelets for Business Cycle Analysis.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    Wavelet Applications in Economics and Finance | 2014
    We provide a signal modality analysis to characterize and detect nonlinearity schemes in the US Industrial Production Index time series. The analysis is achieved by using the recently proposed “delay vector variance” (DVV) method, which examines local predictability of a signal in the phase space to detect the presence of determinism and nonlinearity in a time series. Optimal embedding parameters used in the DVV analysis are obtained via a differential entropy based method using Fourier and wavelet-based surrogates. A complex Morlet wavelet is employed to detect and characterize the US business cycle. A comprehensive analysis of the feasibility of this approach is provided. Our results coincide with the business cycles peaks and troughs dates published by the National Bureau of Economic Research (NBER).
  • Distortion Risk Measures or the Transformation of Unimodal Distributions into Multimodal Functions.

    Dominique GUEGAN, Bertrand HASSANI
    2014
    The particular subject of this paper, is to construct a general framework that can consider and analyse in the same time upside and downside risks. This paper offers a comparative analysis of concept risk measures, we focus on quantile based risk measure (ES and VaR), spectral risk measure and distortion risk measure. After introducing each measure, we investigate their interest and limit. Knowing that quantile based risk measure cannot capture correctly the risk aversion of risk manager and spectral risk measure can be inconsistent to risk aversion, we propose and develop a new distortion risk measure extending the work of Wang (2000) [38] and Sereda et al (2010) [34]. Finally, we provide a comprehensive analysis of the feasibility of this approach using the S&P500 data set from o1/01/1999 to 31/12/2011.
  • Turning point chronology for the euro area: A distance plot approach.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    OECD Journal: Journal of Business Cycle Measurement and Analysis | 2014
    We propose a transparent way of establishing a turning point chronology for the euro area business cycle. Our analysis is achieved by exploiting the concept of recurrence plots, in particular distance plots, to characterise and detect turning points of the business cycle. Firstly, we apply the concept of recurrence plots on the US Industrial Production Index (IPI) series. this serves as a benchmark for our analysis since it already contains a reference chronology for the US business cycle, as provided by the Dating Committee of the National Bureau of Economic Research (NBER). We then use this concept to construct a turning point chronology for the euro area business cycle. In particular, we show that this approach detects turning points and helps with the study of the business cycle without a priori assumptions on the statistical properties of the underlying economic indicator.
  • Modern approaches for nonlinear data analysis of economic and financial time series.

    Peter martey ADDO, Dominique GUEGAN, Monica BILLIO, Philippe de PERETTI, Dominique GUEGAN, Monica BILLIO, Michael ROCKINGER, Massimiliano CAPORIN
    2014
    The main focus of the thesis is on modern nonlinear approaches to the analysis of economic and financial data, with a particular emphasis on business cycles and financial crises. A consensus in the statistical and financial literature has developed around the fact that economic variables behave non-linearly during different phases of the business cycle. As such, nonlinear approaches/models are required to capture the characteristics of the inherently asymmetric data generation mechanism, which linear models are unable to reproduce.In this regard, the thesis proposes a new interdisciplinary and open approach to the analysis of economic and financial systems. The thesis presents approaches robust to extreme values and non-stationarity, applicable to both small and large samples, for both economic and financial time series. The thesis provides step-by-step procedures in the analysis of economic and financial indicators by integrating concepts based on the data substitution method, wavelets, phase embedding space, delay vector variance (DVV) method and plot recurrences. The thesis also highlights transparent methods for identifying and dating turning points and assessing the impacts of economic and financial crises. In particular, the thesis also provides a procedure for anticipating future crises and its consequences.The study shows that the integration of these techniques in learning the structure and interactions within and between economic and financial variables will be very useful in the development of crisis policies, as it facilitates the choice of appropriate treatment methods suggested by the data.In addition, a new procedure for testing linearity and unit root in a nonlinear framework is proposed by introducing a new model - the MT-STAR model - which has similar properties to the ESTAR model but reduces the effects of identification problems and can also represent asymmetry in the adjustment mechanism towards equilibrium. The proposed asymptotic distributions of the unit root test are non-standard and are calculated. The power of the test is evaluated by simulation and some empirical illustrations on real exchange rates show its effectiveness. Finally, the thesis develops multi-variate Self-Exciting Threshold Autoregressive models with exogenous variables (MSETARX) and presents a parametric estimation method. The modeling of MSETARX models and the problems generated by its estimation are briefly discussed.
  • Stress Testing Engineering: the real risk measurement?

    Dominique GUEGAN, Bertrand HASSANI
    2014
    Stress testing is used to determine the stability or the resilience of a given financial institution by deliberately submitting. In this paper, we focus on what may lead a bank to fail and how its resilience can be measured. Two families of triggers are analysed: the first stands in the stands in the impact of external (and / or extreme) events, the second one stands on the impacts of the choice of inadequate models for predictions or risks measurement. more precisely on models becoming inadequate with time because of not being sufficiently flexible to adapt themselves to dynamical changes.
  • Empirical Performances of Discrete Time Series Models.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO
    A Time Series Approach to Option Pricing | 2014
    No summary available.
  • Introduction.

    Christophe CHORRO, Dominique GUEGAN, Florian IELPO
    A Time Series Approach to Option Pricing | 2014
    No summary available.
  • Turning point chronology for the euro area.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    OECD Journal: Journal of Business Cycle Measurement and Analysis | 2014
    We propose a transparent way of establishing a turning point chronology for the euro area business cycle. Our analysis is achieved by exploiting the concept of recurrence plots, in particular distance plots, to characterise and detect turning points of the business cycle. Firstly, we apply the concept of recurrence plots on the US Industrial Production Index (IPI) series. this serves as a benchmark for our analysis since it already contains a reference chronology for the US business cycle, as provided by the Dating Committee of the National Bureau of Economic Research (NBER). We then use this concept to construct a turning point chronology for the euro area business cycle. In particular, we show that this approach detects turning points and helps with the study of the business cycle without a priori assumptions on the statistical properties of the underlying economic indicator.
  • Nonlinear dynamics and recurrence plots for detecting financial crisis.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    2013
    No summary available.
  • An omnibus test to detect time-heterogeneity in time series.

    Dominique GUEGAN, Philippe DE PERETTI
    Computational Statistics | 2013
    This paper focuses on a procedure to test for structural changes in the first two moments of a time series, when no information about the process driving the breaks is available. We model the series as a finite-order auto-regressive process plus an orthogonal Bernstein polynomial to capture heterogeneity. Testing for the null of time-invariance is then achieved by testing the order of the polynomial, using either an information criterion, or a restriction test. The procedure is an omnibus test in the sense that it covers both the pure discrete structural changes and some continuous changes models. To some extent, our paper can be seen as an extension of Heracleous et al. (Econom Rev 27:363-384, 2008).
  • Nonlinear dynamics and recurrence plots for detecting financial crisis.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    The North American Journal of Economics and Finance | 2013
    Identification of financial bubbles and crisis is a topic of major concern since it is important to prevent collapses that can severely impact nations and economies. Our analysis deals with the use of the recently proposed 'delay vector variance' (DVV) method, which examines local predictability of a signal in the phase space to detect the presence of determinism and nonlinearity in a time series. Optimal embedding parameters used in the DVV analysis are obtained via a differential entropy based method using wavelet-based surrogates. We exploit the concept of recurrence plots to study the stock market to locate hidden patterns, non-stationarity, and to examine the nature of these plots in events of financial crisis. In particular, the recurrence plots are employed to detect and characterize financial cycles. A comprehensive analysis of the feasibility of this approach is provided. We show that our methodology is useful in the diagnosis and detection of financial bubbles, which have significantly impacted economic upheavals in the past few decades.
  • Evaluation of Regime Switching Models for Real-Time Business Cycle Analysis of the Euro Area.

    Monica BILLIO, Laurent FERRARA, Dominique GUEGAN, Gian luigi MAZZI
    Journal of Forecasting | 2013
    In this paper, we aim at assessing Markov switching and threshold models in their ability to identify turning points of economic cycles. By using vintage data updated on a monthly basis, we compare their ability to date ex post the occurrence of turning points, evaluate the stability over time of the signal emitted by the models and assess their ability to detect in real-time recession signals. We show that the competitive use of these models provides a more robust analysis and detection of turning points. To perform the complete analysis, we have built a historical vintage database for the euro area going back to 1970 for two monthly macroeconomic variables of major importance for short-term economic outlook, namely the industrial production index and the unemployment rate. Copyright © 2013 John Wiley & Sons, Ltd.
  • Multivariate VaRs for operational risk capital computation: a vine structure approach.

    Dominique GUEGAN, Bertrand k. HASSANI
    International Journal of Risk Assessment and Management | 2013
    The Basel Advanced Measurement Approach requires financial institutions to compute capital requirements on internal data sets. In this paper we introduce a new methodology permitting capital requirements to be linked with operational risks. The data are arranged in a matrix of 56 cells. Constructing a vine architecture, which is a bivariate decomposition of a n-dimensional structure (n > 2), we present a novel approach to compute multivariate operational risk VaRs. We discuss multivariate results regarding the impact of the dependence structure on the one hand, and of LDF modeling on the other. Our method is simple to carry out, easy to interpret and complies with the new Basel Committee requirements.
  • Probability density of the wavelet coefficients of a noisy chaos.

    Matthieu GARCIN, Dominique GUEGAN
    2013
    We are interested in the random wavelet coefficients of a noisy signal when this signal is the unidimensional or multidimensional attractor of a chaos. More precisely we give an expression for the probability density of such coefficients. If the noise is a dynamic noise, then our expression is exact. If we face a measurement noise, then we propose two approximations using Taylor expansion or Edgeworth expansion. We give some illustrations of these theoretical results for the logistic map, the tent map and the Hénon map, perturbed by a Gaussian or a Cauchy noise.
  • Non-stationary sample and meta-distribution.

    Dominique GUEGAN
    ISI Platinum Jubilee volume: statistical science and interdisciplinary research (International Conference of Statistical Paradigms - Recent Advances and Reconciliations) | 2013
    No summary available.
  • Empirical projected copula process and conditional independence : an extended version.

    Lorenzo FRATTAROLO, Dominique GUEGAN
    2013
    No summary available.
  • Understanding Exchange Rates Dynamics.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    2013
    With the emergence of the chaos theory and the method of surrogates data, nonlinear approaches employed in analysing time series typically suffer from high computational complexity and lack of straightforward explanation. Therefore, the need for methods capable of characterizing time series in terms of their linear, nonlinear, deterministic and stochastic nature are preferable. In this paper, we provide a signal modality analysis on a variety of exchange rates. The analysis is achieved by using the recently proposed "delay vector variance" (DVV) method, which examines local predictability of a signal in the phase space to detect the presence of determinism and nonlinearity in a time series. Optimal embedding parameters used in the DVV analysis are obtain via differential entropy based method using wavelet-based surrogates. A comprehensive analysis of the feasibility of this approach is provided. The empirical results show that the DVV method can be opted as an alternative way to understanding exchange rates dynamics.
  • Probability density of the wavelet coefficients of a noisy chaos.

    Matthieu GARCIN, Dominique GUEGAN
    2013
    No summary available.
  • Option pricing with discrete time jump processes.

    Dominique GUEGAN, Florian IELPO, Hanjarivo LALAHARISON
    Journal of Economic Dynamics and Control | 2013
    In this paper we propose new option pricing models based on class of models with jumps contained in the Lévy-type based models (NIG-Lévy, Schoutens, 2003, Merton-jump, Merton, 1976 and Duan based model, Duan et al., 2007). By combining these different classes of models with several volatility dynamics of the GARCH type, we aim at taking into account the dynamics of financial returns in a realistic way. The associated risk neutral dynamics of the time series models is obtained through two different specifications for the pricing kernel: we provide a characterization of the change in the probability measure using the Esscher transform and the Minimal Entropy Martingale Measure. We finally assess empirically the performance of this modelling approach, using a dataset of European options based on the S&P 500 and on the CAC 40 indices. Our results show that models involving jumps and a time varying volatility provide realistic pricing and hedging results for options with different kinds of time to maturities and moneyness. These results are supportive of the idea that a realistic time series model can provide realistic option prices making the approach developed here interesting to price options when option markets are illiquid or when such markets simply do not exist.
  • Turning point chronology for the Euro-Zone : a distance plot approach.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    2013
    No summary available.
  • Comparing variable selection techniques for linear regression : LASSO and Autometrics.

    Camila EPPRECHT, Dominique GUEGAN, Alvaro VEIGA
    2013
    No summary available.
  • Emerging countries sovereign rating adjustment using market information : impact on financial institutions investment decisions.

    Dominique GUEGAN, Bertrand kian HASSANI, Xin ZHAO
    2013
    No summary available.
  • Alternative modeling for long term risk.

    Dominique GUEGAN, Xin ZHAO
    Quantitative Finance | 2013
    In this paper, we propose an alternative approach to estimate long-term risk. Instead of using the static square root of time method, we use a dynamic approach based on volatility forecasting by non-linear models. We explore the possibility of improving the estimations using different models and distributions. By comparing the estimations of two risk measures, value at risk and expected shortfall, with different models and innovations at short-, median- and long-term horizon, we find that the best model varies with the forecasting horizon and that the generalized Pareto distribution gives the most conservative estimations with all the models at all the horizons. The empirical results show that the square root method underestimates risk at long horizons and our approach is more competitive for risk estimation over a long term.
  • Understanding exchange rates dynamics.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    2013
    No summary available.
  • Nonlinear Dynamics and Recurrence Plots for Detecting Financial Crisis.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    2013
    Identification of financial bubbles and crisis is a topic of major concern since it is important to prevent collapses that can severely impact nations and economies. Our analysis deals with the use of the recently proposed "delay vector variance" (DVV) method, which examines local predictability of a signal in the phase space to detect the presence of determinism and nonlinearity in a time series. Optimal embedding parameters used in the DVV analysis are obtained via a differential entropy based method using wavelet-based surrogates. We exploit the concept of recurrence plots to study the stock market to locate hidden patterns, non-stationarity, and to examine the nature of these plots in events of financial crisis. In particular, the recurrence plots are employed to detect and characterize financial cycles. A comprehensive analysis of the feasibility of this approach is provided. We show that our methodology is useful in the diagnosis and detection of financial bubbles, which have significantly impacted economic upheavals in the past few decades.
  • Lévy processes and their applications in finance: analysis, methodology and estimation.

    Hanjarivo LALAHARISON, Dominique GUEGAN, Christophe CHORRO, Dominique GUEGAN, Florian IELPO, Juan pablo ORTEGA, Giovanni BARONE ADESI
    2013
    Lévy processes and their applications in finance.
  • Turning point chronology for the Euro-Zone: A Distance Plot Approach.

    Peter martey ADDO, Monica BILLIO, Dominique GUEGAN
    2013
    We propose a transparent way of establishing a turning point chronology for the Euro-zone business cycle. Our analysis is achieve by exploiting the concept of recurrence plots, in this case distance plot, to characterize and detect turning points in the business cycle for any economic system. Firstly, we exploit the concept of recurrence plots on the US Industrial Production Index (IPI) series to serve as a beachmark for our analysis since there already exist reference chronology for the US business cycle, provided by the Dating Committee of the National Bureau of Economic Research (NBER). We then use this concept in constructing a turning point chronology for the Euro-zone business cycle. In particular, we show that this approach permits to detect turning points and study the business cycle without a priori assumptions on the statistical properties on the underlying economic indicator.
  • Using a time series approach to correct serial correlation in operational risk capital calculation.

    Dominique GUEGAN, Bertrand HASSANI
    2013
    The advanced measurement approach requires financial institutions to develop internal models to evaluate regulatory capital. Traditionally, the loss distribution approach (LDA) is used, mixing frequencies and severities to build a loss distribution function (LDF). This distribution represents annual losses. consequently, the 99.9th percentile of the distribution providing the capital charge denotes the worst year in a thousand. The traditional approach approved by the regulator and implemented by financial institutions assumes the losses are independent. This paper proposes a solution to address the issues arising when autocorrelations are detected between the losses, by using time series. Thus, the losses are aggregated periodically and several models are adjusted using autoregressive models, autoregressive fractionally integrated and Gegenbauer processes considering various distributions fitted on the residuals. Monte Carlo simulation enables the construction of the LDF, and the computation of the relevant risk measures. These dynamic approaches are compared with static traditional methodologies in order to show their impact on the capital charges, using several data sets. The construction of the related LDFs and the computation of the capital charges permit complying with the regulation. Besides, capturing simultaneously autocorrelation phenomena and large losses by fitting adequate distributions on the residuals, provide an alternative to the arbitrary selection of the LDA.
  • Emerging Countries Sovereign Rating Adjustment using Market Information: Impact on Financial Institutions Investment Decisions.

    Dominique GUEGAN, Bertrand HASSANI, Xin ZHAO
    2013
    Assets, debts and other financial products issued by emerging countries are usually considered more speculative than those issued by developed economies. Therefore, relying on traditional rating agencies to invest in these countries is problematic as the information used to assess the economic and market condition in these economies is quickly outdated. Consequently, both the investment opportunities and the necessity to clear particular positions may be missed, respectively resulting in potential significant costs of opportunity or losses. Therefore, an approach taking into account the latest information displayed by financial markets may enable us bypassing the traditional limits. As a result, this chapter proposes a creditworthiness evaluation methodology based on adjusting ratings obtained from macroeconomic fundamentals (GDP growth rate, inflation, external debts, etc.) and financial market movements (bonds, equity volume, volatility, etc). In the first step, a general panel model is applied on country-specific information to generate fundamental ratings. In the second step, applying a multi-factor model to market indicators and breaking down long-term sovereign bond yields into different risk premia, market implied ratings are obtained. Finally, the rating to be considered (denoted "δ-Rating") for investment purposes is a combination of fundamental ratings and market implied ratings carrying out an adapted Bühlmann-Straub method (Credibility Theory). Then, emerging countries "δ-Rating" will be compared to those obtained from developed countries and discussed against financial institutions risk appetite.
  • Empirical Projected Copula Process and Conditional Independence An Extended Version.

    Lorenzo FRATTAROLO, Dominique GUEGAN
    2013
    Conditional dependence is expressed as a projection map in the trivariate copula space. The projected copula, its sample counterpart and the related process are defined. The weak convergence of the projected copula process to a tight centered Gaussian Process is obtained under weak assumptions on copula derivatives.
  • Using a time series approach to correct serial correlation in operational risk capital calculation.

    Dominique GUEGAN, Bertrand HASSANI
    Journal of Operational Risk | 2013
    No summary available.
  • Evaluation of regime switching models for real-time business cycle analysis of the euro area.

    Laurent FERRARA, Monica BILLIO, Dominique GUEGAN, Gian luigi MAZZI
    Journal of Forecasting | 2013
    No summary available.
  • Quantification of operational risks: efficient capital calculation methods based on internal data.

    Bertrand kian HASSANI, Dominique GUEGAN
    2011
    Operational risks are risks of loss resulting from inadequacy or failure of the institution's procedures (missing or incomplete analysis or control, unsecured procedures), its personnel (error, malice and fraud), internal systems (computer breakdown, etc.) or external risks (flood, fire, etc.). They have attracted the attention of the authorities, following the losses and bankruptcies they have generated. To model them, the regulator requires institutions to use internal data, external data, scenarios and certain qualitative criteria. Using internal data, within the framework of the Loss Distribution Approach, we propose several innovative methods to estimate the provisions inherent to these risks. The first innovation dealing with convolution methods suggests mixing Monte Carlo simulations, kernel density estimation and Panjer's algorithm to construct the loss distribution. The second solution focuses on modeling the right tail of severity distributions using several results from extreme value theory and parameter estimation methods of truncated distributions. The third method we present focuses on multivariate VaR calculations. Implementing copula clusters to capture particular behaviors such as tail dependence, we provide a new type of architectures to compute global VaR.
  • Nonparametric methods: estimation, analysis and applications to business cycles.

    Patrick RAKOTOMAROLAHY, Dominique GUEGAN
    2011
    This thesis focuses on the study of the properties of the regression function by non-parametric methods for dependent processes and the application of these methods in business cycle analysis. Below we summarize the theoretical and empirical results obtained in this framework. The first theoretical result concerns the bias, variance, quatratic error and asymptotic normality of two non-parametric estimators: nearest neighbor and radial basis function. The other theoretical result was the extension of the envelopment tests in the case of dependent processes allowing to compare different parametric and non-parametric methods. The asymptotic normality of the statistics associated with these tests was established. The empirical work has been to propose these non-parametric methods in the forecasting of real economic activities from economic indicators and financial variables, to overcome some assumptions considered very strong in the parametric approach. The interest of non-parametric methods has been found in the forecasting of gross domestic product (GDP) in the Eurozone. The role of financial variables in the choice of models and in the selection of variables has been reviewed.
  • Valuation of risk in non-life insurance.

    Mathieu GATUMEL, Dominique GUEGAN
    2009
    This thesis focuses on the valuation of risk in non-life insurance. It is built on 5 main contributions. First, we provide an example of non-life insurance risk through the risk of default of reinsurers. A state of the reinsurance sector, the ways to measure the exposure of insurers to this risk and the stakes in terms of reinsurance policy are exposed. Then, three chapters are devoted to insurance linked securities. Chapter 2 presents the data that we have been able to collect over the past three years, relating to both the primary and secondary markets. Chapter 3 highlights the market price of risk in this market. Chapter 4 deals with a dynamic analysis of the insurance linked securities spread index. These three chapters allow us to identify the main factors that govern returns in the market: investor uncertainty, diversification effects and loss experience. These elements finally contribute to present in chapter 5 our philosophy on a relevant cost of capital for insurance companies. We analyze the traditional approaches used in the insurance industry and propose a new way to deal with them. It appears that the price of risk must depend on the risk basket to which it belongs, the level of risk considered and the period.
  • Analysis of stationary and non-stationary long memory processes: estimates, applications and predictions.

    Zhiping LU, Dominique GUEGAN
    2009
    In this thesis, we consider two types of long memory processes: stationary and non-stationary processes. We focus on the study of their statistical properties, estimation methods, prediction methods and statistical tests. Stationary long memory processes have been widely studied in the last decades. It has been shown that long memory processes have self-similarity properties, which are important for parameter estimation. We review the self-similarity properties of long memory processes in continuous and discrete time. We propose two propositions showing that long memory processes are asymptotically second order self-similar, while short memory processes are not asymptotically second order self-similar. Then, we study the self-similarity of specific long memory processes such as k-factor GARMA processes and k-factor GIGARCH processes. We also study the self-similarity properties of heteroscedastic models and processes with jumps. We review the methods for estimating the parameters of long memory processes, by parametric methods (e.g., maximum likelihood estimation and pseudo-maximum likelihood estimation) and semiparametric methods (e.g., GPH method, Whittle method, Robinson method). The consistency and asymptotic normality behaviors are also studied for these estimators. The test on the integrated fractional order of the seasonal and non-seasonal unit root of long memory stationary processes is very important for the modeling of economic and financial series. The Robinson (1994) test is widely used and applied to various well-known long memory models. Using Monte Carlo methods, we study and compare the performance of this test using several sample sizes. This work is important for practitioners who want to use the Robinson test. In practice, when dealing with financial and economic data, seasonality and time dependence can often be observed. Thus a kind of non-stationarity exists in financial data. In order to take into account such phenomena, we review non-stationary processes and propose a new class of stochastic processes: locally stationary k-factor Gegenbauer processes. We propose a procedure for estimating the parameter function using the discrete wavelet packet transformation (DWPT). The robustness of the algorithm is studied by simulations. We also propose prediction methods for this new class of non-stationary long memory processes. We give applications on the error correction term of the fractional cointegration analysis of the Nikkei Stock Average 225 index and we study world crude oil prices.
  • Financial crises and contagion effects in emerging economies: characterization and measurement.

    Octavie MBIAKOUP KONGUEP, Dominique GUEGAN
    2008
    The 1990s were marked by numerous financial and monetary crises in emerging countries. These crises tended to be chronologically and sometimes geographically grouped, i.e. to spread between countries, regardless of the macroeconomic fundamentals of the countries concerned. The South-East Asian crisis of 1997-1998 is the most marked illustration of what is known in the economic literature as "contagion". This notion has given rise to much discussion and research. We are therefore interested in modeling contagion in six Asian financial markets. We want to test whether the crisis that started in Thailand in July 1997 has been transmitted to other countries by way of contagion, through composite indices and trading volumes. To do so, we first construct a crisis indicator using the volumes traded on the market and the composite indices. Our results specify a threshold beyond which a crisis is imminent in these countries. We then use the transfer function model to show the existence of a dependency structure between the markets before and during the crisis. Using a test, we compare whether the dependency structure between two markets changes during the crisis, compared to the stable period.
  • The integration of information in the price of financial assets.

    Florian IELPO, Dominique GUEGAN
    2008
    The main topic of this thesis is the integration of macroeconomic and financial infonnation by financial markets. The contributions presented here are five in number. The first three use recent advances in asset pricing econometrics. The objective is to measure expectations, risk aversion or simply to forecast the price of derivatives. (1) First, we introduce a new econometric method to estimate the evolution of the subjective distribution from interest rate futures. (2) Then, using option quotes and futures on the European carbon market, we highlight the impact of the publication of emission allowances allocated by the European Commission on risk aversion in this new market. (3) Then, we present a new model of derivatives valuation based on returns following a generalized hyperbolic law under the historical measure. The model leads to low pricing errors when compared to the existing literature. Finally, two topics related to the impact of macroeconomic news on the yield curve are presented here: (4) First, we show that the perception of the impact of a surprise on the European yield market is greatly modified when the US influence is taken into account. (5) Second, we quantify the widely held intuition that the term structure of the impact of news on the yield curve depends on economic and monetary conditions, and this in the US case.
  • Valuation and management of credit derivatives: synthetic CDOs or the exponential growth of correlation products.

    Julien HOUDAIN, Dominique GUEGAN
    2006
    This thesis is based on the use of quantitative methods for the valuation and management of synthetic CDO structures. We illustrate the limitations of standard approaches and develop an innovative valuation method based on the use of the inverse normal Gaussian (ING) distribution and historical correlation levels. We compare these different approaches and study their impact on tranche management. We then extend our research to CDO^2 tranches and develop two original methods for the valuation of these products. Finally, we study the opportunities offered by the management and hedging of standardized tranches of CDS indices and illustrate more precisely two sources of risk in addition to credit risk: model risk and correlation risk.
  • Multivariate long-memory modeling: applications to EDF's producer issues in the context of the liberalization of the European electricity market.

    Abdou ka DIONGUE, Dominique GUEGAN
    2005
    Several financial market data, such as spot prices of interconnected European electricity markets, exhibit long memory, in the sense of hyperbolic decay of autocorrelations combined with heteroskedasticity and periodic or non periodic cycles. To model such behaviors, we introduce k-factor GIGARCH processes and propose two parameter estimation methods. We develop the asymptotic properties of the estimators of each method. Moreover, in order to compare the asymptotic properties of the estimators, Monte Carlo simulations are performed. On the other hand, we propose a multivariate generalized long-memory model (k-factor MVGARMA) to jointly model two interconnected European electricity markets. We give a practical procedure for parameter estimation. For forecasting, we provide analytical expressions for the least squares predictors for the proposed models and confidence intervals for the forecast errors. Finally, we apply these two models on the spot electricity prices of the French and German markets and compare their predictive capabilities.
  • Market risk: measurement and backtesting: dynamic copula approach.

    Cyril r. CAILLAULT, Dominique GUEGAN
    2005
    This thesis deals with the use of copula functions to measure market risk. Chapter 1, we recall the main results related to copulas: definitions, Sklar's theorem, constructions, matching measures, tail dependence, simulation algorithms. Chapter 2, we propose a non-parametric estimation method based on the notion of tail dependence. We compare it to the omnibus" method. We show that the choice of the best copula can be different according to the method. The results allow us to show the existence of co-movements between the Asian Tigers. In Chapter 3, we develop dynamic methods to compute the Value at Risk and the Expected Shortfall. The choice of the risk measure is discussed in relation to the amendment of the Basel Accord. In Chapter 4, we introduce the notion of dynamic copula for the calculation of Value at Risk. Three statistical tests are proposed in order to validate this calculation method.
  • Markovian regime-switching models: regime detection, short memory or long memory and prediction.

    Stephanie RIOUBLANC, Dominique GUEGAN
    2005
    In this work, we are interested in the study of Markovian regime-switching models. We study the possible detection of several regimes in data simulated from Markovian regime-switching models using different empirical estimators. We also study the long or short memory behavior of these models and compare their prediction performances with those of long memory models.
  • International stock market indices and the crisis of new technologies: switching and DCC-MVGARCH approaches.

    Ryan SULEIMANN, Dominique GUEGAN
    2003
    Since the New Technologies stock market crisis in 2000 and the very large increase in the volatility of stock market assets compared to what preceded that year, the modeling of this volatility and its contagion effect across the stock markets in the world, has generated a lot of discussion and research. We are therefore interested in modeling the volatility of three technology indices: NASDAQ-100, IT. CAC and NEMAX and five global indices: Dow Jones Industrial Average, Standard & Poor 500, NASDAQ Composite, DAX and CAC40, in order to verify whether the investment risk, measured by the Value at Risk (VaR), has changed as a result of the technology crisis and to show that the technology crisis, among all the stock market crises experienced, is the one that has most affected the stock markets worldwide. Our VaR calculation requires an accurate modeling of the volatility of the studied series and the identification of the presence of dynamic or non dynamic conditional correlations. We use different models to model the volatility of the indices under study, including different regime-switching models (SWARCH, SWGARCH and MSVECM) and the dynamic conditional correlation multivariate GARCH model (DCC-MVGARCH). We use the regime-switching and VAR models to show the existence of co-movement and contagion effects between the studied indices and the DCC-MVGARCH model to show the effect of the technology crisis on the increase in stock market volatility and the presence of dynamic correlations linking them, as well as for the calculation of the VaR. At the end, we compare the VaR calculated by the DCC-MVGARCH model with the VaR calculated by the non-parametric copula method.
  • Long Memory Processes: Forecasts, Estimates and Extreme Values.

    Jerome COLLET, Dominique GUEGAN
    2003
    This work concerns the study of Long Memory processes (LM in the following). After having recalled the concept of LM, we present, in Chapter 1, the Gegenbauer type LM processes. Properties are recalled, as well as methods of estimation and forecasting of series using these processes. In Chapter 2, we focus on non-Gaussian LM processes. A method to construct these processes is developed and used for estimations and forecasts of time series. Chapter 3, we study different problems occurring when modeling series using Gegenbauer processes, such as under-modeling. Chapter 4, we study the extreme behavior of processes by applying the results of Chapter 2. After stating some properties concerning the extreme behavior of some independent and stationary processes, we give properties of the extreme index for these processes.
  • How to measure portfolio risk through the statistical study of non-linear processes.

    Sophie LADOUCETTE, Dominique GUEGAN
    2002
    No summary available.
  • Generalized long memory processes: estimation, prediction and applications.

    Laurent FERRARA, Dominique GUEGAN
    2000
    We are interested in a certain class of parametric time series models: generalized long memory processes, introduced in the statistical literature in the early 1990s. These generalized long memory processes take into account simultaneously in the modeling of the series, a long term dependence and a persistent periodic cyclic component. This type of phenomenon is frequent in many fields of application of statistics, such as economics, finance, environment or public transport. We have proposed a simultaneous pseudo-maximum likelihood estimation method for all the parameters of the model, and a semiparametric method for estimating the long memory parameters. We have shown the consistency of each of these estimators and we have given their limiting distribution, the results being validated by Monte Carlo simulations. We also compared the performance of each of these estimators on real data. For the prediction, we have provided the analytical expressions of the least squares predictor and its confidence interval. We have compared on real data the forecasting performance of generalized long memory processes and other short and long memory processes.
  • Chaotic time series applied to finance statistical and algorithmic problems.

    Ludovic MERCIER, Dominique GUEGAN
    1998
    Our contribution to the study of chaotic time series focuses on the following points. We propose a study framework allowing to take into account the contributions of the various scientific fields dealing with this type of data: signal processing, dynamical systems, ergodic theory, finance and statistics. We clarify the notion of global Lyapunov exponent with the help of several definitions, from the most formal to the most commonly used. We show why global Lyapunov exponents are useful for characterizing chaos but practically useless for forecasting. We then focus on local Lyapunov exponents. We show how each definition is related to the global Lyapunov exponents and we specify in which application framework each definition is relevant. We give a new result concerning the distribution of local Lyapunov exponents. We consider the non-parametric prediction methods that can be used in this context, detailing two that seem particularly adapted to chaos: nearest neighbors and radial functions. The latter estimator is studied in more detail. We specify its properties and give an algorithm to implement it. We study the predictability of chaotic time series. We show how the prediction horizon is related to the local Lyapunov exponents of the system. We propose a new theoretical approach to deal with the case of noisy chaos. We address the problem of choosing a sampling step for chaotic series from a continuous time system. We give a new result allowing to choose an optimal sampling step in the sense of the prediction horizon. We support these presentations with a set of simulations from known chaotics by specifying their algorithmic costs. We discuss the problems posed by the simulation of chaotic time series. Finally, we give two applications of the tools developed in the framework of intraday financial series. The first application is a direct illustration of these tools in the case of exchange rates. The second application makes prior use of time warping methods which are presented here in a new unified framework.
  • Study of asymptotically powerful parametric and non-parametric tests for bilinear autoregressive models.

    Joseph NGATCHOU WANDJI, Dominique GUEGAN
    1995
    The Lagrange multiplier test appears to be a good tool to test diagonal bilinear models of order one. We use it to discriminate between linear autoregressive models of order one, and some subdiagonal bilinear models of order two, for which we give a necessary and sufficient condition of invertibility. We prove the contiguity of the null hypothesis and of a sequence of local alternatives, which allows us, thanks to the third lemma of le Cam, to obtain an explicit expression of the local theoretical power of the test. Numerical Monte Carlo simulations show that this power is well estimated by the experimental power. We also find that this test is good for testing the types of hypotheses considered. Parametric tests such as the Lagrange multiplier test, because they are built for specific parametric models, may lack robustness. Non-parametric tests to test the linearity of autoregressive models are few. In order to prepare extensions to more general autoregressive models, we construct on a compact of the set of reals, two non-parametric tests to test diagonal bilinear models of order one, stationary, geometrically alpha-mixing, and having noise with a fixed, unknown and bounded density law. The study of the asymptotic distribution of the test statistics, under the null hypothesis, is done using weak invariance principles. For each of these tests, using maximal inequalities, we exhibit a minorant of the power that converges to 1. We show that under local alternatives, the risk of the second kind error can be very close to one. When the noise is Gaussian, tests confirm these results, and prove at the same time that on the example of the diagonal bilinear model of order one, the Lagrange multiplier test is better than the two non-parametric tests.
Affiliations are detected from the signatures of publications identified in scanR. An author can therefore appear to be affiliated with several structures or supervisors according to these signatures. The dates displayed correspond only to the dates of the publications found. For more information, see https://scanr.enseignementsup-recherche.gouv.fr