Statistics and Operations Research Seminars 2011--2012,
Department of Economics and Business, Pompeu Fabra University
Thursday, September 22, 17:00, room 20.137.
Frank van der Meulen (Delft University of Technology)
Bayesian estimation of the drift of a diffusion process.
In this talk I will consider estimating the drift function of a continuously observed diffusion process. The Bayesian approach entails the specification of a prior distribution on the drift function. I will first discuss a result that gives sufficient conditions for obtaining convergence rates for the posterior distribution. These conditions depend both on the prior and the size of the model. From a more practical perspective, I will introduce a reversible jump MCMC algorithm to obtain draws form the posterior distribution in case of a series prior using an expansion of the drift in a hierarchical basis. If time permits, I will discuss consistency of the prior and computational issues in case of discrete time observations.
This concerns joint work with Moritz Schauer (TU Delft), Aad van deer Vaart (VU Amsterdam) and Harry van Zanten (TU Eindhoven).
Tuesday, December 13, 12:00, room 20.287.
Javier Vicente (UPF)
High-Frequency Finance: A Complexity Science Approach.
High‐Frequency Finance is a new coined term that it is related to the raise of modern electronic markets where trades and quotes happen in very short time intervals: seconds, or even shorter. All the information about trades, quotes, e.g., broker, time stamp, price, size, etc. is recorded in daily datasets which bring great opportunities of researching a complex system given the huge amount of accurate data.
I will present two problems. The first one is a classical problem in Finance: the shape and evolution of the pdf of returns. Several attempts have tried to explain empirical return distribution, but in general all these previous explanations failed in reproducing the tails of the empirical distribution.
Here, based on a theory – Superstatistics – developed for understanding critical phenomena in Mechanical Statistics I have been able to explain the pdf and its temporal evolution.
The second problem is about optimal execution of large orders that must be split into smaller ones to avoid the huge market price impact they would cause in case of being executed in one only transaction. In this case, I am interested on the trades of every single market participant instead of on the aggregate of all of them. I study two different markets: LSE, and SSE. Being in both cases the empirical results compatible with a simple economic law related to optimal execution; independently of the specific features of each market.
Thursday, February 16, 16:00, room 23.S03.
Kasia Wolny (Warwick)
Local times of Brownian motion: theory and algorithms
Thursday, March 15, 16:00, room 40047A.
Nektarios Aslanidis (Universitat Rovira i Virgili, CREIP)
Modelling asset correlations: a time-varying nonparametric approach
Joint work with Isabel Casas (Syddansk Universitet, CREATES).
This article proposes time-varying nonparametric and semiparametric estimators
of the conditional cross-correlation matrix in the context of
Simulations results show that the nonparametric and semiparametric models are
best in DGPs with gradual changes or structural breaks in correlations. However,
when correlations are constant or change fast the parametric DCC model delivers
the best outcome. The methodologies are illustrated by evaluating two
portfolios. The first portfolio consists of the equity sector SPDRs
and the S&P 500,
while the second one contains major currencies. Results show the nonparametric
model generally dominates the others when evaluating in-sample. However, the
semiparametric model is best for out-of-sample analysis.
Thursday, April 19, 16:00, room 40.S16.
Dennis Kristensen (UCL)
Implementation and Estimation of Discrete Markov Decision Models by
Sieve Approximations" (with P. Jia Barwick and B. Scherning)
We combine simulations and sieve approximations to obtain a fast,
iterative least-squares method for computing the value function in a
general class of Markov decision models. The class of models allows
for continuous state variables without having to use discretization.
Furthermore, the method can handle models that are non-separable in
unobserved state variables with dynamics in them. The proposed method
yields a computationally efficient, yet numerically precise
implementation of model and associated estimators (GMM or MLE). We
analyze the theoretical properties of the approximate value function
and MLE, and investigate their performance in practice through
Wednesday, May 2, 16:00, room 20.233.
Matteo Barigozzi (LSE)
Which model to match?
The asymptotic efficiency of the indirect estimation methods, such as
the efficient method of moments and indirect inference, depends on the
choice of the auxiliary model. Up to date, this choice is somehow ad
hoc and based on an educated guess of the researcher. In this article
we develop three information criteria that help the user to optimize
the choice among nested and non–nested auxiliary models. They are the
indirect analogues of the widely used Akaike, Bayesian and
Hannan–Quinn criteria. A thorough Monte Carlo study based on two
simple and illustrative models shows the usefulness of the criteria.
Thursday, May 3, 12:00, room 20.137.
Patrick Groenen (Econometric Institute, Erasmus University Rotterdam, The Netherlands)
Identifying Response Styles: A Latent-Class Bilinear Multinomial Logit Model
Respondents can vary strongly in the way they use rating scales. Speciﬁcally, respondents can exhibit a variety of response styles, which threatens the validity of the responses. The purpose of this paper is to investigate how response style and content of the items affect rating scale responses. A novel model is developed that accounts for different types of response styles, content of items, and background characteristics of respondents. By imposing a bilinear parameter structure on a multinomial logit model, we graphically distinguish the effects on the response behavior of the characteristics of a respondent and the content of an item. We combine this approach with ﬁnite mixture modeling, yielding two segmentations of the respondents: one for response style and one for item content. They apply this latent-class bilinear multinomial logit model to the well-known List of Values in a cross-national context. The results show large differences in the opinions and the response styles of respondents and reveal previously unknown response styles. Some response styles appear to be valid communication styles, whereas other response styles often concur with inconsistent opinions of the items and seem to be response bias.
Rosmalen, J.M. van, Herk, Hester van & Groenen, P.J.F. (2010). Identifying response styles: a latent-class bilinear multinomial logit model. Journal of Marketing Research, 47, 157-172.
Wednesday, May 16, 16:00, room 20.237.
Dobrislav Dobrev (Fed Board)
Robust Forecasting with Many Predictors
(joint with Ernst Schaumburg)
The prediction of multivariate outcomes in a linear regression
setting with a large number of potential regressors is a common challenge
in macroeconomic and financial forecasting models. We exploit that the
frequently encountered problem of nearly collinear regressors can be
addressed using standard shrinkage type estimation. Moreover, independently
of near collinearity issues, when the outcomes are correlated random
variables, univariate forecasting is often sub-optimal and can be improved
upon by shrinkage based on a canonical correlation analysis. In this paper,
we consider a family of models for multivariate prediction that employ both
types of shrinkage. The approach is designed to jointly forecast a vector
of variables of interest based on a common (potentially very large) set of
predictors. We illustrate its performance in applications to several
standard forecasting problems in macroeconomics and finance in relation to
Thursday, May 24, 12:00, room 23.S03.
David Veredas (ECARES and Bank of Spain)
Ranking Systemically Important Institutions (with Mardi Dungey,
Based on the definition of systemic risk given by Jean--Claude Trichet
at Clare College in Cambridge (Dec. 2009), we propose a simple
methodology for ranking systemically important institutions. We view
firm's risks as a network with vertices equal to the volatility shocks
and edges their correlations. We use dynamic centrality measures to
rank the firms in terms of risk connectedness and firm
characteristics. An application to all firms in S&P500 reveals that i)
the Fed unconventional monetary policies did have a significant effect
in reducing the systemic risk of most financial firms after the
collapse of Lehman Brothers, ii) the connections between the real
economy and the financial sector are fundamental, iii) firms from the
real economy can be more systemic than financials, suggesting that the
next systemic crisis may come from the non--financial sector, iv)
medium and small--size firms can be as systemic as the largest
corporations, indicating that the too--big--to--fail dogma may
actually be unrealistic.