Research

The following is a list of my preprints and works in progress.

Cademartori, C., Strength in Numbers: A Joint Posterior -Value for Increasing the Frequentist Power of Bayesian Model Diagnostics.

(In progress) This project is motivated by the old observation that the posterior predictive -value is conservative in the sense that the probability of observing -value under frequentist replications of the data generating process is usually less than . We argue that this problem increases in severity as the model dimension grows, putting pressure on previous arguments that the conservativity property is unproblematic in practice. We propose a joint -value computed for multiple test statistics simultaneously, which we develop a frequency bound for and argue is capable of overcoming the conservativity problem in many cases.

Cademartori, C., Identifiability and Falsifiability: Two Challenges for Bayesian Model Expansion.

In this work, we use information-theoretic tools to investigate the properties of Bayesian models under a process of model expansion, whereby a simpler base model is extended to a larger, higher-dimensional model which embeds the base model as a special case. We find that this process tends to lead to weakening identification of model parameters and degrading power of model checks. We argue that these problems may be characteristic of model expansion in general, establishing bounds that indicate a tradeoff between them: the more an expansion avoids one problem, the more it is likely to increase the severity of the other. Finally, we consider the methodological consequences of these conclusions. In particular, we demonstrate in examples that methods capable of leveraging the dependence structure of the posterior distribution can partly overcome the challenges that model expansion poses for many classic inferential tools.

Cademartori, C., Rush, C., A Non-asymptotic Analysis of Generalized Approximate Message Passing Algorithms with Right Rotationally Invariant Designs

Approximate message passing procedures are a class of algorithms derivable as Gaussian approximations to certain belief and expectation propagation algorithms for high-dimensional regression problems. Many of these algorithms have the remarkable property that their error at any iteration can be predicted to high accuracy by a computable recursion called the state evolution. This work studies a state evolution for the Generalized Vector Approximate Message Passing algorithm, which extends the classic AMP algorithm to apply to generalized linear models and to design matrices which are potentially severely ill-conditioned. We show under general conditions that the average error of GVAMP at any iteration converges at exponentially fast rates to these state evolution predictions.