Mathematical Sciences

Department staff

Dr Dalia Chakrabarty

Photo of Dr Dalia Chakrabarty

Lecturer in Statistics

My D.Phil (from St. Cross College, Oxford) was in Theoretical Astrophysics, and I was examined by Prof. James Binney. My doctoral thesis was dedicated to the development of a novel Bayesian learning method that I used to learn the gravitational mass of the black hole in the centre of the Milky Way, (along with the Galactic phase space density), and to the computational modelling of non-linear dynamical phenomena in galaxies. Thereafter, I continued to develop probabilistic learning methods, and undertake Bayesian inference, in astronomical contexts, till 2009, when I moved to Warwick Statistics, and started developing Bayesian methodologies, to apply to diverse areas. After Warwick, I was a Lecturer in Statistics, in Leicester Maths, for about 4 years, and moved to Loughborough on the 1st of March, 2017. My current interest is strongly focused on the development of learning methodologies, given different challenging data situations, such as data that is shaped as a hyper-cuboid; with components that are diversely correlated; absent training data; data that is discontinuously distributed and/or changing with time. I am equally keen on learning graphical models and networks of multivariate datasets, as random geometric graphs, with the ulterior aim of computing distance between a pair of learnt graphs. I am also interested in the development of Bayesian tests of hypotheses that are useful when the alternative model is difficult/impossible to perform computation within, and recently, have initiated a method of optimising the mis-specified parameters of a parametric model, while learning the desired model parameters. My current applications include areas such as healthcare, vino-chemistry, astronomy, test theory, material science, etc. 

  • Bayesian Supervised learning of high-dimensional functions, given high-dimensional, discontinuously distributed data, using the a dual-layered kernel-parametrisation based learning strategy -- compounding a Tensor-variate GP with Scalar-variate GPs.
  • Random Geometric Graphical models and networks of multivariate datasets, and distance between learnt pair of graphical models.
  • Novel Bayesian variable value prediction, given test data on the dependent variable, when training data is absent, and distribution of neither variable is known..
  • Tests of hypotheses given intractable alternatives.
  • Novel optimisation of mis-specification parameters given parametric models, (while learning sought model parameters).
  • Supervised learning using hierarchical regression & supervised classification. 
  • Applications to Astrophysics, Materials Science, Chemistry, Healthcare, Petrophysics, Testing Theory, etc.

My research focuses upon the development of methodologies within Computational & Mathematical Statistics, including Bayesian learning methods — given different data situations, such as high-dimensional data; temporally-evolving; and/or discontinuously distributed data; absent training data; large in size, or under-abundant. This has resulted in

--supervised learning methodologies given hypercuboidally-shaped, discontinuous and/or non-stationary data, using compounding of  Gaussian Processes;
--pursuit of graphical models & networks of multivariate data, as random graphs, followed by computing distance between learnt graphical models, to inform on the inter-data correlation.
--a novel method that allows for variable prediction at test data, (when bearing of functional relation between variables is not possible), by embedding the sought variable vector within support of the state space density..
--Sometimes, in pursuit of this latterly mentioned prediction without learning, intractability is encountered, and I am interested in developing new tests of hypotheses in which we seek the probability of a simplifying model, conditioned on the data, where said simplification is undertaken to counter the intractability.
--Have recently developed a 5-step method that optimises the mis-specification parameter vector, in a parametric model, while Bayesianly learning the sought model parameters.                                                                                                                                                                                                                                                                                                     In addition, I have worked on developing a novel classification technique in lieu of training data, and on another occasion, trained the model for the causal relationship between the observable and covariates, using hierarchical regression.

Applications of these methods are in Astronomy, Materials Science, Chemistry, Petrophysics, Testing theory, etc.

If you are a prospective Ph.D student or postdoctoral fellow,  with an interest in mathematical/computational statistics, please feel free to contact me; I have multiple projects in

  • Bayesian methodology development: high-dimensional supervised learning; learning in the absence of training data; Bayesian tests; random graphical models and networks,
  • applications: Bayesian learning approaches in Material Sc; unsupervised learning in Astronomy; graphical models of data from assorted disciplines, prediction without learning in Petrophysics, and computationally intensive projects. 


- publication of developed methods and undertaken applications; 

- attending conferences:  recent most talks were the keyote address at ICSTA, 2019; colloquium at SAMSI (November 2018);

- applying for grants: recent success with application for UK IC Fellowship for prospective applicant; Royal Society Dorothy Hodgkin Fellowship holder;

- Ph.D student supervision--currently working with two Ph.D students, Georgios Stagakis, and Cedric Spire, (and co-supervising another student); my first Ph.D student Kangrui Wang finished in 2018.

--Kangrui is working on high-dimensional Bayesian supervised learning methodologies and graphical models of large/small multivariate data using tensor-variate Gaussian Processes; he is interested in exploring kernel parametrisation. Kangrui has made applications of his methods to astronomical data, to the learning of vino-chemical networks of wine samples, learning the disease-symptom network in humans, and in computing reliability of large tests. He was examined in the summer of 2018, and bagged a postdoctoral position in the Alan Turing Institute thereafter.

-- Cedric is pursuing a novel Bayesian unsupervised learning methodology that works by embedding the sought model parameter into the support of the likelihood. He is making applications to learn the probability density of phase spaces of distant galaxies, and the total gravitational mass density in these systems. He will be undertaking his Ph.D viva in September, 2019, and has attained the UK Intelligence Community Fellowship (funded by the RAEng for 2 years), which he will be bringing to Loughborough, and will be supervised by myself. 

-- Georgios Stagakis joined me as my Ph.D student, in 2018.

In addition, I have supervised multiple Masters dissetations in the past; regularly work with final year project students in the Department, and offer Small Group Tutorials to my 1st Year personal tutees enrolled on the Mathematics BSc degree.

Teaching -

--MAB270: Statsitical Modelling.
--MAB280: Introduction to Stochastic Analysis (second half of module); 
--assessment of presentations of various project modules in Statistics and Mathematics.

Earlier I taught Applied Statistics at the UG and PG levels, and contributed to the support teaching of various modules.

Pastoral -

          --Induction Coordinator in Mathematics.
          --tutor to a small group of 1st and 2nd year tutees, 2 placement students

          Earlier in Leicester Maths, I had been the first Director of the MSc in data Anallysis for Business Intelligence, and was the overall head of the MScs courses in the Department.


Recently  published & communicated:


  • Cedric Spire & Dalia Chakrabarty, "Learning in the Absence of Training Data -- a Galactic Application", in Bayesian Statistics and New Generations, R. Argiento et al. (eds.),  Springer Proceedings in Mathematics & Statistics 296, Springer Nature Switzerland AG 2019,
  • Dalia Chakrabarty & Kangrui Wang, " 2 Layers Suffice: Bayesian Supervised Learning given Hypercuboidally-shaped, Discontinuous Data, using Compound Tensor-Variate & Scalar-Variate Gaussian Processes", preliminary version in arXiv:1803.04582
  • Kangrui Wang & Dalia Chakrabarty, "Correlation between Multivariate Datasets, from Inter-Graph Distance computed using Graphical Models Learnt With Uncertainties", preliminary version in arXiv:1710.11292
  • Satyendra Chakrabartty, Kangrui Wang & Dalia Chakrabarty, ``New Combinatorial & Bayesian Uncertainty Estimation of Real Tests and Surveys''
  • Dalia Chakrabarty, ``A New Bayesian Test to test for the Intractability-Countering Hypothesis'', 2017, Jl. of American Statistical Association; 112, pg.561--577.
  • Dalia Chakrabarty et. al, ``Bayesian Density Estimation via Multiple Sequential Inversions of 2-D Images with Application in Electron Microscopy'', 2015, Technometrics, 57, 2, pg. 217–233.
  • Dalia Chakrabarty, M. Biswas & Sourabh Bhattacharya,``Bayesian Nonparametric Estimation of Milky Way Model Parameters Using a New Matrix-Variate Gaussian Process Based Method'', Electronic Journal of Statistics, 2015, 9, 1, pg. 1378–1403.
  • S. Banerjee, A. Basu, S. Bhattacharya, S. Bose, Dalia Chakrabarty, and S.S. Mukherjee, ``Minimum Distance Estimation of  Milky Way Model Parameters and Related Inference'', 2015, SIAM/ASA Jl. of Uncertainty Quantification, 3, 1, pg. 91–115.
  • Dalia Chakrabarty & S. Paul, ``Bayesian Learning of Material Density Function by Multiple Sequential Inversions of 2-D Images in Electron Microscopy'', 2015, Springer Proceedings in Mathematics and Statistics, 118, pg. 35–48.

 Upcoming publications:

  • Cedric Spire & Dalia Chakrabarty, ``New Method for Prediction, Notwithstanding Impossibility of Learning, given Absent Training Data -- an Astrophysical Application''.
  • Dalia Chakrabarty, "A New Method for Optimising the Mis-specified Vector of Parameters in a Parametric Model, while Learning Model Parameters".

Awarded the "Hind Rattan" (Jewel of India) award for 2020; these are awards for the Indian diasporic community given by the NRI Welfare Society of India.