Assoc. Professor in Data Science and AI,
School of Mathematics,
University of Bristol.
Office GA.18
Fry Building,
Woodland Road, BS8 1UG.
Hobby Project:
Juzhen, a C++ library for fast numerical computation and neural net applicaitons.
I am a statistical machine learning researcher and have been mostly working on Density Ratio Estimation. I have a vision that by comparing two density functions, many machine learning problems can be solved more elegantly and efficiently. You can read more about my vision on Machine Learning via Statistical Discrepancies.
I am happy to accept new PhD students. If you are interested in working on Machine Learning at Bristol, you may find several funding opportunities, such as Prob_AI and CSC scholarship.
(BDU workshop at NeurIPS2024), Wang, Y.*, Khoo, S.*, Liu, S., Bayesian Decision-making and Uncertainty workshop: Lightspeed Black-box Bayesian Optimization via Local Score Matching, NeurIPS 2024 Workshop on Bayesian Decision-making and Uncertainty, openreview
(NeurIPS2024), Givens, J.*, Reeve, H., Liu, S., Reluga, K., Conditional Outcome Equivalence: A Quantile Alternative to CATE, NeurIPS2024, arxiv
(ICML2024), Liu, S., Yu, J., Simons, J.*, Yi, M.*, Beaumont, M., Minimizing $f$-Divergences by Interpolating Velocity Fields, ICML2024, arxiv.
(OE2024), Zhang, Y.*, Pan, Z., Macdonald, J.H.G., Liu, S., Harper, P., Structural damage detection based on multivariate probability density functions of vibration data of offshore wind foundations with comparison studies, Ocean Engineering, link.
[AABI2023], Sharrock, L., Simons, J.*, Liu S., Beaumont, M., Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models, AABI2023, arxiv.
(ICML2023), Williams, D. J.*, Liu, S., Approximate Stein Classes for Truncated Density Estimation, ICML2023, arxiv.
(ICML2023), Yi, M.*, Zhu, Z., Liu, S., MonoFlow: Rethinking Divergence GANs via the Perspective of Differential Equations, ICML2023, arxiv.
(AISTATS2023), Givens, J.*, Reeve, H., Liu, S., Density Ratio Estimation and Neyman Pearson Classification with Missing Data, AISTATS2023, arxiv.
(ACML2022) Yi, M.*, Liu, S., Sliced Wasserstein Variational Inference (Best Student Paper), ACML 2022. arxiv.
(NeurIPS2022) Liu, S., Estimating the Arc Length of the Optimal ROC Curve and Lower Bounding the Maximal AUC, NeurIPS 2022 arxiv.
(JMLR2022) Liu, S., Kanamori, T., Williams, D.J.*, Estimating Density Models with Truncation Boundaries using Score Matching , arxiv, 23(186):1−38, Journal of Machine Learning Research, 2022.
(AABI2022) Simons, J.*, Liu, S. Beaumont, M., Variational Likelihood-Free Gradient Descent, 4th Symposium on Advances in Approximate Bayesian Inference, 2022.
(AABI2022, Contributed Talk) Yi, M.*, Liu, S., Sliced Wasserstein Variational Inference, 4th Symposium on Advances in Approximate Bayesian Inference, 2022.
(CACAIE 2021) Zhang Y.*, Macdonald J., online version, Liu S., and Harper P., Damage Detection of Nonlinear Structures Using Probability Density Ratio Estimation, Computer-Aided Civil and Infrastructure Engineering, 2021
(TKDE 2021) Wu, XZ., Xu, W., Liu, S., Zhou, ZH., Model Reuse with Reduced Kernel Mean Embedding Specification, doi: 10.1109/TKDE.2021.3086619, IEEE Transactions on Knowledge and Data Engineering, arxiv, 2021
(JRSSB 2021) Kim, B., Liu, S., Kolar, M.,Two-sample inference for high-dimensional Markov networks, page 939-962, Volume83, Issue5, Journal of the Royal Statistical Society Series B (JRSSB), arxiv, 2021.
(AAAI 2021) Minami, S., Liu, S., Wu, S., Fukumizu, K., Yoshida, R., A General Class of Transfer Learning Regression without Implementation Cost, AAAI Conference on Artificial Intelligence, 2021.
(NEURIPS2019) Liu, S., Kanamori, T., Jitkrittum, W., Chen, Y. Fisher Efficient Inference of Intractable Models,_ Advances in Neural Information Processing Systems 32_, 2019.
(ICML2019) Wu, X. Z., Liu, S., Zhou, Z.H. Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin, Proceedings of the 36th International Conference on Machine Learning, PMLR 97, 2019.
(NC2018) Noh, Y-K., Sugiyama, M., Liu, S., du Plessis, M.C., Park, F.C., and Lee, D. D., Bias Reduction and Metric Learning for Nearest−Neighbor Estimation of Kullback−Leibler Divergence, Neural Computation Vol.30(7), 2018
(NEURIPS2017) Liu, S., Takeda, A., Suzuki, T., Fukumizu K., Trimmed Density Ratio Estimation, Advances in Neural Information Processing Systems 30, 2017
(AOS2017) Liu, S., Suzuki, T., Relator R., Sese J., Sugiyama, M., Fukumizu, K., Support consistency of direct sparse-change learning in Markov networks. Annals of Statistics, Volume 45, Number 3, 2017
(ICML2016) Liu, S., Suzuki, T., Sugiyama, M. Fukumizu K., Structure Learning of Partitioned Markov Networks, Proceedings of the 33rd International Conference on Machine Learning, 2016.
03/2014, Doctor of Engineering, Tokyo Institute of Technology, Japan.
Thesis: Statistical Machine Learning Approaches on Change Detection.
Supervisor: Prof. Masashi Sugiyama
10/2010, Master of Science with Distinction, University of Bristol, UK.
06/2009, Bachelor of Engineering, Soochow University, China.