I am a statistical machine learning researcher and have been mostly working on Density Ratio Estimation. I have a vision that by comparing two density functions, many machine learning problems can be solved more elegantly and efficiently. We have seen many important works being done along this line, such as Generative Adversarial Net.
If you are interested in doing a machine learning PhD at Bristol, please consider Compass - EPSRC Centre for Doctoral Training in Computational Statistics and Data Science. You can also directly apply for a PhD position at the university. Some scholarship (such as CSC) are available.
(JMLR2022) Liu S., Kanamori T., Williams, D.J., Estimating Density Models with Truncation Boundaries, arxiv, to appear, Journal of Machine Learning Research.
(AABI2022) Simons, J., Liu, S. Beaumont, M., Variational Likelihood-Free Gradient Descent, 4th Symposium on Advances in Approximate Bayesian Inference, 2022.
(AABI2022, Contributed Talk) Yi, M., Liu, S., Sliced Wasserstein Variational Inference, 4th Symposium on Advances in Approximate Bayesian Inference, 2022.
(CACAIE 2021) Zhang Y., Macdonald J., online version, Liu S., and Harper P., Damage Detection of Nonlinear Structures Using Probability Density Ratio Estimation, Computer-Aided Civil and Infrastructure Engineering, 2021
(TKDE 2021) Wu, XZ., Xu, W., Liu, S., Zhou, ZH., Model Reuse with Reduced Kernel Mean Embedding Specification, IEEE Transactions on Knowledge and Data Engineering, arxiv, 2021
(JRSSB 2021) Kim, B., Liu, S., Kolar, M.,Two-sample inference for high-dimensional Markov networks, to appear, Journal of the Royal Statistical Society Series B (JRSSB), arxiv, 2021.
(AAAI 2021) Minami, S., Liu, S., Wu, S., Fukumizu, K., Yoshida, R., A General Class of Transfer Learning Regression without Implementation Cost, AAAI Conference on Artificial Intelligence, To appear, 2021.
(NEURIPS2019) Liu, S., Kanamori, T., Jitkrittum, W., Chen, Y. Fisher Efficient Inference of Intractable Models,_ Advances in Neural Information Processing Systems 32_, 2019.
(ICML2019) Wu, X. Z., Liu, S., Zhou, Z.H. Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin, Proceedings of the 36th International Conference on Machine Learning, PMLR 97, 2019.
(NC2018) Noh, Y-K., Sugiyama, M., Liu, S., du Plessis, M.C., Park, F.C., and Lee, D. D., Bias Reduction and Metric Learning for Nearest−Neighbor Estimation of Kullback−Leibler Divergence, Neural Computation Vol.30(7), 2018
(NEURIPS2017) Liu, S., Takeda, A., Suzuki, T., Fukumizu K., Trimmed Density Ratio Estimation, Advances in Neural Information Processing Systems 30, 2017
(AOS2017) Liu, S., Suzuki, T., Relator R., Sese J., Sugiyama, M., Fukumizu, K., Support consistency of direct sparse-change learning in Markov networks. Annals of Statistics, Volume 45, Number 3, 2017
(ICML2016) Liu, S., Suzuki, T., Sugiyama, M. Fukumizu K., Structure Learning of Partitioned Markov Networks, Proceedings of the 33rd International Conference on Machine Learning, 2016.
(SDM 2016) Liu, S., Fukumizu K., Estimating Posterior Ratio for Classification: Transfer Learning from Probabilistic Perspective, Proceedings of 2016 SIAM International Conference on Data Mining, 2016.
03/2014, Doctor of Engineering, Tokyo Institute of Technology, Japan.
Thesis: Statistical Machine Learning Approaches on Change Detection.
Supervisor: Prof. Masashi Sugiyama
10/2010, Master of Science with Distinction, University of Bristol, UK.
06/2009, Bachelor of Engineering, Soochow University, China.