Chen Yarui, Yang Jucheng, Shi Yancui, Wang Yuan, Zhao Tingting. Survey of Variational Inferences in Probabilistic Generative Models[J]. Journal of Computer Research and Development, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
Citation:
Chen Yarui, Yang Jucheng, Shi Yancui, Wang Yuan, Zhao Tingting. Survey of Variational Inferences in Probabilistic Generative Models[J]. Journal of Computer Research and Development, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
Chen Yarui, Yang Jucheng, Shi Yancui, Wang Yuan, Zhao Tingting. Survey of Variational Inferences in Probabilistic Generative Models[J]. Journal of Computer Research and Development, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
Citation:
Chen Yarui, Yang Jucheng, Shi Yancui, Wang Yuan, Zhao Tingting. Survey of Variational Inferences in Probabilistic Generative Models[J]. Journal of Computer Research and Development, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
(College of Artificial Intelligence, Tianjin University of Science & Technology, Tianjin 300457)
Funds: This work was supported by the National Natural Science Foundation of China (61976156, 61702367), the Tianjin Natural Science Foundation (18JCQNJC69800), and the Youth Scholars Foundation of Tianjin University of Science and Technology (2017LG10).
Probabilistic generative models are important methods for knowledge representation. Exact probabilistic inference methods are intractable in these models, and various approximate inferences are required. The variational inferences are important deterministic approximate inference methods with rapid convergence and solid theoretical foundations, and they have become the research hot in probabilistic generative models with the development of big data especially. In this paper, we first present a general variational inference framework for probabilistic generative models, and analyze the parameter learning process of the models based on variational inference. Then, we give the framework of analytic optimization of variational inference for the conditionally conjugated exponential family, and introduce the stochastic variational inference based on the framework, which can scale to big data with the stochastic strategy. Furthermore, we provide the framework of black box variational inferences for the general probability generative models, which train the model parameters of variational distributions based on the stochastic gradients; and analyze the realization of different variational inference algorithms under the framework. Finally, we summarize the structured variational inferences, which improve the inference accuracy by enriching the variational distributions with different strategies. In addition, we discuss the further development trends of variational inference for probabilistic generative models.