高级检索
    陈亚瑞, 杨巨成, 史艳翠, 王嫄, 赵婷婷. 概率生成模型变分推理方法综述[J]. 计算机研究与发展, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
    引用本文: 陈亚瑞, 杨巨成, 史艳翠, 王嫄, 赵婷婷. 概率生成模型变分推理方法综述[J]. 计算机研究与发展, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
    Chen Yarui, Yang Jucheng, Shi Yancui, Wang Yuan, Zhao Tingting. Survey of Variational Inferences in Probabilistic Generative Models[J]. Journal of Computer Research and Development, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637
    Citation: Chen Yarui, Yang Jucheng, Shi Yancui, Wang Yuan, Zhao Tingting. Survey of Variational Inferences in Probabilistic Generative Models[J]. Journal of Computer Research and Development, 2022, 59(3): 617-632. DOI: 10.7544/issn1000-1239.20200637

    概率生成模型变分推理方法综述

    Survey of Variational Inferences in Probabilistic Generative Models

    • 摘要: 概率生成模型是知识表示的重要方法,在该模型上计算似然函数的概率推理问题一般是难解的.变分推理是重要的确定性近似推理方法,具有较快的收敛速度、坚实的理论基础.尤其随着大数据时代的到来,概率生成模型变分推理方法受到工业界和学术界的极大关注.综述了多种概率生成模型变分推理框架及最新进展,具体包括:首先综述了概率生成模型变分推理一般框架及基于变分推理的生成模型参数学习过程;然后对于条件共轭指数族分布,给出了具有解析优化式的变分推理框架及该框架下可扩展的随机化变分推理;进一步,对于一般概率分布,给出了基于随机梯度的黑盒变分推理框架,并简述了该框架下多种变分推理算法的具体实现;最后分析了结构化变分推理,通过不同方式丰富变分分布提高推理精度并改善近似推理一致性.此外,展望了概率生成模型变分推理的发展趋势.

       

      Abstract: Probabilistic generative models are important methods for knowledge representation. Exact probabilistic inference methods are intractable in these models, and various approximate inferences are required. The variational inferences are important deterministic approximate inference methods with rapid convergence and solid theoretical foundations, and they have become the research hot in probabilistic generative models with the development of big data especially. In this paper, we first present a general variational inference framework for probabilistic generative models, and analyze the parameter learning process of the models based on variational inference. Then, we give the framework of analytic optimization of variational inference for the conditionally conjugated exponential family, and introduce the stochastic variational inference based on the framework, which can scale to big data with the stochastic strategy. Furthermore, we provide the framework of black box variational inferences for the general probability generative models, which train the model parameters of variational distributions based on the stochastic gradients; and analyze the realization of different variational inference algorithms under the framework. Finally, we summarize the structured variational inferences, which improve the inference accuracy by enriching the variational distributions with different strategies. In addition, we discuss the further development trends of variational inference for probabilistic generative models.

       

    /

    返回文章
    返回