Abstract:
As the scale of AI grows rapidly, errors in deep learning applications are also increasing. Existing popular deep learning frameworks are mostly built on the dynamically-typed language Python, which lacks type checking mechanisms. This leads to many errors that cannot be eliminated through type checking at the compilation stage. This paper proposes a strongly-typed functional programming style deep learning framework based on the theorem prover Coq. The framework features typed tensor structures and a powerful static type checking capability. The experimental results demonstrate that the framework can automatically, quickly, and effectively detect shape mismatch errors in deep learning models, and it has greater advantages in terms of speed and detection capability compared to other checking tools. Furthermore, this paper designs and implements a set of rewriting rules that translate functional programming models into C code, realizing the translation from functional neural network operator expressions to multi-core parallel OpenMP C code. According to the results of multiple sets of experiments, the C code for operators generated by this method is on par with manually written code. Furthermore, the speed of the generated neural network operator C code with multi-core parallel optimization has been improved by 4-10 times compared with the sequentially executed operator C code. Additionally, the generated C operators are highly secure and can effectively avoid common issues such as out-of-bounds indexing and memory allocation errors in manually written code.