Abstract: Static analysis is fundamental to program debugging and security auditing. Traditional techniques, such as data-flow analysis and symbolic execution, facilitate the automatic detection of software bugs, thereby greatly enhancing software reliability. However, their reliance on compilation processes and limited customization capabilities often impede practical adoption in real-world applications. In this talk, I will introduce a new paradigm of static analysis, named the neuro-symbolic approach, and present our recent works, LLMSAN, LLMDFA, and RepoAudit, which enable customizable, compilation-free analysis. Unlike conventional analyzers, they empower users to customize analyses via prompts and leverage large language models (LLMs) to interpret program semantics without requiring compilation. To address LLM hallucinations, they incorporate a series of parsing-based validators and an SMT solver to validate data-flow paths, ensuring the high quality of bug reports. Our techniques have identified over 100 previously unknown memory corruption vulnerabilities in real-world software systems, including projects in Uber's production line.
Bio: Chengpeng Wang is a Postdoctoral Research Associate at the Computer Science Department of Purdue University, working with Professor Xiangyu Zhang. He obtained the Ph.D. degree from Hong Kong University of Science and Technology in 2023, under the supervision of Professor Charles Zhang. His research mainly focuses on the use of program analysis, especially static analysis, to improve software reliability and performance. He is also interested in the intersection of machine learning techniques and symbolic analysis techniques. His contributions to the field have been recognized through publications in esteemed conferences and journals on programming languages, software engineering, machine learning, and systems. He has been awarded the SIGPLAN Distinguished Paper Award (2022) and the ASPLOS Best Paper Award (2024). He received his BEng and MPhil degrees from Tsinghua University, in 2016 and 2019, respectively.