Computer Science Speakers Calendar

View Full Calendar

ConstraintFlow: A DSL for Specification and Verification of Neural Network Analyses & Preempting Flaky Tests via Non-Idempotent-Outcome Tests

Event Type
Seminar/Symposium
Sponsor
The Department of Computer Science, University of Illinois and FM/SE Seminar
Location
0216 Siebel Center and Zoom
Virtual
wifi event
Date
Mar 8, 2024   2:00 pm  
Speaker
Avaljot Singh, UIUC & Kaiyao Ke, UIUC
Contact
Isha Chaudhary
E-Mail
isha4@illinois.edu
Views
20

Talk 1

Abstract: The uninterpretability of Deep Neural Networks (DNNs) hinders their deployment to safety-critical applications. Recent works have shown that Abstract-Interpretation-based formal certification techniques provide promising avenues for building trust in DNNs to some extent. The intricate mathematical background of Abstract Interpretation poses two main challenges for the users of DNNs in developing these techniques. These are (i) easily designing the algorithms that capture the intricate DNN behavior by balancing cost vs. precision tradeoff, and (ii) maintaining the over-approximation-based soundness of these certifiers. It is important to address these challenges to push the boundaries of DNN certification.

General-purpose programming languages like C++ provide extensive functionality, however, verifying the soundness of the algorithms written in them can be impractical. The most commonly used DNN certification libraries like auto_LiRPA and ERAN prove the correctness of their analyses. However, they consist of only a few hard-coded abstract domains and abstract transformers (or transfer functions) and do not allow the user to define new analyses. Further, these libraries can handle only specific DNN architectures. To address these issues, we develop a declarative DSL - ConstraintFlow- that can be used to specify Abstract Interpretation-based DNN certifiers. In ConstraintFlow, programmers can easily define various existing and new abstract domains and transformers, all within just a few 10s of Lines of Code as opposed to 1000s of LOCs of existing libraries. We also provide lightweight automatic verification, which can be used to ensure the over-approximation-based soundness of the certifier code written in ConstraintFlow for arbitrary (but bounded) DNN architectures. Using this automated verification procedure, for the first time, we can verify the soundness of state-of-the-art DNN certifiers for arbitrary DNN architectures, all within a few minutes. We prove the soundness of our verification procedure and the completeness of a subset of ConstraintFlow.


Talk 2

Abstract: Regression testing can greatly help in software development, but it can be seriously undermined by flaky tests, which can both pass and fail, seemingly nondeterministically, on the same code commit. Flaky tests are an emerging topic in both research and industry. Prior work has identified multiple categories of flaky tests, developed techniques for detecting these flaky tests, and analyzed some detected flaky tests. To proactively detect, i.e., preempt, flaky tests, we propose to detect non-idempotent-outcome (NIO) tests, a novel category related to flaky tests. In particular, we run each test twice in the same test execution environment, e.g., run each Java test twice in the same Java Virtual Machine. A test is NIO if it passes in the first run but fails in the second. Each NIO test has side effects and “selfpollutes” the state shared among test runs. We perform experiments on both Java and Python open-source projects, detecting 223 NIO Java tests and 138 NIO Python tests. We have inspected all 361 detected tests and opened pull requests that fix 268 tests, with 192 already accepted, only 6 rejected, and the remaining 70 pending.

link for robots only