Industrial & Enterprise Systems Engineering Calendar

View Full Calendar

ISE Graduate Seminar Series

Event Type
Seminar/Symposium
Sponsor
ISE Graduate Programs Office
Location
2240 Digital Computer Lab
Date
Apr 5, 2024   10:00 - 10:50 am  
Views
7
Originating Calendar
ISE Seminar Calendar

Cross-modal Synthesis for Generative Design

Zhenghui Sha
Assistant Professor in the Walker Department of Mechanical Engineering
The University of Texas at Austin (UT)

Abstract: Design is an interactive activity between designers and design tools in which the design information is represented by different modalities, such as text, sketches, images, and computer-aided design (CAD) models. Traditionally, the transformation of design modality from one to another, e.g., from textual information to 2D sketches, is manually realized by human designers. The recent development of machine learning and artificial intelligence (AI) techniques has made the automation of cross-modal transformation possible. Especially with the advent of generative AI, techniques such as text-to-image generation and the most recent text-to-video model have significantly advanced cross-modal synthesis in various applications. However, in engineering design, cross-modal synthesis is still challenging because the target design modality is typically in CAD representation (e.g., meshes and boundary representation). The challenge comes from the semantic gap between other modalities and CAD models that can substantially influence the inference of design intent and the resulting design integrity during the cross-modal transformation. In this talk, I will present two models we recently developed for cross-modal synthesis in design. For the first model, I demonstrate how we achieve sketch-to-3D shape generation, while for the second model, we solve a more challenging problem where we wish to automatically generate CAD sequences from a single image of a target part. These models can significantly reduce the time and cost in early-stage design by speeding up design conceptualization and iteration as well as shortening the product development cycle because the generated CAD models can be seamlessly integrated into many downstream engineering tasks, such as computational fluid dynamics (CFD) analysis and additive manufacturing, thus enabling the channel from design ideation directly to prototyping.

Bio: Dr. Zhenghui Sha is an Assistant Professor in the Walker Department of Mechanical Engineering at The University of Texas at Austin (UT). Dr. Sha received a Ph.D. from Purdue University and was a Postdoctoral Fellow in the McCormick School of Engineering at Northwestern University. Dr. Sha’s research focuses on system science and design science, as well as the intersection between these two areas. In particular, his research encompasses design theory, human-machine interaction, swarm manufacturing, and complex sociotechnical systems. He is a faculty member of the Center for Additive Manufacturing and Design Innovation (CAMDI) and an Affiliated Faculty of the Oden Institute of Computational Engineering and Sciences at UT Austin. 

Dr. Sha is the recipient of the 2022 Young Engineering Award (YEA) from the Computers & Information in Engineering (CIE) Division of the American Society of Mechanical Engineers (ASME) and received the Best Dissertation of The Year Award in 2017 from the ASME CIE Division. He received the ASME Robert E. Fulton Best Paper Award twice in 2013 and 2017. Dr. Sha was the Honorable Mention awarded by the Mechanical Engineering Graduate Students Board for the Best Faculty Advisor Award in 2023 at UT Austin. He was the Reviewer with the Distinction Award for the Journal of Mechanical Design and received the Technical Committee Leadership Award from the ASME CIE Division in 2020. Dr. Sha is the inaugural Chair of the ASME Hackathon and has been a member of the ASME Hackathon Committee since 2019. He served as the Chair of the ASME SEIKM Technical Committee from 2018 to 2019 and has been the Program Chair of the ASME AL/ML Technical Committee since 2023.

link for robots only