黄播

清华主页 EN
导航菜单

Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains

来源: 03-19

时间:Fri., 16:00 , Mar. 20, 2026

地点:Voov (Tencent): 201-467-303

组织者:Angelica Aviles-Rivero

主讲人:Shizheng Wen

Math+ML+X Seminar Series

Organizer:

Angelica Aviles-Rivero

Speaker:

Shizheng Wen (ETH Zürich)

Time:

Fri., 16:00 , Mar. 20, 2026

Online:

Voov (Tencent): 201-467-303

Title:

Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains

Abstract:

Neural operators have emerged as promising surrogates for PDE solvers, yet applying them to domains with complex geometries — as encountered in most engineering applications — remains challenging. Among existing approaches, we observe a fundamental accuracy-efficiency tradeoff: accurate models tend to be computationally expensive and poorly scalable, while efficient ones sacrifice accuracy. In this talk, I will present GAOT (Geometry Aware Operator Transformer), which overcomes this tradeoff by combining a novel multiscale attentional graph neural operator encoder/decoder with geometry embeddings and a vision transformer processor. This design enables GAOT to handle arbitrary point cloud inputs, produce outputs at any query point, and scale to very large meshes efficiently. Experiments on 28 benchmarks across diverse PDEs show that GAOT achieves top accuracy and robustness while being the most efficient model among all baselines. I will further demonstrate GAOT's scalability on three large-scale 3D industrial CFD datasets — including automobile and aerospace aerodynamics with meshes up to 9 million points — where it achieves state-of-the-art performance.

返回顶部
黄播相关的文章