Pretrained Graph Neural Network for Embedding Semantic, Spatial, and Topological Data in Building Information Models

Computer-Aided Civil and Infrastructure Engineering, 2025

Recommended citation: Han, J., Lu, X.Z., Lin, J.R.* (2025). Pretrained Graph Neural Network for Embedding Semantic, Spatial, and Topological Data in Building Information Models. Computer-Aided Civil and Infrastructure Engineering, 40(26), 4607-4631. doi: 10.1111/mice.70073 http://doi.org/10.1111/mice.70073 cited by count

Abstract

Large foundation models have demonstrated significant advantages in civil engineering, but they primarily focus on textual and visual data, overlooking the rich semantic, spatial, and topological features in building information modeling (BIM) models. Therefore, this study develops the first large-scale graph neural network, BIGNet, to learn and reuse multidimensional design features embedded in BIM models. First, a scalable graph representation is introduced to encode the “semantic-spatial-topological” features of BIM components, and a dataset with nearly 1 million nodes and 3.5 million edges is created. Subsequently, BIGNet is proposed by introducing a new message-passing mechanism to GraphMAE2 and further pretrained with a node masking strategy. Finally, BIGNet is evaluated in various transfer learning tasks for BIM-based design checking. Results show that: (1) homogeneous graph representation outperforms heterogeneous graph in learning design features, (2) considering local spatial relationships in a 30 cm radius enhances performance, and (3) BIGNet with graph attention network-based feature extraction achieves the best transfer learning results. This innovation leads to a 72.7% improvement in average F1-score over non-pretrained models, demonstrating its effectiveness in learning and transferring BIM design features and facilitating their automated application in future design and lifecycle management.

graphical abstract

Download paper here

Download preprint here

This project is funded by the National Natural Science Foundation of China under Grant No. 52378306 and No. 52238011.

Financial Sources:

Leave a Comment