Research

Existing buildings and historic structures play key roles in the communities they are a part of. Not only are they an important part of cultural legacy, but they also are crucial in surmounting engineering challenges like the design of sustainable infrastructure, space shortages, increased costs of construction, and the lowering life cycle costs.  By reusing or recycling existing and historic structures, communities can reduce their carbon costs, decrease the amount of solid waste they are contributing to landfills,  and save on new construction materials. Before existing structures can be preserved or renovated, however, an assessment of the existing building must be completed. This process must comprise documenting where damages are, understanding how existing damages originated and assessing the current capacity of a structure.

My group leverages AI to reconcile heterogeneous, transdisciplinary data about existing buildings into actionable, explainable information. This synthesized knowledge is then used to transform how we interact with structures from the past to make them intelligent, resilient, and sustainable cornerstones for cities of the future.

​This research lies at the intersection of civil engineering, computer science, and historic preservation to combine physics-based modeling and data-driven modeling for the end goal of making real-time predictions and monitoring in the context of Digital Twin a reality. This area leverages the decipherability and clear-box nature of physics-based modeling, with accuracy and pattern recognition techniques of data-driven machine learning algorithms.  More specifically, our research focuses on the following aspects for preservation and adaptive reuse of existing and historic structures as a sustainable infrastructure solution:

Documentation

Photogrammetry, Laser scanning, UAV

Image credit: https://www.3deling.com/

The first step to preserving and reusing existing infrastructure is documentation of the as-built geometry and as-weathered conditions. A computer vision technique called photogrammetry enables a practitioner to create a digital, 3D, scalable model from 2D images either taken with a terrestrial camera or drone. A problematic facet of this process is the lack of accuracy yielded by presented methods. We have developed a robust, statistics-based method for minimizing errors during documentation and published a paper on the topic. Since its creation, this method has been successfully applied to several historic structures in the United States, Italy, and Nepal to aid in preservation and diagnostic efforts.

Data extraction

Eye tracking, Point cloud segmentation, Semantic segmentation

Image credit: https://medium.com/bbm406f18/week-7-facade-parsing-using-deep-learning-7f68d30c8fd6

Once the geometry and conditions of a structure have been captured, further information about the types of damage present and their severity need to be extracted. Over the course of the previous research, we have developed a new convolutional neural network (CNN) architecture which is able to identify cracks in masonry structures. While the problem of crack identification has been well-studied for concrete and asphalt, the joints in masonry make it a much more complex problem.

Non-destructive evaluation (NDE) is a crucial tool for evaluating the health of building structures. Electromagnetic scattering-based techniques such as Ground Penetrating Radar (GPR) enable users to observe the state of underlying structures without damaging the building. However, interpreting GPR scans is a convoluted process that often requires human experts to go through tedious data. Our work involves developing both data-driven and hybrid physics-based models for the automated generation of underlying structural maps from GPR scans. Experimental scans and simulated scans are used to discover intrinsic connections between building structure attributes and how they impact GPR scans.

Data synthesis

Multi-modal data fusion, factor analysis

Image credit: https://www.bdcnetwork.com/augmented-reality-goes-mainstream-12-applications-design-and-construction-firms

Once large, heterogeneous data sets have been collected, often they are layered over one another to ascertain any correlations. While this is possible to do qualitatively for a few layers, it quickly becomes a more complex problem since the number of comparisons is 𝑛^2 for 𝑛 layers. Factor analysis is a statistics- and machine learning-based method for describing the variability among variables in terms of a potentially lower number of unobserved or latent variables called factors. For example, it is possible that variations in ten observed variables (soil moisture, results of thermal imaging, dew point, crack width, movement, etc.) mainly reflect the variations in only a few unobserved or underlying variables. Factor analysis can provide actionable information about what factors could be instrumented with sensors short term for diagnostics as well as long term as part of life cycle management. By understanding how different systems (deterioration, moisture, movement) are related through data synthesis, efficient monitoring strategies which are optimized for the conditions on a specific site can be developed. 

Physics-based modeling

Discrete element modeling, finite element modeling, information theory, manifold learning

Image credit: https://www.bdcnetwork.com/augmented-reality-goes-mainstream-12-applications-design-and-construction-firms

Pure data analytics cannot provide answers about what caused crack patterns or movement within a
structure–a hybrid approach combining synthesized data sets as well as physics-based modeling is required. We have worked in collaboration with colleagues at the University of California San Diego to develop a method for integrating data from documentation (laser scanning and photogrammetry), non-destructive evaluation (thermal imaging), and physics-based modeling (finite-distinct element modeling and cluster analysis) to diagnose the most probable causes of cracks on existing structures. This method was proven by comparing the results to experimental testing; additionally, it was applied to many international and domestic case studies. While using physics-based models can provide detailed insights, they can be computationally expensive and time intensive. For this reason, we have begun exploratory work at the intersection of manifold learning and numerical simulation which would remove the need for diagnostics simulations.

Analysis of historic construction techniques

Traditional building techniques, archaeological reconstructions, history of construction

Image credit: https://www.bdcnetwork.com/augmented-reality-goes-mainstream-12-applications-design-and-construction-firms

Understanding how damages occurred on an existing structure is only part of the battle. Additionally, understanding how unique, historic construction techniques function and affect stability analysis is also crucial to their preservation. We are also working to understand how to best assess the functionality of historic construction techniques such as Roman bonding courses and zipper vaults.

Masonry buildings comprise a significant portion of the built heritage environment for most countries and are susceptible to natural hazards. Over the years, various methodologies have been identified to safeguard these structures, specifically against earthquakes. In contrast masonry failures during extreme wind loading remain unidentified despite the annual increase in the number of tornadoes in the United States. Even though a lot of research is being done at the intersection of extreme wind loading and structural strengthening, historic and aging infrastructure, specifically unreinforced masonry buildings, are often neglected in this area of research resulting in their omission from the building code. Thus, the overarching goal is to identify different modes of failure for extreme wind loading on unreinforced, historic masonry structures. Building information data will be explored using dimensionality reduction techniques to quantify the contributions of specific building features to damage states. Additionally, point clouds are. converted to finite element models to quantitatively understand how variation in the relevant building features will influence the historic masonry construction under tornadic loading. This research will enable targeted retrofitting techniques by identifying and characterizing different failure modes for tornado loading on unreinforced, historic masonry structures and better equip them to handle future disaster scenarios.

Additionally our group works on methods for generating adaptive reuse layout solutions that meet structural integrity standards and minimize Construction & Demolition (C&D) costs. Shape Grammar (SG) is a ruled-based approach well suited for encoding human knowledge and translating design patterns into spatial transformations. Evolutionary computational approaches, such as Genetic Algorithms, can encode physics-based simulation to ensure structural stability and efficiency of load paths solutions. Since spatial and structural solutions may not evolve in the same direction, a third method is proposed to synthesize and optimize layout solutions for adaptive reuse. Machine learning (ML) techniques, such as Generative Adversarial Networks (GANs), have gained notoriety in the past decade for their capability of learning highly complex underlying distributions of data samples. Thus, this research explores different GD techniques for automating and optimizing the adaptive reuse of historic buildings into multi-family apartments.

empty clean shiny glassware on white fabric

Materials science for repairs, strengthening, and retrofitting

Image credit: https://www.bdcnetwork.com/augmented-reality-goes-mainstream-12-applications-design-and-construction-firms

Lastly, our research group also focuses on the intersection of material science, civil engineering, and architectural engineering disciplines for repairs, strengthening, and retrofitting. To do this, we are harnessing the power of machine learning to improve the durability of the built environment while also reducing the embedded carbon footprint of cement materials. We use computational thermodynamics methods to speed up the discovery of new cement materials and predict their performance over long time scales, including geological time sales relevant to nuclear waste disposal applications. By pairing experimental and computational methods, results can also be verified and used to provide valuable information for the additive manufacturing of cementitious materials. These technologies help to predict material behavior, especially the behaviors at the fresh-state, long-term stability of cements, and mechanical durability of such cements at the hardened state. Overall, this information will aid in maximizing information to enhance material behavior and performance.