Abstract

Generative design in architecture has long been studied, yet most algorithms are parameter-based and require explicit rules, and the design solutions are heavily experience-based. In the absence of a real understanding of the generation process of designing architecture and consensus evaluation matrices, empirical knowledge may be difficult to apply to similar projects or deliver to the next generation. We propose a workflow in the early design phase to synthesize and generate building morphology with artificial neural networks. Using 3D building models from the financial district of New York City as a case study, this research shows that neural networks can capture the implicit features and styles of the input dataset and create a population of design solutions that are coherent with the styles. We constructed our database using two different data representation formats, voxel matrix and signed distance function, to investigate the effect of shape representations on the performance of the generation of building shapes. A generative adversarial neural network and an auto decoder were used to generate the volume. Our study establishes the use of implicit learning to inform the design solution. Results show that both networks can grasp the implicit building forms and generate them with a similar style to the input data, between which the auto decoder with signed distance function representation provides the highest resolution results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call