Sam Shahsavani
Back
B+H Architects logo
B+H / AI Design Workflows

Local AI visualization for architectural concept work

At B+H, I supported concept and competition workflows where teams needed fast AI-assisted visual iteration, but confidential project material could not be casually pushed through public cloud tools.

Core Move

The useful move was connecting image generation to architectural inputs: sketch, massing, linework, depth, context, and post-production judgment.

Local First

Confidential project material stayed out of public cloud image tools.

Controlled Inputs

Sketches, Rhino massing, line drawings, depth maps, masks, and site context anchored the output.

2024 Toolchain

Automatic1111, ComfyUI, Stable Diffusion, ControlNet, local models, LoRAs, and Photoshop.

Design Review

The goal was early visual exploration, not final rendering or replacing architectural judgment.

Why It Mattered Then

The difficulty was not the image. It was making the image useful.

In 2024, the workflow still had to be assembled by hand: local setup, model testing, control images, inpainting, and compositing all had to be made usable inside real design constraints.

01

The tools were not turnkey

This was before polished architecture AI workflows were common. Getting useful images meant wiring together local models, control images, prompts, masks, and manual cleanup.

02

Privacy changed the method

Because the work lived inside a company context, public cloud tools were not the right default. The workflow had to run locally and respect project confidentiality.

03

Architecture needed control

Pure prompting was too loose for design work. The useful move was to make AI follow architectural evidence: massing, perspective, linework, depth, and site context.

Workflow

A design-controlled image pipeline

The sequence matters. Each step adds constraint before the AI gets to improvise, then human judgment brings the output back into architectural communication.

Intent01

Start from a sketch or design direction

The process began with a human design idea: a sketch, a program direction, a facade ambition, or a massing concept that needed fast visual exploration.

Annotated hand sketch used as an architectural input
An annotated sketch captured the architectural intent before model setup or prompting.
Geometry02

Translate the idea into massing

Rhino gave the workflow a simple but crucial architectural base: proportion, viewpoint, site relationship, and the broad geometry the image should respect.

Rhino massing model used as an AI control input
The massing model established the spatial logic before image generation began.
Controls03

Manufacture the constraints the model needs

Line drawings and depth maps were exported as ControlNet inputs. This was the technical hinge: the AI could explore atmosphere without losing the architecture.

Line drawing exported from a massing model
Linework helped preserve edges, silhouette, and the intended geometry.
Depth04

Add spatial hierarchy

Depth maps gave the model another reading of foreground, background, height, and mass. The more the image was constrained, the less it behaved like a random generator.

Depth map used as a ControlNet input
Depth mapping helped keep the image aligned with architectural space.
Local AI05

Iterate locally with model and prompt tests

Automatic1111 and ComfyUI made it possible to test models, LoRAs, ControlNet settings, prompts, and inpainting without sending project material to external services.

ComfyUI node graph for a local architectural image workflow
ComfyUI made the workflow explicit: models, conditioning, image inputs, prompts, and outputs wired together locally.
Judgment06

Composite and clean the result

The final step was not automatic. Outputs were selected, repaired, layered, and placed into context through Photoshop so the image could support a design conversation.

Photoshop compositing workflow
Human judgment turned model outputs into usable design-review imagery.
Tooling

The complexity makes the workflow visible.

The screenshots show the historical limitation: the workflow was not packaged yet. It had to be constructed from local interfaces, node graphs, settings, prompts, control inputs, and manual finishing.

ComfyUI

Node-based local workflow

This was not a single prompt box. It was a local graph of models, conditioning, image inputs, and output handling.

ComfyUI node graph for local architectural visualization
ComfyUI workflow graph used to structure local image generation experiments.
Automatic1111

Prompt, ControlNet, and generation tuning

Automatic1111 was useful for fast tests, img2img, ControlNet settings, inpainting, model selection, and keeping the workflow on a local machine.

Automatic1111 interface showing local image generation settings
Local interface with prompt, negative prompt, generation settings, image inputs, and outputs visible together.
Photoshop

Selection, repair, and compositing

The final artifact still needed design judgment: selecting useful outputs, repairing artifacts, balancing context, and composing the image for review.

Photoshop interface for compositing an AI-assisted architecture visualization
Post-production kept the output legible as architecture, not just as generated imagery.
Iteration Range

The value was comparison, not one perfect image.

The workflow let the team hold massing logic relatively steady while testing material, atmosphere, facade expression, lighting, and proportion across multiple outputs.

AI-generated tower iteration 1
Iteration 01
AI-generated tower iteration 2
Iteration 02
AI-generated tower iteration 3
Iteration 03
AI-generated tower iteration 4
Iteration 04
AI-generated tower iteration 5
Iteration 05
AI-generated tower iteration 6
Iteration 06
Project Range

Two design situations, one repeatable method

The value was not a single lucky image. The same pattern could move across different project types because the workflow was based on controls, not only prompts.

Project 01

Entertainment and retail concept exploration

A concept image used to test mood, scale, lighting, and public-facing atmosphere during early design iteration.

AI-assisted visualization for an entertainment and retail concept
Early visual exploration, not final documentation.
Project 02

High-rise hotel context composite

A tower concept moved from sketch and massing through controlled generation and Photoshop placement into an urban context.

High-rise hotel concept composited into site context
Contextual composite after sketch, massing, ControlNet iteration, and cleanup.
Scope

Real constraints made the workflow useful.

The workflow mattered because the constraints were real: local execution, confidentiality, controlled inputs, and design judgment.

Scope

Not AI designing architecture by itself

The work is not autonomous design or final client rendering. It is a controlled visualization workflow for early design review.

Workflow

A controlled visualization layer for early design

The useful work was translating immature AI tools into a local workflow for architects: private, constrained, iterative, and tied to design intent.

Operating Pattern

Emerging tools become valuable when the workflow respects the work.

The specific 2024 methods will keep aging, but the pattern still matters: understand the design problem, respect confidentiality, build the missing workflow, test it against real project constraints, and communicate the result clearly enough for a team to use.