buildingSMART Forums

Federated Decentralised CDE

I disclaim modelling together as a whole what is not a whole in accordance with empirical realism.
I disclaim modelling practices in discrepancy to reality that make life easier for software developers at the cost of having models that are not in accordance with empirical realism.

Modelling of windows and doors with automatic component wrappings and splays is the duty of the microservices that are operating on the realistic model. The model itself shall be based on a precise model conception that are consistent with empirical realism.

I welcome operations for automatic component wrappings and splays.
I refuse model conception that imposes compounds that are in contrast to empirical realism (e.g. for facultative insulation that is mounted to a concrete wall at a different time period)

I refuse software models that are inconsistent to empirical realism (e.g. in contradiction to the way the physical insulation is being set up)

Model conceptions need to abstract from reality. This can be done by omitting details or by simplified representation of details. But I refuse model conceptions that prescribe wrong details (e.g. 45 degrees insulation cut).

I welcome model conceptions that
a) omit unknownd details (e.g. the insulation cut angle)
b) request user specification of detail (e.g. the insulation cut angle)
c) use common detail as default (e.g. a common insulation cut angle) and allow for user intervention to deviate from default

I think omiting details is not a good idea

Indeed building “patterns” for walls, floors, roofs, etc that are so close to their reality is the answer

How many different walls we have in the world? Just we need build their “pattern” then in software just you select the pattern you want and draw your wall

Also, these patterns have the ability to add or remove their “layers” and “details” too

There’s a long time I know its name “ATOMIC DESIGN”

One obvious example for omitting details in structural engineering is the inner structure of concrete. We typically omit the conglomerate details and model the concrete as continuum. Sometimes we omit details of reinforcement and model the combination of concrete and reinforcement as continuum with specific mechanical properties.
This is abstraction but still somehow consistent with reality.
45 degree insulation cut typically is not consistent with reality, is it?

It’s not

I think what we need is just two things (which some software today have both, for instance Dassault Systemes software):

  1. STEP AP 242
  2. Modelica

Then we will be so close to reality “if we add Atomic Design” to the (digital) built environment industry too (which this part is new and I’m working on it)

let’s be realistic (and i’m an advocate of the designers, mostly architects here - i’m one myself): architects won’t buy a software product that doesn’t provide that kind of time-saving automation

just forget it.

we need solutions that provide both.

i’m with you, but it’s not the only flaw of proposed solutions. if we go deeper, the whole modelling principle might appear wrong in those applications. that’s why i’m here for new insights.

I fully aggree that time-saving automation is mandatory.
But time-saving automation is the duty of the microservices (or authoring software products) that are operating on the models based on realistic model conception.
The model conception shall not impose modelling paradigms like compound walls that complicate the LOD-evolution from e.g. LOD-200 to LOD-300 according to

from LOD-200

to LOD-300

and further to LOD-350

I really happy to see some like-minded friends:

A year ago I porpused this to bSI friends and was/is believing that in this way bSI rooms can work better on IFD and IFC and will develop classes constantly

However, I think today IfcXtreme will play a role as something like FaaS for “semantics layer” which I have something better than it called FBM (because it’s a surprise) and Passports for Material and Geometry layer

I shared this before and many were against this idea, but today some have started to think about it.

“You don’t need different LoDs, you just need the highest LoD you developed once, not phase to phase which was good some years ago”

Some years ago, especially in big projects, contractor and sub-contractors didn’t trust to the model owner-consultants and designers were developed and wanted to develop it again from the start, so those days (and even still today) LoDs were meaningful but today things are changing “so fast”

P.S: Same conditions for LoI and LoIN too

If LOD-200 is not beneficial any more then wall layers more than ever are not beneficial, are they?

bS defines “IfcWallStandardCase”
“The IfcWallStandardCase handles all cases of walls, that are extruded vertically.”
“… and have a single thickness along the path for each wall layer …”
Does this make sense as we know that this will not at all reflect reality?
“… the quantity could be absent or in excess because the BIM model has a low level of development (LOD). The building elements that have layered structures such as architectural walls always face such problems. This is because during the construction phase, each layer may have a different dimension.”

Beside this I still do not understand why a standard case wall is modelled

  • by bS as vertical extrusion (IfcWallStandardCase)
  • by Software-vendor A as horizontal extrusion
  • by Software-vendor B as boundary representation (B-rep)
  • by Software-vendor XY as ???
    I feel similar like consulting medical doctors: Ask three of them and get back at least five diverging answers.

It is the purpose of authoring-software to ease life for building domain experts, not for software-engineers. Oversimplified model conceptions as concession to software-engineers cause limitations on artistic freedom and cause limitations on interoperability.
Model conceptions must follow building domain requirements and must respect building technology facts closely.

I re-establish the relationship to the original topic “Federated Decentralised CDE”:

If we find a standardized way to describe building properties and building element properties in a more consistent way then statement-based requirements engineering becomes feasible.
The statements originating from several stake-holders are collected in a “federated decentralised CDE”. The statements can be ckecked for consistency and solution of discovered discrepancies can be guided by workflow management tools. Proposals for topologies and geometries can be inferenced from requirement statements by algorithms. The proposals can be refined by additonal statements.

I call this again “Computer Aided Evolutionary Design (CAED)” and it is related to “Federated Decentralised CDE” by the “statements”.

The wall layer method is for solving current issues, especially IfcWallStandardCase
It’s based on walls that have a high “granularity” something like what we see in facade design
It supports patterns and templates, it means that you have better control on layers and wall parts

Standard Case is an old idea which still repeats in IFC

Back to Federated Decentralized Connected Data Environment (CDE) which you prefer to call it CAED which is a known method which has its own pros and cons

I think I shared anything needed to build an efficient CDE or CAED

  1. Modeling and Simulation-based Systems Engineering (M&SBSE) --> SysML + Modelica + STEP AP 242
  2. A Data Lakehouse which supports BIG Data (which Liked-Data is part of it) and AI (which is more than BI)

This is why today some well-known companies have started to use some software like CATIA, etc “more than before”

the granularity offered in this example, as in many from the bimforum is simply wrong. bimforum workout is a proposal, not a standard, and not anything that has to be followed.

in the lod200 phase of the design you don’t see any walls with components, you only see objects with the highest aggregation level. sometimes the schematic model has only spaces as volumes, in the next project phase you see more detailed elements.

i think the infamous depiction of a chair in 5 lod levels has done massively damage to the understanding of the lod data saturation. such a chair is only present in the lod 400+ model, if at all.