buildingSMART Forums

Presenting Blender as a new IFC authoring tool

I think there are some slight misunderstandings here - IFC since the very beginning had always decoupled geometry from BIM data. It was always fundamentally possible to model geometry with a whole variety of techniques, use them multiple times, or even have no shape at all, for any arbitrary BIM object. The fact that most authoring tools choose to limit your modeling based on the type of object has been purely an artificial vendor limitation.

That said, I thought I’d just update the thread with some of the new features since I originally posted:

  • Improved support of project libraries, so you can have multiple project libraries, and easily relate/unrelate them to projects
  • Property sets now detect data types from the property set templates, so you can define your own property set templates, and it will read them that to allow you to enter in data quickly. I am not aware of other vendors using property set templates, so this is exciting news!
  • Implement IFC georeferencing through map conversions and target CRS definitions. This makes three vendors that I am aware of that do this properly: ArchiCAD, FreeCAD, and now Blender.
  • Allow you to export 2D / 3D wireframe geometry, this allows some very efficient filesize savings where a full solid representation is not required.
  • Export basic material colours and rendering styles including transparency.
  • Allow export of an externally defined material, such as a .blend file, .vrmat file, or even .mat Radiance file for lighting simulation!
  • Units are no longer hardcoded, and now support different types of metric units which can be set through the Blender UI, so you can choose to work in metres, millimetres, etc.
  • Allow property sets to be assigned to types, and have object property sets override type property sets.
  • Some optimisations and bugfixes

Here is a cool little graphic showing the 2D representations / wireframe view support for a chair furniture:

1 Like

Thanks Dion, as always you’re in the right path.

Let me explain today issues related to “ISO 19650” and “Digital Twin(s)” which will give a better view about today and tomorrow

At the point firms start to implement ISO 19650 in their projects, the file sizes will increase because of Level of Information Need - LoIN

Also, Digital Twin(s), especially its Time Series (which we see Time Series in IFC4 ADD2 TC1 too) will cause the size of files increases even more

So, some have started to think about “data/information exchange” and find “efficient” ways:
We hear some whispers about “binary solutions”
We see some have started to focus on “decoupling strategy”

Early I saw Autodesk decreased the size of Navisworks file sizes about half.
I see Bentley Systems on SQLite Symposium these days, which I think is logical, Bentley System as a leader in the Infrastructure mainly uses CityGML I think and CityGML to SQL is a normal workflow
Also SQLite plays or will play a vital role in Internet of Things - IoT and some have focused on

I think today we need “dynamic solutions” and some have focused on this area

jeeez, what we’re striving for now is the right amount of information ‘coupled’ with ifc. decoupled and incoherent data we have more than enough :wink:

digital twin is an interesting concept, though. i’ve just spoken with one startup to utilise this here in poland…

We work on our international Smart-FM startup in residential buildings (and have to solve big big issues/obstacles, which we have done)

i can’t report that startup data (they asked me not to yet), but othewise i understand a digital twin to couple all possible data to the geometry representation of the physical object, especially self-updating one.

A few new features have been implemented:

  • Objects and types can now be associated with external documents, such as plans, brochures, specifications, warranties, etc.
  • Objects and types can now be classified using classifications such as Uniclass / Omniclass / anything you choose.
  • Objects and types can now record qualitative constraints in the form of objectives. This can help record design intentions and health and safety strategies, among other things like code compliance requirements.

An example of objectives is shown below:

Capture

1 Like

Some more features have been implemented:

  • Documents and classifications are related to the project itself.
  • Support for multiple representations are created. Previously, only a ‘Body’ representation was created. Now, you can add or remove multiple representations, some of which may be automatically generated from the ‘Body’ representation, and some which you have full custom control over their content. The supported representations are:
    • Body
    • Axis
    • Clearance
    • Footprint
    • Reference
    • Box
    • CoG
  • Although Blender has first-class support for manipulating breps, you can now choose to export simple extrusions of any angle (i.e. not necessarily normal to the profile curve) as a SweptSolid, giving the first steps towards supporting IFC roundtripping with programs like Revit which support these types of geometries but not other types.

An example of Blender and Revit IFC roundtripping is shown below. What is also shown (do you see extra shapes in the Revit screenshot?) is that Revit imports alternative representations that were authored in Blender. Revit doesn’t seem to have controls to manipulate them further, though, but at least the data is not lost.

3 Likes

but that means, referring to your previous comment, that “all possible data” begins, or may begin, as decoupled, and then can be coupled with a representation. Moult gives you control to assemble the data you want, the way you want (because it’s decoupled) and then coupling it as you wish with any representation.

Dion these days practically shows that: IFC and IFC schema can have which structure?

IFC can have a small and structured core, and lot of decoupled Entities, Templates/Psets, etc.

I explained the whole picture here:

yes, rob, data assemblage control is ok, but think of all the users worldwide, not acquainted with the bim method yet.
i’d more welcome ehsan’s notion of some automatic “generator” of the proper level of the log and/or (if decoupled) information in the authoring softwares.

hmmm. What’s @Moult say? I see your point gester, so, I’d like to see that too. Important though that, I think, Dion’s motivated by the experience of users who ARE familiar with the bim process. tightly controlled coupling breaks down often for several reasons. I think he’s trying to address that. Maybe Dion’s got something for reviewing models, so the right properties sets can be applied to the right element sets, to fill in the gaps created by normal bim usage, where the coupled tools don’t suit every needed case, and where added required properties end up not consistently applied.

and this is the main point here:

  • who’s to decide which information is the proper one for the particular granularity level, and

  • how far this can be applied automatically?

Here I explained what the Live BIM is?
And also, here and here I explained why we need SQLite/SQL-based [1] [2] [3] schemas and approaches

It’s been a little over a week, and I’m very excited to see Blender actively used on some commercial projects! I find that the modeling speed increase is by far the greatest benefit - it allows for rapid experimentation of designs and ideas, similar to sketching on paper, but with the ability to retain semantic BIM data.

As for the discussion about decoupling and coupling, I’m afraid I’m not entirely certain exactly what the problem is. Although I have some questions about the IFC spec, I generally agree with the approach and their level of (de)coupling in the schema. Perhaps you are discussing more about how vendors have chosen to implement IFC authoring and what UI they provide - but this is a separate matter. I have not broken or changed any rules of the IFC schema in my implementation. From a UI perspective, I have adopted a lot of authoring concepts from the CG world - things like mesh reuse and property assignment to geometry are old concepts that have been used for decades.

Some new features include:

  • You can now create nested element relationships: for hosted objects and nested components.
  • Material layer sets can be defined
  • Material constituent sets can be defined
  • Predefined door attributes can now be defined and assigned to door types
  • Predefined window attributes can now be defined and assigned to window types
  • It’s now packaged for more end-user testing! See below:

Blender BIM is also now available in a packaged format for Windows, Mac, and Linux! It now joins FreeCAD as being the next cross-platform BIM authoring application. It is now an installable package available here and has updated installation instructions.

Also, the OpeningDesign architectural studio has been very kind to provide an excellent repository of BIM projects to test on! They work on completely open-source projects where the entire workflow is available to the public under CC-BY-SA 4.0. They are also providing test data to test out data roundtripping between different BIM authoring tools. Big kudos to @Theoryshaw and @yorik!

Here is one of their projects imported into Blender, showing the beautiful real-time rendering capabilities of Blender and clean mesh output which allows for easy geometric manipulation thanks to IfcOpenShell.

Dion, you did what I was scared about

You have two choices:

  1. Blender as a main platform for BIM
  2. FreeCAD as a main platform for BIM

And you have chosen the second scenario, which personally I think is wrong
I appreciate efforts on FreeCAD community, but as I said before FreeCAD is not a use friendly software (it’s like Solidworks) and it’s really hard to see it as a software many be comfortable to use it

Also, you have to see things with “national digital (delivery)” and “smart city” and “digital twin(s)” lenses to understand current obstacles related to OpenBIM, and BIM (See this project)

@Moult
i’m having hard time understanding the advantages of blender for architectural activity, and, beside obvious advantages (free modelling of everything, free assignment of data, either relational, or not) i’m missing significant issues, which might be based on my lacking deeper knowledge of blender, though.

what i miss are first of all intelligent systems that understand one another: eg. windows in the walls (including components overwrapping), flat roof drainage, which adjusts particular roof components for the slopes, automatic roof/walls framer, curtain walls with automatic edition of elements, and many, many others.

at first glance the architectural/structural modelling in blender is a cumbersome act.
am i missing something?

rob

At the moment Blender is a raw software for BIM,
But it has some vitally important bases that has the potential to make it one of the most promising BIM software in the Digital Built Environment Industry in the near future

These days all talk about AI, DL/ML, but few know what really they want from them?
If you want the software automatically recognizes the building objects/components and their relations, so some work on it in “national digital (delivery)” projects and in the near future we will see more and more solutions related to this

Blender can be a nD software and can handle all your considerations, because has “Python” consul and you can write appropriate codes for anything you want

It’s something like Grasshopper, it opens a door for many to do what they want

@ReD_CoDE - Blender BIM only contains code for Blender. It has nothing to do with FreeCAD. That said, FreeCAD is a very powerful and capable program developed by talented people. I have nothing but respect for FreeCAD.

I do not believe in a “main platform for BIM”. The industry is too diverse to be delivered by any one software. I believe in using the best tool for the task, and using open data standards like IFC to integrate together. Objectively, with Revit’s 5 modeling tools, we can conclude that Blender, a dedicated modeling program, is preferred to the task of rapid and complex architectural modeling. Equally objectively, with Blender’s lack of structural analysis capabilities, we may use a superior tool such as Tekla for producing steel work. I would encourge diversity in the industry where possible.

@gester - thank you for raising this point. Although it may differ in other disciplines, I find that in the architectural discipline the use of parametric behaviour allows for speedy modeling with one caveat: that your building is very simple. The ability to place a wall and a window within that wall in 5 clicks is alluring, but quickly reaches its limits on the “design complexity vs modeling speed” graph.

That said, once the design progresses further, or if the designer is more ambitious early on, these parametric modeling often hinders rather than helps, I find. Evidence for this is seen when architects often use other software to prototype their ideas, and then painfully redraw later in their documentation tool. Often, they still use renders from other software to communicate the designs. Another example would be stairs - where a parametric stair can be set up with a single click, but is often not quite correct, and often generates a 3D form which is simply insufficient for a certain level of detail. A third example would be the difficulty in using a parametric modeller to provide LOD400 BIM files - it would be “mostly” correct, but “mostly” is not good enough when we are using for fabrication. The simple issue of wall joins, despite the intelligence created by parametric priorities of IfcWallStandardCase, simply creates more frustrations than it solves, leading many architects to simply model layer by layer. You may also see evidence in designers using Rhino + Grasshopper to do custom parametric design (these features also available in Blender, though to a lesser extent).

That said, if you do need these functions, they exist in the Blender ArchiPack. They are also rather trivial to build and extend - creating and splitting walls is practically instant, and automatic voids for inserted elements are not far off. If you are missing drag and drop content libraries, there is a huge ecosystem out there from the Blender Cloud, and trivial to create. Reusable content libraries and parametric geometry is not unique to programs like Revit and ArchiCAD. They are plentiful and easy to create.

You would also be surprised at the speed of modeling, if you have not yet seen a fully trained CG modeler employed to model a building - a building for them is a very simple exercise compared to the complex Hollywood-style scenes that they aspire to. A project I am on has demonstrated that we were capable of modeling a 5 storey building down to the curtain panels, screw flutes, baffles in the precast panels, backing rods, sealant and packers, in a meagre few weeks, whilst simultaneously designing the building (and doing things like changing the grid - a very parametric modification). This model can then be brought back into a documentation tool like Revit to have sections / plans cut out and placed on sheets. The model has been animated, rendered (in real-time), experimented with lighting and texture, and had prototype sample designs in minutes while they were being discussed in meetings. If you have not seen some speed modelers, you may be very surprised to see what you might discover :slight_smile:

Modeling features aside, I am building as close to native IFC support as you would find anywhere - leading to a large amount of control over the BIM data. At the moment, I believe I have implemented more OpenBIM features than Revit, although I would love to be proven wrong.

Blender has a huge ecosystem: animation, building physics, amazing lighting simulation capabilities, and, of course, a very customisable setup with an interactive Python shell.

Features aside, the industry needs a bit of open-source :slight_smile: We’ve had monopolies for a while now, and I’m not sure how much benefit they have brought.

2 Likes

I shared the answer for all your considerations, but just few people can recognize what the solution is:

BIM HAS BASE ISSUES

BIM has some base issues that even repeated in ISO 19650 that if we don’t solve them we won’t see a real Digital Twin(s)

“BIM IS A STATIC APPROACH BUT WE NEED DYNAMIC ONE”
“BIM HAS DEVELOPED FOR DESKTOP AND SIMPLE SYSTEMS, IT IS NOT EFFICIENT FOR COMPLEX SYSTEMS LIKE DIGITAL TWIN(S)”

I included the answer/solution inside the content:
Keys: Dynamic + Complex
So, who can recognize the answer? Those who are in Building Performance and Building Energy area :wink:

BIM+Simulation is the answer. “The industry needs a new schema/approach”

I talk about a live model, a real representation of reality inside the software

And to achieve this level we need to combine IFC with at least two other things (which both are from simulation territory)

So, I shared the solution with buildingSMART friends, especially to @jwouellette, @berlotti, and a little bit with @jonm
Plus @Moult which knows clearly what am I talk about

However, what am I thinking needs time, and I ensure that we will see it soon, from buildingSMART or maybe other organizations

The industry is competitive and those who plan better and act better in the long-run are the winners

@Moult
‘leading many architects to simply model layer by layer’

no, neither archicad nor vectorworks have this necessity for the walls, it may only concern revit, which is actually a software for engineers, not designers.