Hi there, i am witnessing what seems like a stand still of certain DWG based software when it comes to speed on processing ifc. The conversion process to have data available takes forever. What specific demands are being formulated for the certification of new ifc4. @jwouellette can you guide me with this? I only search for the word: speed in a pdf of b-cert but that did not pop up a result…
@Hans_Lammerts I’m sorry about your experience, but speed is NOT a certification criteria for IFC2x3 CV2.0 or IFC4 RV1.2.
I believe the idea was brought up at the time of formulating the criteria and was argued, but the consensus was that we needed to focus primarily on fidelity (which is no trivial feat in itself), which can be more objectively judged versus performance, which can be very subjective (how fast is fast enough?).
Ouch! That is a serious weakness in the certification. How fast is fast enough? Well, that is a question that needs be addressed by Buildingsmart and the user base. At least NOT 2 hours for just a very simple building. You know what the say about time, right? It IS money. Even with a new IFC4 certification
I have to agree with you @Hans_Lammerts , that the processing of IFC Data in different Software is a crucial task. I think we all have this experience of… time is money
But isn‘t that what makes competition in the market grow? Some software process data faster and others slower. That is why I don’t know which criteria (as @jwouellette mentions how fast is fast enough!) could be set by buildingSMART to allow software certification. Project complexity varies which means different data types. In the end as an end user if you open a 1Gb file you while still need to maybe wait, even though the software might be certified. Maybe I am wrong, but buildingSMART delivers the structure and the software developer develops a code to read that as fast as possible.
Hi @agron, Firms BUY! (for quite some money…) software and the certification suppose to say something for the ability to use it in a reasonable matter. A minimum is needed. The argument that the laws of the free market will regulate or somehow correct this flaw is, in my opinion, not valid. It needs to be addressed in an orderly matter if you aim at governmental organisations enforcing ifc as open data BEFORE it can be used in the market. As i see it, the only other alternative to get around is is to skip all sofware vendors and go for open source. Build your own car.
As i see it, the only other alternative to get around is is to skip all sofware vendors and go for open source. Build your own car.
why not? this is also an option that the market does not prohibit.
In my experience, there are some already good open source solutions that can handle better the IFC structure then the paid versions. I think this will make all those software producers rethink their strategies to argument their high price. On the other hand the end users should be the ones pushin the software developers to improve their stuff…critical feedback.
There are a number of issues with certifying performance.
- Certification tests are generally small. Performance on them wouldn’t say much about larger projects.
- Performance may vary wildly on type of data instead of size of file. For example, a small file with 10000 BReps may perform much worse than a large file with shared extrusion geometry. So a small number of benchmark files may overestimate or underestimate performance.
- What’s the benchmark machine? How is the time measurement normalized?
All of that said, certified or not, people are going to complain, and vendors can only ignore complaints for so long. I would say that certain companies are definitely hearing this and are looking at ways to work on performance.
May i ask what official goals certification has? And i mean that seriously, without any undertone intended.
Fidelity… based on a design coordination workflow for IFC2x3 (thus the “Coordination View” MVD), as well as design coordination and other workflows that might benefit from a “Reference View” for IFC4.
Speed means nothing if the results aren’t reliable across the board. Fidelity was, and still is, a primary concern. It is also the goal/criteria that is most feasible for buildingSMART to manage at this time.
Acceptance = fidelity × speed . The value of all this metadata means nothing if it takes 10 minutes to get the most simple geometric shape across. This formula can be used over 3D CAD versus ‘BIM’ in general.
One file from the GEOBIM 2019 benchmark is very interesting to use and benchmark speed on import:
Uptown.ifc filesize : 240 mB
More information about the backgrounds of this file : https://3d.bk.tudelft.nl/projects/geobim-benchmark/uptown.html
" Note: The use of semantic entities is not perfect. For Example, many IfcBuildingElementProxy describe elements having well-defined semantics in IFC (e.g. stairs) and many attributes are missing, but we will evaluate the consistency of the model with itself through the conversions."
IFCviewer (user timed, roughly):
FZKviewer, Karlsruher Institut fur Technologie (logfile, minutes rounded)
Time for preparing data: 7 min, time reading and parsing IFC: 9 min
Autodesk Navisworks Simulate 2021
Aborted after 2.5 hours at 12.5%
BIM authoring software
Blender : no results yet, gave up after +40 minutes)
Allplan : two hours
BricsCAD BIM DWG : 6 minutes.
AutoCAD IFC importers : ruling that posibility out, no chance this file will ever get in.
Revit release 2019
Those findings are consistent with what I have experienced too in the industry. There is a large amount of upskilling and education to be done. The biggest influencers are the vendors in this case - if a vendor makes IFC authoring difficult, or opaque, then it is no big surprise that users are unaware or uninterested in pursuing proper BIM data.
Perhaps buildingSMART should address these issues in their certification process. It’s no use supporting an OpenBIM specification, if it is prohibitively difficult for users.
As a case in point, refer to the OSArch Wiki article on how to do IFC georeferencing in Revit. It took a thousand words to describe the process, and requires external patching tools and files to be downloaded.
For speed, can you retry the BlenderBIM Add-on with the multiprocessing option enabled, that may help get you a result, though I still expect a long time. Keep in mind that there are optimisations still to be done to the BlenderBIM Add-on, as well as improved partial / incremental loading which can solve these issues that are a work in progress. Speed is incredibly important.
At a brief glance at the file, I also can see why the speed is so poor - I will elaborate on this in a future post on this thread, and also perhaps propose some solutions.
Thank you Dion. Absolutely agree. I will continue the benchmark later. To me this conclusion from GEOBIM and your writting further illustrate the need for clear good sample models. Not just snippets.
Please correct the name of the “FZKViewer”. Could you grant me access to the file on the Google Drive? I have already tryed to send a request. It´s interesting to see that our viewer is not the slowest
Have you checked with Solibri?
Time to beat !
BricsCAD in 350 sec. (6 min)
Filesize : 106 MB
I say, the first workable solution found.
@andreas.geiger I have not checked Solibri, but you can give it a try.
Of course, machines vary but let’s just forget about that.
I real world this is also very different.
Naviswork: stopped the import proces after 2.5 hours when it was 12.5%. The waste of time and energy on processor power doesnt’t feel right.
@andreas.geiger Any results of Solibri?
In the process of importing the Revit ifc export into a newer release. Does anyone know what the influence is on use of (lots of) shared parameters? My hunch is these are holding things back making it fluently. @Moult
See link to the logfile in earlier post of summaries.
BIMcollab ZOOM is crunching the file really fast. Less than 2 minutes… Am I doing something wrong?
If I’m not mistaken, it uses the IfcEngine.dll libraries, which are known for being very fast.