There are numerous developments going on in analyzing embodied carbon of a 3D model, especially in the building structures world. Companies are building their own tools, or using proprietary LCA tool Revit integrations to assess the eCO2 of 3D models at various stages of development.
There are too many out there for me to manually evaluate them all; but there is a large enough community of people who have tested them or use them already, so this is why I’m posting:
How do these tools actually deal with element data? And/or how do different consultants deal with associating the right data with their models?
At the heart of every different eCO2 analysis process, I believe we are determining a Mass for every element, then multiplying by an eCO2 rate. I’m building something from first principles, to maintain full control over the analysis. I’m exporting to Excel, a specific set of parameters for different types of elements, e.g. Concrete Beams, Concrete Columns, Concrete Floors, Steel Beams, Steel —, etc. the list goes on, because (as anyone dealing with Revit knows) different element types and families have different parameter complexities.
It would be a big imposition on the Revit modelling process to make sure technicians are getting this data correct when initially modelling our buildings. And things like concrete on metal deck, and reinforcement rates in concrete – these are all significant enough to warrant a degree of data refinement (e.g. kg of concrete per m2 of deck, and kg of reinforcement per m3 of concrete), but impractical to get that data into the model at the initial modelling stage. So my current plan is to to export a snapshot of the model element data at a particular point in time, manually assign the necessary data to individual elements (in custom parameters, using Excel), then BIM Link (a Revit add-in) can push it back to Revit.
This is not as smart a process as I believe it can be; it requires a good deal of manual post-processing of just a snapshot of the Revit model data at a point in time. How are others achieving similar? Or are they? Do existing proprietary tools do the same? Or do they do even less (i.e. less refined data)?
All thoughts and input welcome.