What makes a project MEGA?
|I know what makes a man mega, though. Hint, arm canon.|
The answer is not so simple. Is it square footage, file size, campus size, density, or something else? I honed in on the file size debate because this is something I have railed against for some time, that is, file size is not the best indicator for how models should be broken up. In fact, the fear of big files drives many to inflict upon themselves wounds that aren't easily rectified. Let me explain.
Years ago, in my Revit noobness, I was taught the concept of 'Lazy Parsing' (thanks Phil). It sounded like a bunch of database mumbo and jumbo, and paid it no heed. That was easy then. Revit pilot projects tended to be small, easy, and predictable so we could focus on the tool itself rather than a unique challenge of the project. As my Revit prowess grew, the concept of splitting and linking to maintain smaller files cropped up. Lazy parsing, I remembered...
|Pictured: Mr. Parsing|
Back to lazy parsing. The concept is simple. Revit loads what it needs to show you what you want. A 400mb file with four 100mb worksets is better than four 100mb files any day. The problem with that concept is it requires the team to take care to place things on the right workset (novel, I know) so they can be loaded/unloaded at will. Problem solved, right?
|That sounds harsh; let's just say incorrect from now on.|
Only through careful model management can true large file zen be attained. This coming from a guy whose last several projects were in stable single files, all north of the 500mb mark (just architecture, mind you). Sure it's work and your team has to change how they work in the file (specify worksets, anyone?), but for the greater good of the process, it can be done. I'm not for keeping all the data in one place just to say we did, however. Break files up based on the needs of the project, like separate buildings, DIs, discipline, or team location.
Just promise to not break your file up by floors.