Grasshopper

algorithmic modeling for Rhino

Hi,

I am working a lot with the new cluster functionality. And I noticed a huge increase of the file size. I thought working with multiple instances of one cluster won´t make the file size that bigger.

For example I created a definition and a cluster which is app. 250 Kb. When I copy the cluster 10x in that definition the complete file size rises to 10x 250 Kb. But when I explode the cluster and copy the "cluster-nodes" 10x , the file size is much smaller.

Basically I don´t mind about the file size, but since GH 0.9 we had numerous errors which prevent us from saving the file.

like "We´re terribly sorry, but..... unable to save"

Next time I´ll post a picture

So, what I actually need to know, is there a dependeny between the GH file size and this error-message?

Thanks

Views: 741

Replies to This Discussion

Each cluster stores the data inside of it at the moment. I know it's far from ideal. I'm surprised it's that much bigger than exploded clusters though, the overhead per document shouldn't be that big I thought.

I'll start experimenting with storing clusters more efficiently when I have a couple of days of quiet time. 

Incidentally, are your clusters local or do they reference external files? If it's the latter, I can add a hidden switch somewhere that does not include the document in the gh file, but only the reference. As long as you make sure the referenced files are available it will work.

--

David Rutten

david@mcneel.com

Poprad, Slovakia

Yes, thats the point. Those clusters which were created with connections to the existing inputs are much bigger. So we started to replace these with "clean" built cluster without input connections. This helps a lot to reduce the file size.

At the moment we have only local references. It is much more comfortable to edit the cluster also we had many software crashes when referencing external cluster-files.

Maybe you can make a switch which deletes the external data which is connected to a created cluster.

Any ideas concerning the error message?

Unfortunately I have no idea what the error message is about. It may be a file write buffer overrun or something or it may have nothing at all to do with memory. 

I'm afraid there's little I can do at the moment.

--

David Rutten

david@mcneel.com

Poprad, Slovakia

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service