Grasshopper

algorithmic modeling for Rhino

Dear all,

I have noticed that as I use HB and LB they gradually eat my RAM and never free the memory even after an operation is completed. This causes my computer to eventually crash.

Is there a way to free the memory or other tricks to prevent crashing. (Other than adding extra RAM because it is a company computer).

Regards,

Daniel

Views: 481

Replies are closed for this discussion.

Replies to This Discussion

Hi Daniel,

I have experienced similar situations where I eat up all 16 GB of my RAM on my machine.  I think that the unfortunate reality is that, compared to other non-visual means of scripting, Grasshopper is not necessarily the best at dealing with huge amounts of data in relation to the memory of the machine.

Usually my way around the problem is to alter something to the specific simulation that I am trying to run (either decreasing gridsize or something of that nature).  What simulations are giving you the most problems with the memory?  Also, what is the memory of your machine?  You really need to have at least 4 GB to use Ladybug + Honeybee well (I think that this isn't too far from the system requirements of Rhino).

-Chris

Dear Chris,

Thank you for replying, your videos on Youtube are amazing btw, thanks also for those.

I have been trying to carry out a full size energy simulation for a commercial building attempting LEED.

There are 7 storeys and over 200 thermal zones. (no typical floors)

The machine I am using is an i7 with 32GB RAM.

I have also tried to isolate operations and internalising data in BREPs after each long computation. I found that HB is good for around 30 thermal zones and the maximum size is around 60.

Regards,

Daniel

Hi Daniel,

What are you exactly trying to do with Ladybug and Honeybee? Is it happening for any type of analysis? We need more details to be able to help.

Mostapha

Dear Mostapha,

Thank you for replying. The problem mainly happens while it tries to solve adjacency for many zones. I haven't been able to sort out the adjacency so I am not sure if this would also be carried over to later operations. I think I have been pushing the limit of HB a bit by having 200+ thermal zones across 7 floors but anyway, I still think HB has got great potential especially because Rhino is the primary mean of 3D communication between Architects and sustainability consultants here in Japan.

Regards,

Daniel 

Daniel,

200 zones definitely sounds like you are pushing the limits of HB (as well as E+ if you were to get to the stage of running the model).  My recommendation to you if you don't want to go the route of a more powerful computer (it sounds like you've already got a good one) is to break up the building into a few separate energy models (maybe break it into 3) and put adiabatic walls in between them.  

Being strategic about where you choose to break up the model (across boundaries of similar microclimates) will help ensure that your results are still accurate while you narrow down the system boundary of the building.  Also, if there are any cases where you expect two or more zones to have similar thermal behavior, joining them into one zone will also help cut down the solve adjacencies and model run time.

It's always difficult to hit the right balance between highly-resolved zoning that ensures an accurate result and the grouping of microclimates to ensure a fast/manageable run of the model.  Zoning is an art.

-Chris

Daniel,

I know that I am 2 years too late with this comment but, over the past few months, we finally identified the issue was that was causing your memory (and a lot of other people's memory) to blow up.  You can see in the Release Notes here, that the memory issue has finally been fixed:

http://www.grasshopper3d.com/group/ladybug/forum/topics/release-not...

-Chris

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service