Grasshopper

algorithmic modeling for Rhino

Hi there,

Do you ever tested what´s the maximum amount of points for the multi-phase calculations ? I tested a huge one -with 50K points, and the simulation went fluently but after it was done, GH and Rhino just froze for 12h, and then I just killed it.

I have  32GB RAM , so it´s not really normal for it to froze. 

Best,

Peter 

Views: 597

Replies are closed for this discussion.

Replies to This Discussion

Peter,
I imagine that this is happening because you might be trying to load at least 50,000 x 8760 = 438,000,000 data points, which should take time but not 12 hours (it seems like there might be a bug at work here as well).
Does the simulation folder seems to contain all of the results for 50k points?
Also Mostapha and Sarith will know much more about this issue than I do.
-Chris

Hi Cris,

Yes, it has everything, I also ran it on Linux-CentOS, and tried to " load back in", but the same waiting time. I can´t upload the files anywhere cause it´s 45 GB.

Sorry to already pushing the boundaries of the tool :) 

Best,

Peter

Hi Peter,

Which analysis recipe are you using? There are differences on how Honeybee[+] loads the results based on the recipe.

For most of the recipes Honeybee[+] loads 2 * number of hours * number of blind state for each test point. The 2 is there because it loads the results for total and direct contribution separately. If you don't have dynamic blinds use the Annual recipe which is a flavor of daylight coefficient recipe that doesn't load the results to memory and just calculates the annual metrics.

Let me know what is your goal and we can take it from there.

Mostapha

Hi Mostapha,

Since I´m not using any BSDF for this calc, and have just one state (unshaded)- cause of huge overhangs, I´m using Daylight coefitients. BUT, since I want to show the contribution of each facade + skylight I divided them into groups, that´s the reason of the size + compare the results with Daysim.

It´s not crucial, I´m just interested where are the limits

Best,

Peter

Attachments:

We can and we will get to the place to handle large files like this. There are a couple of items that needs to be addressed before we get there.

For comparison studies to Daysim you should use the Annual Recipe. That is using the daylight coefficient recipe to run the analysis and similar to the Daysim workflow it doesn't load the results for each test point for calculating the annual metrics.

Here is the good news. If you copy your current files under the annual folder and change the recipe to annual and leave reuseMtx to True the component will read the results without re-running the analysis. That should give you the option to compare the values. No window group is supported though.

As I said there are solutions to be developed for specific cases. Keep us posted on your findings, needs and suggestions.

Mostapha

Thank you Mostapha,

I´ve got a wish/question, how it´s possible to do an annual simulation with different shading state each hour (light redirection). Do I need to write 8760 states, and then in postprocessing take them ?

Peter

Hi Mostpaha,

So I followed your advice and checked it with the Annual Recipe, but I´ve got this error at the post processing (see below), looks like just the first grid is read in, and the rest have some issues. Any suggestion?

Peter

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service