Grasshopper

algorithmic modeling for Rhino

radiation analysis - taking very long and using 30gigs of ram. too much detail or memory leak?

I am analyzing a geometry like shown in the picture.

Except I added a round base (not shown), to see how the mass affects the ground in terms of solar radiation.

ladybug radiation analysis grid size 2, 

and its been using nearly 30gigs of ram.

causing other programs i'm using to slow down quite alot.

its been many hours now too. and I am thinking maybe there are too many details or I might not be able to do too many analysis.

Since it is written in python, i wonder if there is possibly a memory leak issue here?

seems like there was a discussion by mr heumann about ghpython performance issue here especially when its running in parallel.

and if it just has to do with the geometry or how I have prepared it, I am wondering if there is a possibility of putting a max ram usage like in a photoshop.

This way at least I will be able to do other things while the radiation analysis finishes.

Probably hard to achieve but I wanted to ask if it is possible.

in this img you can see that the grid size is not super fine - to me I dont know if I can lower the grid size too much from because I want to get information about detailing the facade system.

Views: 6004

Replies are closed for this discussion.

Replies to This Discussion

Hi Youngjae,

1. What is the units of your document?

2. Can you share the file?

Mostapha

Hi - Mostapha, I am glad you mentioned the unit. It was in meters although I was modelling as if it were square feet.

1_When It was in meters, I tried

grid size 10 = 30 seconds

grid size 5 = 1.5 mins

grid size 3 = 2.7 mins

grid size 2 = ??? memory peaks and rhino freezes.

However now that I have switch the unit of the rhino file to feet, 

now grid size 3 = 18 mins. 

which makes i suppose since the analysis will have to work with smaller tolerance.

The below img is what i got after 18 mins. I think also the fact that I have joined the individual units with solid union also make it longer maybe? you can see the mesh triangulation not only around the corners of masses but also inbetween different units (if you look at the top level you will see)

oh, and I also have very little disk space left.

I would like to share the file but right its a big mess and has a lot of stuff that is unrelated to this particular memory issue, like revit interoperability and urban modelling. and the definition is set up so that it needs to have an excel file that feeds what you see on the lower left corner, wing mass scales. In order to compare design studies I am animating the index of list component that feeds the different scale of the wings and the width of the floor plates you see. you can see it in my video here. I will try to clean it up a bit when I get a chance, but it seems like grid size 3 might work as a starting point.

when I get around to extract values from the mesh vertices and actually apply different facade designs driven from the parameters, I would know better what grid size might be necessary.

Looks like you are on the right track. Just looking to the model I think that it should run much faster than 18 minutes. It doesn't look like you have so many test points. One solution that might help is to mesh your geometry beforehand and then use the mesh as the input for geometry. I assume that will save you some time in this case.

- Mostapha

I see, thank you for the tip. I am reminding myself about the mesh brep settings and how to mesh faces by dividing uv distance - I probably could use the paneling tool components by rajaa. I remember they work super.

When I do run a batch simulation, right now I just get viewport exports but i actually need to write list of values into an excel speadsheet each time it simulates. 

I see that from radiation analysis component I get a list of values with no tree structure, when I would like to have lists of values separated by which face it is on. But I understand this may not be possible because the geometry input is meshed and joined.

Ideally extracting colors from vertex points will give me a list of values that are identical to the values out of the radiation analysis component. I could probably use the find similar component or set components to check. 

However, the ram usage is an issue for me - these days I am often either detailing and doing stuff in revit and dynamo or have another rhino-grasshopper open isolating and solving a part of a bigger definition. while the rhino and grasshopper run definitions in batches. or I have 3ds max open running a GPU renderer. and It would be really useful to be able to set a limit on ram usage. Its just that even after the simulation is finished, grasshopper will ALWAYS maintain the level of ram usage , whether I am doing something with kangaroo or like ladybug that can be super RAM heavy.

maybe this can only be considered as a feature of the grasshopper as a whole rather than  from a component ?

Thank you again Mostapha, for these great set of components. and so many of them! I have been working on sustainable design issues since college. and very special that you guys decided to leave the codes really open source, too. 

p.s. do you know or use the pv watts website ? I havent looked into all the honeybee components yet but I remember using pv watts a lot when I was calculating on site solar energy potential. that plus 40 year period rain data for sizing the roof + water cirsterns. I guess what i am saying is that I dont know how much information and how many different kinds of information is contained in epw files.

I see that from radiation analysis component I get a list of values with no tree structure, when I would like to have lists of values separated by which face it is on. But I understand this may not be possible because the geometry input is meshed and joined.

True. If you put breps as input then the data will be branched but once the input is a joined mesh it will be flatten. You can use unflatten component to branch them out based on your meshing pattern.

Back to RAM issues there are couple of things that could be considered, and implemented. One of the solutions is exactly what you mentioned. The other one is to de-activate undo for the component. I assume that Grasshopper/Rhino keeps the data for undo in the RAM (I haven't tested this and I can be wrong but this is my understanding.)- Finally I might be able to delete some of the data after assigning them to outputs. RadiationAnalysis component is literally one of the first codes I have ever written in Python and it is a mess so if I want to edit it probably I would re-write it from scratch.

p.s. do you know or use the pv watts website ? I havent looked into all the honeybee components yet but I remember using pv watts a lot when I was calculating on site solar energy potential. that plus 40 year period rain data for sizing the roof + water cirsterns.

I like what pvwatts provides but for development purpose I'm looking forward to integrate NREL's System Advisor Model (SAM) to Ladybug at some point. It does provide similar data and in my opinion it is even more advanced. It also has an API which makes it much more desirable.

Mostapha

I am using exploded breps as inputs. I get one radiation mesh still but the radiation results are branched. So at first I tried to map the values with colors and represent the value with lines/ circles - just to start out. I tried to branch the values by what face they are on, using the data structure of the radiation results or by finding similar values between the radiation results and the color values on mesh vertices. However, handling 100k points with find similar poses challenge. I dont think it runs fast enough.

But now that I think about it, since the test points are branched and so are the radiation results , so I thought I am set there if I just use exploded breps as inputs to test an idea of interpreting radiation results for outlining embedded functional aspects to detail semi unitized facade components.

I am particularly interested in designing facade components using solar radiation to create a spectrum of thermal distribution between South/West and Northern facades, and between vertical facades and overhangs in order to draw natural ventilation across the floor plates during the summer and in the winter have the facade components work as efficient thermal break/ micro winter gardens / heat collectors.

At any rate, I would need face normals for most cases so I thought I could get the normals from mesh vertices. 

But as it turns out the testing points are not the same as the mesh vertices. The testing points are center points of the mesh faces. but no problem, I could just evaluate mesh and get normals in branches that way. easy.

Now the issue however, is that the points are in branches, but they are not necessarily in an order that is useful. It is because there are triangle and square faces mixed. for this one brep face, the algorithm created two triangular wedges that when in as a list of points, it is either listed in the beginning or the end of the points. so - if you read the polyline of the points, you see first the polyline skips the mid row, and anyways, you can see there is a problem with the way I'm using exploded breps as inputs although it would be the easiest way to organize and branch the testing points, the radiation result values, and the face normals. 

So alternatively I could mesh the breps first, then use the meshing pattern to unflatten the.. radiation mesh? I dont know what you I mean, I have to try it myself. If I can control the way the brep is meshed, and then It might be possible for me to also have a consistent logic of how test points are organized... which I think is really crucial to work with the radiation results. For example, If I want to some how get this information over to revit (I think Mr. Sobon recently release Mantis Shrimp), I would need to have the points organized neatly.

I will post if I get some progress on the meshing approach you suggested. I think the way triangle and quad faces are mixed are a bit random now... or it might be that I can find a clever way to list points... I think Mr Stasiuk had some interesting approahes with mesh indices.

// 

As for the RAM issue, I don't understand how grasshopper handles ram enough to comment on what you said. But if there were a "purge" function where the user can decide to empty undo history after saving the definition, and if that lowers RAM use, that would make a lot of sense.

// as for SAM, I'm glad to know it exists and I have to look in it!

so I decided to down the analysis area and up the sample rate.

this took about 22 mins to analyze. I have a i7 4700mq, hyperthreaded to 8 cores and I am beginning to think maybe this is not good enough. For a while I have been thinking about the possibility of gpu computing for leveraging computation but I am still scratching the surface mostly. I know its a kind of hot topic in scientific computing. and I know there is a community of parallel programming with python... but that's more down the line of my development...

anyways, I ended up just sorting all the points by their z value.

So I have been able to use the vertex to extract color gradient and draw lines aligned to the vertex normals. However the normals at the edges of the slabs I get diagonal normals, and they seem to be somewhat inconsistent. I would need to either round them up to a certain angle or cull them out... not sure how to do this right now. 

and I feel there should be a convenient way to organize and branch points by what faces, mesh, or close curve they are included. and I found this thread where I can test points by their surface inclusion. I have never used the "D" output of surface evaluate but this might work.

Well anyhow the fun begins - facade design!

I want to detail rainscreen panels and offset them where there is more radiation falling over the course of the year.

I also want the southern facade to work like a solar chimney (in the lower half, like a climbing solar chimney) that channels hot air building up the facade from the sun all the way to the roof, and pull air cross the interior out of vents. So I need to further divide the facade surface breps to fixed rainscreens and operable shades. 

my goal is to move the information over to detail and specificate to revit. and this would make a good occasion to test out the brand new mantis shrimp. I hope I can run revit + dynamo and this analysis together :)

So I think I might be at the end of this particular thread - the answer to the RAM bottleneck is - DOWNSIZE YOUR SAMPLES!

I think there is an easy way to do what you are looking for. If you just send me the results for two surfaces and let me know what you want to do I can help you to get the results sepratly for each face. I feel you are making it complicated by taking the vectors for vertices - Based on my understanding you don't really need it.

Mostapha

Thank you mostapha for the reply. I totally overlooked the testvec output and it makes things much easier and light on the feet.

The only thing I like about using mesh vertices is that the points line up almost perfectly with the panelization grid I used with paneling tools component. I am using the karamba mesh brep to and the grid patterns from these tools on the surfaces are in sync, and the points are more of a rectangular grid.

I did try to rebuild the geometry as meshes separately and join mesh before feeding it to ladybug in order to avoid triangles. but for some reason the component gave no output. It said solution exception: 'int' object has no attribute 'get bounding box'

Karamba mesh brep gave a clean mesh so I ended up using it for the analysis. I think you are right about it the mesh vertices approach being more complicated and it is really slow. I guess the ideal solution would be that I rebuild the vertical faces of breps as rectangular meshes (the vertical faces) and join that mesh with floor slab meshes that has some triangles. and if radiation analysis likes the mesh, then everything will be fine.

The download link for the rhino file is here: about 10mb.

Attachments:

Hi Youngjae, When I open your definition there are so many missing plugins as I don't have them installed on my system. To help this conversation going I made and example where I made two panels, then I meshed them inside the component, then found average number for each panel and moved the center point outward based on this number. I hope this shows the workflow.

Mostapha

Attachments:

That solution works wonderfully. Here is my version of it. Thank you Mostapha.

Attachments:

The approach you showed me was what I was looking for. I am more versatile modelling with nurbs surfaces so I was going to have mesh as like the information layer and the nurbs surfaces as the fabrication layer. But since branching test points and test vectors relating to a large number of surfaces or curve inclusion is such a bottleneck, now I do get the feeling that gaining more sophisticated control over the meshing process would be important to work with ladybug components. 

By the way, I have noticed that the radiation result is very different whether I input one mesh or a lot of surfaces. and the earlier version of the component (0.0.54) is now throwing up error saying "too many values to unpack" which means the component is out of date, I suppose.

On the left image you see radiation mesh (0.0.58) with one mesh as input, and on the right you see radiation mesh with exploded breps as input. To see what is happening inside the box with exploed breps as inputs, I tried to see how the meshing is done inside the component (0.0.58) but not a lot of progress so far reading it.

I am attaching a more concise definition and the rhino file if you find the time to take a closer look. 

Rhino file is here

Attachments:

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service