Grasshopper

algorithmic modeling for Rhino

Hi there,

I have a definition that calculate a lot of data and is starting to use a lot of memory.

As you can see from the screenshot, just opening the definition, it takes 10sec just with a blank rhino file and without any data.

I can manage hundreds of lines (input data), but over 3 or 4 thousands it get stuck.

I realised that the RAM use reach the maximum (8 Gb in my case) and CPU doesn't work (1% for Rhino process).

Is it possible to control the RAM use in order to arrive at 75% of RAM, in order to keep CPU working?

Needless to say, that I have to optimise my definition!

Thank you

Views: 592

Replies to This Discussion

Memory use is what it is. When you start to need more memory than your RAM has, the computer starts using the hard drive as a memory stand-in (this is called "paging") and that absolutely tanks performance.

In some cases it's possible to reduce the memory footprint, but it really needs a case-by-case analysis. Do you know where most of the data is stored and what sort of types? Breps, numbers, meshes, bitmaps?

thanks David. It's what I'm doing now (checking the heavier components). Is there any list where I can see what works more than another?

Unfortunately no. As easy as it is to profile time, it's that difficult to profile memory. A lot of memory is shared between different parts of the program, so you also have to be careful not to count some data twice. Then it's even harder because some memory is maintained by the .NET garbage collector, while other memory is managed by Rhino itself.

How about the global memory foot print? E.g. I found "memory" enumerated in Grasshopper.Kernel.GH_ProfilerMode. It's description is "Indicates memory footprint is being measured". How is data being accessed?

It isn't, that part of the profiler was never implemented.

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service