Grasshopper

algorithmic modeling for Rhino

Hi All, and David specially

There's a cool software called vvvv, I?m sure you know it. One of the amazing things it has implemented is a comand to check memory consumption by each node. It would be supercool if we could see (maybe color, or value) all components memory consumption... To optimize definitions and let GH work smooth...

I have no idea how difficult is this, but I think would help a lot to create better deffinitions!

Cheers

Views: 486

Replies to This Discussion

Hi Pep,

this is very tricky, since a lot of memory is shared among different components. Meshes, BReps, Curves etc. are only duplicated when they need to be changed. In short, if you were to add up all the memory of all the objects inside each component, you'd end up with a lot more than is actually being used.

ps. Why do you care about memory consumption? Are you running out of memory? Would it perhaps be of more use to label objects based on the time required to solve them?

--
David Rutten
david@mcneel.com
Poprad, Slovakia
You are right, I meant a components "termometer" .

I'm not running out of memory if I do things properly, and that's something you learn with time. To have the posibility to see which components consume more time (what I refered before as memory) would help a lot to improve definitions performance I think.

Being able to compare combinations of components that together do the same (example: divide curve, divide lenght, divide distance, I remember a post where this was explained by you). And more complex combinations with the same goal...

At least I think is interesting as an "educational" add-on or widget to show and explain different behaviors and performances of nodes and definitions.
First iteration:


--
David Rutten
david@mcneel.com
Poprad, Slovakia
Very cool... The information is perfect, maybe to have some color would help, but I think its difficult to set a scale...

The metaphor I used before, speaking about a component termometer it might not be that metaphor, maybe, there could be a button or a widget that let you see the "temperature" of your components. Like a CFD thermal analisys picture of your definition...

I imagine that this complement may slow down performance, but should be something user-actived-disabled. I can imagine a picture where every component has its own color (depending on the "thermometer") and the wires with a color gradient from the first component color to the second one.

Hehehe, its christmas, time to ask for it. But as there's not much interest on the forum, might not be something people are interested in... anyway, thanks for your time, fast reply and faster implementation. As allways, amazing David.
Yeah, it still looks pretty ghastly. This is merely to see if the time spend in certain well defined functions is a good indication of processor time.

We'll see where this ends up. As it turns out, it's also quite tricky to measure relative performance times, as a lot of information is cached in between runs. The first time a VB or C# component runs for example it has to compile the code you wrote, so it takes a long time, but after that it runs quite quickly. I hope I won't have to resort to average runtimes in order to get a good indication...

--
David Rutten
david@mcneel.com
Poprad, Slovakia
Second iteration:


I'm measuring the worst case scenario in all cases. So all subsequent iterations which tap directly into the cache are ignored. It doesn't give you a very accurate representation, but it does highlight problematic areas much more clearly.

--
David Rutten
david@mcneel.com
Poprad, Slovakia
super cool! This is info I've assumed in my head before. I agree with wht Damien has said below:

Profiling is one of those things that can open your eyes as to what might be a potentially good way of doing something and a potentially bad way. I think this will be quite helpful in find out some "best practices".


This is another way of looking at the def, and I think personally it would be handy in teaching situations where there are a few clear strategies to solve the task at hand...it would be great to compare...

Whether this means that a strategy is better than another based on faster solving time, I am not so sure, but at least it will provide some insight into this question.
Good to see the interest of you guys. I felt so many times the doubt of following one or another way... that could be a tool to make the desicion easier (or at least with more "sense").

Willing for using it!!
Whether this means that a strategy is better than another based on faster solving time, I am not so sure, but at least it will provide some insight into this question.

It might turn out that the age old GH axiom, "The less components the better," may not be a universal truth...
Very very nice!

Maybe this doesn't have to show up for every component.. (Imagine a situation with 200-250 components, I am guessing there would be considerable processing power used up in calculating these times itself). If it works on selection alone, it would probably implement faster.

Theoretically, does this mean the total solving time of the definition is the 'chain of components' that takes the longest time? In the picture above, it would be the chain consisting 'point-curve-divideDistance'?

Because that still adds up only to 97%, I am assuming the Point and Slider component start solving in parallel, and the two Divide components also start solving in parallel?
Hi Suryansh,

the overhead in measuring the times is negligible. I'm using System.Diagnostics classes which are designed to be used in profiling environments.

It is only possible to measure the time-interval once, and to test whether or not the measurement will be of use will take more cycles than to simply measure it.

There is of course extra overhead in drawing the feedback on the screen, but the Profiler is implemented as a widget and can thus be enabled/disabled from the View->Widgets menu.

I have given no thoughts yet about chain-times. I'm not sure it would be useful, or whether it actually is possible to identify well-defined chains in a complicated network. But I think it might be useful to be able to see the total time-span for an arbitrary group of objects without having to add them together on a piece of paper.

--
David Rutten
david@mcneel.com
Poprad, Slovakia
CPM: Critical Path Method.

But there is also a variation called CCPM: Critical Chain Project Management.

Would it be hard to implement a procedure from one of the above methods?

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service