Grasshopper

algorithmic modeling for Rhino

Hello,

Has anyone noticed that grasshopper will significantly slow down once a definition reaches a certain size?

I have been working on a few quite big definitions recently - and can see my heaviest components (usually offset curves on curved surface), however i noticed on more than one occasion a general slow down that occurs when simply placing components and when plugging them in, where the window freezes for between 1-3 seconds.

Even after placing some relatively light components - whilst it had been working perfectly for a few days, and the whole definition then seemed to operate at about a 1/4 of the speed it should. This also happens when I disable parts of the definition.

Even just putting down a single component now seems to take an excessive amount of time.

Im working on rhino 5, 64 bit i7, nvidia 2gb etc. 8gb ram so I cant understand whats causing it or if there is some relationship to the number of components that causes this apparent stalling.

Any advice or experiences appreciated.

Chris

Views: 5686

Replies to This Discussion

I think every time you hook up a new connection the solution will re-compute regardless of the amount of work the new component takes up.

There are a few approaches you can take to minimise the issue

1) Scale down the inputs. e.g. if you are basing your project on a grid of 50 by 50 use 10 by 10 until you have a better idea about the direction you're taking.

2) Disable the solver. If you know you have to connect up 5 new components with 10 different wire connections. Disable the solver until the task is done and then enable it again.

3) Cache geometry or Intenalise. If the first half of your definition is working how you want it to then you can use the internalise feature of a Param component to disconnect the wire and set it in memory. and disable all the prior components so they no longer need to compute. Alternatively use the original Geometry Cache component to bake it to Rhino and call it back with a referenced name.

ALSO BEWARE of any components like BakeAttributes that have an activate Boolean. Unless this is set to false, each time the solution rebuilds it will bake its geometry to Rhino. I have been in the situation many times where I have an huge amount of objects in rhino, to the point that I have to force Rhino to close in order to get bake to using my computer. Make sure you set it to True and then back to false straight away.

Hi Chris,

Recently I have also experienced what you are talking about. However I run 64 bit i3 2gb ram. So it reaches it's limit a bit faster.

What works for me is to select 'Selected only preview' under 'Solution'. Disadvantage is that it disables the preview of of everything else. 

Disabling the solver might also work, but I always feel the need to have direct feedback on my actions.

 

Interesting topic, ciao,

bArt 

 

The same for me. Recomputing the whole definition for a simple component additon. At least that's how I explain the huge delay for a basic computation. 

There should be way to control which components to recompute every time you add a new component so that you can save tons of computation time keeping instant visual feedback.

That's assuming you are not affecting the existing part of the definition- like partial solver disabling.  

hi chris, danny, bart, vangel

i too suffered the same symptoms today on a huge gh definition.

the gh file is over 1MB, and within the components are working with large lists of data.

despite the scope of the definition, the solver was quick and the interface unaffected. that is, until this afternoon when even the smallest gesture's on the canvas would result in 5-20 second freezes.

i solved the problem by half-systematically tracing the largest sets of data, and spotting the stem of the problem as some erroneously grafted list of data, which through other components exponentially duplicated irrelevant geometry.

all back to normal.

sean

I've noticed the UI slow down. Even adding a component to the canvas is slow on my current def. The complete compute takes minutes which is normal for this definition. But rewiring, adding components etc seems really slow in the new version. 0.9.0006

One of the things I've noticed is that the speed of my definitions seems to be a sign that I'm asking GH too do something incorrectly. This has been so consistent I consider the speed of my definition as a sign of if its working as I intend. When things start to noticeable slow down(an obvious change in speed), I just try to re-make a section. Often I find that there are little logic errors, long chain of components that can be done with fewer components or just something that was never working as I intended in the first place.

Last night I just re did a bunch of stuff in one of my big definitions, and it started instantly solving 10 times faster. most the stuff I fixed started as organizational issues and then I found a few comparative path level logic errors, and even found a way to filter some stuff I didn't want instead of ignoring it as a result. these little tweaks made a massive difference regarding speed. and the definition is still pretty much the same size.

If your suffering from significant slow down's I'd start considering reworking parts of the definition that don't seem to be functioning exactly as intended or areas where maybe you are a little hazy on whats going on. it might the definition it self not its' size that's the culprit for the slow down.

I think the OP is not refering to the time the definition takes to compute, but rather that the UI slows down when you have too many components on the canvas.

I've noticed this too. I don't think it has to do with the speed of drawing components over the canvas. It might have to with GH having to update component, input, output arrays everytime a new component is added or something along those lines.

I haven't tried it, but maybe clustering some parts of the definition might help. That is if clusters work as "blocks". If they work more like synchronized groups (which i think it's more likely), it might not improve at all.

I also just switched to the latest 0.9.006, and with the very same definition (which indeed is huge, ghx ~1.4MB and I AM working heavily with pathes) the UI is terribly slow now. It takes about 3 seconds to add a new component ..!!!!

at least the definition works, as far as I see now.

I'll give clustering a shot,

but I would like to know the cause of this issue in the new release .. David???

Best

Robert

okay, clustering helped it - not gradually but interestingly suddenly, when I reached a certain limit (that I dont know) of components on a single canvas it worked fast again. So a few components more or less cause the UI to switch from superfast to delayed by 3 seconds...

I found the problem causing the slowdown. It wasn't so much large gh files but large amounts of data (all the time was spend computing boundingboxes of preview geometry over and over again).

I added some smart boundingbox caching and component insertion/removal delays are down from >9 seconds to <1 millisecond on the test file I was using.

Reason clustering worked is because fewer objects are previewed and therefore fewer boundingboxes need to be computed.

--

David Rutten

david@mcneel.com

Poprad, Slovakia

You are talking about the opengl preview, right?

This also happened when preview geometry was disabled on all components (maybe not so much as 9 seconds). Are bounding boxes computed even if preview geometry is disabled?

No they aren't. If you're still experiencing slow-down with preview switched off, please send me the file and I'll have a look.

--

David Rutten

david@mcneel.com

Poprad, Slovakia

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service