Grasshopper

algorithmic modeling for Rhino

(Edit. David sez: Please append your (well explained) ideas/suggestions/wishes. I don't care much for what you want, I care why you want it.)

Perhaps this topic has been covered recently, but I don't see any active threads.  We're looking for a plugin project and I'd like to get some feedback from the power users before choosing something.

So...

What is missing from grasshopper?  

What would you like to connect to that you can't already connect to?

What kind of bottlenecks do you run into?

What secret wish do you have for Grasshopper that doesn't even seem possible?

What project have you been meaning to undertake but haven't had/won't have the time?

Just trying to brainstorm a few ideas here.  There are so many great and useful plugins out there, it's hard to discover the gaps anymore.   

Looking forward to your thoughts!

Cheers,

Marc

Views: 26938

Replies to This Discussion

Hey Santiago,

Thanks for asking.  Flux's product focus changed pretty radically and we had to put our second GH plug-in on pause.  I'm sure that there will be some exciting developments in the future, but for the time being we will have to wait.  

It has been really great seeing all of this discussion, though, so I think there are good things coming out of this regardless.

Cheers,

Marc

I agree.
I hope this discussion continues.

Hallo David,

A thing that is kind of troubling is the memory issue when having e.g. large (e.g. 12000 Branches, with an average of 200 values) DataTrees. When passing such structures arround a few components my memory gets filled (only of 4 GB but still).

I am not sure where this comes from but as fare as I understood when developing components each component creates a deep copy of the input than processes the data and outputs it. Do understand that this might be the simplest way to avoid conflicts but somehow this is a real pain point.

Richard

Each parameter creates a shallow copy, but when you modify data deep copies are made first. The datatrees are copied, but that shouldn't result in too much overhead. On the other hand 12000 branches is quite a lot. Can you post a file that shows this memory consumption?

Added a simple example to show the problem.

In this case it makes no difference since it is a 130MB problem  (at least on my computer) but I got some files with more components where the memory consumption goes over my 4GB RAM.

Its when doing some spatial analysis.

Attachments:

Let's do some math on 2.4 million variables:
12000 x 200 = 2.400.000
(double: 64bit / 8 byte)
2.400.000 x 8 byte = 18.3MB

If we consider GH_Number (used inside datatrees), it has some additional fields:
Type Description  = 41 byte
Type Name         = 6 byte
Is valid         = 1 byte
Is valid why not = 0 byte (let's assume everything works)
Value             = 8 byte
Total for GH_Number: 46 bytes, 8 of which are used for the number, 38 for saying it's a number).
2.400.000 x 46 byte = 105MB for one set of data. (a quick runtime check seems to confirm this is the right ballpark)

My conclusion for using big data in grasshopper needs some additional love at the moment. Currently the datatree structure becomes less useful when starting to go beyond the order 100,000, after this I usually consider writing some C# solution. Main reason usually is the runtime, not memory: executing the multiply command for example for 2.4 million times takes 20 seconds here (I'm not that patient).

Thx for the math!

But what I am wondering is the memory behavior, when passing this data through a set of components. And what also is strange, when the definition gets recalculated with less elements (e.g. 1000) than the memory consumption doesn't drop.

 

The type description and type name are constants and do not take up any memory on every instance of a GH_Number. It is true that there is overhead compared to an array of doubles.

I stand corrected, my estimations seem to be quite a bit off.

using this code:

    long start = System.GC.GetTotalMemory(true);
    var bla = new  Grasshopper.Kernel.Types.GH_Number(42);
    long end = System.GC.GetTotalMemory(true);
    A = (int) end - start;

a GH_Number consumes apparently 24 bytes.

I think you also have to add the reference pointer which is 4 or 8 bytes depending on 32 or 64 bit platform.

Is it too late to revive the request for a "replace component" functionality?   I.e.  a "swap-all-wires-into-this-parameter-for-this-other-one?" 

For those of us who aren't sensible enough to extract parameters often enough.

Or, perhaps as a parallel alternative, the ability to extract a parameter in the middle of a wire?

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service