algorithmic modeling for Rhino
So I have come to a point, where I need a specific component that doesn't seem to exist.
Basically I have a set of components that is being recalculated all the time and that creates an output (just a certain sized box in this case). I want to hit a button to "set" that output so it stays the same, even when the components before it get recalculated. Then when I press reset it again looks at the connected input and uses that data.
In vvvv this is basically what a hold node will do.
The data dam is basically what I need, but I dont get why it has the button included in it, rather than having just a boolean input. Its one of those UI quirks, that dont really make sense - maybe they did at the time it was conceived. I need the data dam deep inside a cluster. It would seem silly to route the input and output up through the clusters just for the data dam.
Does anybody know if such a thing exists. Is there a workaround or plugin that has such a component?
Thanks for any help.
Tags:
Only persistent data and the structure of the definition is preserved from the cluster edit mode to the executing instance of that cluster - it will not remember that you clicked the button while "inside". it's as though the cluster were a standalone document and you closed and reopened it.
And nothing you're doing addresses the document object either - while it's possible to talk to the parent document from inside a cluster, I don't think this does you any good here (with your current script).
the other problem with this solution is that even though the VALUE is not updating, it is still expiring the components downstream every time the slider changes - creating unnecessary recomputes, which I think goes counter to the original posters purpose if I understood it correctly. I think Metahopper + Data Dam can solve this - I'll post an example when I get to a computer.
As I mentioned above, it is possible to use Metahopper to trigger a data dam inside a cluster. This can be used to prevent slow code from triggering (the c# script that shows a message box is a stand-in for some slow operation) although it does NOT prevent updates to your cluster inputs from triggering downstream updates (see the attached "recorder" component). This is unavoidable when working with clusters - unlike loose components, an *entire* cluster always expires (executes) when any one of its inputs updates.
For the attached script you'll need MetaHopper, available for download here: C:\Users\nycahn\Dropbox\MetaHopper\MetaHopper.gha
The MetaHopper approach uses the "Set Object Value" component to pass a "refresh" to an instance of a cluster - the boolean you supply will switch the dam between "Always" and "Never" mode.
None of this does exactly what the original poster seems to want - the ability to go inside to edit a cluster, hit a button, and then have the "updated" data propagated back to the original definition. This is, as far as I know, simply not possible - it's just not how clusters work.
Thanks Andrew for your detailed explanation on the issue. Your last sentence is exactly what I need. I have one document which contains a number of modules. To do what you need to do you go inside each module and set a lot of things there. One of which is to set a box to a certain size, but which should not be updated on each recalculation, because it is based on things that change with each calculation (I am using hoopsnake anemone to calculate an animation basically). Thats why I need the data dam. Now as soon as I go out to the main patch this gets reset, which is kind of strange, because everything else, like sliders and so on, dont reset just because I leave the cluster.
So I guess I need to find a more permanent storage solution for this. Like being able to write true global variables (as in available to all open GH patches) if a cluster is treated like a seperate document.
Do you know of a plugin for this? I guess I can always just write it to a text file.
Thanks once again for clarifications. No I am aware of the implications of recalculation and so on - its ok. The document I am working on is really huge. Each module contains the same inputs and connections to be made. They themselves go into a rather large cluster, which spits out things. Then on the very top level you can start the animation, which will recalculate everything, write the whole geometry as an .obj file and then go to the next frame, which changes the geometry, writes a file again and so on.
In this case I am using the clusters sort of like a user interface. You go into module 1, set everything up, then come back out, go into module 2. If you want you can duplicate a module and do the same. Then in the end you let it "render" out the animation. This hole process is far from quick, so having it recalculate once (which takes around a second max) is not a big deal. For the sake of a better user experience I am sacrificing some performance.
So I started using the plugin Elefront recently, and it has some interesting features, one of which I am exploiting right now to get what I need (ie. global data). I am writing a data layer into Rhino with an invisible object with custom attributes. Then anywhere in the document, no matter which hierarchy level, I can retrieve and update that data. Pretty neat.
Surely python once again would be a better way, but this works and my python skills are close to 0.
Indeed, it might be useful with something like an ExpireDownstream property on the GH_Component for cases like this. Anywho, it's a quick/simple script that seems to have done the job for Armin and the others till now. Sounds like adding a Toggle input to the existing DataDam (or the one in GH2) would be useful all around.
Armin - I don't think this is a smart workflow. Setting values "inside" the cluster means you run all calculations twice - once in the open cluster document, and once again in the instance of that cluster in your main document. Sliders and panels stay because they are "Persistent" data - meaning they are set internal to the component and persist on save / reopen / copy / paste etc. (This is what happens when you internalize data into a parameter as well - data becomes persistent.)
Data that is generated by a component or workflow is called "volatile" data - and this will NOT transfer from your active cluster edit session to your document - it will simply get recalculated.
As a best practice, all the stuff you need to "Set" - the primary inputs to your algorithm - should be set up as inputs to the cluster. This may be a bit different from the way VVVV works, but it is how Grasshopper is designed.
If you post your actual files and make it a bit clearer how you are setting up this animation, we may be able to recommend further alternative workflows that are more successful in a "grasshoppery" paradigm.
also an other thing that i noticed is that your data dam insn't static, in the sense that when change occurs on one side, the other does indeed receive an update even if it outputs an old set of data. Infact the Data recorder picks it up. this means that it doesn't really work if you want to block computing data down stream.
We covered that right above here, see Andrews solution ;)
Regarding: "Like being able to write true global variables (as in available to all open GH patches)" The Python sticky dictionary is really useful for this kind of stuff. It can be accessed from all RhinoPython editor instances (i.e. both EditPythonScript and GHPython) in the same Rhino session. I haven't tried reading/writing to it from within a cluster, but it don't see any reason why that wouldn't work. There's plenty of threads dealing with sticky if you do a search. Hope that helps.
Thanks Anders for your suggestions. I might look into it, but have found a workaround for now using the Elefront plugin for GH. I am using that anyways to bake things to Rhino, so I am using the custom attributes you can write and reading those. Works well enough for what I need right now.
Welcome to
Grasshopper
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
© 2024 Created by Scott Davidson. Powered by