algorithmic modeling for Rhino
Tags:
.NET isn't really the problem, the Xamarin guys have ported pretty much all .NET functionality we need already in Mono. The problem is winforms.
Winforms (System.Windows.Forms to give it its full name) is a .NET namespace which provides windows and controls for Windows operating systems. Buttons, toolbars, menus, statusbars, custom drawn controls, mouse and key event handling, tooltips, etc. etc. etc. are all winform classes in GH1.
On the Mac there are no Windows controls, instead you have to use Cocoa. So what we need is either a way to map all winforms code to cocoa code (no good 64-bit solutions exist), or a platform which can run on both interface platforms.
We're going for the latter option. Rhino6 for windows (and Rhino5 for Mac) will ship with a special build of Eto, whose aim is to provide most of the functionality in the intersection between Windows and MacOs. With Eto we hope to be able to develop .NET plugins with rich UI that run on both operating systems. It's too late to rip GH1 apart and translate all winform code to Eto code, but since GH2 is a ground up rewrite anyway there's no additional effort involved in switching platforms.
Sorry for the late bumb but just stumbled upon this thread.
@David: Sounds cool. Does that mean that Rhino + GH could theoretically be ported to linux as well?
Theoretically sure, however porting is always a huge investment so we need to be reasonably sure we can get the investment back by selling Rhino for Linux licenses. Linux is traditionally a very difficult market to make money in so I doubt we're going to seriously look into what it takes any time soon.
most of multi processing and parallel processing super computing units in universities either in U.K. or abroad are Linux based. so i think as there are increasing need for multi processing in modelling and simulation tools in rhino and grasshopper. i think it's a good opportunity to apply for an international fund for such an idea. why don't you consider this as an opportunity for developing rhino for linux ?
On the Rhino side, I really don't think you need supercomputers to process NURBS geometry.
On the GH side - once you REALLY need supercomputers to compute anything, you want to remove any bottlenecks... and GH with it's architecture (at least GH1) could easily be considered one. There are a lot of features which make GH super user-friendly, but on the other hand the same features are really computationally expensive (especially casting). I will state that if you are really at a point when you need supercomputers, you should probably hire some good programmer.
EDIT:
Let's imagine you have a definition which computes distance from plane to point.
Take a look at the equation 10 here :
http://mathworld.wolfram.com/Point-PlaneDistance.html
If I am correct, GH would just compute this equation 100 times for 100 points. If you would really like to optimize the code, you would just compute the denominator once for the plane and speed up the whole process tremendously (square roots are expensive). True, GH may already do that, but that's just a good case study.
Thanks Mateusz, very eloquently put. Grasshopper 2 will also be a bottle-neck in terms of pure computation, because it too has to continually check the types it's working with.
When it comes to really hard-core computation, the only sensible approach I can think of is to make it easy for Grasshopper to communicate with some other program which may or may not distribute computational processes to the interwebs, or a server-farm, or a supercomputer.
do you have any recommendations for such programs ?
What I meant was that it makes the most sense for me (the Grasshopper developer) to make sure that data can be converted/shared with other apps, whatever they may be. At the moment the only inbuild feature for writing files is to populate a panel with text and stream it, but that is both limited to text, and hard to control. Not to mention it's very easy to overwrite previous exports.
Having more ways to output data, be it plain text, Xml formatted or binary, both to files and to the clipboard or over specific ports sounds like the most sensible first step.
thank you and i totally understand your point of view. the issue is that i'm in my first stage in my PhD and i'm thinking i'll be depending heavily on grasshopper and our super computing unit in my university is working with LINUX so that's why i thought it would be great if there is a way to lank all of this together
What sort of things are you looking to compute? Those supercomputers/processor farms typically only run well if you have a problem which can be heavily parallelised.
urban thermal simulation and i'm not sure yet if i'm using LB+HB or DIVA or other plug-in or even another software as I said it's still my 1st year
Welcome to
Grasshopper
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
© 2024 Created by Scott Davidson. Powered by