Grasshopper

algorithmic modeling for Rhino

Hi All!

As a quick question for Chris and Mostapha, and anyone else who knows, I am trying to run a parametric iteration study, but it is going to take a long time for them all to run if it can only run one energy and daylight simulation at a time.

Is there a way to run the idfs created as a group run like what energyplus launcher can do with native energyplus?

Views: 3021

Replies are closed for this discussion.

Replies to This Discussion

Not with the current interface of Honeybee. It will wait for each simulation to be over and then runs the next simulation. If you don't wait for the simulation to be over then the resultReader component will fail.

There are of course solutions to run files in parallel, and then post process the results. Is this what you're looking for?

Mostapha

Hi Mostapha, thanks for the reply!

Yeh, I'd like to create a bunch of the files and then process the results afterwards. Is there a short tutorial or example file of this process working which I could work through? 

A good example to start is how honeybee does it for Daylighting (https://github.com/mostaphaRoudsari/Honeybee/blob/master/src/Honeyb...).

Vinu has the same issue apparently. Maybe you should team up for this one.

I think a good approach would be adding this option to "Honeybee_Re-run IDF" component. It already does create a batch file for a single file. You only need to make it work for multiple files.

Mostapha

Yes, I will share when I figure it out, if you would like to collaborate Elzine shoot me an email @ vinu.arch@gmail.com

Hi Vinu and Elzine,

I can create this functionality for you if you like. Let me know if you would like me to code this for you both.

Regards

Anton

Hi Anton,

That would be great if you could code it! Do we need to give any other details?

Vinu

Hi Anton!

I'm literally just learning python coding so if you are already able to code this then that would be great! My university has a processing lab of computers which they are giving me access to to run the parametric study and fully use honeybee/ladybug. I do feel like I need to get my head around coding to better understand it as my supervisor is having me do demo's of the program for other students and external consultants to introduce them to the capabilities of parametric design! Exciting times, but having this ability would greatly help the practicality of it within research for sure.

Much appreciated if you are able to do this!

Hi Elize and Vinu,

Sorry for not getting back sooner I will have a crack at this tonight by adding the functionality that Mostapha suggested to the Honeybee_Re-run IDF component. Could either you and Vinu make an example file for me to test this component on?

Some things might be easier on Skype my time is anton.szilasi I'm in Portugal at the moment but moving to US central time in the next month to 6 weeks.

Hi Anton,

Sorry for the late reply, was in a workshop. I could skype next week, will you be in US by then? I live in pacific time. We could arrange a time to skype!

Hi Vinu,

I've already created the component post a group run example file here and I'll test it with your example file. before I release the component.

Hi Vinu

Where abouts in the pacific are you?

I am currently making up a quick little parametric example to post on for Anton, just testing if the parametric works currently but will have it up soon.

Hi Anton,

Apologies for the late reply on this, I have been trying to sort out some parametric runs. Please find attached a quick example file of some parameters set up to run parametrically. I have only joined 2 up to the mysliders component to make it quicker.

Simply just want to see if you can make it create all the idfs and then run them all in parallel to each other - instead of having to wait for one to finish before the next can begin.

Cheers,

Elzine

Attachments:

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service