Grasshopper

algorithmic modeling for Rhino

Hello everyone,

is it possible to open up the code written in C# script editor in GH as a file in visual studio? 

It took me some efforts to write a program and now I'm writing my report about it. Would like to analyze my code regarding the complexity of it. I saw that visual studio has an 'analyze' tool, it includes for example the cyclomatic complexity. 

If I can't open the code in visual studio, I will have to do it manually. And to be able to do it manually I will have to read myself in code complexities and it will take some more time. 

Is it possible? I read that there is a link between GH and Visual Studio, however can I view a code which is already written? 

Thank you in advance 

Tülin 

Views: 1124

Replies to This Discussion

You can certainly copy paste the code into VS, but you'll have issues with referenced assemblies. VS won't understand the code to be valid.

If you have Visual Studio though, I would recommend using it to write GHA components instead.

[...] however can I view a code which is already written? 

I don't understand. The only code you could possibly hope to view must have been written already.

It would seem backwards to convert everything to a GHA project just to perform some diagnostics. In my experience the only time you should start worrying about code analysis is when it's either too slow or taking up too much memory. And in those cases you just start inserting strategic System.Diagnostics.StopWatch measuring blocks until you find the bottleneck.

Big-O notation and complexity metrics are all nice and good for when you're publishing generic algorithms to be used by others, but analysing your own code for the sake of analysing your own code makes baby Jesus cry.

Proper analysis of an algorithm is difficult, and to do it well requires a fair amount of intelligence. An automatic metric isn't going to be too informative.

I recommend performing both a practical and a theoretical diagnostic. You should figure out the Big-O notation performance of your algorithm from first principles, probably doing both a worst-case and best-case scenario if there is a difference.

Then generate a large set of example inputs and measure the run-time to see how it increases with additional complexity. The results should fall within the theoretical worst-case/best-case bounds.

It also makes a lot of sense to measure different sections of the algorithm. If it contains N distinct steps, what is the Big-O runtime of each step? Which one is the bottleneck? Could it be improved? Could it be pre-computed?

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service