algorithmic modeling for Rhino
Works a treat! Thanks for the update Timothy.
Question for you, any plans to support grabbing the data directly from the api sources? A tiny script could get the data, save it to a file (or stream directly to the Elk components). I am messing around with:
System.Net.WebClient wc = new System.Net.WebClient();
wc.DownloadFile(url, path);
Which does download data, but I am not sure if all of the ways and relations come along for the ride. There are other API's such as the Overpass. I will look over them and see if it is possible to do something similar.
I hadn't given it much thought, but it looks like using something like Overpass can get around the standard api limitation of 50,000 nodes. This would make it much easier to pull in entire cities.
I did a quick run through with the Overpass API since it can pull a large amount of OSM data and it looks like it works. I just wrote the resulting data to a text file and I got 1.27 million nodes as opposed to the 50,000 maximum nodes from OSM's XML file. The formatting is a little different, but I think I could add a streaming component and then rewrite the OSMData component to accept either the stream or the file path. It would definitely take a lot longer to solve a data set as large as Overpass is able to pull though it may be easier to wait on it than to stitch together multiple tiles.
Does the normal API call for the bbox and the four coordinate parameters give you back all of the data, or did you have to use a different api call?
I used the Overpass api because unless I'm reading the documentation wrong the normal OSM api still limits you to getting 50,000 nodes.
yes, that is what I meant. In the overpass api, which api call were you making? was it just the standard bbox GET or did you have to specify the nodes and relations?
Ah. I just tried using the standard bbox call and it looks like it pulled all of the data. I haven't tested it with Elk yet to make sure it's all there, but just a quick look through the resulting data looks like it should work with a few changes.
Also after getting an inital workflow set up I discovered that the data that's downloaded appears to be incomplete. It looks like all of the points and ways are in the file, but somewhere in the relations section it just stops abruptly. I'll have to see if I can find a way to reliably close all of the open nodes in the XML file.
Another API to try, might be more reliable is Open Mapquest, though I am not certain the limits of it.
From that same page, it seems the limits of the bbox argument are 10 square degrees. Some quick googling tells me that 1 square degree is appx 12,365km2 on earth. So 10 square degrees is 123,650km2 of area...that is quite a lot depending on the application.
I was able to do a little parser to format the data correctly for Elk. It is not so elegant, but I include it as an example. I was able to retrieve more that 50,000 nodes. Of course, it takes time to download, so this blocks GH for a few seconds depending on how much you are retrieving.
One thing to note, the method I am using (XDocument.Load(url)) does not seem to include a 'timeout' property. Meaning that large queries might timeout...this happened to me. An alternative would be to just do a WebRequest which does have a timeout property or use an XMLResolver which I suppose could also have this property.
Here is the script. It has the following inputs:
urlPre (string):the url prefix for the mapquest xapi api. Should be:
"http://open.mapquestapi.com/xapi/api/0.6/*[bbox="
path (string):this is where you want to save the incoming data. I suppose this could be done without a local file, all in memory, but since elk wants a file name, it is here. I save data to a folder in my c drive, so a possible value could be:
"C:\Data\map.osm"
left (string):min longitude in decimal degrees
bottom (string):min latitude in decimal degrees
right (string):max longitude in decimal degrees
top (string):max latitude in decimal degrees
The output is just the filename you put into the Path input. You can then feed this to Elk.
Here is the script:
private void RunScript(string urlPre, string path, string left, string bottom, string right, string top, ref object A)
{
string url = urlPre + left + "," + bottom + "," + right + "," + top + "]";
System.Xml.Linq.XDocument xDoc = System.Xml.Linq.XDocument.Load(url.ToString());
//Mapquest does not seem to return a proper bounds line, and it seems this throws something off, this next line adds the appropriate line
System.Xml.Linq.XElement xElem = System.Xml.Linq.XElement.Parse(""<bounds minlat=\"" + bottom + "\" minlon=\"" + left + "\" maxlat=\"" + top + "\" maxlon=\"" + right + "\" />"");
xDoc.Root.Add(xElem);
xDoc.Save(path);
A = path;
}
This is 76995 nodes from what I can tell. Took about 129 seconds to compute everything according to GH.
Hey Luis, did you extract all tags separately?
can you upload an example file of how you did this with the api? the regular osm file isn't giving enough details like this is. I would greatly appreciate it. I already pull up the urlPre of the map I want.
Welcome to
Grasshopper
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
© 2024 Created by Scott Davidson. Powered by