The google free API is limited to 2500 requests per day.
I use it to retrieve the full address of the GPS coordinates provided by the tracker.
Last night with George and Mike driving around and me working on my car, we hit that limit.
I've changed over to a paid account now, but that's not sustainable when providing a free service.
I'm going to have to test some of the free reverse geocoding providers and try to incorporate them into the system.
This only affects the address shown in the information bubble on the website and in the app.
What I might have to do in the end as more of us jump on is have the free tier use the free geocoders, and a "premium" service use the google service.
The problem with the free ones that I have looked at in the past, is they generally don't provide a full address down to the house number - just street/suburb.
I not normally had to deal with street addresses - always tracking vehicles using lat/long data or items / asset or even locations of events all via direct lat/long data.
I have use Mapstraction API is an easy way to switch between the various APIs though you could go straight to OpenLayers so the API that has geocode()
I use mapstraction in the website itself to display the location of the lat/lon on the map.
It's actually the reverse geocoding within the tracking server (that actually talks to the trackers) that has hit the limit, not so much the mapping itself.
So the tracker sends it's lat/lon, I send that to a reverse geo provider, and they send back an address.
I've just developed code for locationiq.org/ and switched everyone over to that.
They do support house numbers, but that information hasn't been inserted into the openstreetmap database for Australia yet.
If you query a location in the UK for example, the house numbers are included.
So any paying customers can use the Google API, but freebies will use locationiq. Hopefully one day the Australian data for house numbers will be imported to openstreetmap and there will be no difference between the two providers from our point of view.
Just had a crazy thought.
I do so many lookups when people are driving around - why not store every lookup in my own database and query that database first.
Maybe that way I can switch back to using google for everyone, but will have to query google a lot less. Most people would go to pretty similar places often.
I wonder what the chances of hitting the same place would be though. Even if you're 5 meters away from last time, the coordinates would be different.
Maybe I can just say if it's within 10 meters, use it. Otherwise, do a new lookup. I think it's worth looking at anyway.
Sun, 15 Jan 2017 00:38:03 +1100 22.214.171.124:64213 Found xxxxxxx, xxxxxxx NSW 2153, Australia in cache Database. Bypassing lookup.
Will be interesting to see how it goes.
I would still prefer to check if the location falls within a 20m radius of an existing location in the cache database, but I don't think that's too easy without querying every record each time.
So every GPS location the hardware collects and sends to the hybrid server the first thing it does it translate it to Street address using google services - this is what I think your saying?
You can do a local 20m compare using lat long data alone to existing cached entries and if one or more found then take the first and do no lookup. You could also decide to keep a last address looked up for each device and just use it for the next 10 mins and just append it on the current incoming GPS location. The address will be "Close to" but far fewer lookups - with both combined you will reduce street lookups considerably. I am not sure what you use the lookup street data for. I assume just display purposes. I would also age the cache entries say 30 mins so fresh ones appear from time to time. The gui could have the option to select an entry to manually do an address lookup if they want an accurate address for a location. This would go into the cache. You will need to limit the cache size so cache access performance cost stay low.
Just to clarify I was thinking of a separate cache in memory not looking up the path records in the database - my suggestions will result in the path records becoming less accurate street address terms. It would be costly to scan all records to do the street address look up via an area match in GPS coordinates. Needs to be local and fast and use a far smaller data set.
I will just see how it goes using my current method for now.
The problem with a short-term local cache is that unless the car is sitting at a set of lights, it will have moved a fair way by then (which is why you say it will be less accurate).
The openmaps database is under 60MB, so if I just store every looked up location, the DB shouldn't get too big. Over time, the external lookups should get less and less.
It will also save a bunch of lookups while our cars are sitting at home not moving for days.
I've switched everyone back to using google maps, so everyones trackers will be contributing to me building my own location data down to the house number level.
So a 50% reduction. Not to bad if the performance in providing the data from your local database is reasonable. This is depends on people traveling the same path - which is true mostly. How many driving around for the 1600 data points?
The logos and trademarks used on this site are the property of their respective owners.
We are not responsible for comments, advice, opinions, products or services posted by our members, as they are the property of the poster.