I’ve been giving a lot of thought to the user interface for the robot, specifically the geolocation portion. I want to have a GPS unit mounted to the bot that relays a location to the user interface. This data would not only be recorded for review, but it would show the user a real-time picture of where the bot is located.
The difficulty with this is that the operator and bot will likely be operating over some kind of ad-hoc network, which means I wouldn’t be able to pull data from Google in real-time. I would have to pre-load data and map tiles into the operator’s terminal. This isn’t too hard, especially with the script I wrote a few weeks ago. The difficulty arises when the bot relays GPS data over the network. How can this data be used to select a tile (or set of tiles) for the background, and from there, how do I accurately depict where on the tile the bot is located?
So I did some research and put together this little Python script. I wanted to test out some mathematics and methods for determining the tile to use and pixel location on the tile where a marker of some kind would identify the position of the bot. There’s obviously a lot more work to do in this area, but it’s a good start. It’s ridiculously accurate, so I’m guessing the primary error source would be the GPS unit itself. Feel free to take a gander and use it as you see fit:
# Given: a lat/lng coordinate
# Return: the tile and pixel coordinates of given position
# By Rob Sears
import math
GPS_Lat = 37.364101 # A neat little SR-71
GPS_Lng = -120.577612 # Blackbird in California
zoom = 21
n = math.pow(2, zoom)
# Calculate the tile x:
tile_x = int(math.floor((GPS_Lng + 180)*n / 360))
# Calculate the tile y:
tile_y = int(math.floor((n/2)*(1-(math.asinh(math.tan(math.radians(GPS_Lat))))/math.pi)))
# Tile X to Lng (where x = the tile x coord, and zoom = the current map zoom level):
tile_Lng_1 = 360 * tile_x / n-180
# Tile Y to Lat (where y = the tile y coord, and zoom = the current map zoom level):
tile_Lat_1 = math.degrees(math.atan(math.sinh(math.pi * (1 - 2 * tile_y/n))))
# Compute Lat/Lon of adjacent tiles
tile_Lng_2 = 360 * (tile_x+1) / n-180
tile_Lat_2 = math.degrees(math.atan(math.sinh(math.pi * (1 - 2 * (tile_y+1)/n))))
# Make some transformations:
# Due to the way floating values are calculated by Python, the very, very tiny
# differences that occur in the math tend to get rounded out and not properly
# projected on to the tile's plane. So we set an adjustment coefficient that
# gives a requisite amount of precision, then we lop of the remaining decimal
# points and treat the number as an integer. We use a y=mx+b type regression
# to map a pixel to a corresponding Lat/Lon coordinate using these integers,
# then we transform the values back into decimals by dividing by our adjustment
# coefficient. The result is a perfect lat/lon projection on to the tile.
# Set the adjustment coefficient:
adjustor = 100000000000
# Transform all lat/lon points to large integers:
GPS_Lat_adj = int(round(float(GPS_Lat)*adjustor))
GPS_Lng_adj = int(round(float(GPS_Lng)*adjustor))
tile_Lng_1_adj = int(round(float(tile_Lng_1)*adjustor))
tile_Lat_1_adj = int(round(float(tile_Lat_1)*adjustor))
tile_Lng_2_adj = int(round(float(tile_Lng_2)*adjustor))
tile_Lat_2_adj = int(round(float(tile_Lat_2)*adjustor))
# Compute the difference in lat/lon points (results are integers)
Lng_diff = (tile_Lng_2_adj-tile_Lng_1_adj)
Lat_diff = (tile_Lat_2_adj-tile_Lat_1_adj)
# Compute pixel locations:
pixel_x = int(round(float((GPS_Lng_adj-tile_Lng_1_adj) *256 / Lng_diff)))
pixel_y = int(round(float((GPS_Lat_adj-tile_Lat_1_adj) *256 / Lat_diff)))
# Print results:
print "Tile Numbers: " + str(tile_x) + ", " + str(tile_y)
print "Pixel coordinates: " + str(pixel_x) + ", " + str(pixel_y)
print "Link: https://khms0.google.com/kh/v=125&src=app&x="+str(tile_x)+"&y="+str(tile_y)+"&z="+str(zoom)+"&s=Galil"