This week I lost a big chunk of time to two things. One of them was implementing what seemed to be a very simple feature: showing the coordinates under the mouse while looking at the map. I had it working in about 30 minutes and moved on to other things.
Later, while working on a new side-quest, I found myself using the feature while placing some clues. At this point I noticed that my implementation was buggy: when I put my cursor over a star system it showed coordinates that were close to the correct values, but definitely wrong. For example saying a star system was at 14.8 x -20.1 when it was really located at 15.6 x -19.4.
First, understand that there are multiple coordinate systems involved in figuring out where the mouse is “on the map”:
- There are the displayed “Map Coordinates” where 1 unit is equal to a grid line on the map.
- There are game world coordinates which are used for things like range in game. These have a 1:1 correspondence with Unity’s own coordinate system. These have a 1:4000 correspondence with the map coordinates.
- There are two coordinate systems specific to Unity’s Camera system: Screenspace, measured in pixels, and Viewport, measured on a scale of 0-1 relative to the bottom right corner of a camera’s view.
- Finally, there is the relative coordinate system of the UI.
Calculating the cursor coordinate from the map is tricky: the map is not directly displayed by a camera (a “camera” is how Unity renders a scene). Instead, the various map elements are positioned using the map coordinate system and rendered by the map camera to a RenderTexture. This RenderTexture is included in a UI overlay. This technique allows for the lovely map to gameworld transition that would not otherwise be possible just by switching cameras. The downside is that there’s no straight-forward relationship between the mouse position on screen and what shows up directly beneath the mouse cursor. For example, the texture is square and the player’s display may not be.
Fortunately, as part of the aforementioned game to map transition, the game camera continues to track the player’s position from high above even after it has stopped rendering anything. Using the game camera I can use Camera.ScreenToWorldPoint to figure out where in the game world the cursor is. So, if I just divide that by the map scale, and add that to the player’s map coordinates, which I already know, that should give the cursor coordinates.
Except, sometimes it would be wrong.
I looked at a number of things, none of which seemed likely culprits: the cursor icon offset, the map “pan” offset, the easing on the zoom, etc. The more I studied the problem, the more I became aware that trying to conceptualize the coordinate juggling was giving me a headache. Finally, I called it a night and went to bed.
The next day, as I tried to tackle the problem again, it became clear that a) I wasn’t seeing the problem and b) the headache seemed suspiciously confined to the area around my left eye which was also pretty bloodshot.
So I called my doctor who fortunately had an opening that day. He thought it might be viral conjunctivitis, but wasn’t 100% sure so referred me to an ophthalmologist, who also fortunately had an availability that day, but was in downtown Boston and also the availability was in 17 minutes. Anyway, after a series of adventures, it was established that my headache was not caused by trying to conceptualize multiple coordinate systems, but rather a scratched cornea.
It seems that having caught it quickly and treated I should have a full recovery soon.
After some time worrying that I might lose sight of the bug and literally everything else, it occurred to me that actually seeing the bug rather than trying to imagine it might point me in the right direction. So I added a graphic to the map that showed where the game thought the cursor was pointed on the map:
Above is the result of the visualization. The large white cross is positioned where the map thinks the cursor is. As the map zooms out, the cross seems to “vibrate” between different distances from the player, before settling randomly when the zoom stops. The scale of vibration was clearly dependent on two things: how far out the map was zoomed and how far away the cursor was from the camera center.
I immediately recognized the issue now that I could see it. If you’ve read my previous post, How to Manage a Very, Very, Very Large Contiguous Universe, you might have guessed it as well:
I was calculating the mouse’s position relative to the player in the game world using a projection from the game camera. At maximum zoom, that’s two million units away. That’s two million stored as a floating point number. Floating point numbers can store values from about 10^38 to 10^-38, but they do so at the expense of precision. For “small” values like 1 or 100, they’re very precise, out to many decimal places. But for large values, they start to get a bit fuzzy. For example, you might be able to store 1,000,000.1 and 1,000,000.3 but not 1,000,000.2 (not a real example, but you get the idea). The vibration I was seeing was the cursor snapping to the closest floating point value to what I was asking for.
Now that I knew the problem, I was able to come up with a very quick solution: create a camera that doesn’t render anything, but tracks the game camera at a much lower elevation.
The development lesson here isn’t to be aware of floating point limitations (I obviously was aware of them a few months ago), but to remember the value of visualizing your bugs.
Also, since this post will serve as the Weekly Update, here’s what I did this week on Starcom: Nexus:
- New mission for the race added last week
- Complete revamping of the wandering encounter system
- Started laying out new sectors to place new content
- Display coordinates in map mode, correctly even