I’ve begun working on the next major update to Pedometer++ which includes a variety of hiking focused features. The first of which is the addition of a walking route planning tool. This will let you use an interactive map to help you choose your route.
One of the core aspects of this feature is using the Mapbox Directions API to find a walking path between two points. Using an API like this is incredibly productive as I’m entirely offloading the complexity of analyzing the map data to find the best route to their servers and can just focus on the user experience. But it still carries with it some important implications which I have found are really important to consider up front.
I’m going to pose a little axiom I’ve found true in my experience:
The earlier in the development process you make a decision the more impactful it will be on the final product.
This is an axiom I’ve observed in countless development projects, where an early decision can either come back to haunt you or bless you down the road. The challenge is that at the start of the project you have the least amount of information about the final product so you are in the worst position possible to make good decisions.
This tension doesn’t have a straightforward solution. Instead what I have found to be the most helpful is be keenly aware of it when starting a new feature. The care and consideration early decisions warrant is much, much higher than those which will come towards to the polishing end of a project.
The reason for this uneven weighting is that early decisions form the foundations onto which all the later work will build so if you make a mistake early then you’ll either have to rebuild from the ground up or be patching around the issue forever. Similarly the final performance of the feature is often limited by the assumptions and choices you made at the start.
Example. Storing Routes
The feature were I was specifically reminded of this rule was with regards to how fine grain of a route I store within the route planner.
Mapbox’s API has two options for routing: a “full” resolution option and an “simplified” option.
For this discussion I’m going to compare the routes returned for a four mile section of trail just outside Patterdale in the UK Lake District.
Here is type of route returned by the Simplified request:
It looks reasonable when zoomed out, but if you zoom in you can see how it often would stray from the actual path and likely doesn’t provide enough detail to differentiate between diverging paths on the ground.
So then let’s look at the Full request:
This is incredibly detailed, and would provide certainly enough specificity to differentiate paths. So the straightforward answer would have been to use this approach and the move on.
However, if I had I would be setting myself up for lots of challenges down the road. The simplified route involves 18 waypoints…but the full route includes 377! 21 times more coordinates to manage.
While I’m cognizant of the risks premature optimization, my experience has told me that the fewer points I can store the more performant countless later parts of my app will be. No amount of finding rendering improvements can make up for the fact that each time I have to analyze or display the route it will use 20X more resources.
So in this case I wanted to try and find a middle ground. Something were I can store just enough coordinates to ensure the walker isn’t confused or lost but no more, so that ALL the downstream systems I build from here can be as performant as possible.
Thankfully I can throw math at the problem in this case and use the Douglas–Peucker algorithm to simplify the resulting route without a loss in accuracy. This method works by looking for points along a line which fall within a given tolerance from a straight line and then drops these points. Essentially dropping out points which aren’t adding accuracy to the line.
Here is the result of running this at a variety of linear tolerances.
You can see that by increasing the tolerance makes the curve less “accurate” but also dramatically reduces the number of coordinates the route is comprised of.
Getting this tolerance right was the real crux of this decision because I want to save as much space as possible but the utility of this feature as a navigational tool is also paramount. So I can’t simplify the route beyond the point where it would cause confusion in the field.
I experimented with a variety of options here on a variety of trails and found that a tolerance of roughly 5 meters was “good enough” for me. This resulted in a reduction to around a quarter of the original points in general, but without a meaningful reduction in accuracy.
Here is another example of this using a longer 23 mile route I walked a few weeks ago.
The original route I got back from Mapbox included 1,809 waypoints (78 waypoints per mile). My algorithmically reduced version included 494 (21 waypoints per mile). But as you can see even on a very winding section of switch-backed trail the route is pretty much perfectly along the trail.
This particular example is a relatively straightforward instance of applying the rule of overvaluing early decisions but hopefully it serves to illustrate the concept. By paying extra attention early on in this process by reducing the inputs to my route maker as much as possible I’m setting myself up for lots of easier problems later on. When I go to make my map scroll smoothly it will be much easier to optimize with only 27% of the points to render, analyze and consider. Similarly it will free me up to explore some on-device features which might be too slow or impractical if I was using the full resolution routes.
What applying this axiom to your work will vary from case to case but perhaps it could be well summarized by asking yourself the question “How can I make my future self’s job easier by the choices I make today?”. The more you adopt this future looking perspective, especially early on the smoother the later parts of the process tend to go.