Design Notes Diary #4

Simulated Locations

While Sim Genie is fantastic for simulating location on the Simulator, I realized that for the next phase of testing & development I’m going to need a way to simulate locations on device. While it is fun to “have” to go out for walks to test features (and it is a bit of a joke in my house that I’ll often go for hikes in the middle of the day “for work”), I really need a way to test on device while connected to Xcode. So I’m building out a little system that can read in a GPX file and then generate events just like Core Location does.

This will slot into my app wherever I currently use a CLLocationManager and let me mock out locations whenever I want.

Now I can pretend to be in Scotland whenever I want.

Live Activity Payload Limit

I’m working on the Live Activity part of this update. Something I’m running into is the 4KB limit on the data I can pass to my Live Activity. Right now I’m passing in coordinates to be displayed on a map as a path of where you’ve recently been. It seems like periodically this fails when a certain number of points gets added. But I’m not really sure how the 4KB limit is calculated.

So I wrote a script to keep adding a GPS coordinate and try the update to find the limit.

I tried out a bunch of different Codable encoders, and it looks like it is using the JSON encoder to define the limit.

1
2
3
4
Failed Payload Size (bytes):
PLIST BINARY 2334
PLIST XML 9898
JSON 4298

When my activity state hit the point where the JSON encoded representation was over 4KB it starts to fail. So I guess that’ll be my limit.

Redesign the Live Activity screens

I’m working through a number of designs for my Live Activity. Trying to find a good balance for the information and aesthetic.

1. Information Dense

2. Corners

3. Top Data

4. Just Distance in the Corner

5. “Part of the Map”…YIKES!

6. Just Distance, tidied up.

I think I’m leaning towards #6. While I have some interest in putting the time, distance, speed etc into the Live Activity. Given the constraints on updating it I’m starting to think that actually showing something that changes very rarely is actually better. While I can do things like shorten the time display to something like “2m” rather than “2:01” I’m wondering if all that does is set the wrong expectation. This display is more glanceable and simple…or at least it likely should be.

[Note those screenshots look a bit funny on the top and bottom edges because I have to preview them pretending that they are Medium widgets, as far as I can tell there isn’t yet a way to do a SwiftUI preview of Live Activities]

As is so often my approach when I get stuck on questions like this, my first port of call is the Human Interface Guidelines, specifically the section on Live Activities.

This section seems to settle the matter pretty well.

Present only the most essential content. People appreciate getting a summary and key bits of information about an ongoing task or event; they don’t expect to receive a lot of details or to perform actions in a Live Activity. Let people tap your Live Activity to access additional details and functionality within your app.

I’m only going to show the distance in the Live Activity/Dynamic Island. If the walker wants to see more details then they can just tap it.


Next up is the Dynamic Island Treatment.

1. Something “Traditional” where I put the distance in the corners of the island.

2. Bold, map oriented layout.

Option #1 feels like the safe option, but I often subscribe to the “If you’re gonna be a bear, be a Grizzly” school of design. So I’d rather not just do the safe thing. But I’m not sure about the way the cut-out looks…maybe Notch redux?

That’s either genius or horrible…and honestly I can’t tell the difference yet. I’ll have to live with it for a while. It feels like matching the old way is actually quite natural and fits with the rational why the original iPhone X had a notch. It avoids the weird trapped corner above the pill.

I may eventually offer an alternative layout for other workout types. For example, this map-centric layout is very appropriate for a hiking workout, but perhaps less so for running. But that is a problem for another day.

Daily Math Expedition

The mapping system I built is based on the XYZ/”Slippy” system. This involves a mountain of math converting between lat/lon coordinates into tile coordinates and projections. I just discovered that deep in my framework I was miss calculating the extent of the visible map, and thus was requesting way more mapping tiles than I needed to. Which would have been a rather expensive mistake to make if that had gotten through to production. Phew.

Corner Radius

I noticed something wrong with the top metric view. The workout view is displayed in a modal presentation. This has its corner rounded by the system to a 10pt corner. I was insetting my view from this by 16 points. Which leads to a little issue, visually speaking.

Typically the rule for perfectly matching interior corners is to decrease the inner radius by the padding amount between them. This means that the corners will match perfectly as they will share a center. However, in this case I’m padding by so much this would lead to an inner radius of -6pts. So instead the best I can do make the inner one smaller, in my case I went with 8pt. This visually makes it “nest” better, I think.

David Smith




Design Notes Diary #3

8:45a Start of Month Logistics

The first day of the month always starts with a half hour of kinda boring logistics. Lots of invoices to be paid, accounts to be reconciled and bookkeeping to clean up.

9:44a Catching up on my RSS

While reading through my RSS subscriptions in FeedBin, I hit upon a new subscription. Which is always a fun event. As is often the case, I discovered this new site to follow from a link from Michael Tsai’s excellent blog (no seriously, if I were somehow forced to only read one blog, it would almost certainly be his). Anyway, he linked to a super interesting post by Alin Panaitiu about SwiftUI performance. From which I learned a ton.

10:46a Jelly Face

Many mornings I find myself struggling to get started coding. While overall I enjoy my work, that doesn’t mean that getting going with it is easy. What I’ve found recently is that doing a bit of a ‘warm-up’ is often helpful to start my day. This typically takes the form of building out a little watch face.

Watch faces are a nice quick, constrained design problem that challenge enough of the design and development parts of my brain to be useful, without getting me pulled down into a deep rabbit hole.

This morning I worked on a digital face using a “font” that the designer/developer extraordinaire Daniel Farrelly made for another watch I was working on. They are hand traced from the design of an old World War 2 watch.

The result is quite lovely.

One day I’ll integrate all my custom watch face designs into Widgetsmith (or who knows maybe one day as actual watchOS faces). But until then this is how I get started on days I’m feeling sluggish.

1:01p Rolling my own Chart System

Yeah, so the more I thought about using Swift Charts yesterday after I finished working for the day, the less I liked the idea. There are just too many limitations at this point and I think I’m going to end up chasing my tail way too much.

So I figured what I’d do is start my work today by seeing how quickly I could recreate the chart that I made yesterday. If it was just a few hours then I’ll drop Swift Charts for now and just use my own. That way I can easily support back to iOS 14 and customize things just how I like it.

The result was very promising. Here is the chart I made in around two hours.

The main downside with this approach is having to build my own AudioGraph system for the VoiceOver support, but that doesn’t look too bad.

Exploring performance

Something funny about having an app that displays user observation data is that it can only grow one day at a time. So to get a sense of if my graph approach is performant enough I need to look at how many bars could possibly be shown.

The iPhone 5S was introduced on September 20, 2013. This was the first iPhone to support step counting so is the farthest back data could exist. That was 480 weeks ago. So to performance test I loaded up my graph with 1.5x that (720 weeks) to see how it holds up. So far so good. It seems like I don’t need to build out a dynamic rendering system yet.

For the main step graph display I have had to build my own custom Lazy loading HStack system. The native SwiftUI one wasn’t reliable enough for me. I found that it did weird things with very high data point counts. We are 3359 days from the iPhone 5S launch, but when I tried putting that much data into the native LazyHStack everything fell over. Hence, I had to build my own system for layout and caching.

4:13p Better Testing Data

Something I’ve found very important while working with SwiftUI is that I need to be very thoughtful about what testing/example data I put into my Previews. If I get them wrong I’ll end up building the wrong thing or optimizing in the wrong direction.

To that end I just did a bit of work to create an easily loadable version of my own step count database (I wrote it to a big JSON blob that my previews can read). This goes back to nearly the launch of Pedometer++ and so is a good varied example of what a high usage user would look like.

4:41p Dark Mode Stats

I also just remembered that I added Dark Mode reporting in the last Widgetsmith update. It looks like Dark Mode is much more popular than light mode.

This is session based so it could also mean more folks have automatic switching turned on and use the app in the evening, rather than having their phone in Dark Mode all the time. Though it amounts to the same thing, I need to make sure things look real good in Dark Mode as it is the dominant appearance people use my app in.

David Smith




Design Notes Diary #2

5:35a Checking in with the new stats again

Now that I have a full day of data to look at I checked in with the Dynamic Type data. The funny thing is because of the Law of Large Numbers the result isn’t particularly different from when I has looking at a fractional day yesterday. With the exception of now having enough time to catch a few of the more rare Accessibility sizes.


It is a terrible habit I have, but I also check yesterday’s App Store sales every morning as one of the first things I do after I wake up. I got into this habit when I was early in my career as an indie and things were a bit tenuous. Even though the business has been stable for several years now, it is still something I do every day.

6:23a Tidying up the workout screen

I thought about adding a button to the speed display to swap between pace and speed. But after playing around with it a bit I think it is worse that way. I’ll make it so that if you tap on it, the display will change. But put the ‘visible’ version of this in the workout settings page. I don’t mind hidden features as long as they aren’t only hidden.


While trying to get this built I ran into one of the more frustrating aspects of SwiftUI work. Where a view’s hit box exceeds their visible bounds, making your button fail if you hit it in certain corners. In this case the map view was blocking the bottom of the speed label. The trick was adding a .contentShape(Rectangle()) to the map to constrain its hit testing.


I tried out mirroring the Apple Watch Workout app layout where you have a swiped page view with the controls on the other side, but I think I like that less. Visually it is better, but it feels really weird on a screen that is so large.


While trying out putting the map on top, I also realized that I need to change my SwiftUI preview to put the workout view into a modal sheet. This is how it will be displayed in the app and if I don’t then many of the designs may look better than they actually would on device.


Tried making the map HUGE.

Now we’re talking. Let’s try to make the map the star and just overlay the metrics/buttons on top of that.

To start with I’m going to just throw some basic charts in here but hopefully get a baseline from which I can then get going with later.

What’s really tricky here is that I want to display graphical data…which means using Charts. I’d love to use Swift Charts, but those only work on iOS 16+. Currently iOS 16 is around 50% of Pedometer++’s user base, so that’s not great, unless I only show the Trends tab on iOS 16.

Right now that is what I’m tempted by and then to build a super basic version for users running old versions. One of the main goals of this update is to modernize the app, so getting stuck on something old seems counter productive.


Using a framework the first year it is out is always a bit perilous. I just spent a good hour trying to work out how to pin the axis to the side of a scrolling chart view. As best I can tell this isn’t a first class behavior. I got something working…but you know it might actually be simpler to just roll my own charts for this if I hit a bunch more limitations like that.

David Smith




Design Notes Diary #1

Tuesday, November 29, 2022

8:50a Meta Note on Design Notes

As part of my usual development approach I very often post in-process screenshots and ideas to Twitter. This has provided me with a rich sense of community & feedback. With all the recent drama around Twitter, however, it has gotten me thinking about this part of my work process and whether an alternative would be better, at least for now.

The more I thought about it the more I realized that while posting my work likely had some benefit to others in learning from my process and mistakes, there was a fundamental value to me in the capture rather than the share. In many ways it reminds me of the Field Notes slogan “I’m not writing it down to remember it later, I’m writing it down to remember it now.” Or similarly the concept of rubber duck debugging.

There is a tremendous value in having to condense the vague notion of an idea down into words that clarifies, crystalizes and improves it.

So while the drama and uncertainty of Twitter is likely ongoing for some time I didn’t want to lose this aspect of my work process. So I’ve decided to try something new, a daily Design Notes diary. While I’m working I will keep open a Markdown document and periodically capture what I’m thinking about, working through or evaluating. Then at the end of the day I’ll post these to my blog, with cross post links to twitter and mastodon.

This is very experimental, but times of uncertainty are often useful for experimentation. Expect some typos, half baked ideas and unevenness…but also hopeful useful honesty and a view at what goes into making apps.

And if I’m being completely honest, I hope there is an additional benefit of “being watched” helping me be more productive. I’ve been in a bit of a slump recently, and who knows, maybe this will help. 🤷🏻‍♂️

8:59a GMT Publish Widgetsmith 4.1.1

I had shipped an update with Widgetsmith last week which had a really awkward mis-render on my new paywalls of all places…thankfully Matthew on twitter let me know so I could get it fixed.

I also added an instrumentation to my privacy oriented analytics system to collect which Dynamic Type sizes are most popular. With the hope of using this to better support it going forward.

9:27a Explore layouts of the Pedometer++ workout screen

I’m working on a new iPhone workout mode for Pedometer++.

This is what I started with:

But I think the inset map looks a bit weird. So I tried this:

Which I think is much better but I’m not a huge fan of the buttons being next to the map. It looks a bit crowded. Like they need to be free.

Hmm…I think this is better but it feels a bit unbalanced still. But I’ll run this this for a bit.

I think it looks more balanced with the buttons as images rather than words. But still not 100% happy with it.

9:46a Get zooming working for my custom map system

I have built out a completely custom mapping system. Originally this was for the Apple Watch but now it is also for the iPhone. On the watch I could do zooming with the Digital Crown, but now that it is running on an iPhone I also need to get pinch to zoom to work…which is a bit of a pain in SwiftUI.

This work is made much easier with the use of on device SwiftUI previews. Since gestures don’t really work well in the simulator.

Phew, that was an adventure. You know it is a tricky problem when you end up with a notebook page full of math formulae at the end…but I got there in the end:

12:20p Working on being able to flip between Apple and Custom Maps

When working on GPS features I’m usually having the iOS simulator run in “City Run” mode, which simulates a route around Infinite Loop in Cupertino. Sometimes I wonder, who did this run originally? I believe this has been in the simulator since iPhone OS 2.0, way back in 2008.

Probably a good time to mention the excellent Sim Genie tool by the legend himself, Curtis Herbert. This lets you replay GPX files to the simulator. Which I’ve found exceedingly useful for getting a variety of locations to test.

3:57p Initial Dynamic Type Stats

With the Widgetsmith release I published this morning I added data collection for Dynamic Type sizes. I want to understand which ones are most common to better adapt my designs. The fun thing about having an app with a wide audience is that I can pretty quickly get this data in.

Here is the result so far:

This overall matches what I’d have guessed, with a bit of a surprise that the Extra Small size is so popular.

4:36p Wrapping up

It has been a really productive day. Way more productive than I’ve been able to achieve in the last few weeks. Though of course I often can get a brief burst of productivity by changing anything. Working in a new location, working on a new project. The real trick is finding a change that sustains that productivity.

Will this one do that? No idea. But I’ll take the boost today either way.

David Smith




Production Notes: making an iPhone Video in the Field

The response to my Apple Watch Ultra expedition video has been really encouraging. Video is very far from my area of expertise and something that I feel only a passing confidence with. However, I still think that I’ve learned a few things worth sharing in the creation of this video, in case anyone else wants to make a similar video using their iPhone.

This video was recorded entirely on an iPhone 14 Pro, using a combination of the Camera app and DoubleTake (discussed below). I edited it in Final Cut Pro. The raw source footage was around 21GB, which I AirDrop-ed over to my MacBook Pro.

Audio

By far the hardest part of making a video outside in the elements is getting usable audio. The built-in microphone is only really viable indoors or on very still days.

I’ve tried a number of things over the years while out hiking to try and get good audio. I’ve tried using the old wired EarPods. I’ve tried using a standalone lapel mic with dedicated recorder. I’ve tried using my AirPods Pro. I’ve tried using a second iPhone running Voice Memos. None of these really works very well and each have a variety of drawbacks.

What I have found is that it is essential that the audio be recorded directly into the iPhone. While external recorders can be later synced up in Final Cut, this process is super cumbersome and means that you can’t easily check your footage in the field. It is also essential to have some kind of wind/weather screen on your microphone, otherwise you will constantly be ending up with unusable shots where all you hear is wind noise.

After this trip, I think I’ve finally settled on a workable solution that ended up working very well in practice. I have started to use the Rode Wireless Go 2 connected directly to my iPhone. This microphone system includes two parts, the microphone/transmitter and then a receiver unit. The microphone comes with a wind screen which can be installed on top. This device is clearly intended to be used with a larger, more traditional camera setup, but with a little creative cabling I got it working well with an iPhone.

The output from the Wireless Go can be connected to the iPhone in two ways. Rode makes a USB-C to Lightning cable which can do this digitally, but I found this cable impossible to source. Strangely a standard USB-C to Lightning cable doesn’t work; it seems there is something special about Rode’s necessary to get audio to flow.

What I found to work well instead was to get a cheap TRS-to-TRRS cable and then plug this into the iPhone headphone adapter. This makes the phone think that the audio is coming in from an external headset and worked a treat. iOS then used this audio source whenever it was plugged in without any additional setup or configuration.

I primarily use my iPhone with my right hand. So I clipped the receiver to my right wrist using my Apple Watch’s strap as the clipping point (and tucked the end of the cable under the strap for safe storage while moving). For this trip I was wearing a watch on each wrist, but in the future I’d just wear my watch on my right wrist for this purpose. That way whenever I want to record some video, I’d just pull out my iPhone, plug in the cable and start.

Probably the best example of this setup’s quality is this section in the middle of my video where I’m discussing the audio quality of the Ultra. I was around 20 meters from my iPhone/receiver, across a road, with steading drizzle and a blustery wind. The resulting audio is very clear and usable.

Camera.app and DoubleTake

For the majority of my video I used the main Apple Camera app. I find that you can’t beat the performance and ease-of-use of it. I shot the video in 4K/30fps.

I tried doing a few scenes in Cinematic mode but ultimately regretted it because the resulting video is much more cumbersome to use. I also did this video with HDR enabled, but ended up regretting that too. While I’m sure there are ways to make HDR work for a video destined for YouTube, I found in practice HDR made the workflow and upload process much more complicated. In the end I disabled HDR for my export because no matter what I tried I couldn’t get it to look right on YouTube. For a recreational video creator it just isn’t worth the marginal benefit.

The other app I used to record video was FiLMiC’s DoubleTake. This lets you record concurrently from the front and back cameras. I found this really helpful in doing my “Walk and Talk” segments. Previously, I’ve done these with just the selfie camera, but this leads to a kinda boring visual. Also, it makes it hard for me to ‘show’ things, which in the case of this video was useful to be able to cut between my face and video of my wrist.

Since both resulting clips use the same plugged-in microphone, later syncing up the two videos was very straightforward in Final Cut.

The only thing I wish both of these apps included was the ability to overlay an audio peak meter while recording. FiLMiC’s main video app can do this, but neither DoubleTake or Camera.app does. I’d have liked the confidence that I hadn’t misconnected something. Instead, I did lots of quick test shots which I played back to confirm my audio/video was solid.

Timelapses

In a couple of spots I included scenic timelapses in my video. What I have found to work really well for these is to not use the timelapse mode in Camera.app, and instead just record a very long regular video. The challenge with the timelapse mode is that it limits your later control over timing and quality. If you just speed up a long, fixed position video in Final Cut you end up with the same effect but can adjust the timing easily later on depending on what you want.

For these fixed shots I attached my iPhone to a lightweight, tripod using a Studio Neat Glif. This worked well and was very lightweight, which I value on my hikes.

Stabilization

If I wanted the most stable, clean video possible I would have taken along a handheld gimbal for this trip. But the reality is I really didn’t want to lug around the weight of something like that with me and have to stop to set it up any time I wanted to record a clip. So instead I just shot all my non-static shots handheld and relied on the iPhone’s stabilization to be good enough.

There were a few clips where I think the result is a bit more wobbly than I’d prefer, but overall this approach worked well. Something I’d recommend doing here is while recording your walking shots make sure you don’t lock your elbow. You want to keep a flexible arm while recording, this lends a bit of natural shock absorption to the video and actually is a pretty good ‘virtual gimbal’.

Apple Watch Camera Remote

For a few of my ‘talk to the camera’ shots I setup my iPhone on a tripod and then moved away from it. For these I didn’t want to use the selfie camera so that I could get the full, high quality video of the back cameras. In order to check my framing and start/stop the video I then used the Apple Watch’s Camera Remote app. This gives a little preview of what the iPhone can see, so I could check I was in frame. Then I could remotely start and stop capture. I wasn’t sure if this would have any issues working out away from WiFi and civilization, but it worked great for me well out in the wilderness. You can catch a glimpse of this in action right at the end of my video.

Closing Thoughts

Overall, I have to say the creation of this video was much simpler than I would have guessed. The recording process (once I’d sorted out the audio situation) was natural and not that dissimilar to what I’d be doing anyway while capturing photos and video clips for myself.

I’m sure there are better ways to edit and compose the video in Final Cut. I could have added music or transitions or all manner of other touches. But the reality is that I find that if I over complicate something like this I just won’t do it. I’d rather keep it simple and have it exist, rather than strive for sophistication and leave it a dream. Each time I make something like this I’ll learn and improve.

I highly recommend giving something like this a try (and you can start off just using your AirPods) on your next adventure. It is both a good skill to nurture as well as something that will preserve a valued memory later on. Photographs and video clips are lovely, but recording you actual thoughts and experiences is so much more rich and valuable.

David Smith