Cued
Play the best parts of your favorite songs.
Next.js • TypeScript • Website • Postgres • Drizzle • Queue • Tailwind CSS
What is it
I love working out, but hate listening to dead music while I do it. Especially when I'm running or lifting, I need high-energy songs to keep me moving. Unfortunately, my favorite tracks almost always have dead parts (like long intros/outros). This sucks and often results in the dreaded "stop and scrub" where you frantically "get to the good part" before the next set.
This is a pain, which is why I built Cued. It lets you set exactly what parts of your music you want to play and then automatically skips songs, ensuring you never have to deal with lame music again. The source code is available on GitHub at cued.
How does it work
I wanted Cued to integrate tightly with Spotify because I actually wanted to use it myself. This led me to leverage the Spotify API through implement a two-part architecture:
- A web-based UI (Next.js + Tailwind + tRPC + Postgres via Drizzle) to easily look through your playlists/search songs on Spotify and quickly trim them to your liking, and lets you turn Cued on/off.
- A backend worker that monitors your Spotify activity and detects whether or not the next song in your queue has a custom start/end point and skips appropriately.
I connected these parts through a queue system, implemented with BullMQ.
Crucially, this means you can set Cued up once, and do everything from the regular Spotify app. This satisfied my integration constraint.
Key challenges
Worker Polling
Spotify doesn't offer an event-based API (like webhooks or websockets) to tell me when the user's Player State changes. These types of frameworks are nice because they make my worker's computation very efficient, since there's never any wasted requests because they'd only act when Spotify notifies a pertinent change has occured.
Instead, we're forced to make requests repeatedly and detect when the state changes ourselves, which is called polling. The tradeoff here is in compute/rate-limiting vs latency.
If we want to closely approximate the speed at which an event-based API would give us (reducing latency), we'd risk getting rate-limited by Spotify because we'd have to send rapid requests with little cooldown. But, if we wait a very long time between requests, we are slower to respond to state changes.
As with many problems in software engineering, this was a balancing act and I ended up going with a poll interval of 5 seconds in order to get satisfactory performance.
Web trimming UI
This was a welcome challenge. It turns out that building an accessible, performant UI to trim audio that's being streamed in (e.g. doesn't exist via an actual <audio>
element in the DOM) is very difficult. I got it to work and had a fun time implementing:
- "Thumbs" for adjusting the desired start/end times of the song
- Auto "play on new start time" when the start thumb was adjusted
- Auto "play 3 seconds before the new end time" when the end thumb was adjusted in order for the user to know exactly when their song's ending
- Pause/play, with auto pausing after moving past the end point
- A virtualized progress indicator that works very well and shows exactly where in the song you are, even after seeking around/replaying a bunch
Local caching
I used tRPC, which integrates with Tanstack/React Query. This is awesome because Tanstack Query comes with local caching out of the box, which meant that general navigation around the UI feels snappy when revisiting pages and doesn't eat at my rate limit.
However, I discovered that upon saving changes to a song's cue points and closing the modal, reopening would bring up the previous state from when the data was queried, without factoring in the latest mutation.
This is a pretty common problem when trying to sync client/server state. In fact, apps like Instagram handle things like this by pre-emptively sending your like on a post to the UI (i.e. filling it in on the client) while the server call happens in the background. In this case, though, my queries aren't particularly expensive, so I leveraged Tanstack Query's onSuccess
callback for my insertTrack
mutation to invalidate the local cache for all queries.
This forces them to refetch, and when they do, they'll get the latest update from the database with the data from my mutation included.
If my queries were more expensive, it might make sense to define a local update function for the cached data if the mutation's update was predictable to avoid refetching. With my Instagram example, they just set the local state to (likes) => likes + 1
in order to avoid refetching.
Poor SDK quality
Spotify's APIs and SDKs are seriously lacking. First, their general API SDK has several type errors. For example, the endpoint getting a user's current playing track, as I discovered (but also as one would logically assume...), can return null
. Their SDK types do not express this. This led to a pretty annoying developer experience, where I had to disable several ESLint
rules in order to add validation that shouldn't have been needed.
Second, their web player SDK doesn't play well with React. Even after I got it to work locally, there appeared to be some internal logic that disabled it working on production, which was quite frustrating since this wasn't detailed on their site. Fortunately, I realized I could just play from the user's currently playing device with the API, which saved the project and the web "trimmer" UX.