Yet another year has past, yet another Advent of Code has been bestowed upon us by Eric Wastl. This time Santa has accidentally dropped the keys to his sleigh into the ocean somewhere and with your track record you are just the person to call in such circumstances. If nothing else, at this point you should have been conditioned to not question the logic of keys to a sleigh. Maybe he chains it to a pole1 for security.

Advent of Code is a series of little or not so little puzzles, with one revealed each day from December 1st until the 25th. The puzzles start with a part 1 question, and if you manage to answer that correctly you get to look at part 2 which usually includes some kind of twist. In this post, I’ll continue my vaguely consistent tradition of reviewing particularly interesting days and trying to provide some background on them.

This year I once again reached for Rust as my programming language of choice with the lofty goal of getting a total program runtime of under 0.5 seconds, after I managed 1 second last year. The methodology for measuring this has not changed. I also met my lofty goal, albeit barely. The results are below. With that said, let’s look at the puzzles.

Interesting problems

One thing I can say in general is that the puzzles were a bit harder than other years but was combined with an excellent pacing. Several puzzles required some form of memoization for example but the slow escalation of difficulty made it possible to learn along the way.

Day 6: Fishonacci

Directly from the problem statement you could see that the number of fish would grow exponentially. As such you don’t want to consider each individual fish, and instead group them by “next day they would be fertile.” Then you just do the computation iteratively as normal.

Setting aside the direct problem, I think it’s interesting to look at the growth function GG, to see if we can analyse it a bit more. From the problem definition, lanternfish will give birth after their ninth day in life, and then every seventh day after. This is not representative of the real world,2 but we can it as follows: the growth at time t G(t)G(t) is equal to the growth 9 days ago (the fish that have just been hatched) plus the growth 7 days ago (the fish that are fertile once more).

Gt=Gt9+Gt7G_t = G_{t - 9} + G_{t - 7}

This could remind you of another recursive sequence that looks fairly similar, the Fibonacci sequence Ft=Ft1+Ft2F_t = F_{t - 1} + F_{t - 2}. For the Fibonacci sequence, a direct formula is known3, but I’m not good enough at maths to find one. It also wouldn’t help us, because we’re looking for the total, not the growth. We can rewrite this equation to get to the number of lanternfish, LL, by substituting GtG_t for LtLt1L_{t} - L_{t - 1}:

LtLt1=Lt9Lt10+Lt7Lt8Lt=Lt9Lt10+Lt7Lt8+Lt1\begin{align} L_{t} - L_{t - 1} & = L_{t - 9} - L_{t - 10} + L_{t - 7} - L_{t - 8} \\ L_{t} & = L_{t - 9} - L_{t - 10} + L_{t - 7} - L_{t - 8} + L_{t - 1} \end{align}

The growth function turns out to be easier to compute iteratively, so we can iterate that and maintain a running total to arrive at the final result in O(n)O(n) time which works fairly well.

The final code can be found on Github.

After doing all this, Reddit informed me that you can also model the state transition diagram of the fish as a matrix multiplication and use fast matrix exponentiation to arrive at the end result in O(logn)O(\log n) time. It should be noted that the base cost for this algorithm is larger as squaring a 9 by 9 matrix is more expensive than adding a few numbers, especially since nn is only 256.

Day 7: Crab rave

Part one just required computing the median of the positions for reasons others have proven more concisely than I could, but part two is more interesting.

The cost for a crab to move a to a certain distance is equal to the arithmetic series i=1ni\sum_{i = 1}^{n} i where nn is the absolute difference in positions. Thanks to Gauss we know this is equal to f(n)=n2(1+n)=n2+n2f(n) = \frac{n}{2}(1 + n) = \frac{n^2 + n}{2} The total cost FF is therefore a sum of unimodal functions, which happens to be a unimodal function. We can use this unimodality to find the minimum using ternary search.

Ternary search works in a way similar to binary search, but doesn’t require you to know the minimum a priori. We first find an upper and lower bound to search in, then we probe this range at two points, aa and bb, at ⅓ and ⅔ of the range respectively. If F(a)<F(b)F(a) < F(b), we know that the minimum cannot lie after bb, so we throw away that third of the range. Otherwise, we know that the minimum doesn’t lie before aa, so we throw away that bit.

In the demo above, you can see this process in action. The yellow dots are our probes, and the reddish area is the area we still consider. You can see just how fast this area shrinks which each step. In practice, we repeat this process until we have a sufficiently small range left over, then try all of the remaining positions to find the minimum. An important thing to note is that while in part one the minimum position was guaranteed to coincide with at least one of the initial positions, the same is no longer true for part two.

The final code can be found on Github.

Addendum January 24th

An earlier version of this article stated that the cost function for each individual crab is a quadratic function and therefore the total sum is also a quadratic function. This is almost correct but not helpful, as the function is discontinuous at the initial position of any particular crab. So here’s a better explanation of why the method shown above works.

The cost function at position xx for any particular starting position pp is what I would call a weirdly cropped parabola f(x,p)=(xp+1)xp2=(xp)2+xp2f(x, p) = \frac{(|x - p| + 1) |x - p|}{2} = \frac{(x - p)^2 + |x - p|}{2}. The absolute bars introduce a discontinuity which makes it not quite a nice parabola. The discontinuity means that we cannot trivially sum all functions together; the coefficients don’t quite mesh up together. Taking the derivative directly is also unhelpful.

I don’t have a formal prove that the resulting sum is even a unimodal function but I do have a handy-wavy one. We can take the derivative relative to xx of the the cost function, which gives us f(x,p)=xp+12sgn(xp)f'(x, p) = x - p + \frac{1}{2}\text{sgn} (x - p). While we still cannot sum these derivatives directly due to their discontinuity, we do now that each individual derivative is a monotonic non-decreasing function when pp is a constant, as the function is a linear function of xx minus some constant, then add ±1\pm 1. That last addition is slightly tricky, but it also only flips once and only from negative to positive, so it’s still a monotonic non-decreasing function.

We can therefore conclude that the sum of all functions pcrabsf(x,p)\sum_{p \in \text{crabs}} f'(x, p) must also be a monotonic non-decreasing function, and therefore pcrabsf(x,p)\sum_{p \in \text{crabs}} f(x, p) must be a unimodal function. Well not quite, technically I would have to prove that the derivative starts out negative, but I hope you trust me that a line of slope of 11 crosses the xx-axis somewhere.

Day 17: Slam dunk

Day 17 introduces another set of quadratic equations to solve. We want to hit a box that’s somewhere to our right. The yy coordinate follows a nice parabolic arc after launch due to gravity. This means we can compute the “hit” for any initial vertical speed by solving a simple quadratic equation. The xx coordinate is similar but has a catch: it stops at the crest of its arc, because its deceleration is due to drag and not gravity. This makes the yy coordinate nicer to work with so we can iterate over that.

The following description does two assumptions which happen to hold for the problem inputs I have seen. First, we assume that the target is somewhere in the positive xx coordinates. Second: we assume the target is below the starting position. The second assumption is more integral to the actual solution.

For part 1 the task is to find the highest initial vertical velocity for which the ball still ends up in the destination square. Many people realized that the square is at one of the “nice” positions where there exists some initial horizontal velocity where the ball can come to a complete stop after being launched. In that case, the question is reduced to “how fast can you shoot the ball upwards so it doesn’t phase through the target?

Computing this maximum is fairly simple. The ball’s yy coordinate follows a perfect parabola, which means that if you throw it upwards at some speed, it will come back down to the same height at the same speed. Our time step is discrete, so that means the first vertical velocity at which it travels below the starting point is equal to the negative of the initial speed minus one. If that is less than or equal to the lower edge of the square, it will hit. From this maximum speed, we can derive the highest point ymax=i=1vmaxiy_{max} = \sum_{i = 1}^{v_{max}} i.

For the second part we need to enumerate all combinations that can be used to hit the target. For this we need to try all vertical velocities. It’s possible to compute the vertical time of hit by solving a simple quadratic equation. You can then use the time of hit to determine valid horizontal speeds. For this you need to compute whether you can come to a full stop or if you need to hit the target at speed. One important thing to note is that any vertical speed may hit more than once, and you need to try all of these times.

The final code can be found on Github.

Other days

Some other days deserve an honourable mention but did not have as much depth.

  • Day 9 was a beginner’s course in tree search algorithms. Both BFS and DFS should work, although I implemented DFS for slightly better cache locality in my queue.

  • Day 16 was a nice exercise in reading the instructions very well. No major problems, thanks to the bit-level parsers provided by nom, which deserve a shout-out.

  • Day 18 did not quite fit into the Rust ownership model but it worked out in the end. Apparently it also works really nicely with a linked list of pairs and heights, but as is tradition for me to mention by now, linked lists in Rust are a bit of a let down.

  • Day 19 was pain. No further comments.

  • Day 20 had a nice twist. At first glance it was just a simple cellular automaton but it managed to violate a core assumption I’d made and did not bother to check for an hour. And also someone used it to simulate the game of life, which was neat.

  • Day 23 was a rather straightforward game graph search. It would have been even more straightforward if I’d had read the problem statement correctly. I had missed that the Amphipods are very restrictive in how they will move. They will not move in the hallway after moving into the hallway, and they will not enter rooms that are not their own, nor will they enter rooms that have “other” amphipods in them. If you ignore this, your state space becomes much larger and you need a much better A* heuristic to get this to finish. You will also find a route that’s shorter than the real one. At least I have a good heuristic now.

  • Day 25 did not stand out to me directly but apparently is actually an example of the Biham–Middleton–Levine traffic model where you need to compute how long it takes for the state to evolve into a globally jammed phase. I just like learning about such models.

Benchmarking results

As mentioned in the introduction, my total runtime this year stands at 0.3 seconds. Most parts are sub-millisecond, a few are sub-microsecond, but the outliers basically contribute all of the runtime. You can see that in the plots below.

Cumulative runtime by day
Cumulative runtime by day
Runtime by part
Runtime by part

To make the goal, I did a bit more optimization than I have previously but nothing shocking. The main take-aways were to avoid using hash maps if at all possible. Using (dynamic) arrays as bitsets tends to work very well. Nom is also great for parsing and is much faster than my regex abuse from previous years.

In conclusion

I appreciated the upped ante of this year’s edition and the time to use some algorithmic knowledge that my day job tends to let me use as often. This has been one of the very few things I’ve ever used ternary search for which was neat. Other than that, I have not much to say, other than that I’m looking forward to December.


  1. Presumably the North Pole. ↩︎

  2. I was curious so I looked this up. One probably species for these lanternfish, Benthosema glaciale, lives to be about 8 years old, matures after 2-3 years and spawns offspring in the summer. ↩︎

  3. The nnth Fibonacci number is equal to φn5+12\lfloor \frac{\varphi^n}{\sqrt{5}} + \frac{1}{2}\rfloor where φ\varphi is the golden ratio, 1+52\frac{1 + \sqrt{5}}{2}↩︎