By Eric Burden | December 7, 2020
In the spirit of the holidays (and programming), I’ll be posting my solutions to the Advent of Code 2020 puzzles here, at least one day after they’re posted (no spoilers!). I’ll be implementing the solutions in R because, well, that’s what I like! What I won’t be doing is posting any of the actual answers, just the reasoning behind them.
Also, as a general convention, whenever the puzzle has downloadable input, I’m
saving it in a file named input.txt
.
Day 6 - Custom Customs
Find the problem description HERE.
Part One - Group Project
Ah, customs. What fun! Interestingly, it doesn’t seem to matter how we answer the questions on the 26-question form, just that we answer “Yes” to some of them. As a group. Cool. That’s totally do-able. Actually, this problem is a great opportunity to use a new (for me) package and a data structure I don’t often employ: a stack!
library(dequer) # Provides the `stack()`
library(purrr) # For `reduce()`, `map()`, and `union()`
test_input <- c('abc', '', 'a', 'b', 'c', '', 'ab', 'ac', '', 'a', 'a', 'a', 'a', '', 'b')
real_input <- trimws(readLines('input.txt'))
# Helper function, divides the input text into the answers for each group
parse_input <- function(input) {
groups <- length(input[input == '']) + 1 # The number of groups
stacks <- lapply(1:groups, function(i) stack()) # Pre-allocate a list of stacks
current_stack <- 1 # Start with the first group
# For each item (line) in the input, if it's blank, move to the next stack,
# otherwise push the characters from that line to the current stack
for (i in input) {
if (i == '') {
current_stack <- current_stack + 1
} else {
push(stacks[[current_stack]], unlist(strsplit(i, '')))
}
}
lapply(stacks, as.list) # Return the list of stacks as a list of lists
}
parsed_input <- parse_input(real_input) # Parse the input file
# Produces a list where each item is a character vector of all the answers
# (letters) that appeared for *any* member of the group
answers_by_group <- map(parsed_input, reduce, union)
# The sum of the lengths of all vectors in `answer_by_group`
answer <- sum(lengths(answers_by_group))
The stack implementation in dequer
makes it convenient (and fast!) to append
items to a stack of indeterminate size, and, once all the items are added to
the stack, it can be converted to a list for further manipulation. The
format of parse_input(test_input)
can be seen below. It’s a list of lists,
where each sublist represents a ‘group’ and each character vector in that
sublist represents all the answers marked ‘yes’ by an individual person.
List of 5
$ :List of 1
..$ : chr [1:3] "a" "b" "c"
$ :List of 3
..$ : chr "c"
..$ : chr "b"
..$ : chr "a"
$ :List of 2
..$ : chr [1:2] "a" "c"
..$ : chr [1:2] "a" "b"
$ :List of 4
..$ : chr "a"
..$ : chr "a"
..$ : chr "a"
..$ : chr "a"
$ :List of 1
..$ : chr "b"
Part Two - Consensus!
Part two is a minor variation on part one: we need to find (per group) the number of questions that everyone in that group marked ‘yes’. This is deceptively simple, since we’ve parsed the input data already.
library(purrr) # For `map()`, `reduce()`, and `intersect()`
parsed_input <- parse_input(real_input)
common_answers_by_group <- map(parsed_input, reduce, intersect) # The change
answer <- sum(lengths(common_answers_by_group))
You read that right, just swap out union
for intersect
in our map/reduce
call, and now we have a list containing one character vector per group that
contains the letters that were in every group member’s answers. Yay, set
operations!
Wrap-up
The toughest part of this puzzle was the data ingestion, which is actually also
true in a lot of real-world scenarios. Wrangling data into a usable format is
often over half the battle. I’m pleased as punch to have discovered dequer
,
and plan to make use of it again in future puzzles. purrr
really shined in
this one as well, and while it would have been possible to use loops to get to
the final result, there’s just something so syntactically pleasing about doing
it in a single, readable line of code.
If you found a different solution (or spotted a mistake in one of mine), please drop me a line!