I've been enjoying watching as SearchResearchers tackle this Challenge. It's a bit trickier than I thought, so let me walk you through my (relatively) simple solution.
And I now have a spreadsheet that looks like this. Each column is one day (from Nov 1 - Nov 22), and each cell is the amount of snowfall on that day (in cm).
I didn't do anything fancy here--I just screenshot the map, then laid the charts for each location on top of them. This is a visualization technique known as "small multiples" (that is, a small number of repeated charts all of the same type)--but one of the things to know is that they all have to show the same thing on the same axes, or all bets are off.
Note that I DID have to set the max-grid values to all be 19. If left to their own devices, the Y axis will be different on each one. I wanted them to be comparable, so I had to manually set them all to 19. If I was producing hundreds of charts, I would have written a piece of code to this... but this data set was small enough that you could do it by hand.
As I worked through this data collection and plot task, the biggest challenge for me is just keeping track of where my data came from, and how it gets transformed from one data source to another. As I worked, I kept backtracking and checking my data. (This is where having a buddy to work with is a great idea--looking at you Ann & Debbie.)
I also really like your comments about wanting to create contour maps (or heat maps, although maybe we should call them cold maps).
If I get the chance, I'll write that up tomorrow.
Search/Sensemaking Lessons: In the meantime, it's useful to see that sometimes the simplest approach is best. (See my queries above.)
The hard part is knowing which of the many data sources will work out.
My approach is always to grab a small data set (3 or 4 cells) and then work through the whole process from beginning data download to visualization. It's a mistake to try and do the entire data manipulation process on the full data set at the beginning. Use a small sample and work UP to the full dataset. Trust me, I've wasted many hours on wrangling data, only to find out, when I got to the end, that the whole thing was broken. Better to find that out on a small subset of the data than the entire thing.
Because sometimes you'll get halfway through an analysis and realize that everything you're doing is wrong. Or that the data doesn't make sense. Or it's too full of errors, or missing data points.
More comments tomorrow... but this was my quick answer for today.
We'll be talking more about this in the future!
In the meantime, Keep Searching On!