Skip to content

Instantly share code, notes, and snippets.

@khufkens
Last active March 13, 2025 08:51
Show Gist options
  • Select an option

  • Save khufkens/b13b28068ddfd8a20d130abe2c95d49a to your computer and use it in GitHub Desktop.

Select an option

Save khufkens/b13b28068ddfd8a20d130abe2c95d49a to your computer and use it in GitHub Desktop.
Backup USDA SNOTEL
# This tiny script tries to grab all USDA SNOTEL data
# for safekeeping. SNOTEL monitors snow and water across
# the US but mostly in the west.
# https://www.nrcs.usda.gov/resources/data-and-reports/snow-and-water-interactive-map
#
# The data downloaded are the historical reports
# summarizing daily values of snow pack and additional
# metrics. Reports lag real-time values with a couple
# of weeks.
#
#---------------------------------------------------------#
# loads required libraries
library(snotelr)
library(dplyr)
# lists all sites
sites <- snotel_info()
# download loop
# could be parallelized but
# might trigger a rate limiter
# so be careful with parApply()
# if you implement it
snotel_data <- lapply(
sites$site_id,
function(id){
tryCatch(
expr = snotel_download(
site_id = id,
internal = TRUE
),
error = function(e){return(NULL)}
)
}
)
# One can flatten everything, this results
# into a table of 12M rows and 19 columns
# ~2.5GB in size
snotel_data <- bind_rows(snotel_data)
# save as RDS and compress
# to reduce the file size
# compressed ~50MB
saveRDS(
snotel_data,
"snotel_backup.rds",
compress = "xz"
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment