'Automate large data scraping
I am very new to using R and coding in general.
I would like to ask for any help I could get to solve my issue.
I try to download a large file from https://www.ncei.noaa.gov/access/search/data-search/global-summary-of-the-day.
So I use the GSODR package to download all data however, no matter what is the data length (1 year, 3 months, 5 years) it only downloaded around 5000 data (obs. of around 49 variables).
:(
another problem is actually I have to automate the download. my friend told me that I need to make a server and scrape it with the website (not the GSODR package) to do that. I will appreciate it if anyone gives me any advice since I'm very new to all of these.
thank you very much. I really appreciate your time.
library(GSODR)
library(dplyr)
library(reshape2)
if (!require("remotes")) {
install.packages("remotes", repos = "http://cran.rstudio.com/")
library("remotes")
}
install_github("ropensci/GSODR")
load(system.file("extdata", "isd_history.rda", package = "GSODR"))
# create data.frame for Laos only
Oz <- subset(isd_history, COUNTRY_NAME == "LAOS")
Oz
# download climatic data Laos for whole years
ie_2018 <- get_GSOD(years = 2017:2021, country = "Laos")
data(ie_2018)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
