downloader/0000755000175000017500000000000012140123355013714 5ustar sebastiansebastiandownloader/MD50000644000175000017500000000134712140123355014231 0ustar sebastiansebastian80e4cbda3b1a0336c6ba7d0f329b5dca *DESCRIPTION 4a0dc8018d8913ec0d00a5bfc9a2caea *NAMESPACE 14176d8c318701239b9b6759e01f324d *NEWS d4dc41f688c23edc6a3a2b75ca5c88b5 *R/download.r de21e78821c43a2c90f2cfb9fec2dc6d *R/downloader-package.r 940c51690b21750f42c30cae06ff83e6 *R/sha_url.r c07bfa9c90d8a83779bede4e7757403d *R/source_url.r 6faf10fe18d73cd9b260a08156583f02 *README.md 445bb9feecafd409496db71611f9a17c *inst/tests/test-download.r 9ae6c32f1be586c0e9444c3daa6e1059 *inst/tests/test-sha.r 2c54a48219328c36dcf8cf44576d9a08 *man/download.Rd e8d6670ce5c1569ee4165d4043d056df *man/downloader.Rd 948dc04ca3c272481874b3eebbac9a62 *man/sha_url.Rd e2633d9081627558442eed6e492461a0 *man/source_url.Rd 760b5bb5a5b44be51e89cde3827cb738 *tests/test-all.R downloader/NAMESPACE0000644000175000017500000000011612111275017015132 0ustar sebastiansebastianexport(download) export(sha_url) export(source_url) importFrom(digest,digest) downloader/NEWS0000644000175000017500000000170412140111211014401 0ustar sebastiansebastianVersion 0.3 -------------------------------------------------------------------------------- * `source_url()` function now checks the SHA-1 hash the downloaded file. * Add `sha_url()` function, for finding the SHA-1 hash of a remote file. Version 0.2.2 -------------------------------------------------------------------------------- * Disable all network tests when running on CRAN, because the connection to the remote test website may not be reliable. Version 0.2.1 -------------------------------------------------------------------------------- * Change https redirection test to not run on CRAN because their Windows build machine has more stringent security settings. Version 0.2 -------------------------------------------------------------------------------- * Switched to using `Sys.which` to find external programs. * Added tests. * When using curl, follow redirects with http. (It already worked with https.) * Add `source_url` function. downloader/DESCRIPTION0000644000175000017500000000162212140123355015423 0ustar sebastiansebastianPackage: downloader Maintainer: Winston Chang Author: Winston Chang Version: 0.3 License: GPL-2 Title: A package for downloading files over http and https Description: This package provides a wrapper for the download.file function, making it possible to download files over https on Windows, Mac OS X, and other Unix-like platforms. The RCurl package provides this functionality (and much more) but can be difficult to install because it must be compiled with external dependencies. This package has no external dependencies, so it is much easier to install. URL: https://github.com/wch/downloader Imports: digest Suggests: testthat Collate: 'download.r' 'downloader-package.r' 'source_url.r' 'sha_url.r' Packaged: 2013-05-01 03:58:23 UTC; winston NeedsCompilation: no Repository: CRAN Date/Publication: 2013-05-01 07:23:57 downloader/man/0000755000175000017500000000000012111275017014470 5ustar sebastiansebastiandownloader/man/download.Rd0000644000175000017500000000301412104241600016555 0ustar sebastiansebastian\name{download} \alias{download} \title{Download a file, using http, https, or ftp} \usage{ download(url, ...) } \arguments{ \item{url}{The URL to download.} \item{...}{Other arguments that are passed to \code{\link{download.file}}.} } \description{ This is a wrapper for \code{\link{download.file}} and takes all the same arguments. The only difference is that, if the protocol is https, it changes some settings to make it work. How exactly the settings are changed differs among platforms. } \details{ This function also should follow http redirects on all platforms, which is something that does not happen by default when \code{curl} is used, as on Mac OS X. With Windows, it calls \code{setInternet2}, which tells R to use the \code{internet2.dll}. Then it downloads the file by calling \code{\link{download.file}} using the \code{"internal"} method. On other platforms, it will try to use \code{wget}, then \code{curl}, and then \code{lynx} to download the file. Typically, Linux platforms will have \code{wget} installed, and Mac OS X will have \code{curl}. Note that for many (perhaps most) types of files, you will want to use \code{mode="wb"} so that the file is downloaded in binary mode. } \examples{ \dontrun{ # Download the downloader source, in binary mode download("https://github.com/wch/downloader/zipball/master", "downloader.zip", mode = "wb") } } \seealso{ \code{\link{download.file}} for more information on the arguments that can be used with this function. } downloader/man/source_url.Rd0000644000175000017500000000411012140111140017121 0ustar sebastiansebastian\name{source_url} \alias{source_url} \title{Download an R file from a URL and source it} \usage{ source_url(url, sha = NULL, ..., prompt = TRUE, quiet = FALSE) } \arguments{ \item{url}{The URL to download.} \item{sha}{A SHA-1 hash of the file at the URL.} \item{prompt}{Prompt the user if no value for \code{sha} is provided.} \item{quiet}{If \code{FALSE} (the default), print out status messages about checking SHA.} \item{...}{Other arguments that are passed to \code{\link{source}()}.} } \description{ This will download a file and source it. Because it uses the \code{\link{download}()} function, it can handle https URLs. } \details{ By default, \code{source_url()} checks the SHA-1 hash of the file. If it differs from the expected value, it will throw an error. The default expectation is that a hash is provided; if not, \code{source_url()} will prompt the user, asking if they are sure they want to continue, unless \code{prompt=FALSE} is used. In other words, if you use \code{prompt=FALSE}, it will run the remote code without checking the hash, and without asking the user. The purpose of checking the hash is to ensure that the file has not changed. If a \code{source_url} command with a hash is posted in a public forum, then others who source the URL (with the hash) are guaranteed to run the same code every time. This means that the author doesn't need to worry about the security of the server hosting the file. It also means that the users don't have to worry about the file being replaced with a damaged or maliciously-modified version. To find the hash of a local file, use \code{\link{digest}()}. For a simple way to find the hash of a remote file, use \code{\link{sha_url}()}. } \examples{ \dontrun{ # Source the a sample file downloader::source_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r", sha="9b8ff5213e32a871d6cb95cce0bed35c53307f61") } } \seealso{ \code{\link{source}()} for more information on the arguments that can be used with this function. } downloader/man/downloader.Rd0000644000175000017500000000111012062207601017105 0ustar sebastiansebastian\docType{package} \name{downloader} \alias{downloader} \alias{downloader-package} \title{downloader: a package for making it easier to download files over https} \description{ This package provides a wrapper for the download.file function, making it possible to download files over https on Windows, Mac OS X, and other Unix-like platforms. The RCurl package provides this functionality (and much more) but can be difficult to install because it must be compiled with external dependencies. This package has no external dependencies, so it is much easier to install. } downloader/man/sha_url.Rd0000644000175000017500000000210012140111140016371 0ustar sebastiansebastian\name{sha_url} \alias{sha_url} \title{Download a file from a URL and find a SHA-1 hash of it} \usage{ sha_url(url, cmd = TRUE) } \arguments{ \item{url}{The URL of the file to find a hash of.} \item{cmd}{If \code{TRUE} (the default), print out a command for sourcing the URL with \code{\link{source_url}()}, including the hash.} } \description{ This will download a file and find a SHA-1 hash of it, using \code{\link{digest}()}. The primary purpose of this function is to provide an easy way to find the value of \code{sha} which can be passed to \code{\link{source_url}()}. } \examples{ \dontrun{ # Get the SHA hash of a file. It will print the text below and return # the hash as a string sha_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r") # Command for sourcing the URL: # downloader::source_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r", sha="9b8ff5213e32a871d6cb95cce0bed35c53307f61") # [1] "9b8ff5213e32a871d6cb95cce0bed35c53307f61" } } downloader/README.md0000644000175000017500000000130312062207546015200 0ustar sebastiansebastiandownloader ========== This package provides a wrapper for the download.file function, making it possible to download files over https on Windows, Mac OS X, and other Unix-like platforms. The RCurl package provides this functionality (and much more) but can be difficult to install because it must be compiled with external dependencies. This package has no external dependencies, so it is much easier to install. Example usage ============= This will download the source code for the downloader package: ```R # First install downloader from CRAN install.packages("downloader") library(downloader) download("https://github.com/wch/downloader/zipball/master", "downloader.zip", mode = "wb") ``` downloader/tests/0000755000175000017500000000000012062207546015066 5ustar sebastiansebastiandownloader/tests/test-all.R0000644000175000017500000000010212062207546016727 0ustar sebastiansebastianlibrary(testthat) library(downloader) test_package("downloader") downloader/inst/0000755000175000017500000000000012062207546014701 5ustar sebastiansebastiandownloader/inst/tests/0000755000175000017500000000000012111503352016030 5ustar sebastiansebastiandownloader/inst/tests/test-download.r0000644000175000017500000000265112106241035021004 0ustar sebastiansebastiancontext("download") # Download from a url, and return the contents of the file as a string download_result <- function(url) { tfile <- tempfile() download(url, tfile, mode = "wb") # Read the file tfile_fd <- file(tfile, "r") dl_text <- readLines(tfile_fd, warn = FALSE) dl_text <- paste(dl_text, collapse = "\n") close(tfile_fd) unlink(tfile) dl_text } # CRAN has intermittent problems with these tests, since they rely on a # particular website being accessible. This makes it run with devtools::test() # but not on CRAN if (Sys.getenv('NOT_CRAN') == "true") { test_that("downloading http and https works properly", { # Download http from httpbin.org result <- download_result("http://httpbin.org/ip") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) # Download https from httpbin.org result <- download_result("https://httpbin.org/ip") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) }) test_that("follows redirects", { # Download http redirect from httpbin.org result <- download_result("http://httpbin.org/redirect/3") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) # Download https redirect from httpbin.org result <- download_result("https://httpbin.org/redirect/3") # Check that it has the string "origin" in the text expect_true(grepl("origin", result)) }) } downloader/inst/tests/test-sha.r0000644000175000017500000000414512111503352017747 0ustar sebastiansebastiancontext("sha") test_that('sha_url', { # Create a temp file with simple R code temp_file <- tempfile() str <- 'a <<- a + 1' # Write str it to a file writeLines(str, sep = '', con = temp_file) url <- paste('file://', temp_file, sep = '') # Compare result from sha_url() to result directly from digest() expect_equal(sha_url(url), digest(str, algo = 'sha1', serialize = FALSE)) }) test_that('Check SHA hash with source_url', { # Create a temp file with simple R code temp_file <- tempfile() writeLines('a <<- a + 1', con = temp_file) url <- paste('file://', temp_file, sep = '') # Calculate the correct and incorrect SHA right_sha <- sha_url(url) wrong_sha <- '0000000000000000000000000000000000000000' # Counter - should be incremented by the code in the URL, which is a <<- a + 1 .GlobalEnv$a <- 0 # There are a total of 2x3x2=12 conditions, but we don't need to test them all # prompt=TRUE, right SHA, quiet=FALSE: print message expect_message(source_url(url, sha = right_sha), 'matches expected') expect_equal(a, 1) # prompt=TRUE, wrong SHA, quiet=FALSE: error expect_error(source_url(url, sha = wrong_sha)) expect_equal(a, 1) # prompt=TRUE, no SHA, quiet=FALSE: should prompt and respond to y/n # (no way to automatically test this) #source_url(url) # prompt=FALSE, no SHA, quiet=FALSE: do it, with message about not checking expect_message(source_url(url, prompt = FALSE), 'Not checking') expect_equal(a, 2) # prompt=FALSE, right SHA, quiet=FALSE: should just do it, with message about match expect_message(source_url(url, sha = right_sha, prompt = FALSE), 'matches expected') expect_equal(a, 3) # prompt=FALSE, wrong SHA, quiet=FALSE: should error expect_error(source_url(url, sha = wrong_sha, prompt = FALSE)) expect_equal(a, 3) # prompt=FALSE, no SHA, quiet=TRUE: should just do it, with no message about not checking source_url(url, prompt = FALSE, quiet = TRUE) expect_equal(a, 4) # prompt=FALSE, right SHA, quiet=TRUE: should just do it, with no message source_url(url, sha = right_sha, prompt = FALSE, quiet = TRUE) expect_equal(a, 5) }) downloader/R/0000755000175000017500000000000012126347716014132 5ustar sebastiansebastiandownloader/R/sha_url.r0000644000175000017500000000253612126347716015760 0ustar sebastiansebastian#' Download a file from a URL and find a SHA-1 hash of it #' #' This will download a file and find a SHA-1 hash of it, using #' \code{\link{digest}()}. The primary purpose of this function is to provide #' an easy way to find the value of \code{sha} which can be passed to #' \code{\link{source_url}()}. #' #' @param url The URL of the file to find a hash of. #' @param cmd If \code{TRUE} (the default), print out a command for sourcing the #' URL with \code{\link{source_url}()}, including the hash. #' #' @export #' @examples #' \dontrun{ #' # Get the SHA hash of a file. It will print the text below and return #' # the hash as a string #' sha_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r") #' # Command for sourcing the URL: #' # downloader::source_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r", sha="9b8ff5213e32a871d6cb95cce0bed35c53307f61") #' # [1] "9b8ff5213e32a871d6cb95cce0bed35c53307f61" #' } #' #' #' @importFrom digest digest sha_url <- function(url, cmd = TRUE) { temp_file <- tempfile() download(url, temp_file) on.exit(unlink(temp_file)) sha <- digest(file = temp_file, algo = 'sha1') if (cmd) { message('Command for sourcing the URL:\n', ' downloader::source_url("', url, '", sha="', sha, '")') } sha } downloader/R/download.r0000644000175000017500000000576012111277055016124 0ustar sebastiansebastian#' Download a file, using http, https, or ftp #' #' This is a wrapper for \code{\link{download.file}} and takes all the same #' arguments. The only difference is that, if the protocol is https, it changes #' some settings to make it work. How exactly the settings are changed #' differs among platforms. #' #' This function also should follow http redirects on all platforms, which is #' something that does not happen by default when \code{curl} is used, as on #' Mac OS X. #' #' With Windows, it calls \code{setInternet2}, which tells R to use the #' \code{internet2.dll}. Then it downloads the file by calling #' \code{\link{download.file}} using the \code{"internal"} method. #' #' On other platforms, it will try to use \code{wget}, then \code{curl}, and #' then \code{lynx} to download the file. Typically, Linux platforms will have #' \code{wget} installed, and Mac OS X will have \code{curl}. #' #' Note that for many (perhaps most) types of files, you will want to use #' \code{mode="wb"} so that the file is downloaded in binary mode. #' #' @param url The URL to download. #' @param ... Other arguments that are passed to \code{\link{download.file}}. #' #' @seealso \code{\link{download.file}} for more information on the arguments #' that can be used with this function. #' #' @export #' @examples #' \dontrun{ #' # Download the downloader source, in binary mode #' download("https://github.com/wch/downloader/zipball/master", #' "downloader.zip", mode = "wb") #' } #' download <- function(url, ...) { # First, check protocol. If http or https, check platform: if (grepl('^https?://', url)) { # If Windows, call setInternet2, then use download.file with defaults. if (.Platform$OS.type == "windows") { # If we directly use setInternet2, R CMD CHECK gives a Note on Mac/Linux seti2 <- `::`(utils, 'setInternet2') # Store initial settings, and restore on exit internet2_start <- seti2(NA) on.exit(suppressWarnings(seti2(internet2_start))) # Needed for https. Will get warning if setInternet2(FALSE) already run # and internet routines are used. But the warnings don't seem to matter. suppressWarnings(seti2(TRUE)) download.file(url, ...) } else { # If non-Windows, check for curl/wget/lynx, then call download.file with # appropriate method. if (nzchar(Sys.which("wget")[1])) { method <- "wget" } else if (nzchar(Sys.which("curl")[1])) { method <- "curl" # curl needs to add a -L option to follow redirects. # Save the original options and restore when we exit. orig_extra_options <- getOption("download.file.extra") on.exit(options(download.file.extra = orig_extra_options)) options(download.file.extra = paste("-L", orig_extra_options)) } else if (nzchar(Sys.which("lynx")[1])) { method <- "lynx" } else { stop("no download method found") } download.file(url, method = method, ...) } } else { download.file(url, ...) } } downloader/R/downloader-package.r0000644000175000017500000000103012111505403020015 0ustar sebastiansebastian#' downloader: a package for making it easier to download files over https #' #' This package provides a wrapper for the download.file function, #' making it possible to download files over https on Windows, Mac OS X, and #' other Unix-like platforms. The RCurl package provides this functionality #' (and much more) but can be difficult to install because it must be compiled #' with external dependencies. This package has no external dependencies, so #' it is much easier to install. #' #' @name downloader #' @docType package NULL downloader/R/source_url.r0000644000175000017500000000611212140111076016457 0ustar sebastiansebastian#' Download an R file from a URL and source it #' #' This will download a file and source it. Because it uses the #' \code{\link{download}()} function, it can handle https URLs. #' #' By default, \code{source_url()} checks the SHA-1 hash of the file. If it #' differs from the expected value, it will throw an error. The default #' expectation is that a hash is provided; if not, \code{source_url()} will #' prompt the user, asking if they are sure they want to continue, unless #' \code{prompt=FALSE} is used. In other words, if you use \code{prompt=FALSE}, #' it will run the remote code without checking the hash, and without asking #' the user. #' #' The purpose of checking the hash is to ensure that the file has not changed. #' If a \code{source_url} command with a hash is posted in a public forum, then #' others who source the URL (with the hash) are guaranteed to run the same #' code every time. This means that the author doesn't need to worry about the #' security of the server hosting the file. It also means that the users don't #' have to worry about the file being replaced with a damaged or #' maliciously-modified version. #' #' To find the hash of a local file, use \code{\link{digest}()}. For a simple #' way to find the hash of a remote file, use \code{\link{sha_url}()}. #' #' @param url The URL to download. #' @param sha A SHA-1 hash of the file at the URL. #' @param prompt Prompt the user if no value for \code{sha} is provided. #' @param quiet If \code{FALSE} (the default), print out status messages about #' checking SHA. #' @param ... Other arguments that are passed to \code{\link{source}()}. #' #' @seealso \code{\link{source}()} for more information on the arguments #' that can be used with this function. #' #' @export #' @examples #' \dontrun{ #' # Source the a sample file #' downloader::source_url("https://gist.github.com/wch/dae7c106ee99fe1fdfe7/raw/db0c9bfe0de85d15c60b0b9bf22403c0f5e1fb15/test.r", #' sha="9b8ff5213e32a871d6cb95cce0bed35c53307f61") #' } #' #' #' @importFrom digest digest source_url <- function(url, sha = NULL, ..., prompt = TRUE, quiet = FALSE) { if (prompt && (is.null(sha) || sha == '')) { resp <- readline(prompt = paste(sep = '', ' No SHA-1 hash specified for the file. The hash is needed to ensure that\n', ' the file at the URL has not changed. See ?source_url for information on\n', ' why this is useful. Are sure you want to continue? [y/n] ')) sha <- NULL # Set to NULL for simpler check later on if (tolower(resp) != "y") { message("Quitting") return(invisible()) } } temp_file <- tempfile() download(url, temp_file) on.exit(unlink(temp_file)) if (!is.null(sha)) { url_sha <- digest(file = temp_file, algo = 'sha1') if (url_sha == sha) { if (!quiet) { message('Hash ', url_sha, ' matches expected value.') } } else { stop('Hash ', url_sha, ' does not match expected value!') } } else { if (!quiet) { message('Not checking SHA-1 of downloaded file.') } } source(temp_file, ...) }