devtools/0000755000176200001440000000000013201030625012101 5ustar liggesusersdevtools/inst/0000755000176200001440000000000013200656425013071 5ustar liggesusersdevtools/inst/templates/0000755000176200001440000000000013171407310015061 5ustar liggesusersdevtools/inst/templates/packagename-package.r0000644000176200001440000000010312416621515021071 0ustar liggesusers#' {{{ name }}}. #' #' @name {{{ name }}} #' @docType package NULL devtools/inst/templates/revdep.R0000644000176200001440000000013612740754321016501 0ustar liggesuserslibrary("devtools") revdep_check() revdep_check_save_summary() revdep_check_print_problems() devtools/inst/templates/NEWS.md0000644000176200001440000000013612724305435016167 0ustar liggesusers# {{{ package }}} {{{version}}} * Added a `NEWS.md` file to track changes to the package. devtools/inst/templates/codecov.yml0000644000176200001440000000001712740754321017234 0ustar liggesuserscomment: false devtools/inst/templates/omni-README0000644000176200001440000000146213171407310016704 0ustar liggesusers{{#Rmd}} ---{{#github}} output: github_document {{/github}} {{#^github}} output: md_document: variant: markdown_github {{/github}} --- ```{r, echo = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>", fig.path = "README-" ) ``` {{/Rmd}} # {{{ package }}} The goal of {{{package}}} is to ... {{#github}} ## Installation You can install {{{ package }}} from github with: {{#Rmd}} ```{r gh-installation, eval = FALSE} {{/Rmd}} {{^Rmd}} ``` r {{/Rmd}} # install.packages("devtools") devtools::install_github("{{{username}}}/{{{repo}}}") ``` {{/github}} ## Example This is a basic example which shows you how to solve a common problem: {{#Rmd}} ```{r example} {{/Rmd}} {{^Rmd}}``` r {{/Rmd}} ## basic example code ``` devtools/inst/templates/cran-comments.md0000644000176200001440000000076312724305435020167 0ustar liggesusers## Test environments * local OS X install, R {{{ rversion }}} * ubuntu 12.04 (on travis-ci), R {{{ rversion }}} * win-builder (devel and release) ## R CMD check results 0 errors | 0 warnings | 1 note * This is a new release. ## Reverse dependencies This is a new release, so there are no reverse dependencies. --- * I have run R CMD check on the NUMBER downstream dependencies. (Summary at ...). * FAILURE SUMMARY * All revdep maintainers were notified of the release on RELEASE DATE. devtools/inst/templates/gpl-v3.md0000644000176200001440000010413013171407310016512 0ustar liggesusersGNU General Public License ========================== _Version 3, 29 June 2007_ _Copyright © 2007 Free Software Foundation, Inc. <>_ Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. ## Preamble The GNU General Public License is a free, copyleft license for software and other kinds of works. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others. For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. Developers that use the GNU GPL protect your rights with two steps: **(1)** assert copyright on the software, and **(2)** offer you this License giving you legal permission to copy, distribute and/or modify it. For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions. Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users. Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free. The precise terms and conditions for copying, distribution and modification follow. ## TERMS AND CONDITIONS ### 0. Definitions “This License” refers to version 3 of the GNU General Public License. “Copyright” also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. “The Program” refers to any copyrightable work licensed under this License. Each licensee is addressed as “you”. “Licensees” and “recipients” may be individuals or organizations. To “modify” a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a “modified version” of the earlier work or a work “based on” the earlier work. A “covered work” means either the unmodified Program or a work based on the Program. To “propagate” a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. To “convey” a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. An interactive user interface displays “Appropriate Legal Notices” to the extent that it includes a convenient and prominently visible feature that **(1)** displays an appropriate copyright notice, and **(2)** tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. ### 1. Source Code The “source code” for a work means the preferred form of the work for making modifications to it. “Object code” means any non-source form of a work. A “Standard Interface” means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. The “System Libraries” of an executable work include anything, other than the work as a whole, that **(a)** is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and **(b)** serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A “Major Component”, in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. The “Corresponding Source” for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. The Corresponding Source for a work in source code form is that same work. ### 2. Basic Permissions All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. ### 3. Protecting Users' Legal Rights From Anti-Circumvention Law No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. ### 4. Conveying Verbatim Copies You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. ### 5. Conveying Modified Source Versions You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: * **a)** The work must carry prominent notices stating that you modified it, and giving a relevant date. * **b)** The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to “keep intact all notices”. * **c)** You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. * **d)** If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an “aggregate” if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. ### 6. Conveying Non-Source Forms You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: * **a)** Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. * **b)** Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either **(1)** a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or **(2)** access to copy the Corresponding Source from a network server at no charge. * **c)** Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. * **d)** Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. * **e)** Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. A “User Product” is either **(1)** a “consumer product”, which means any tangible personal property which is normally used for personal, family, or household purposes, or **(2)** anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, “normally used” refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. “Installation Information” for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. ### 7. Additional Terms “Additional permissions” are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: * **a)** Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or * **b)** Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or * **c)** Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or * **d)** Limiting the use for publicity purposes of names of licensors or authors of the material; or * **e)** Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or * **f)** Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. All other non-permissive additional terms are considered “further restrictions” within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. ### 8. Termination You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated **(a)** provisionally, unless and until the copyright holder explicitly and finally terminates your license, and **(b)** permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. ### 9. Acceptance Not Required for Having Copies You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. ### 10. Automatic Licensing of Downstream Recipients Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. An “entity transaction” is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. ### 11. Patents A “contributor” is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's “contributor version”. A contributor's “essential patent claims” are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, “control” includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. In the following three paragraphs, a “patent license” is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To “grant” such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either **(1)** cause the Corresponding Source to be so available, or **(2)** arrange to deprive yourself of the benefit of the patent license for this particular work, or **(3)** arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. “Knowingly relying” means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. A patent license is “discriminatory” if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license **(a)** in connection with copies of the covered work conveyed by you (or copies made from those copies), or **(b)** primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. ### 12. No Surrender of Others' Freedom If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. ### 13. Use with the GNU Affero General Public License Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such. ### 14. Revised Versions of this License The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License “or any later version” applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation. If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. ### 15. Disclaimer of Warranty THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. ### 16. Limitation of Liability IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. ### 17. Interpretation of Sections 15 and 16 If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. _END OF TERMS AND CONDITIONS_ ## How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the “copyright” line and a pointer to where the full notice is found. Copyright (C) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . Also add information on how to contact you by electronic and paper mail. If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode: Copyright (C) This program comes with ABSOLUTELY NO WARRANTY; for details type 'show w'. This is free software, and you are welcome to redistribute it under certain conditions; type 'show c' for details. The hypothetical commands `show w` and `show c` should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an “about box”. You should also get your employer (if you work as a programmer) or school, if any, to sign a “copyright disclaimer” for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see <>. The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read <>. devtools/inst/templates/test-example.R0000644000176200001440000000013613171407310017614 0ustar liggesuserscontext("{{{ test_name }}}") test_that("multiplication works", { expect_equal(2 * 2, 4) }) devtools/inst/templates/template.Rproj0000644000176200001440000000047012416621515017721 0ustar liggesusersVersion: 1.0 RestoreWorkspace: No SaveWorkspace: No AlwaysSaveHistory: Default EnableCodeIndexing: Yes Encoding: UTF-8 AutoAppendNewline: Yes StripTrailingWhitespace: Yes BuildType: Package PackageUseDevtools: Yes PackageInstallArgs: --no-multiarch --with-keep.source PackageRoxygenize: rd,collate,namespace devtools/inst/templates/travis.yml0000644000176200001440000000017212724305435017124 0ustar liggesusers# R for travis: see documentation at https://docs.travis-ci.com/user/languages/r language: R sudo: false cache: packages devtools/inst/templates/mit-license.txt0000644000176200001440000000007212724305435020042 0ustar liggesusersYEAR: {{{year}}} COPYRIGHT HOLDER: {{{copyright_holder}}} devtools/inst/templates/testthat.R0000644000176200001440000000010412416621515017045 0ustar liggesuserslibrary(testthat) library({{{ name }}}) test_check("{{{ name }}}") devtools/inst/templates/CONDUCT.md0000644000176200001440000000255312656131112016510 0ustar liggesusers# Contributor Code of Conduct As contributors and maintainers of this project, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities. We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, or religion. Examples of unacceptable behavior by participants include the use of sexual language or imagery, derogatory comments or personal attacks, trolling, public or private harassment, insults, or other unprofessional conduct. Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct. Project maintainers who do not follow the Code of Conduct may be removed from the project team. Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by opening an issue or contacting one or more of the project maintainers. This Code of Conduct is adapted from the Contributor Covenant (http:contributor-covenant.org), version 1.0.0, available at http://contributor-covenant.org/version/1/0/0/ devtools/inst/templates/readme-rmd-pre-commit.sh0000644000176200001440000000064712656131112021514 0ustar liggesusers#!/bin/bash README=($(git diff --cached --name-only | grep -Ei '^README\.[R]?md$')) MSG="use 'git commit --no-verify' to override this check" if [[ ${#README[@]} == 0 ]]; then exit 0 fi if [[ README.Rmd -nt README.md ]]; then echo -e "README.md is out of date; please re-knit README.Rmd\n$MSG" exit 1 elif [[ ${#README[@]} -lt 2 ]]; then echo -e "README.Rmd and README.md should be both staged\n$MSG" exit 1 fi devtools/inst/templates/appveyor.yml0000644000176200001440000000153413171407310017454 0ustar liggesusers# DO NOT CHANGE the "init" and "install" sections below # Download script file from GitHub init: ps: | $ErrorActionPreference = "Stop" Invoke-WebRequest http://raw.github.com/krlmlr/r-appveyor/master/scripts/appveyor-tool.ps1 -OutFile "..\appveyor-tool.ps1" Import-Module '..\appveyor-tool.ps1' install: ps: Bootstrap cache: - C:\RLibrary # Adapt as necessary starting from here build_script: - travis-tool.sh install_deps test_script: - travis-tool.sh run_tests on_failure: - 7z a failure.zip *.Rcheck\* - appveyor PushArtifact failure.zip artifacts: - path: '*.Rcheck\**\*.log' name: Logs - path: '*.Rcheck\**\*.out' name: Logs - path: '*.Rcheck\**\*.fail' name: Logs - path: '*.Rcheck\**\*.Rout' name: Logs - path: '\*_*.tar.gz' name: Bits - path: '\*_*.zip' name: Bits devtools/inst/doc/0000755000176200001440000000000013200656425013636 5ustar liggesusersdevtools/inst/doc/dependencies.html0000644000176200001440000002770213200656425017162 0ustar liggesusers Devtools dependencies

Devtools dependencies

Jim Hester, Hadley Wickham

2017-11-08

Package remotes

Devtools version 1.9 supports package dependency installation for packages not yet in a standard package repository such as CRAN or Bioconductor.

You can mark any regular dependency defined in the Depends, Imports, Suggests or Enhances fields as being installed from a remote location by adding the remote location to Remotes in your DESCRIPTION file. This will cause devtools to download and install them prior to installing your package (so they won’t be installed from CRAN).

The remote dependencies specified in Remotes should be described in the following form.

Remotes: [type::]<Repository>, [type2::]<Repository2>

The type is an optional parameter. If the type is missing the default is to install from GitHub. Additional remote dependencies should be separated by commas, just like normal dependencies elsewhere in the DESCRIPTION file.

Github

Because github is the most commonly used unofficial package distribution in R, it’s the default:

Remotes: hadley/testthat

You can also specify a specific hash, tag, or pull request (using the same syntax as install_github() if you want a particular commit. Otherwise the latest commit on the master branch is used.

Remotes: hadley/httr@v0.4,
  klutometis/roxygen#142,
  hadley/testthat@c67018fa4970

A type of ‘github’ can be specified, but is not required

Remotes: github::hadley/ggplot2

Other sources

All of the currently supported install sources are available, see the ‘See Also’ section in ?install for a complete list.

# Git
Remotes: git::https://github.com/hadley/ggplot2.git

# Bitbucket
Remotes: bitbucket::sulab/mygene.r@default, dannavarro/lsr-package

# Bioconductor
Remotes: bioc::3.3/SummarizedExperiment#117513, bioc::release/Biobase

# SVN
Remotes: svn::https://github.com/hadley/stringr

# URL
Remotes: url::https://github.com/hadley/stringr/archive/master.zip

# Local
Remotes: local::/pkgs/testthat

# Gitorious
Remotes: gitorious::r-mpc-package/r-mpc-package

CRAN submission

When you submit your package to CRAN, all of its dependencies must also be available on CRAN. For this reason, release() will warn you if you try to release a package with a Remotes field.

devtools/inst/doc/dependencies.Rmd0000644000176200001440000000502213200623656016727 0ustar liggesusers--- title: "Devtools dependencies" author: "Jim Hester, Hadley Wickham" date: "`r Sys.Date()`" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Devtools dependencies} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- # Package remotes Devtools version 1.9 supports package dependency installation for packages not yet in a standard package repository such as [CRAN](https://cran.r-project.org) or [Bioconductor](http://bioconductor.org). You can mark any regular dependency defined in the `Depends`, `Imports`, `Suggests` or `Enhances` fields as being installed from a remote location by adding the remote location to `Remotes` in your `DESCRIPTION` file. This will cause devtools to download and install them prior to installing your package (so they won't be installed from CRAN). The remote dependencies specified in `Remotes` should be described in the following form. ``` Remotes: [type::], [type2::] ``` The `type` is an optional parameter. If the type is missing the default is to install from GitHub. Additional remote dependencies should be separated by commas, just like normal dependencies elsewhere in the `DESCRIPTION` file. ### Github Because github is the most commonly used unofficial package distribution in R, it's the default: ```yaml Remotes: hadley/testthat ``` You can also specify a specific hash, tag, or pull request (using the same syntax as `install_github()` if you want a particular commit. Otherwise the latest commit on the master branch is used. ```yaml Remotes: hadley/httr@v0.4, klutometis/roxygen#142, hadley/testthat@c67018fa4970 ``` A type of 'github' can be specified, but is not required ```yaml Remotes: github::hadley/ggplot2 ``` ### Other sources All of the currently supported install sources are available, see the 'See Also' section in `?install` for a complete list. ```yaml # Git Remotes: git::https://github.com/hadley/ggplot2.git # Bitbucket Remotes: bitbucket::sulab/mygene.r@default, dannavarro/lsr-package # Bioconductor Remotes: bioc::3.3/SummarizedExperiment#117513, bioc::release/Biobase # SVN Remotes: svn::https://github.com/hadley/stringr # URL Remotes: url::https://github.com/hadley/stringr/archive/master.zip # Local Remotes: local::/pkgs/testthat # Gitorious Remotes: gitorious::r-mpc-package/r-mpc-package ``` ### CRAN submission When you submit your package to CRAN, all of its dependencies must also be available on CRAN. For this reason, `release()` will warn you if you try to release a package with a `Remotes` field. devtools/tests/0000755000176200001440000000000013200623656013256 5ustar liggesusersdevtools/tests/has-devel.R0000644000176200001440000000003713200623656015251 0ustar liggesuserslibrary(devtools) has_devel() devtools/tests/test-that.R0000644000176200001440000000005112656131112015305 0ustar liggesuserslibrary(testthat) test_check("devtools") devtools/tests/testthat/0000755000176200001440000000000013200656427015120 5ustar liggesusersdevtools/tests/testthat/test-s4-sort.r0000644000176200001440000000200213200623656017563 0ustar liggesuserscontext("s4-sort") suppressMessages(load_all("testS4sort")) classes <- methods::getClasses(ns_env('testS4sort')) test_that("Example classes are not topologically sorted", { ## there are some superclasses of the first class ## later in the list superclasses <- extends(getClass(classes[1]))[-1] expect_true(any(superclasses %in% classes[-1])) }) test_that("topological sorting s4 classes", { sorted_classes <- sort_s4classes(classes, 'testS4sort') for (idx in seq_along(classes)) { ## for each class in the sorted list ## all its superclasses are before superclasses <- extends(getClass(sorted_classes[idx])) expect_true(all(superclasses %in% head(sorted_classes, idx))) } }) test_that("sorting extreme cases", { ## no classes to sort classes <- vector('character', 0) expect_identical(classes, sort_s4classes(classes, 'testS4sort')) ## only one class to sort classes <- "A" expect_identical(classes, sort_s4classes(classes, 'testS4sort')) }) # cleanup unload('testS4sort') devtools/tests/testthat/testMissingNsObject/0000755000176200001440000000000012634340542021057 5ustar liggesusersdevtools/tests/testthat/testMissingNsObject/NAMESPACE0000644000176200001440000000002312416621515022271 0ustar liggesusersexport(a) export(b)devtools/tests/testthat/testMissingNsObject/R/0000755000176200001440000000000012634340206021255 5ustar liggesusersdevtools/tests/testthat/testMissingNsObject/R/a.r0000644000176200001440000000000712416621515021660 0ustar liggesusersa <- 1 devtools/tests/testthat/testMissingNsObject/DESCRIPTION0000644000176200001440000000031512416621515022564 0ustar liggesusersPackage: testMissingNsObject Title: Tools to make developing R code easier. This package lists 'b' as an export in NAMESPACE, but the 'b' object doesn't exist. License: GPL-2 Description: Version: 0.1 devtools/tests/testthat/test-getrootdir.R0000644000176200001440000000040112416621515020373 0ustar liggesuserscontext("getrootdir") test_that("finds common prefix", { expect_equal(getrootdir(c("x/a", "x/b", "x/c")), "x") }) test_that("returns empty string when all paths in current directory (#537)", { expect_equal(getrootdir(c("a", "b", "c", "d/e")), "") }) devtools/tests/testthat/testCollateMissing/0000755000176200001440000000000013200623656020733 5ustar liggesusersdevtools/tests/testthat/testCollateMissing/R/0000755000176200001440000000000013200623656021134 5ustar liggesusersdevtools/tests/testthat/testCollateMissing/R/b.r0000644000176200001440000000000613200623656021534 0ustar liggesusersb <- 2devtools/tests/testthat/testCollateMissing/R/a.r0000644000176200001440000000000613200623656021533 0ustar liggesusersa <- 1devtools/tests/testthat/testCollateMissing/DESCRIPTION0000644000176200001440000000031513200623656022440 0ustar liggesusersPackage: testCollateMissing Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: b.rdevtools/tests/testthat/test-check.r0000644000176200001440000000112113200623656017326 0ustar liggesuserscontext("Check") test_that("sucessful check doesn't trigger error", { skip_on_cran() results <- check("testTest", quiet = TRUE) expect_error(signal_check_results(results), NA) expect_equal( summarise_check_results(results), "0 errors | 0 warnings | 0 notes", fixed = TRUE ) }) test_that("check with NOTES captured", { skip_on_cran() results <- parse_check_results("check-results-note.log") expect_error(signal_check_results(results), NA) expect_error( signal_check_results(results, "note"), "0 errors | 0 warnings | 2 notes", fixed = TRUE ) }) devtools/tests/testthat/testVignettesBuilt/0000755000176200001440000000000012725304541020766 5ustar liggesusersdevtools/tests/testthat/testVignettesBuilt/NAMESPACE0000644000176200001440000000004212416621515022201 0ustar liggesusersexport(function_with_unusual_name)devtools/tests/testthat/testVignettesBuilt/R/0000755000176200001440000000000012634340400021160 5ustar liggesusersdevtools/tests/testthat/testVignettesBuilt/R/code.r0000644000176200001440000000007312454305464022270 0ustar liggesusersfunction_with_unusual_name <- function() { print("Hi!") }devtools/tests/testthat/testVignettesBuilt/vignettes/0000755000176200001440000000000013200624363022772 5ustar liggesusersdevtools/tests/testthat/testVignettesBuilt/vignettes/new.Rnw0000644000176200001440000000023612416621515024260 0ustar liggesusers%\VignetteIndexEntry{New} \documentclass[oneside]{article} \begin{document} <<>>= library(testVignettesBuilt) function_with_unusual_name() @ \end{document}devtools/tests/testthat/testVignettesBuilt/DESCRIPTION0000644000176200001440000000030112416621515022466 0ustar liggesusersPackage: testVignettesBuilt Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/test-github.r0000644000176200001440000000715412725340406017547 0ustar liggesuserscontext("GitHub") with_mock <- function(name, value, code) { env <- asNamespace("devtools") orig_value <- env[[name]] unlockBinding(name, env) env[[name]] <- value on.exit(env[[name]] <- orig_value) force(code) } test_that("GitHub repo paths are parsed correctly", { expect_equal(parse_git_repo("devtools"), list(repo="devtools")) expect_equal(parse_git_repo("krlmlr/kimisc"), list(username="krlmlr", repo="kimisc")) expect_equal(parse_git_repo("my/test/pkg"), list(username="my", repo="test", subdir="pkg")) expect_equal(parse_git_repo("devtools@devtools-1.4"), list(repo="devtools", ref="devtools-1.4")) expect_equal(parse_git_repo("yihui/tikzDevice#23"), list(username="yihui", repo="tikzDevice", ref=github_pull("23"))) expect_equal(parse_git_repo("my/test/pkg@ref"), list(username="my", repo="test", subdir="pkg", ref="ref")) expect_equal(parse_git_repo("my/test/pkg#1"), list(username="my", repo="test", subdir="pkg", ref=github_pull("1"))) expect_error(parse_git_repo("test#6@123"), "Invalid git repo") expect_error(parse_git_repo("Teradata/teradataR/"), "Invalid git repo") expect_error(parse_git_repo("test@*unsupported-release"), "Invalid git repo") }) # Mock github_resolve_ref.github_pull so that GitHub API is not queried for this test mock_github_resolve_ref.github_pull <- function(x, params) { params$username <- sprintf("user-%s", x) params$ref <- sprintf("pull-%s", x) params } # Mock github_resolve_ref.github_release so that GitHub API is not queried for this test mock_github_resolve_ref.github_release <- function(x, param) { param$ref="latest-release" param } test_that("GitHub parameters are returned correctly", { with_mock("github_resolve_ref.github_pull", mock_github_resolve_ref.github_pull, { expect_equal(github_remote("hadley/devtools")$repo, "devtools") expect_equal(github_remote("krlmlr/kimisc")$username, "krlmlr") expect_equal(github_remote("my/test/pkg")$subdir, "pkg") expect_equal(github_remote("hadley/devtools@devtools-1.4")$ref, "devtools-1.4") expect_equal(github_remote("yihui/tikzDevice#23")$ref, "pull-23") }) with_mock("github_resolve_ref.github_release", mock_github_resolve_ref.github_release, { expect_equal(github_remote("yihui/tikzDevice@*release")$ref, "latest-release") expect_equal(github_remote("my/test/pkg@*release")$ref, "latest-release") }) }) mock_github_GET <- function(path) { if (grepl("^repos/.*/pulls/.*$", path)) { list(head=list(user=list(login="username"), ref="some-pull-request")) } else if (grepl("^repos/.*/releases$", path)) { list(list(tag_name="some-release")) } else stop("unexpected path: ", path) } test_that("GitHub references are resolved correctly", { default_params <- as.list(stats::setNames(nm=c("repo", "username"))) with_mock("github_GET", mock_github_GET, { expect_equal(github_resolve_ref(NULL, list())$ref, "master") expect_equal(github_resolve_ref("some-ref", list())$ref, "some-ref") expect_equal(github_resolve_ref(github_pull(123), default_params)$username, "username") expect_equal(github_resolve_ref(github_pull(123), default_params)$ref, "some-pull-request") expect_equal(github_resolve_ref(github_release(), default_params)$ref, "some-release") }) }) test_that("Github repos with submodules are identified correctly", { # Appveyor has a very low GitHub rate limit which causes this to fail often, so # skip these tests skip_on_appveyor() skip_on_travis() expect_equal(github_has_remotes(github_remote("hadley/devtools")), FALSE) ## a r package repo known to use submodules expect_equal(github_has_remotes(github_remote("armstrtw/rzmq")), TRUE) }) devtools/tests/testthat/helper-github.R0000644000176200001440000000135413200623656020003 0ustar liggesusers## set-up and tear-down create_in_temp <- function(pkg) { temp_path <- tempfile(pattern="devtools-test-") dir.create(temp_path) test_pkg <- file.path(temp_path, pkg) capture.output(suppressMessages(create(test_pkg, description = list()))) test_pkg } erase <- function(path) unlink(path, recursive = TRUE) ## fake GitHub connectivity: set a GitHub remote and add GitHub links mock_use_github <- function(pkg) { use_git_with_config(message = "initial", pkg = pkg, add_user_config = TRUE, quiet = TRUE) r <- git2r::repository(pkg) git2r::remote_add(r, "origin", "https://github.com/hadley/devtools.git") use_github_links(pkg) git2r::add(r, "DESCRIPTION") git2r::commit(r, "Add GitHub links to DESCRIPTION") invisible(NULL) } devtools/tests/testthat/testS4union/0000755000176200001440000000000013200623656017355 5ustar liggesusersdevtools/tests/testthat/testS4union/NAMESPACE0000644000176200001440000000011013200623656020564 0ustar liggesusersexportClass(A, B, AB, mle2, mleA, mle2A) importClassesFrom(stats4, mle) devtools/tests/testthat/testS4union/R/0000755000176200001440000000000013200623656017556 5ustar liggesusersdevtools/tests/testthat/testS4union/R/classes.r0000644000176200001440000000031413200623656021374 0ustar liggesuserssetClass("A") setClass("B") setClassUnion("AB", members = c("A", "B")) setClass("mle2", contains = "mle") setClassUnion("mleA", members = c("mle", "A")) setClassUnion("mle2A", members = c("mle2", "A")) devtools/tests/testthat/testS4union/DESCRIPTION0000644000176200001440000000037313200623656021066 0ustar liggesusersPackage: testS4union Title: Test package for S4 class unions License: GPL-2 Description: Author: Winston Chang Maintainer: Winston Chang Version: 0.1 Collate: 'classes.r' Imports: stats4, methods devtools/tests/testthat/testVignetteExtras/0000755000176200001440000000000013200656427020774 5ustar liggesusersdevtools/tests/testthat/testVignetteExtras/NAMESPACE0000644000176200001440000000000012416621515022177 0ustar liggesusersdevtools/tests/testthat/testVignetteExtras/vignettes/0000755000176200001440000000000013200624360022773 5ustar liggesusersdevtools/tests/testthat/testVignetteExtras/vignettes/new.Rnw0000644000176200001440000000010612416621515024260 0ustar liggesusers\documentclass[oneside]{article} \begin{document} Test \end{document}devtools/tests/testthat/testVignetteExtras/vignettes/a.r0000644000176200001440000000000712416621515023402 0ustar liggesusersa <- 1 devtools/tests/testthat/testVignetteExtras/DESCRIPTION0000644000176200001440000000027412416621515022503 0ustar liggesusersPackage: testVignettes Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/test-s4-unload.r0000644000176200001440000000555413200623656020075 0ustar liggesuserscontext("s4-unload") # Returns a named vector of this class's superclasses. # Results are sorted so they can be compared easily to a vector. # A contains B == A is a superclass of B get_superclasses <- function(class) { superclasses <- vapply(getClass(class)@contains, slot, "superClass", FUN.VALUE = character(1)) sort(unname(superclasses)) } # Returns a named vector of this class's subclasses # Results are sorted so they can be compared easily to a vector. # A extends B == A is a subclass of B get_subclasses <- function(class) { subclasses <- vapply(getClass(class)@subclasses, slot, "subClass", FUN.VALUE = character(1)) sort(unname(subclasses)) } test_that("loading and reloading s4 classes", { load_all("testS4union") # Check class hierarchy expect_equal(get_superclasses("A"), c("AB", "mle2A", "mleA")) expect_equal(get_subclasses("AB"), c("A", "B")) expect_equal(get_superclasses("mle2"), c("mle", "mle2A", "mleA")) expect_equal(get_subclasses("mleA"), c("A", "mle", "mle2")) expect_equal(get_subclasses("mle2A"), c("A", "mle2")) # Check that package is registered correctly expect_equal(getClassDef("A")@package, "testS4union") expect_equal(getClassDef("AB")@package, "testS4union") expect_equal(getClassDef("mle2")@package, "testS4union") # Unloading shouldn't result in any errors or warnings if (packageVersion("testthat") >= "0.7.1.99") { expect_that(unload("testS4union"), not(gives_warning())) } else { unload("testS4union") } # Check that classes are unregistered expect_true(is.null(getClassDef("A"))) expect_true(is.null(getClassDef("B"))) expect_true(is.null(getClassDef("AB"))) # Load again and repeat tests -------------------------------------------- # Loading again shouldn't result in any errors or warnings if (packageVersion("testthat") >= "0.7.1.99") { expect_that(load_all("testS4union", reset = FALSE), not(gives_warning())) } else { load_all("testS4union", reset = FALSE) } # Check class hierarchy expect_equal(get_superclasses("A"), c("AB", "mle2A", "mleA")) expect_equal(get_subclasses("AB"), c("A", "B")) expect_equal(get_superclasses("mle2"), c("mle", "mle2A", "mleA")) expect_equal(get_subclasses("mleA"), c("A", "mle", "mle2")) expect_equal(get_subclasses("mle2A"), c("A", "mle2")) # Check that package is registered correctly expect_equal(getClassDef("A")@package, "testS4union") expect_equal(getClassDef("AB")@package, "testS4union") expect_equal(getClassDef("mle2")@package, "testS4union") unload("testS4union") unloadNamespace("stats4") # This was imported by testS4union # Check that classes are unregistered # This test on A fails for some bizarre reason - bug in R? But it doesn't # to cause any practical problems. # expect_true(is.null(getClassDef("A"))) expect_true(is.null(getClassDef("B"))) expect_true(is.null(getClassDef("AB"))) }) devtools/tests/testthat/testUseData/0000755000176200001440000000000013200624327017340 5ustar liggesusersdevtools/tests/testthat/testUseData/NAMESPACE0000644000176200001440000000002712656131117020562 0ustar liggesusersexport(sysdata_export) devtools/tests/testthat/testUseData/R/0000755000176200001440000000000013200624327017541 5ustar liggesusersdevtools/tests/testthat/testUseData/R/a.r0000644000176200001440000000000012656131117020136 0ustar liggesusersdevtools/tests/testthat/testUseData/DESCRIPTION0000644000176200001440000000027212656131117021053 0ustar liggesusersPackage: testUseData Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/test-load-hooks.r0000644000176200001440000001075313200623656020324 0ustar liggesuserscontext("Load hooks") test_that("hooks called in correct order", { record_use <- function(hook) { function(...) { h <- globalenv()$hooks h$events <- c(h$events, hook) } } reset_events <- function() { assign("hooks", new.env(parent = emptyenv()), envir = globalenv()) h <- globalenv()$hooks h$events <- character() } setHook(packageEvent("testHooks", "attach"), record_use("user_attach")) setHook(packageEvent("testHooks", "detach"), record_use("user_detach")) setHook(packageEvent("testHooks", "onLoad"), record_use("user_load")) setHook(packageEvent("testHooks", "onUnload"), record_use("user_unload")) reset_events() load_all("testHooks") expect_equal(globalenv()$hooks$events, c("pkg_load", "user_load", "pkg_attach", "user_attach") ) reset_events() load_all("testHooks", reset = FALSE) expect_equal(globalenv()$hooks$events, character()) reset_events() unload("testHooks") expect_equal(globalenv()$hooks$events, c("user_detach", "pkg_detach", "user_unload", "pkg_unload") ) rm(list = "hooks", envir = globalenv()) setHook(packageEvent("testHooks", "attach"), NULL, "replace") setHook(packageEvent("testHooks", "detach"), NULL, "replace") setHook(packageEvent("testHooks", "onLoad"), NULL, "replace") setHook(packageEvent("testHooks", "onUnload"), NULL, "replace") }) test_that("onLoad and onAttach", { load_all("testLoadHooks") nsenv <- ns_env("testLoadHooks") pkgenv <- pkg_env("testLoadHooks") # normalizePath is needed so that capitalization differences on # case-insensitive platforms won't cause errors. expect_equal(normalizePath(nsenv$onload_lib), normalizePath(getwd())) expect_equal(normalizePath(nsenv$onattach_lib), normalizePath(getwd())) # a: modified by onLoad in namespace env # b: modified by onAttach in namespace env # c: modified by onAttach in package env # In a normal install+load, b can't be modified by onAttach because # the namespace is locked before onAttach. But it can be modified when # using load_all. expect_equal(nsenv$a, 2) expect_equal(nsenv$b, 2) # This would be 1 in normal install+load expect_equal(nsenv$c, 1) expect_equal(pkgenv$a, 2) expect_equal(pkgenv$b, 1) expect_equal(pkgenv$c, 2) # =================================================================== # Loading again without reset won't change a, b, and c in the # namespace env, and also shouldn't trigger onload or onattach. But # the existing namespace values will be copied over to the package # environment load_all("testLoadHooks", reset = FALSE) # Shouldn't form new environments expect_identical(nsenv, ns_env("testLoadHooks")) expect_identical(pkgenv, pkg_env("testLoadHooks")) # namespace and package env values should be the same expect_equal(nsenv$a, 2) expect_equal(nsenv$b, 2) expect_equal(nsenv$c, 1) expect_equal(pkgenv$a, 2) expect_equal(pkgenv$b, 2) expect_equal(pkgenv$c, 1) # =================================================================== # With reset=TRUE, there should be new package and namespace # environments, and the values should be the same as the first # load_all. load_all("testLoadHooks", reset = TRUE) nsenv2 <- ns_env("testLoadHooks") pkgenv2 <- pkg_env("testLoadHooks") # Should form new environments expect_false(identical(nsenv, nsenv2)) expect_false(identical(pkgenv, pkgenv2)) # Values should be same as first time expect_equal(nsenv2$a, 2) expect_equal(nsenv2$b, 2) expect_equal(nsenv2$c, 1) expect_equal(pkgenv2$a, 2) expect_equal(pkgenv2$b, 1) expect_equal(pkgenv2$c, 2) unload("testLoadHooks") # =================================================================== # Unloading and reloading should create new environments and same # values as first time load_all("testLoadHooks") nsenv3 <- ns_env("testLoadHooks") pkgenv3 <- pkg_env("testLoadHooks") # Should form new environments expect_false(identical(nsenv, nsenv3)) expect_false(identical(pkgenv, pkgenv3)) # Values should be same as first time expect_equal(nsenv3$a, 2) expect_equal(nsenv3$b, 2) expect_equal(nsenv3$c, 1) expect_equal(pkgenv3$a, 2) expect_equal(pkgenv3$b, 1) expect_equal(pkgenv3$c, 2) unload("testLoadHooks") }) test_that("onUnload", { load_all("testLoadHooks") # The onUnload function in testLoadHooks increments this variable .GlobalEnv$.__testLoadHooks__ <- 1 unload("testLoadHooks") expect_equal(.GlobalEnv$.__testLoadHooks__, 2) # Clean up rm(".__testLoadHooks__", envir = .GlobalEnv) }) devtools/tests/testthat/test-build.r0000644000176200001440000000062613200623656017361 0ustar liggesuserscontext("build") test_that("source builds return correct filenames", { path <- devtools::build("testNamespace", path=tempdir(), quiet=TRUE) expect_true(file.exists(path)) }) test_that("binary builds return correct filenames", { path <- devtools::build("testNamespace", binary=TRUE, path=tempdir(), quiet=TRUE) expect_true(file.exists(path)) }) devtools/tests/testthat/test-dll.r0000644000176200001440000000701713200623656017036 0ustar liggesuserscontext("Compiled DLLs") test_that("unload() unloads DLLs from packages loaded with library()", { # Make a temp lib directory to install test package into old_libpaths <- .libPaths() tmp_libpath = file.path(tempdir(), "devtools_test") if (!dir.exists(tmp_libpath)) dir.create(tmp_libpath) .libPaths(c(tmp_libpath, .libPaths())) # Reset the libpath on exit on.exit(.libPaths(old_libpaths), add = TRUE) # Install package install("testDllLoad", quiet = TRUE, args = "--no-multiarch") expect_true(require(testDllLoad)) # Check that it's loaded properly, by running a function from the package. # nulltest() calls a C function which returns null. expect_true(is.null(nulltest())) # DLL should be listed in .dynLibs() dynlibs <- vapply(.dynLibs(), `[[`, "name", FUN.VALUE = character(1)) expect_true(any(grepl("testDllLoad", dynlibs))) unload("testDllLoad") # DLL should not be listed in .dynLibs() dynlibs <- vapply(.dynLibs(), `[[`, "name", FUN.VALUE = character(1)) expect_false(any(grepl("testDllLoad", dynlibs))) # Clean out compiled objects clean_dll("testDllLoad") }) test_that("load_all() compiles and loads DLLs", { clean_dll("testDllLoad") load_all("testDllLoad", reset = TRUE, quiet = TRUE) # Check that it's loaded properly, by running a function from the package. # nulltest() calls a C function which returns null. expect_true(is.null(nulltest())) # DLL should be listed in .dynLibs() dynlibs <- vapply(.dynLibs(), `[[`, "name", FUN.VALUE = character(1)) expect_true(any(grepl("testDllLoad", dynlibs))) unload("testDllLoad") # DLL should not be listed in .dynLibs() dynlibs <- vapply(.dynLibs(), `[[`, "name", FUN.VALUE = character(1)) expect_false(any(grepl("testDllLoad", dynlibs))) # Loading again, and reloading # Should not re-compile (don't have a proper test for this) load_all("testDllLoad", quiet = TRUE) expect_true(is.null(nulltest())) # load_all when already loaded # Should not re-compile (don't have a proper test for this) load_all("testDllLoad", quiet = TRUE) expect_true(is.null(nulltest())) # Should re-compile (don't have a proper test for this) load_all("testDllLoad", recompile = TRUE, quiet = TRUE) expect_true(is.null(nulltest())) unload("testDllLoad") # Clean out compiled objects clean_dll("testDllLoad") }) test_that("Specific functions from DLLs listed in NAMESPACE can be called", { load_all("testDllLoad", quiet = TRUE) # nulltest() uses the calling convention: # .Call("null_test", PACKAGE = "testDllLoad") expect_true(is.null(nulltest())) # nulltest2() uses a specific C function listed in NAMESPACE, null_test2 # null_test2 is an object in the packg_env # It uses this calling convention: # .Call(null_test2) expect_true(is.null(nulltest2())) nt2 <- ns_env("testDllLoad")$null_test2 expect_equal(class(nt2), "NativeSymbolInfo") expect_equal(nt2$name, "null_test2") unload("testDllLoad") # Clean out compiled objects clean_dll("testDllLoad") }) test_that("load_all() can compile and load DLLs linked to Rcpp", { skip_on_os("solaris") clean_dll("testDllRcpp") load_all("testDllRcpp", reset = TRUE, quiet = TRUE) # Check that it's loaded properly by calling the hello world function # which returns a list expect_true(is.list(rcpp_hello_world())) # Check whether attribute compilation occurred and that exported # names are available from load_all expect_true(rcpp_test_attributes()) # Unload and clean out compiled objects unload("testDllRcpp") clean_dll("testDllRcpp") }) devtools/tests/testthat/test-imports.r0000644000176200001440000000262113200623656017754 0ustar liggesuserscontext("Imports") test_that("Imported objects are copied to package environment", { load_all("testNamespace") # This package imports the whole 'compiler' package, bitops::bitAnd, and # bitops::bitOr. imp_env <- imports_env("testNamespace") # cmpfun is exported from compiler, so it should be in imp_env expect_identical(imp_env$cmpfun, compiler::cmpfun) # cmpSpecial is NOT exported from compiler, so it should not be in imp_env expect_true(exists("cmpSpecial", asNamespace("compiler"))) expect_false(exists("cmpSpecial", imp_env)) # 'bitAnd' is a single object imported specifically from bitops expect_true(exists("bitAnd", imp_env)) # 'bitFlip' is not imported from bitops expect_false(exists("bitFlip", imp_env)) unload("testNamespace") unload(inst("compiler")) unload(inst("bitops")) }) test_that("Imported objects are be re-exported", { load_all("testNamespace") # bitAnd is imported and re-exported expect_identical(bitAnd, bitops::bitAnd) # bitOr is imported but not re-exported expect_false(exists("bitOr", .GlobalEnv)) unload("testNamespace") unload(inst("compiler")) unload(inst("bitops")) # Same as previous, but with export_all = FALSE load_all("testNamespace", export_all = FALSE) expect_identical(bitAnd, bitops::bitAnd) expect_false(exists("bitOr", .GlobalEnv)) unload("testNamespace") unload(inst("compiler")) unload(inst("bitops")) }) devtools/tests/testthat/shallowRepo/0000755000176200001440000000000012724305435017417 5ustar liggesusersdevtools/tests/testthat/shallowRepo/objects/0000755000176200001440000000000012724305435021050 5ustar liggesusersdevtools/tests/testthat/shallowRepo/objects/pack/0000755000176200001440000000000012724305435021766 5ustar liggesusersdevtools/tests/testthat/shallowRepo/objects/pack/pack-c4e0f1d1d68408f260cbbf0a533ad5f6bfd5524e.idx0000644000176200001440000003156012724305435031260 0ustar liggesuserstOc !!#$''))*,01378;<??CCCCCFIKKMNORRSUWY[[\]___abdghklnnpqtuwxy}     !"%&*-./0024567:<>>>ACGJKLNNQSWYZ[]_`bbdilmoopqssstuuvwz{}MBr)kɂav'(#8q?=Q (srV,*h4 h^d%"QcB$V?gH=Rt\oV =!glƨ$~hPluCC%95=|#(QIf=X[qi'!1м{!Q\{Z1SӎIg&nAWzi( j}>?aٯ>AYuSeB)Xq4"8yS7ZAiTLgֵ!?2"&I "wgя/k}ݾ;CS28KRAdY+'۲D9 :*>V !q eT{ 'rӭ>Lr(\|)I!nYE9;!49kE!Z@}ӡ!@򺭣BNMi!HYZv OwWH"'0f'p#hܸq[N݊#.Ra~cShdB#g3⟑0$ <ԺNRH ˺JL%hŊYi+jf@_%5qu0tx58%ґ6 @44%x L8'}e.Tm'ffH,UyM`C'kI:/&~U'ڍf(+(2rEVJQ,UwCcދA>kN,%,aCo*,,ؓ{+e N-">ܚȢ  GL&WA[-»,AƳN-OU;rVh:qE.99r>G*).??X$lW墀ƈ0j90dJ58Gatm50h6픙yeakA1&{tA㖚 TB2b]e“ siE{3+m.])x3D.kfZUĩ]Xy3Q詭R fq5"E{7cjY56k70RgHgtʳ6ҨkXXw:Q;y/0.7Z"]P[8'M*ҩ7Ʈ XCO!e<~A8q=*/9\P8DemD\6UYY9AK2J3UYm9eŸel"]wd7G;ND\#fjCH 5NaYa?#.bCn@=*:Q=ʋ Cҕ-?E¢DZk"cE|H˾Ekyff GԻfEKA4da\VEԺƲ{NIGqWF~%4΄k(g޴fCGh6({ڊ!pPRG`loƢ88cIlSG6^^QZKid~KILbALfn J_95pqx&AK2⺃xQXݎxNpKMh%Ϙ/yK˘* `K-Xw0VL[fը RMK݂Qل̂V0݁M{`(sk7ND Đ;sW"i!O"O*"q?P1Bql+Pp4ׂ¥UאPڢ)} hAlPϊ69{쟳ThJꡝQ >MSᴚʻ̹+bQg.rzԡ>*JRs ˸*OPf,2SW4m53=X:'TJh9S<^@K3TPPFTZ4 2 )IbUuΟ; +buZ5QmV; dbAX5%2Vz뱪_73pݟ3V"с gGL . ,W4Z֘P9aWLi> gđE2-6IWi̢?f!X P wJ%cu"$~#XOqv8ӽ9і91NX|d!πԖb毰Y("UIkGזiW&h][AW5+߻}< ]iPt:/-]zΦ%M ]MuMO2|G37rx ^'Re5~5,3^o87DI' l׉O^vMPcŏ H_MwhC^}&``wC'h``+ ~9poa`4Pӳw[f{L[?a)戉~>b:OVuF'~|İbv chzŹWOQebuGQΔ$,c|x9C&f|^c$hB#\,yd MktV5IVe@&= `}BzGIseWCNU2[ e:=W$cfYp&qw4Exg&H^x:p\IhhL- |u&^iCOY Ѷjf*aZ:y#kqֹZ~vkZzHw(kTqǎV(l /x> v mbw)c}Wn/WD˄UorMnEv4T>anVr9\ikdp|/L(.!cF7$p.]0g9-WYGJq]fE#Ei\yqʀlT;Cpflra~uCP ~s\ DLBLaBt6qpm:Kz&~6wtG0UR>) tl( 0?̐"FBmu"j l(vOJ~Ls H@7v]L .\/ ^9Ġw*9k2k `?x:8w=u0tyx)QP^d$'+%O^x_Vey:G5ExkJEbM!޸Ax\ E}x=YeFY\)%yπVW-Hm768o`Moyʳ+?_Qm{f"R9E=/OQ.{ܼ"dcoN&B |Z{nL|%R\V-h7%> o||%>v>EȆT&}mC>V ٍiuc~WI>Ejzxd ?i6<\~ {9[]JBVP%-覆QuS#"?-؀&ƺ@261?=" SIJ4J5B7w쁁MX.!%e[$*Q@j~cWrB(dHZ^Sw O*d*gPp9PY>bFtŔ7іv5F4oi1N|veZ-!WSL5f{K /Lc4*$7=ϱ9Gv[xK Q'Qoo5BׅtcNíPeex΅͙t"3a[AÙYrQ% !&ΰ.$#G^CeW ؇WJNO?ә"؉FuC:t۬Mgi@fI8Kx|J\jgݮwDL0nK2dl`4S{x; Yy箘k(p죴v. #q6."+8 < cc-+XjJۓ^tܐAQ?_5QD ǝ!X@v⑿W&֩HJk|yõIs:`B K0ˑG{zBWr+: yKˠbùZLveU=0F}- uc@RieFF+V4RΝ![+>ٵL+/u ؛l>?my햊\C^mw- so/W5Uߐ,;O;2lN5sa,$FmT}Benw5.B=Ľ$2ܓE<%?25׸.)eK~pM!k?@N ){.8NF/~0Zêڪ~1REؑg*7f8A3CZby|mk8{GqK`wpm@,,=¬|;a[觓.7rTMc{eUأtڭk[L[&BmO{މ`dI-}:CjŷRz\(6otk mk^ y*od.krQ@ǭ"yF8<%@6j;&vm,I ֝~|uwذ{ M!Z?>`ſYb0"P'ZX- ~\r'\W" s+(* 'W=G 8g1j? ֵvbrZv3ƶ t{CtAʱaSF絟!;7ɑY_Ogĩ0߉KV>Bۖ$3{!d1)۹2c Dź3$Ɏ3S~}%?C+u0|DƢ2vw_&"OQ]=/k dcRlk~;'rK(3h#e[yőۼA=:Co=/ĵ QU.ݫn!M7Ҡ's+01^O;¿!s| fȬ[RvJ#Yqq`Pǹ~UutU!5JځpKҐp)h;==,LYeLj5Mg-#(O[;PzL(n ݫ/klǷ<2ȀYfDL3ˮŮa>ҫ8?& 'Ǫם BĢ1eHHKnɹ=pC#,{/欢ʙ̑r+^ʱWݿskHU_%"11N2iϟ̙u@ԧ2[MN{̀:bjRMzfgS2],KΎ$N-O_.I]fߩ[ 1F,k|VLy$o) GH~(I+CK*֬v [ӰZS)F%x] Ɉ:leCW aR~Eu 1~]nPO5ԉƒD4^K%Fbwc'`Q*ˮe^hx0^xp+e!< C!MHW7zo1Qas[mRn?]yuwP4x@F/̼q;ڧC;G/7:ʨg%h{ ){J}x E/Fe^==ew;X9(oP /qg¹N<*H/Xr/ @xiC&HB1ƛQaɷhq6A._l-1lr('KYCǕNKD#[V̲ǐ:]db8T1Ks1Kr}\±'郠gݖ 6WC&VkH>7a_ny+)X/a⛲CK)wZSϖeVBPSpѵ<\+9ާh~.W8NZo P ]>REF|<ƘxjhsFUƊ0֡FN$Xs ^ъb Sq0m>2G#ߒ^i(R4>Ӄ,Lv- *gY{( Zg5M*#AC)bΗaBsaXds`$|jsoP yh-ibJH'wW~|'B|6sրj#Dt=*ɸAaY3 ]N_Zk&H+yܗ|0?i 3a`sC,H%}j |N6)9ov>Hj Gԕlf$Mh@[2y?0QYxT8_КFH 6$kbpu;3Bտ?AVBn)EC"޽\CA/K:PHBwPa$ 'z"Nof} 2srҖS佞,5[4A2Tдrimf volhVTFMS \p. >hX1g 㪕/t;} F6%L6kjEr0lqk;L'') ?7>=#%Ar.I/Rro?~HM!@حNcЦ\4_ٻ jRY&R@qx1gP_RTax:Od,ߣɼHLMN55-sR!>/C&[I$wX&qxGJezڃd&-Ti iFG /|U Q6(<%lz[qEeHxS?#@ufǎD'1t j8{<ΩWRw,߫۹;H~#+p1R ݿ3D!]p[܌UK&#o)luq=l&w?mIQ"6K&w^zt0߾i`œzoW=<8%Ղ+~g${=.߮M=sV5W"Ԡ@<L$91tz|jE#Յ|6#C"ƻ",=+.T CaWP[Nq3{n|J.Y>$RU Y P\+E8xb@:te&ۡkwEǧ'X=6Rd"-|!VqHV[`Ds`y;qO㥠^#z }>Q<R#XOB5l/е*ً #1lY43,a`c) )?S𳨁]\g/w9?h@)NᄃC,݉ WO6nSMf!AbiV3\;prmq.]sx׆ qUg[զDMCH)OE;b-4;%VV_}-꬞ 5S)cN*Rx8 kM>p1O!q+֢=]"g9ns6@4'l |?`l5˯]}F}y:b$דH *˞+ѹ[PGξ!Hӄ56N`!e^tТ%h>* B)WvV!x1[J(Z)%id¼J_X̵݄y1[U{)TtFW ܡ_hw[Z^9,W=',r줍;_w1ۭa6HIi:,<=@{ԧʯ FSoN{okaAcwda>5]/`Z!{I:Grwa$c{8odT n (%CEtknv)cE43:HD!e (_(oNUuQ蝬Tn6b aT^%2Aߒw3Ç,J ePa,&3{UZ_U$R`[ 6>Kg2Ydkh9PO'Gj׾DPYC~%2f8@73s#,:5I'OAx.}cM* C,#ZVڇ~ ~CLI8c/k7 O(ז/4v,b kA+Mt.I<$7)@e fmiw5&MIKSf)=^4:.)aA!*66{{F69A'.8N%W61$;s%$zs[4~>\4g-vl: ]ؘk;3gkncF\^/XssSv*kOR(SJa!lq)_'Hؙ;b$~ !0pckgAFUFg%1Q4mE]-(Rx;{m|Y0 =,A[Q}_n0#Lqa 1%m1p-<JRXa0`<.3PG%n@cqnf,UKH+T*iinI56S@?[~]Q]%FB#JH_D*N gz=S[ʠ"0rkB0 xK =6|;)A6Hc^KecBBv@HLJ1hI9$2Pr6QuX|a i!r`l]\Vsޕްm"IJ:čnO/&A x !=UЀxW3q~A{mINoPm,ְNLT3 u@%8.CH{G&ZlJJTcX էɉ˂ޗRGho#L$xAN!E} vr.&xQEӱ h~o",1JX,Bg靚\'m1Z1"f:"ߞv LNq MZOt6)sLe+gY fh޻^mB+gu*tIm{]~Q>zm^,+x340075U J*IL/Je.v:>XnwDA/=$(3$[b>E'mGCR 5R͹feNvX%5߻(,X272؉//ob%9D/7Ae͕]~AxN:5*` πO?'KKհ._gsW 58ٕlUF _S$ ξ}5<d˙7T}pF*W(1ڽ@Ͽn cA: v[W^(J:A%V}ogìA=68y?T]rQbnr~nnj^I1LSJuV5y`VSڔԲb,uf S7u1Lʝq~f^q Ë6f[/aF[.<Տ o'K/)syn{"[ZZGD ך9͊_3"DAqQ2Cc?4=xt59NI1Þ鯟P>w~jtߣs pRK@ ̕Ν(}О/>ЎԬxmy pEJe(6%JaC Jygx<3L)hۍЁRI$FSJIiuhakylmlۛZjrRy9, I6ImʬoeW4ʃbρ8r!E+_5OأU=ك|Jr,7 a| _bܼxT_rԽU:a&%as6bCkş1 I.TYrÏ$.q9k~8bl)c7Zt%KxL`UH:'|X`ﰹg5фO-_۲De_=GP>q|0)=ӫwc' >Ew^*Hߢ#ؽXL>Տ~VAmls\Y0Clu5J;ٴN;{]'g<,H@H(,!ԭ9A/4{UU{5S_TƩI~eWuCw_`,K t^YCs^K._dwUq_f8{)mfa4QVp0vc[q0z^?# B$O TnĤ~vJh~Lͯ; {ݵ<|QH .iz]hbk%q[̪IW9d5<ўd2[TWfN2 A'@l]桛 #z|mhD4=cƼJ DNX)>:_NW?{rڂY, q q1c)'g)yjBkfuM#L,As8gKL0M:ruW|,1P4//2 -%`_9'L-[ryXT 85ͲrNZ؊%.\0'iVr 8bTa^f]}C}M1@f8|ˈNRKfy'^[LнwT'&hZ`0bxÏn0HRqρ1HH\*qh;Ԩ/R˭lWA Q}SF U֢%>pƬvu+iw}* 誇+qṶKS *@ A#ok69u/l~"2I BbFQaߴ=9M(zUtqÚ K~;U S?>"Ry"ȦԶ)ēUCVa"'wCk)[[*I{_k4$fc3Ϳ"$Ġl6Qxe1]F[Н;펠J` Gq@Qf?{3ND)o>09KPS#$DCl>ߩiyM)'w߰o!`}7m?q%ӷǰhޅH_WаG+薌ӟ&6yܣ~"^3=XDa^3ne_qk4?ߐھ)n4,:-O3z^[ꅱ}`tE,^*!%HDr2C/]c7x83Gz32]@(]tK</t#1k 2[[(`ڣ5WF A5v)k5#1j. <NxmoUQ $YVELPvb . #Ro$c4Z_FI$h!IA)C^Y:69ТKO8qbْ;ʸQXIe%;mG/ * KzU%2IxAC[ c'P7o2Rot'q_,/OWm6>:4;њlW[lG?Ux31܂ĒbD_~AΓvbx340031Qps uMaZVڵM[M5<3',..0BImuˡJ\]|]8*?M.`0-ɻRGk R+*ss(ҐRX \eqnjE"P=P5Cj{6CE7 _W$#jcA'NlmZ; (,oP|WyWil'l>l&xei TkpҪJPEJY,EJc̜1Ӭf%7Kl%d-{7K xQH(ɝs͸y< ٭hgbmiGssWħ*Ϸ֔:bd[Sp+SnԢ:Ky!&Ap6.scҍU<(Tdp:o>/T=M`U|Zf9$K9  =UvtG"ϸ][kfvd2~RTD4L\XH[&J^#'x dή{!Inо,lxvKG֮씈rYT}gHXw&3`_mxە/YEEަx"LY |iw}L 9_L|DclXҰ_5sӓl. X k6;Y?dDqlçN2QN.?5'CIKɷd*Gw59\1s­*jI'K,{BHe$Of|bdaT+Vǁ O?.f%u*86r]i-=\ExAtcybj R8 =NTeLq&;sf˅$ 2!p1:07u|O H]&h&'W^\#ZˆMqK6N 12  -ST?G R1>npk U'Xdl`dti桐m?KYYpnip]Q]b}鬫g󞀘v`f:%W9BtkB:~p!ۡf/=+5P䌤Mm˕ NjLug.Iguoq;3>ǯf2[.E}I{"H+-?Q֫,]+ lpRD/fٶL'ym^zE! ‹ɥ=(:xxH +s3z"p:NVl mw" 9xܿ(ԁy)=t*!hXnY׻|gv!Ә p_ZFoNd}oipí0MoxbnX+.#:Ǯ;ìF* O͉{u3liy B渀x G6֤zOTg9{ąw┶ˡ]c^?cށI\y>pkgb*g껴 =HM%i1nfTˏ~)7-0pcO^ <^_ʶZ~= CukgpZn[~ $`}< zdՁ/q[Ⱦ'KtW 7]f݆[_O3,2[} Ϩ_/{N<@*YPJ)_Tej0DJˍm~UQl;"o4GY_1{`-PdyƧ"4<8߰!kڗiD{Yjf߳&cudVݬ(9=%Wtgڀg_x\/Yxc,`{սEg~\|]: ϳrPS:aA9%Ct* iITaW < )e; cJ: sBq?&wVx/{l C"ˈu#z^H ܍2ϥwbVWQoI1Q [}(_&,ƴyBEG ۡ=l_Vȳ9bP_9|$NtgP-eᶯiu`fe&+7XW6.r$ [Bh$]|6SuMIBܭ 4ϰQ1^(*wRl"ak[ g.42g^h~+r^U.ғhhҦn͐ ~}h;{#^D:%2-zInFB.QK`O!N] ^%noO'0҅4]7n?@LKA=9fٻ&TJ ƻ6-I.WEc\ݷZzGs&JT3\//g7nfQ\`"J6a8rG7,ثTb7bu]_1, Džbj y}XjgYQd:OO*^ Mg3d{D\~6QʁWW4NRd Ra+>%(2(#{< vbVh>Pci 87:qMxRbM~BRhhvэjJ='oI #; 3|OFe9o̅  :hN iI/ߝ]@W .5?qĀy2^+x340031QK,L/Je^ﱯ8lz*9#59[/a ?<bg=Li]P%ťEz) o.+Ϟ,ɐ ?F.ݦx340031QHI-+)Kf ׮䄏ZT Rx340031QH,MI-K bx:y{wbkQVZ\[XTVUS|4+9;{gnb`U E ;xݗY\jƟ^iw,,xmy\A\ $T© `DIBg&"ǮXV*rʩbme?RTE)I+V9N@?I̛7o~3mGB\6[o/Cۆ2۷h7S*SـCsI.8Jpq :?Ahώ'.s3}m%Fp4>X7f~ R_::Ǡnl .nMɆ<~.PQbeOAEpC bZ+ y Gr}·/StKi+,6x' a:!,.bUGn<%4`ZS-lTw=S9wM0V1@rHB+M3|tEB|p@ -6,»ykdžm5Eh+ *hNe#X< SG[~ sTBIiCu{閤8DATthnaͳ/9qſ8h)-*!X2d@M)boL]ܫJNyZ<ͥF`W8.R3I\d vܽ$zcj0 ;rmpSl²X~4ߌK T?ֺR5yTլk}z h$ZݖöYe1TCR.eYq=X.JhKk|)|d/ӿ>y'#{ud6ptJj 0xcF\\Qx-ASe #K9!/Y ԦVӿ̎n?LgB2:MӮ{ UI>uxsqidڤ\:Zv2ɊE٩sKPR܇`f,Emh? =Wd 7¶LF0..sƱYpbi$(}:{y˙+)DIĹ#5ɣ:ẕ>Iߑa ^ r@ȶQS=lUva}…QPÛa%7Pd~"AeVU{1[KeeYQc*QX5,4$KHu3虩RǍfaw?h8arrjG~apX3J g3$76.`4;oxPh$'ݷQt|jAQ\餖#/@*wBIԯQ^w'#< ftx&5^h:>:5T~TIz6يޥ)`t {9m?bvE:>Uugjr:sKY OQ=gj\C9ɫ1vHdn, &ârcCO_Vt;ԺM+|#SHӂ^Zs E;J WEO2pUVx?YHϧޚ-nᒆR@X @"zs~nb-%UVa@Bo %u;m=7j2IH;}r_ lꎊa-LC)Fg-FnUޑls3#A) D|ZuujU;R!KЇÀ'4+ ;r<?ѰTs(93msz{Vioo@T}`y[wx^8chx#Q&3i5]jY%%W+ޚǹ/tXzU0?|Ҿo/"x340075U*)+(a|]ֶ+<3s;뢭!DUkPX6K>v/qv/qk熻F-fLZ`NJ?෎`|&鷨?x340031Qpq v cX*S۽ES K\nQ0M0fG7 gSs(1p=MKqbJؗC"!r)% կywY׶9U]m~ 32x340031Q(,NI,I+JIdْln$9#5''?<('E/APi~_܊ x340031QK,L/Jel+=^YW<dBT&f%3hhL&Ƨױ]M^yfĕَwYe?>_UW\P__3fb:<92S;_r3̏6Tu~p)P3x340031Qpq v c;yeGߜ p`䔽[v 1,x340031QH-*/+b}bYC_*45*5x340031Qpq v cXrUo[7.!Dkp+Cݕ_Ֆ~4bXaU Pb8:s+6fhaxiO8 ry 5M٦ŠӱNvw:ax340031QH""a._ow)9wu~dx340031QH"_.<L_;\in%=x340031Qpq v cv3Qqϗ5n{~` Pb8cG\z ߖ=ZUx340031QH+bXi}[=.-?\k$ x340031Qpq v cl/ޫzdm-nb A ok1ss׎z? V x340031Qpq v c|Uq>U,G&'ZUB98:2?y]rG4̷͞a&@0MT?\δC'Xm_m ]$x340031QH+b~{!' _wxߡ$n3oxYzɉNJJx340031Qpq v cp`<Ƀ/12^Y/O@&&[m.'si/ ||9|x340031QH+b|N.̥J x340031Qpq v cn< -o1g{ L @!"k憟ISh#x340031QH+b[ښ-/oz 2 nx340031Qpq v ch8)2&bN/~cb ey%% La v>8 Nݧ1x340031Q(I-. Ma(Ngbf-:Df x340031Qpq v c8Xܬ{k\uZ !}]]"fIQ|~kW%=[a(1J<}G_;E4p)Ox340031Qpq v cP|r#(֗e<-k{B98:22{OxL{9wB4Qp9 oܟcݷ5N&$x340031Qpq v cT,lYW㚯!Dkp+CԒ+ɋ^kĀTԊ7L @!a#ӍlϮx٘}ugL9+ɡx340031QH+biMu xm x340031Qpq v c(ٕsF=}י Jn_-u5(su ptve9Y،%oM7(^lb A Lo۾+ vj{a'yx340031QH+bhr#{B@U rx340031Qpq v c꾜em8ffQp(܉3f~ g(1pdpXup曓;Bj%x340031QHI,.N-+b`ܚoRd'Iֹ*a?_ x340031Qpq v chXٗjӏ`dvC2?G_GgWvfօw%xzYףn4pxvO̞B YГ&ӥx340031QHI,.N-+bx3eCK ة x340031Qp+(aX;ʫTO\=ѽ{2 )`)N6hpO{t)Tksg@CrW ?Ո~='Tkp+w+\S=?c˿W.51 UE xg(sO$C\f^q k~k2M$NI ;Qx340031QH+b <=nrڪq Ax340031Qp+(a8{۞,:YdQV^d"'^٩!DFx340031Qpq v c(1Zذ]gÉN]7(su ptve3֡Wވ߿{WwBIjqI1wv8jX>,x340031Q(I-.)H, bX$vĹ?\e2OcdbpE OK]viWߘX:x340031Q(I-.M)ͭ b`bfNXm.*91KǬx340031Qpq v chտ~ʓ3nt~!Dkp+Ü߿F۾z^`JRKrJ見1mSk)۩x340031Q(I-.)H, bc"3mGs?oB98:2LY9.M™+gߜob A Wv۴jnx6-x340031QH+bx6M9{wk+qIODYx340031Qpq v cXx!L zG !}]]}4{kݚʣnz<2RKJR$Z_<;y-Yl~<*~-!x340031QH+b~{!' _wxߡrrǫgW{1}἗56Rˠx340031Qpq v cXx!L zG !}]]}4{kݚʣnz<2RKJRޮzzl̍ 7w+}x340031QK- +gpzvuں{OYc}.ux340031Qpq v chnU#C>MӪ9WB98:2҈V#^6 PbȻ|\\-127C2RKJR"s{m~k?Vhc.rx!!< $睢 rϱ5QAx8R(&{R.BuA3Ñ4'\W" s+5+ix! Ekyff GԻf KxKI-+) *(RኃbJSIJL ?(1O79?775(+J-KI-2\BC‰eEPE%xs MQM-JO-J7 x+ᔜ.," ;#$K$\¥WZR^ϕW\xr wvp{u]d c5 dhHP x5+).ՊBN O\P]#E/ 2'̀cL,Ӓ{im#yݶ>.0n2ÙO(y O f{5՟[=eٚc̫^S2<,8JG.q.Kұ¡\Ř/͟-f)[zIJ#BQ42גSӌRQ?\%CR w a?\UKĊ έlC,=jQ;c~ 1ذ+?ZޅԿBLL3'ιqwH"$QB%L"[@x'ԡSq 4oP2[>9w( Rfd-'$N(J=agk%Qf>*ӍۥT΃.짊hatCBO CWλ󪮳;+), V\u\GnF}KSێǞkh7]܏^ٓNs7uu;X/tA0MCp=o"hΥЇv5:M]qu=\`-!xcJf%n@xkhlg1x3Ax6 f3"KMlR$H-Ë~uakHA塺AڃRh/_bh|Z7ߣs**4=t?_5OuMZ?%tEKdO,t-4-t/tM[ " ŜjZ|"LR<I70M`;ş{0O+;.Ŋu_x[ Q*+:g:0_@R::DW`'LP~JKn)R`WgQ2 /B N4ͅ82mt5p;ڶ?02'NΙ!,L uEmu@2M`նuldmwظ - !(U3$`A1ۂXp>5#i,T^WF\ffqJdCOXb4{,팇ԍ7tɦU@P6 RruK9 mAǞn ,f O*+rfCdbI-diyPx{gm(;مovE3wK*$;n{:*%uV<Ǫus"p[ESa4l+s6Ǜu 5|ɑR_=9RZ|д@Q, Z/ +ylX{)aXV$/|شZhͼe)wH)?H7xͽkw[בޟ,i !Qw,9YPre^i9tl-!$]r9p͉i5Ng^bכ3k#vfnL"i͏/Orؘڎ/3XcҜ@iӾYKS?}R^,vb:_mϛŎ j9tNXbuWSH=O(q`yc_;6Wp7H?{bu_;.ծ<ibu=_"y>~]7%i_r1jɥ-ZÊ˄/xJFK`R"i -[l gY\muLshZKc3W=&7Wg_-AZPRDrBG!.h(j?OdŇn2=qd 3:s>-}X +dA_,V i.~LA*j̬,5YXbOz(9kcJL $F^OA(5^s^o5O1Skxj{02 !Ñ?/p83Y :q'bQg;ON8+[ybB'Wj$b&T%<~t* TTfcwrW$2 ]xfXb .F%`!Q?_猂 Oyic'(ٚ%b]`g$U04^o$5ЬS&FX\J#u:eKJ~>|7~xlgb/7n]zSezK#8TO-bkxikRږvyAv䁭}x+&\*77rcD[\KDN0"5cN0-Sx'}ofƢlэe8,`M<$zrڐ6t.LO1 AZthߋf(xJa\/дGc 1 63ԟÁܭ|5:RTz?hE'v>|€Ū3qu mZUCn'G5 uvQ,4toK8ypW7v%E&~L!HyME,"# ÃPRՙRC'DC1o.j `cu[+% Փ屋Lx|?Is'Bo+;.vJ ly@AP0cEWOwvmX[Q V#f~xiAf(^PVxX􃞝 m 7 0FR =fGjQ -Fd{ TwF(S3_}j>M6[1[ 0úќʨ g,#iO7v̘ǚ<37F :ϰSՏk9M |W?7aee|?41ܗ>@`?ˇ9b@2 GC=#r"$(BAUUt-҉x69)9;&@km.J:p\ۧ3Dsq#۵M[_!D/P'kp>c;ԇҜ :T z̐ee 38vw} ˍt5lQ%\p]2"b2ac4%X3n@I߶Q 6Ò~g6E^- F-s]'p)%DF+g[OX %eX=H/<&2.L[(-?oy:E (^]HP{÷z[ z]B-8[o'֟1JuH?#RG|YFNՐlR/>`lx>,#*gcVvqw0mBߍ+:-Ofی5sfxw bQ+~0V*Zŗ9 s EXGFV]\, D\ "պ̾\N)?Ж̂x߽߾mM y"fs&>w(Nr} >jc;c_f$ORBZDkITcR'؄m\"oy8ŁpX:BǥbB8]N> GTȄlRbj%b}̑~ b의xxm,~ 7=Dw=!D}rTհ7>IJ}~'`odcd2 ;YWҦ  VhGȋ()>0t8I(?.}0(I#oնX,[)?'p5=790EW֙!ӨT\ 4.tC$ܢ5A8h=5MHcV[}J$ ?mzYbĥ3i_ ƛNVvH V*tNif;,Z@Ӆe;Xv\,c&?;=c#aAH<d&2%`Ͼ*Mm?h:?Ҧŏ%8 -t6ui~!T=X19lS_0b^cal?C,?rb,h ^HƑDe${JcIk+-%i G8̿%姓O cc}}!) jt0ė.^_PqcqLZAōDv|FK焎;"R0 mxv$!&;%fyE&PVUVXJ"QGj ̞0''OX4ۋ{'г#IqBYqjB'\u8z9t8'#8)b-&%D#Ȃ Z t490Bo|e5rq^AX `Hٽ"5[WD`0_qKv&ItZa=3 U,/w B*uua&gg µMXQZ2\KgR9v0rEL 'qfw;ĮșUI(&lJs= \1˨p%bx" z~aŔcIحT;D\_e fN 6=U.U^%qWO=sE.֬1XB=C ՏZ^B g!ja"*Ra4G% D{$%9[V|5J; N`9NWU٩?OhVL  d910·PŭZ*Y bLB4-p c(x6_2X3H \|[2}g&Ӌǟ( bh$)kbY<>E“al%2].gg1_s!Z!g"`Be(lY l $\^Oh| *PڧZɂ->4YQ2ie(!0c)0E6k{)qX\M9v(HȇEMduZT*N$(c_pc^آv\@Б~0#Lx2g {X9ghw<_I/QF"" d;LPHtWBI<)rQQASN8,oF %0!:RD+P5"8a?>!͂Hs":և-%??Y}}1}>@kR,B ޔ*脧R aaPU.Pq`,uhTUrL?R7u@f.+koqldzeRgkZxmW+򡧪7"c$ݞA3pGow8>̨}& :`דrxh%FPӱG)Z T+h"tVbb[r P˲B&w=Gp{?P(|94] ⋪]?'# \rTskN}jOVXe:Qa3tL2dWYPk-Eu `y(wO)a 压7$!-nfˠ˒a]v.jZ]'ӣΌ@D"H1zGj <}U ox߷;zcfPuE:#&x[o:.iF9(XEdo-{QDb \ aeZ]֌r>2l{Ng4f26dNHAxR/JuD?/^&P{?* qcpVi,X s!3>2*B &!n rI,L1<޼)"KTPaIvK9;з cg1[lW SfAl5K'"$9#%h{r nRE܏0 P7nc 0绋ݟnmt{\Q@"w)M$uA"^-Y'"Y, bYX̞N'EW6^$:k.&LmZuHoD%Je,o֛:|Na??}r<D nFQ/LJr)q Tƶ;rP{JIKo/2 nsJV܋+qǾbk#8pcXE5A6[ŷBGZ=f%'8[@qQ@h=VwԴRTYU$NPAǓXCx}ޙ,M_z< KE֞NL*jXDU?QU:o:,7ZhOq'+QY^;x]:džtQ\wGS} 9}v{2?ĎG*!ҍ5]?Zڅ^*:]R2ōRm=1p>Ȟw5 *KR`+W40Jf ~O:'-8'졛'=\ _e4xO(q:gF"od O >nm]OLc)uJ7 j;{M޹eYrd=PR}(;K~TןafFC?M 3._w`cvPb>zȇP8ר<=8Ru<5 樂OrqIݫ|<GdXOClP%ZL0T|t.ƈ%DM$煔Kd0GN_Yiq/^$:\x"1 C` kਵ:˃ )56qy̱\V`g#ԛ=a~~yн>Lxd^*JI곃5YtkbBD`t|zrT_:rcAObg]]LYO%aVJz,S[֟OoJ ,h]᳉1C Mj$z SF@J=5nH }#G򧬓H'Ph:]>J;;rxFIYҿDUmlnp3ƽmHzM;@k`k,hllF0N9_e) vTQG+lm _stvN1⍂oۓKݭP8끯)oWjDq̈E%YXa$天Qq(,59FJMfX;q[{m*4 n,kY~S;tI wcovGk!Tյd_/ΠC$#o`].gR?2E|YϏF9^CDtՇKx>֞iHO'wi*pg5ʗ|tissdOH3eQsExpNH{n G}[0ܚwqOo!gBv"¬B;y-g~$%;JXl'@4'dd¶ Jar͈%9uVA\괌f^S1 }~s  ՛gs{=zR0 rCzJkѤ&KPlUZ]#PEWGbDȉ^ӊ}48Aޣ9|fS6@HRZRR>,NFr0^s-pKWGbee z(ԓGn#ĕ/ Y;%M>o`jޮSE8&_Nre^q(:nlm'_/6l)HD^z.e˷o^vLq̈́wĚ?( ]m X;-*)}8zmحL2xbh)LQ#PUJC5TB4Wuw_=#0ۀd ŗ[8}߭Y6*fU{d7VmVK A*lTd ?Vh}%}5dآ)JS g;M߹7 Y-,hs(w'm V!}$,PJn*׵Qv`8aGJ'gd4Ư] G-K/G1UMt:WhE$dm%uVof~V,ASʸԖN;\XD@ բ(;s j{ Rc2.L.frE.~Dk$EcH*{n6;t:7wW;qńa6~utO1(taF/Wa!g [:\p.#^/ZX}m"J4LYJ'"? ƷffBã{L ?^,X%Tw|ǰl:ZE2BD Ζ@p_jWMj?4z/%ֺ;h6=x=e."dTQl!j`T7uh2Rca¾ݵa8K8PP&MgOd&nc3H&g"p\!unpaE?>Bc]97=lط$|ʓwS2s<-HwE|S|⏒-3jBu؃ğeUlG}HwLK3rȐfz PefX(>Vɰ&iaA֥Kͨg Php4(&n3u9EH'PB1<% jGffU6߇6 @v`*T'Vb ᰜ͛EY܅r9I"t{0E`$5[ `A^%u w?zsӦ1Z,]U?~ vWsSgL]&K m/a>>!܀?7i),#:RNx٭U^HTE/wWp!(,%H oz}A w ^:jIs}Ehv/eT"2ge"md*s3dtcskYR,(ZwΟsXQ/LMPxțV3T?)h`B/`= 3cDzS|bϰ|AE-p;Yƫml8UiMR#՟c",UJXc}wJ~7EjP%cM\'mG<UZ*70u(x^2'T\jHƬ~l.5+~UpOV#֟x`o, 6a2Jw rJԚǽEARb/ف.>Ȣ`:)pOĤ+8d4)R/o&3KqbI>:a؎VpNAkF{-MyQCӸ00q]xa.齽ZYj+*wXl!0.(+,۩JAjjXȆf@k-MAlEG #}p0$̆Mri'fBY.ӱӟA??7VUv?еwM:}!7:Y%@>F)o1 KzPᵁlvv>#uz` :FuMm*G+-&$:Ԁiub1nL{XY,5E_Wv_m6HD3J&/S7חX1D,u")RHz_L~$8j•-yFLVs9s𦊴` Hy]Vw&&X^X_*!D"N4qÐr 涮Ű iZ׭)'@hzV*[jbF0~7?/'+Hਂҫu)zg ="ii2Tj}:!yoE:!FnJ3Y|>T`5XFܟ@~nN*oeF5-K~ᙰ#1 ~]ЬF;;E!E%aZ<ǛH ڦPА.LO>T2of<Ҡd34,y7˚ʩK(RX,KP5b&2}cZ}t@9)g 1Fh޳ց$-WbLg$VKeC[p1VqJGZ,QPΗIHrD ,ܓFAjoN"kOFKe֍M"ѝx3DK M܍I5Y^ Z4Wt OfeBV7ؽb-hE&R}宆?Ύ_!Oc^4DS>yҴDu.T6VP3^QbP.@IT< L˺N9eQUek%Ģ4d&hoYF?Jb2WiyvUx4RLGQ{Εw 6wJ2Ҕ #QB&:ub]]rqwazne E ܣ:#.SY7p_oP *PZR6鬪y1oF*>2Q|d~5T0+̌h7%ҀJ@zސ8cij>"jŃ̠wLX.JsLS֥AM$"k̩6 *@,SqOKF!wSo̎׉t]Hu]QN*|DJJBvQy4x㿨΢"/EMI YU:0pX\s@) r^f)M5›`eɐof$i; H>&fAJ6})q\;IdP0Kѵ;^j _ߔf7I*$ègw t@"G xbN 5:AE]܁8n+zpy]^d:mлؒJ E`O}Gw9J֧3H8*ACo?mJgމvnB| J*HYт+3xd*l&P s?P^)gEh4[I7nNH1Vps\ γ"d:i@&/{t&SpDVܦtw%h%QM@ [vE_ < `ޗjBz.EΠ3EIj;ap 0TsĘY\AMrKXBEI "H-YQ MI90׽LIANBv R3F\ wU8z?]{a I͝D?*sGm?R$178@t=jAȌ|6bnIkw*ZV]=!J+0Ɍ+cER421 Sz tKblL#όV $3 qkOKIDOS>/NVlR86+4o-ΕƤB7TC ە١ r"Ȼ5=\7}i&VJh~J2CR e[[cµ~,VJgr\\hwq6ojZ0yow&/*ˮzhLb4|6z UV*G7V“=,='jK{9W Xiѳ8uo{J4bvC k3XKX|V_,55tJ\xzP@`ݨ%#%zxRRY,O.maEB_@_ztش}[(YR6E"O~}\mJquYG[o-wwp+-Ah-33h'y:Òku_ *+dwל&GfKa:μE\<|uM~5RXU=WoSCqWWAp 4][CUb V4c"uo9aAiyW.t͸ejBSh6b4˂YADFŽzN0^l)Vt7U;E'b(]ѵIԑtK..lߊ $_{D{{H~m^/rrij6ס j/pL=1eZ(@$aj]+.,'JԓDW_C&^'f h~w|]/.KPNZۉ˼_b8D;dRP3Z]ƣdRG tю /7 P7 X,<~){uqĎ[t *[f\Z@k 4bTX6U;b\`1n9& *% m\F*ky*2j~ 5"ftщOTՒ35`ᖥK}Ypș= ϻGC{ g#˛q#_Ɯ=+~ ﻉgMQnq~@(oIt'7-):'b`sP=QzV6^p~ k`o~ nB "yr*"]|iV,yexEV43ɧh&db7k*kpɿ'9H;7y`u4H/eqC~]T#NNT˜_v\0*[_|Je7Jcԩti[ }{HWf&EpѲͤ|T6f%iPCÞ(BF(J74%[~{G?BrHMx|SK0o=!J:m1`ۜ?Kv,m<*k=:֑.(NsTފn\:%b_t֯cuK_qǁb9fr!8UsE[uFzJV{e^zMb:^:S>TauZuyN0RZNiqo 3*5pNPN&M6.Mvj;=dx]IQWI4|&uB}.U|܏↑0MY4Z֘ &I.3ި7j }f 0^D)HBR# )bΟa*.cOIMj`G!1B76~HUteiZxۿ|ǷRf쯐djR5V[Πd_MoJ1[(KH_zAX *_ polܢ&%ADiiYaK@[ëXi&\r.ӆz&-CvM¯/ݚ6(FĕY;ѝcǾg@B 74EHk"x%|auXIs&R\o׻@jM&R!>//QqKX?&$ZYkԩ7s9 G{ "p(,gx)pSR CaҢSےŢJfy^CGs ȼҤ_YHp)NΎF|RǔXç;fFe*R,k/suo6Q̖"-ӢÖREnՕ8ecVŘДM@ɲlt>A, wƏGFNv$:i(2E}!9 uQD'/z83"x%7CZ ՝>ńϻ9B5)@%q? /x,s/m#3v'M6>'zl:e-L`ColsV6jb%WIـIE5NXa3$42ILJ=֦EBy._R,z-ޓBT &(hb%%+6YʌA3&rՑ By'KzS]n_,H5Z%u^kĿ%WB@JY[#x@-NFy~DnX<[+2݃$;K/aaZh8x^l-(HS]3C4 Hu35CgUN6,ě9}ɨJ$6+tp183EZ|˛wYɥggݵ|[9sdeu{T^kh }13>oCZq-|;^*#BS5ܥB>W`~_g!:Ku% q50B[/NEpr:|cv |tЅs!ҋ=~%J@eJ0t6 (֪? i\Ȁys*մV4q-,:m݄¿Li}ԙ D _%U$˱j c[kUiKyP ܧV0/A"a$j.(5ϦāW Y -K!0M:Bۘҳuˀ lLY#dgXG:?],uhkh< cm'%j_[>dMd\|`q[s Yχ z<ڪڴnlQh/$[S^) ϪC3\B'5 89enRUhwT:`]dU2CNWex65e'W-ÇiF .,km\p*އi32p ȠnidnaYUX=b >=.d{lBZ0b{C1+x Ap{M(4v1Wʝwzȩޭ\hDzo |@B@wOjYU2t>) (N} zՃn G#V(,I3)#Բq=oDsqݼ`@oZM8,CbS & [\EF˨)4$7KӅFhNJ䏊 gI?~lyQʦУ+g:~r1(zW0a|f+_.lYXJcc2S)VNQ0rϽV0M[z.{O3rlRrT]$+3dp+n7X v^8$\b2*N~ύfo{(:?o4wțxaiD "4QfBdЮW){U 9VxU9f$cC򞭝 i_V*ץ!Q-{c^_@-Y}c6"W^:AU|y/QX?$_wިƔsҏJadpŧ *% Tv*Ǜ2P>xzRȊP rq,䜦8YwXB {W`2"s> K2b䰀1 /O(zXa*pa ojb:RT d-#$P^ &u}\x>A'sYʳfD Yq«CWu W%*&){dh;{s.gr<\ deDpU!=+zv {ɶ(?YT0\r_ I?PJ(dΔheژAM]t$:PsEN> :DOGB\Ǝ9_y혁zsUeo{3'i k*g+&Sr|25B'w#퍓j3S*/(VC>sE2S"sSŒ!UpD $袬#I_:/yqˑ8~et7~&JSUGO"cMa 8HqwԲbC=C$7; /xkܼY@Y!%$??XPRp`= u(xk3BJjYI~~Ndyu8 Rx@  added to flag vignettes entitled % -gFxVo6 V`_X/)d-ѱH?P;R40,_wǻwxLYM%XgbaӺ.^:q$'4Zee*\U[j[)9ٴeedBX^J4oGD:c(N鲪e4WSߖ>N|{8}D+RtGJE0plIi][Z K )|d9!QT9rJ{jUh锪=e%?U6U]]_`},H}Y% *~ o=rʭ̻wsbBv6yC0} ^2s!CY%NBq#Du|BY(iGZalG+hhJP^f ֢Ţ7l1\4KQt2IP,hið&١n+o!{Ƕ(O3h##,T? e`)#1|8\Q0Y,KKK]o?>z#]gTou^<2ۮ(,AFo2Ed!-QVwqƽxN]ע>EL,&GۏM 2v.D]vyz.ׁ`YHOQQ'hRp2€mXW|;c8 &%7$,Jx ftzO+I0$)~]B)]"kPa<%R% <ϲ!AֵW #-Uv{=SV`{-zw-HWݘ! rѦA 5> 4Ai|P'd~$e|: cb8.>k^F\M?¡͛m/?.!RԖu{?鎩74UhP,N5n f[Tr>=e%>#=BS}}} 1yXp7/y[`p)ݪ) Ca%eU1i)ڇ"KpZ>{vq{(:s' J~-Ưh '4ǝZ#WIL'a:3|d9F`Y?ui!XS-lzi73Zcw^g[^s^`d 鋉z{h*x-\pyjV$??A#~ >E `Ix;vaJIG}bu}^e9~]y/EU`1]}/[300i W =+Mfk. l~eɺZ샗)M-};|MB74Ki\ju0(0SdQ;sRPFQPwfWPrq{7Ў/-{IK"z44b?Wש۵PkDotfן7DŽ!I ErXmxWQo6~9"gcV$mWg}Z6NgHqPlM,`1yx}ߑ \e#kyYgP[S`>8:ďER46KK++@%HekZܝp p{ۺ#p/0ՉP%dBcEڥqNjAvU- ãok]fCɯmVrqQ2GPbt\]J;<.-ЧX133&Ed[ʌS @t]{.ExK"W-T OzpB{ ^ZXI0y_"](L/__9pL?r*=Iu6SFkf-IO>l;aQ,IR39RX] S  ǰ-J }W+IhYEVs m)׭kZUZU0FA,(7Y,F(kٷ׍6n tK¼0f1sxr|*Q`O45hlF#Zɡ3BED$߰4y49ߘ_mHpY8+],v=pd{T]'>RCGp|x&>'w$-Abqh}{'zc՛|JsSbGGTm]As]JíVwHc+b2NoCDAg!iak k^Y2v] #."i, ;r[﵈/Vy[՗0.灉Œ,P̸Ij􄗶7p_21% $v Ӆ1XG yHk6W—+j5. 1 #lHdҹnzxmkNT |wMy?txU]o7|؞ Si)}1 Pw5òbw;Khr9;3;/mҦ)jգ2n蕍|ΒrtzMti'D.* ҘyS/@^Itb,PQ=~ Q7hE " .1k}y}<ȝb9ơej$s[-e%n:sSj8DL^Ǩlb1&BF[3X0ܤg ɫtK"2Ձ҂<)OM6FT:ưJq|ri2Ƶh\w:vi+V{Ds(s)iyv]& jt!|;^c>WGkt%b́7]jl kwKLq-oD ¶;hGy[ 1[L\0v7lj^NqɸP;xTU$E.o#̳;ȿq̞ mZ^ϬbLɰIY#/'ȸ8-)`(nLo^E @!IqׅHթ[@= Q 1rE웊藫9b@BgDzKNuXٸAT9#E`)ros\SUZ-b= ^ pzb Ajt~mp-p7Fue'[Mղg@M@Bzߨ"cІmmGV 2v"~"j:wu#< ~6#`Ej(('{V[1Vr>1W 6IGQRU6F+6aJ m̬?O?<1<T6qMU/Ntw}0Vq Wg"k[U@󇵘W_(`xW[o8~ϯ8S bl/i0hI-I9u I],aw>?yFUP';œgRmAZH "LokIIiop JZieCuۆ>ԦJ _۫7';娶Rx~:NxfMk [΋MBk/Dݘ{WHCԱ&sb@[{k9)ckxͷ҃i)ԡ1~{c$tUeHԚ;{Y.`>J}JbX}un~PX?F}6&:K15rl7ɐb :5N>uf-V:/p 3JujǤIк/I qCF?(4ѥQ?<( O/ayh\wx^{icMc&:Ksqqb8SlR(Kzi Q!/)xZy `8'w**yyXȵuAE%cgl_7bx C) 2nՁ߸歩CSt}C#@i2"e1Zy8/)2%tr1}`=()Kg}} _fo  rU)+C.Rd} a1T:$he[ d:N&c̃=lyeY >$ <W9go2~;Jc~OCm)xi 3x$[<&ړxQk45xVe#x1ZUY Ï]yyG%ahLi/s$%Ώ뙊 7ih4Fxm? SI'KNlp"0L֙GAMq$v`I=8NA),~L (Z#DGBa2v] {2vfNm&@GG?$*W. w#RɽP-Wđ&&Ee-%ۆzqK? hEW.y_]ǸHΐ;[Os=ZN=wc c#wcw*NZoؼD&Zb𧔞Ya-0}vF >ѢX9O \?E -5G_Ody#BORs*K52(ռ\$c8 v2Hp2)[*Gq0kE[fODx<̙֚e*A0gdYC&ri(a襯u$sr`J^?{ٛE@StB& U qrպ"UI=|l %T AAfsh'% WC점:);kցJIޔ\Fb+T:E@(O4od@彤НҥnGǦJPOwlYa'Ǟ_E`,ҕYAFt ;K\X7Q";#Ee;I 3(n""\ UѭAl;HzA5VwȠFo ~97<\S.tTֆoJ4)BS\<rcK)y_[uVmOF4t?,WV̫6 jX6!V^Q]+P|DEugʥAqq1Ta>/?\q,p"3z'*"X~m M ߹3FL,\No_m 3N<߹ry+|u ڠQZ޶Y}D}Zf 5ZMFic-.c\T.5آ(Y%WwQYgӁeq%dBw`irWTހQ]W6!m +XӥqGXO7?xt)el}{DLs q1l\9#ӌgPq_ #]k -,>?>֓$ e-+PpιvyqZJԇ '4JӋϮ9-w>fA{jqW5JÌ 1t#n\ ^"wacpˮ{_?y1x][q2LNҰL(C߰ fO8`0m,:S{h)a2VszhTFjN8=u@(KJl(pO0}s|/Z=FC+a*͌8ȓl ]QbMA,xp HI F% pF7w?&F%hAiuAL 9lT˳2k?, _-qphw6k2c#.ڽvYF[F'%F)fqbH. ϋ[g[46rIPe+ IWb4Z~/gڴb9VrTS,uF(I_~w7>[˜gKN+:rariU2IӇu=iT9*NәdzM]RRr0ҧDo)E$v U{N.Smw)1ȉڬ'6@nt.?zA^phFM<<<?O 7B1iç۩?16F%Rmcvh5my5ݎ~sѾW,8@м ~i+C@tI}!vG& XKeY_}]ա6GC[ T+;&w nezTDMBWy+<8vi:Xt#Z5HHZ(/܏4' ϩ%f!q HpGJa˜6s^IÌvHidI,~ }KF9w2Ml{yWB|ec[]  W(3x}Q?K@ҒEPEdh-.r\]hJA?B>G\r?޻T)QӉ쏽K]-.`pX-1ӷ*9& r郆x蠱guI%_aH\XJL0G1B`:k5Va;΂ڃޒw"6_6'p`iآ "(k\Ycv'_PGSq킇`EcHpE@CyP)hVbCkV'S[!fb-# a[B 艒Q9L:Q- 2BnLĂqu3tđΙ=LUE+h $kOT۠u'h[ KS]24:A8b )R v&b_ DB$J#i~sHNvm,9!{W:^eOBFfOr"+ԅމkJE&EZ.˔'eu~,;TgIדZlmEcd=&⋴,,oWXWn Yb+^NnE[>m&XĊY]/X`J׭gG_rzXYQg;HHPv_ [UqT9">R+z~]{*RӾ4>F5FQ2U;75WKguU)yzJRmUH44i$sՐb, 7z+Vs 2' ε:c Ljy0ġ(K![q _-HR?Y{H]x깭ҊVpͪCV iZ${E3RSQY ߊVUw/ mMi } E>B?4)bDMۑm)w 4MQ*3y%%A$z9;@?M6,p(Uz?9/o z^,ң]:Qn~t;vXQF08PT9&3USy2zKA]G@c:Hc >)\RlRю^wxhv=J ֜K!GHҲ"R\+qMu NJy`d73>Ss] &48&}Yh 0yUDHC4vt6 #KŎCݵh5{@8vɀOi'!"U Z(*F?BN1`|mzbIۉcs(C4tWmxMJT v%0z.9;WY믟?/׏wl_WWQN;@ x"xShیDc95&&!CXB^",cp.T#@¸q<~L_ޯcgmmR\[l;F4ol}91Ґ$0D}&џp4u kzE[v8: M}oj^8p<&Q޻yHwj_BLJ0>H3׻״0|y{C0O5UL Ei@hCHXh '8hx:X'Y:ł>H¯ΝctT)U_6屇 8A|}#<$s`u^q fjƾ3^UyR~f\4A8|Q? );az~f qƽIGpP@^o[xI=eH/~g;N߭@BHĠ ^Қ /ZoH@ d ՜gv1iqcᡟ cQG7d4b6F|¿],ecu"}nё"iZ cput$|vIs:f#;ΘDA'|Iw&Q'k?6rͷLVdVn",&C1żݮpf޸&DPC؍'.m>9>#'d$!ּ6Ǯy(bvh?+9QZ"?15qdoS͆}J6i}˰G-c%vo#\/ظ/ n!N"~ӠMhxTMO1WiR{.R9T*JȬgcx7hK7_o}-(W`(h-DR3yw."-kb 4P [ͷqw*I= qM݀HPJ j AEq [oQ`5 LO4>+(Jq8~|x% 3NYq.F6`Mk4T1Mʸn9tٵxWIPz0 %s>U2׆l6{ S=|.i =_L3.U>;_50Y‚D`;WfΎF".p!SàvS<C\VW6+Ǟ;WՀ,,3bzbFby,O*#9&^2Jrby"r郜,U3?t%W pUQ==g$9lp@]VwXe +SYhw_1d: Ci`ac4 /CR iqj!~ZQ*oNԲMF ~SWFM|KɤzS)#_DxS۪0|Wl\(2C } Qa[2|HZݕ-\rl3;#,Z,+ʒ[$\V2PXhbԙΖ2KkMҷ񭇤;X,`gc.L\0PG3W{Z ZyaQeQb-|qS>J17cj*Nw l$u<V%zyg;\}nGOI8/qE::hJrkiBtU?цǤW&ҡOȟ>z{$yqg Cc ϛ`( 2ji ͘xBA"aa?2'- *}4B{tyRZE[8*X]w0 lOBDi vY;9V axWQo6~*C-ckfi:X P ujmsHxHʢcg}&x>R'C879nkhXy˖| -^ &%Ti-mgR;!kQ 8 RWa*}+ {%wZmc%os h%7Pɭ8XTLՇf6Ӗ5H+ͭ;4YݚV[*,7 m#".3 \ Uvn[b|Z#{al){DIq!6Z ;*b0]lY֔\"V\]z%.P n<.Yռ˫Vɭ@Q )eXZt"5DJ@/k"wmZ[ah.jmiDcJbcwLD}Ǎ/X+]g&DeUEa҂5 Uxy bpZkxӪz ĕB#a"|Ik̦Ytݎ_9V~OI{c"~h#eZidrM쩴 Gc02xF}4{wb{dUhŜ7.ɳwL(dp8φj%^vJV^ҍ^sS`t ޾V *gxNa{gF4#p ޜv}1)Piabf^! |fqg>i!B̀j<| i.٘WGԁEoaH0pH킌Q f߯DI(IM=`@ޑ"S>WitƎ7(㨃= 3\d81D͎xtPa"K‘[o¶$WK,o[QTEWf8=s? u,1gW$YϚ4#ct>,NK;tԴo3\OV?ui|5s ZR\K:)PtשW"x#DQ X:BQ̣Lz?WBF+J?20M(,J?.XK O,L 9g }wX,Ǩ8(%tfi_q$IJ$&me%"NjeEtoکbvv}-*s1Ck7 5 |ma⽋Eޑ!Eݲ=%cAAO̳ϟ>MlVlrsqoq*8u:- Oڣ{*evvxŊٻ˳[zŕ:jdT B'+ǻVh۲,a,eUӇ~h,6)-RNn*AXtE|Oqvs,oHM"_#1 nxVn8}WLi ^%@~X$E &S4Ȥ+RΐI rxpf1`7 :c>22\u!J`UJmjLes8zv?#J8O0X*/LC&-gsJj]`f#"o ,(Kx0{2W q-LU]/51Uɏ51݂o .(-M8PwXp ㋏|t.XĉFJ[H[L+88",]M%Qh%oG!3#fu5ǩ͊(T0l9vH_.24{C1 m]8a}Igyå[*D?ꍅ[Zf7p/ &yt[yeԍXZbwXm!:2L:C YHNDbcϬK4;@cOP~c'fvv$ e ;d:땴Gp-|$FSJ9g%%?M̉D'J`g c=9^=PVl卤]b(kqyu_Ȭݝ7 5wֹ3 14R\Zf`-J2sR]k$ݢt=l=ā'؎)"ֽxZkw6_IC*j޺5MNs$qAqiXSJ~T}$v}vS 3zJ\zTLUU^ȍZ'dǃ_GU/J7[%ZWv1R^%VR')a+$\s&ӬLuu<ܨ(TepURϖgӕo.T^Β\֗]de!l\ .ܬӤʅ%0hj2XxFͫreVjTJln=5U$+m3AyQuIVzSG:Y!f=RݗM.!].:)VÎ9 dC&j]V5~Z&*:͊kXrCXha1zS,bM4[d4U:ʌ2KTf+N'iuB@%`S[!:CTQ*GftM/n,)Ӭו6 JRCT~E+~oYץN"< /kݱ~G!+]sm pvh@FtIuwrSf0SDf7! m4m [V y0+ N7}S/aQv0 se.;ӐFEVC*}(J (r _&xJgI@IN:p `Ё`(x Mӌ{$^Vf V˥sEi5R麩 GU'<|&{ȍ(dRO`Lb=hO$Y\]֩ TYIl>wT #IA³ i9{إ 5xY)bq'}gc+}G&~&u >eQWM!qV#uY@kN{8YQYPSm4{*W q6`(&6 jQv:EPYSʐU{IV8Z= g:˜ߓ`lH_AT31okʀJJ!q#͐]9?O:}xaW;gq*9>~X ~8MG=`V?^ * ^ Xa7txg`BX,ȉJD-GDy}EؠsUbXTp]%ZjH*s7K(A՛Z [TR'u~OI~"#8?&3SǞsґu:Yei( UV"EUVb;°)gm^bg/Z98W8zYb2&;Zs +iU8E=OYH3 <+;i[ܐmRL|ng"+ wg>NNO|O >{y:Ӌߜ}7ɻwo8 6} Sܲ|7Ck&'Hm.ad|kK\c -@aGG=%p@%BV967n{JِIH:17ݮr G=⼍9jURֆyі,Y]{}: }7ll fv72baJ:sCO4:JuŸ.Ks "asMH``&#_CF=tnGK˶Rwv($E`\F]<*f{+-$Orr:y:Ї)pYg#-!doDrh ƎaDmj(C"*1iXVNKg秧 GRluykbPh.-6hxMזH^A.GK QW/q5 ;r64k=Czf6OaeiKkAq"͆/}&bj<|t^e.n $9c Dd\yy4c) x~HW<u?iN :΋ w ޫUX]a>XJ<Īgˡ[")$=e+Ng>ʐ<߃%T!wd*5do-ҷ5 Nwb<8P c9M}sH7M0i-.cj_L|sZtEѻ-Lϳ>78wSnpJD5H681iOi} &zڒ mqW$+؛/yP@y[N88sJ&6oy v?޶GA/\.8l[ZrdGS8QAI8yEdyYGk؈c4ZߵUr#`:[y$Eקՠ ,~i2=rǀHB2Gpx#0Eu&e閶l\dini_֑\OKnMyzH3 kl>J^?H\Ďg{t 0 <ԡ(~kAm׻Nra|ɖs7ٜ`m=1N/M!)dAik9;r>~Ch_pH)+8AMc{B㒹D9Xpg*F3~>F6C~xewl'c=v>T>t]/Nh1>=̄ _lϡ8[[#\o0tIrͲg_{ԳҹV.E>e#;o̹'[0@eKY]sl~ѻf#=aH?v$PH~"H+_ua{˨^%7F8)Qi7ZLɪGN2l)&?vs&fS+r.CUɭ$K [[2G:/{w}R.B!aU dyb1#yف_s72ejۉ8p=veJDD@pYvPՕ{l`ݶ vr:=(3[t q|3?h&xmRj0+C{,-TJIi0[[vDdIHrIGPa53;b V;bP)(u~PaAq%h3l,VC) xdž" 6e (oqDaG a Q25d_RIGngX,Mc 1Γ4$k-S ̪JXʅ2Wf3.QpJšR8G>p\y6"WpɖzaX$3ʥ 0?䴿 VPPnov;/'yhoKҨ~ɎxYmo8_o7|nmۻrE~j %jI*~3Y- עfϼp;;e7WmY%m-ǶB^YLi^=Ie V8_A;finԎTX`eQU8wR7^*`V{5%fzoi7ˆI7>C-7V`{֬ܢ/@:4?m%lIUe ̻nu*ݭxn yiekij(;'+!V!twj0&G윂ZBT>Cnz\й++u%esXpeS΂ߕ$3p-J7t7|PrmͳͦUڝg9bmFՃQX&pH)@rbnaq?F/gkJS,B!\/|/F$ ߙaik>L\+7\ν䆎gcJtV1[<Ɓx`""#=D_c!_\T]@7kxX_i^4Ut+"ؤj^)#>?ߋzAT.XxUD3|pWin @bYZgQ=tOݼwSl5!ރn fuNktskz[3h/6Km6p݆L \I?ƾf.kkE{U\}LcF oGj ?r*i+6%<;Lܷ^|P }&p_@̿/pQ025vg=Q􌲲7I8FOjށ߁A/6 Z0yd_Nl_d !r-HD(ȑlL#^3OПJaJN~`4ˈ~i7DN 亄giD#JKL*>l9x^V,ѣiLJG}n.z( j BjA ۵ROF> Vr*hb?_St;;^:~TՓ=xD1Qud=CkɱZ'N29>d|?!g̙A:2V Բt4SB+:yvk ԯCfV߿}J0+fO?1% R/dp (Tedl41D4NAG&pg--N^#ѹџGn, cGCHG; qij{JŬ-#&XC=j8zk$bɇ)klD$<^M>Y C)K9GƓx6=p(H?I1`_MD i_ N,?Fzvj zEk7Um]|rVA]!Q~MI44YL`t4p4%/sB;*ͥ(߾'AIq[({ nvxxbxQa/{yoǷ0܁?: ^a6vOl÷h(Lh\Kw8lxY$5#s:OB FC[X/+ Kΰ@J, +VPR iwk7>{ZaI]ir|]a=e {3 _tX5?wLC(1JpqօdrSz0Z)}ȴ$Pq3/)*1 ҺtNPTtgDZR&8T%3ݷEY~z,g.e:yr"rL#WW:G<lb;tnn05lr<-aV/u,H?1`?oK[9ªx/KEtZ]!x25I_T޷xUo6⪠8Z5 ڡkKa0eH⸁~Xn@wwQo QITTI1+(mi5Er?t[ՂK8q{ŷT: 'AFH'ܞjY2>ep(b-}A7ʟ(`TSxWyI,HhW{O\KN ف3 hу}ZN(e,6,c0oģ6}+h 3: Y_O_~?ٺ7׷ZQmT}3@kZw{z]#"lg{Vf\`GJ9Yb{0. 2㾲&eT]g^Nc @(^w.H"~|JaX4} I_fQ%ͱ9xkb[Sdz!jF8Yt⺌(/^E) L6Ƿi@߰5OWW+ee/el_ 4E7bU5e~z#uS?b/0YHѮMyrHZxWmoH_1"*QZ*iOB&Yuk7k;NtCj30Jí14l;PF6j4Rc] =5\r/`V *#-+oؽM3]/,7Kn\jFdS|i8GBq#nQpF$}v8T< |VϹu<#Sc]!X;ٰ6Njjpv\J +%F9$|鴖(E aqsWY$s 4n@CN| M0Y"W" L]#m"**ܦuofrΜ6֓bȊRK##_-7@{CQΉ-a!sMAx JU|}(6s})qԇ >D:ȣPs}'0G6؜O=PD#R2'%lAcj#{+ 拜8@0=&&=/Df a5M.j՝խaGSY. ARMe}o{?wRn@'.'R[+z!X%_ {mp Cߝ¹p31_l4 ɧz<,$칝bʨVyWGᱰxtw%mgw'2[-NƁEG/WǾu2fb $-sX8.{so{RgL&i@Y j!c"Ϲep(-hD!7:V>PX#eR3gB=aîC$DTW_ݶ^ InYz?2ؘ SAj@ScY5drK;3]9V&l+e}8)RO@4<^\MM~]lgU'LpOyQƴ}%S*)zvsoq @V# Z ?m5sv;5gg  pԇ3hOZzd|^/XXZsh B֙*CKj9G89>]|/ziޞW@ϯΆp1>;:4( }i{Dw` + m|QnSq]mSxSKk@WLȠ Rz2MI)Ĺ4YK#i+vv\G=3ոP0>f{} T' H"x2BQZ쌨k4i@xLtR߇n%`.5TɌ|f= T [+t-茫B>%(Bx9\ˆ u>dh#k  VHw(Q-E`, l܅5wA*ZV7!~*2XjW &a 4H;2Z~mEi o1ٕ;am_M!~U~]AQ+6}+ .nfp$&9Z:[AJMQ+4xO"\"h-IҕbEb"S}1xuRAn ܃-%R(Vx]PXFm^N-`C,z.\t q.V*p5+@瓱=YWX@keZIK]#ore];_JH(hˊ4R=KM&w$\i/Љ BF5tcaZv,/"]*֓XbQfVAJl\:Wr .tR"˭3Φ/`#7kV ^'уԜԜBBJ căU}Y9Zޓ x$"~mxWmo6_qN9]6HCD[D% Z;i')~h]{}xj 7Ӆ,P2]2[ $yaru}|\ S[/Valp7ThGD}G#WI*6>^\f(#DFX8+:Q5F \Z򤦝S7N)~ ߲4lg#\RH\GR>9mf;YL=:vᬉp3dQMn{+Jœۢ"捲|*Y9#biOs]2V1y婳qG$.ɧX,NyE6J2-=z@Ía /F|,NN[")adSGdž!\p`+E5q\R ~5LHaLcMU >@J$Jc3Sp_ :;•i )'Lc '45Zcjj&]2ڏ\*n@* Nktͩo'ΐfJJ`c$xlhJl䰚 6ldQ?jep@!!C{*y\;햘|Լ )HC7sĈ_3M1F;Fo0<=& =y'8Ws4\"ԕ$M&̵;mr6e47`fGc]#0IjmɱT|[eԆ޵3866sWTi5<ÚP;ב^b}e:Ee6Q]\c4CW_~ŶTKj:谢DbNg{dj8[͋NNe})-ǼviCpg8+kލK~kMp*剜O"4ᗌ>}"1wk 3K7VV?x_Z~:Jг[85nkN'Jno$zr{T?&JkYavG`UmHL9yRmwwP2Ek\WC~|KħbmZse;5+ d\Џ65J;ʵ'7G5f1/YG_BɹNTz}Np՜h_)91h]I9&qIشN*BgaaA*XU\8|"J "iXl'pp !n?W E fB IAYA$nt]%Z`GM~A.ڮqqxU[k0~ϯдqRu,F֖)RQJrPwdRЇ B}&)n^MXK*9U=W1yj"f$6U9 9x&\Z^Ι~r"cǎF{YQ0EV+nKp3wNNdPxΒ˧m~ftctv!*!VϜ6}8VuаN2=!cE+Kn;r{}} ZsHp|^ ̰q^tUBan|hrPzCJd%TиfLʴBOVEVcJ I4Q=(pYC7Ԁ²n" i.sHerl|~6z-;=]Q͡.|yG#ҷ&gM(J1_n]DxH2mAd$te2nhmy=/)x+sFBPONo~J>7􅗁=̉t縿3&(H7e~YrXR <le&$O?P![z@ nYXAbK.S& ˠ{7$(D 2Cnp N.'=_f`aJo句6>E[%ywW69\hw-7O4<$O^ 2*guY ͭ\_[&w%`Znx͚3*xuj0X9{JPsƱXYc6Kɻwl6$ 3;d-4àw 4g@|C' 9p9h< #[@=\QO/y.ϯLv: ;v{0O586b<"h&D'15LUű}r =y#Ib'cj(eʪN*wdM)+6:(0N -58ol~v<(WQUc2'0&Q%e31XY qf9Q @ڠ j6or1E$m L 3 ,Z>/ z*c-TOޡFK~!\?#139By̦)մ`TÎP nIQa~2@-HDP'IoVj+\*[xG~$!oxUTo,}H 5q4s)kPרa&# C-z?4uR*Hm?dJޢu:wi.]&y&JQ*~m.k6|2Y ݬ`3E­#eԖ3A6K, "9D!FQ>ScG][#kiP|B 70=Y_BLHq5fPy(]-j-E`9Oiyz"ƹc E#FzvĚjE7}t½*_^69_>-V^I5.j), OBcI LU7JjCEq<)C ;v?GOرFZj8[J:6Z=u;b+5_ 4ǩEӱaEZIGG%;"ږ8s!ؗ)Z[smVM' cRaF6ʱ]oqm0-MgOjkil-ԝh:Y>[ `KxxZ<=f8:bg/rR0V6Niw;٠;/Q&K RZ_MTK;{#8l?,K:]GŻ JjJqUbN!Q.c$+gnDdHnnA=j,$Bf:lZQ&zSMMӆ^߃V)'mǐx.ZsY$˛F)QCr)a]?xL%XwPjX@{[ wo&&7 7~>Fm [PMr{-gjBK͚{{tPZ`=)҈3cLk'u@+ɭGz HLlM2]4v^vtnTh>C@."xUMEjV5R& Վ2UNPcU~g!)(pf߷_!#K.K 8R VZƁE!qttrq\88ףO?[qmҌbׯǘt~tѫ鋳ӏI75u8t;fCxܫ.Z SqkY*W(R!RЙiSY:s[vP@NP4&VG }4be\kkB.PW4!CBFw[aнBRJsJC3:#2 vZzSo3`8gJWAHm/$x665bDbtWEv?u:?G5ݼiOď_۟;2 }:+PMT :Cm*mݹ@)0*7\c) U| V)xII].+îU|_#YV3z4-kJxGEI 2~-XаHJE Ki &<CRbAϝqxʌ-2-! y{s6ad"y&(+ %V~pg1H=WvoHa6Jl5{qmGΰlNh">T5|E@j x\mw۶_~{k'ۜMǹ==q$-HY}b;洉D`/,3 _,(/6"mun$f8>3OIrߘ,i]̛*wL<ǻfb:)Y>4o09K,x6-ˏ5nn* Tz(6OaAy~̊r˷/^<6O -6$in9瓢pɪܚjL2$[IrAtRXs]L&&7g4Hv/:􆽁uyܵD`ia0 $R #>ިdV:;"TXnSVdl0_ە[?$wPe;\B\];P%ӚQo)@Cc~mH  y.ғ_ $>;x~]cΩf;y]ͅ jk"DȨi>`my wgt:,neIqNX9;^R,Rb4,WuVfJI*+.O4+%)\5'odzzjp+^`/ ,aN;V*p4_3gc\_;* ixµio2W1q Q} |>2*yvid)"ޘPglmPoSw.](@fUSUS+NVeOЪ f2=;Bi'"iz_&Q"Ӕ̭yL{I?oPdN&.OsHm^ ;̛&1BxƨO<}KntdurjIݻ̺W *iNO nkY-j03M,/k,58*)$u=zF'Yt̯'b UKMUb8S.fym>R0bCG QSu^ð zO= ¢ Ed eV]y(E`Qa- r1ߴ3"w(ٝ`תdֽg%lWTCgi3^E4i9 i1N6Cd2w)}`:Iu֩fSx{9 ne;B{?'A\OIq<׀""=1R{E\*^;QޤK1e(($8ū1[?QG=Zo6o;32Ev|qua9a]^A{Jg!|Q8q`X1'[N#0 j= lkpb5y}m<"GaBg~Z[д3JcHjDۍƹ2"$9Zv1w7Czѻ#کm$o"RlRcFJJ9Aǘ,wN?,~ v1T0Bi9}S/J ֥?;\nᐾHӵ1d蔄}LD; z,(,rDI$iSY[Ub%/NEpPF b|7Ф;Xu(P@E$|TŶ| l}1ՁaQ94=C6[=5tُp ~ WI{-xQnץRdFv1*(EB'z2fvsG@ v"9yHKGHW#n4p &$FQKQBb bP2II |qDK_nUC6v?]K\BMk1װڸF#~Kh/LSZ_|~/z5:!%p[Kl3PAK|ge]e<'?5A#[rݔðʟ̴xa;DJU~D J% Nɼx9gѯwm+;je>=U LʧX gG,.t"%&wj#1?1x:jA)ņ<iBu-tc}?D)nZta̪C쎜ǭjv\J)O8dt  ʳ)L!$T0lZ4 iH5*9i<Ew F"ʕ,^b8p‘fQ|\[l%ЯUUۖ. G{=-'(|7k Ц(6EU($/Nx>"_}ͧK9-uL,٫HӁj4"^0ЯhCvNА"6̮=0Ζ̤qf77s|QX9)A j.ǚlKHD$,sXgKXZR HLH$IZ+; KDީ(|߽ P,4xSB31\21b$߆#;+4~Dtw9EP,{@y}DB0|V~$d w̺iC㟾#Z6[uNsf!y 0"gYϵˡ ۉ"j|G^a@[:d{&rd` )`׳,L* F*? ˡ1͢?|>@SOTصB>וHy[R| ÿ)Ւ9X ZlDkU@t8P5_A"j̦p9_l.̊(K/Sp/;D-iC ;*0xƛc񌝚xvc)5ZslBs;PK)gEY_gei7ΖeS̑0K!`փ B |xɋ=ʸ08dD]Y&jKyi 9iОh3E.i2t:NqD'UzAt W^$i#A/뺪yD@b(w :5/Db)3(Ta:K=61ȡ VF Cࠫ&Bifn`ѿŏK{i \Q*İ"q6!S@ÓH*RPnCv/!s4гd.Pd_~;ܳ\Р&phyd?B8M {8aS;^6f9zρhƷ^Eb${Ow,Hؓ-17|MR,~OvF OYᱛ.9Er*A0Ėôr?;$/˓nz 0J 9Λ?x=izcp>wGX0E q5UDB9,ӎz\p87c%*a.ܴ4|սk"@em^AIo(lN+a7>x_rQ),5XT'Sj>!n1Pޙ.=$8r<\_txwX ٌ>%3ᣓ%[ PT(h.SFd/1e^Vcﶰ94k]&trv^!Vc^ #m~\GLL$.|tM6)غQ̼eߊ.*(AsVtd{~r'דf}XEc T2="i;J wm=~ZP0" zs &Y;!@EM.{HoxV1Zʎ$iA9B~v{H0-Kc$9pQ 1ab ]&uLl7̕AqH容q+Eݔ݉8Vd\u%08Б4u2pVf?<ۤtDw!ɵӂ-p9ч/Jꟸ hu@jGrgHJOR1loKH!%i">R*!S L2q<aOM|_^(bfbD*xC;iஶ jk Pm?C \}Od$g'+@[W[n(FqW3wӄDI*4ŭ\p@Ey$5؅E6a.\=>LSa'Y䠟.VZPh8vJzIم~)"O|"4utICdo :Ao-EՆ]6w:ZNY}sLNȾdnoWφrMxu]-л.H/5%z(" p@Tȹ-&^~82_Bp\|):vG]T"e<.kD42佟aLvqg~{Zru(ўS5FAbW+9ƥВ3.uҾE[9rJHM_qh^(]8t{R*&BqL#CO͓HN'aO?(s d s)Hخ fcxԥm])n* sLuɈSn̤ͿU[!ZKg(+тox ٯy"Oީ ^lhrx="OdXEރ+?b>em Eҗǖ,^L>gEnW.n1m:{ 21 1ǎ^0_}Q _@j5_#U_>WF| &cM/ZZD4biǗO|P*i6ssP>w5;\ɷ*#ݚܪZݰں`T]6m 2QZWQ7,9\_;DD[ U6 / `|D.|0%m"N'QMǏIo|4:0=JSZmP(P-7a7.7cˀ"ic94ao2Ĥ[7,ԹYV=l|-F188I;HjBҠcԂo Ԣ 'x` MtIE/%K[d('.r-ֲÆi*q ⡸ՙw pcU^`xR:tIޘ(<gɯU 3M mҷMpD8.x(>Laꑝd-GkĵJ^M !wBpòM]"Ũrϓ=/ [eG a#m̝$9z?EBUocw4Ta)‹I*VXlduYy,?Ze`J+Wce q+7]䆮nwL#'xF|olz8Fvq+`|I9%|W+F+ңDq9+%8{fǴ3%;)fitu(ޗHN1`H Cb!vׄGSK~=b/:~'t20ƣʕwp^형-jϚh^$gk }cQ\c,F|HNI?'FotdWv"Ɇ0str%WR4; "gZ]*߫\ p'*]clȴ@q'brW ݍ} 0za=y^)Q<U[_][PЦyiۋ6oA[WLQs8 #3MARעSxcYۅ2_x”?1cN0I"5eRti&__2 \i6h㛥v}uXoͤCf 9_s 4yR= `:3M&9'h 9_(8GD/Es/>Yi>/͏=Ja KI/kk裏'IJ>+ nѫ?": 1{2o+ 87ћhъw=t(E('=To]tvo,Ce??j‡rCi" onťʯRt_O4xmkPI= 91vup;U:gMI?16L<#K0I1N[&.R0p]̚03LB%J*QT뭼ͽVyD=.EY.q*szVc yt({A\/?&@3E4>yTЋŗ{?u04gxV[O6~W4JUi+JB,!<3pv`,p{hvnαWi,Pꎮf[hՑcu iYY$=VVV v,j7j(1\aR]Ge]g\Ls{I5LT\s+wB.4ZԄV%"iopX3Be]o**DMds5eD^mIzC@5 LZ VfS1q<(f 9g sT^_Ad8K34Yi*һq vg1* CĄO_I*so-[k{sXTJ6b@%T`|0e2fi瓟>RO΍v&66 i18z8H v$+YQȊWTJzOV a t5O '7XsK|3Kb T< z@b72;f= '}1.=v$v7GZe+jx0>A#3wWĬRe5ZbC!<*b;.#655 8w9LdJ![/V-oA\/9O(_\[/xV]O8}W ]5HE|H>ڊ3YہQk;CfZVE(ľsεiHP'ob!inMDždegƮvv_=)G;dET M*yjƫCGl-)JiVjJ2'-Ž ?YMSw6taJdC)2,MEQڳG@U솨^x97B5]Vh,y`wRt+vlJY:=V|-*&Vz=HV Vvlc ptrd,+%LSрܫ $ [Ďp@-"ٱpM*{Λ^?zBR ҥ!>+i^I?7WO@h (QѼ%g2v2Nh`c  3և۹hUZ?vBl!E\>U_~RIM-+@,Yb6VƣՈkVpƥx3or<(bT[TE|~`BT>O[X<#ePPƃpMNxEޅTFVA7FT\nI`TOE$994LDBᗯû^C˶Flyo IEi`.<H>'s=FkY~-g:j#ٟJ/*ſ5v\ )_ݎ6 +;jB5 ~PDxZkoF_1Ғ7v &ccp\aD%pHYjޙC/Ɛyܹs_(Z-ZikeBRFrDV*Hb!~Kt#<"iN\yEj |X}TYl.ĺhD$Ei)dOӘ:gQXtBq4J$E5F* 5cL!ioee78jZVKCǬK362وg.1\+׆Sw?EiQυ=%Uq$2kBz2tJE*Ԩ'ƒxhU|,oFYn^_m|@ (i!wU◢bӑY(i%hnAʬ_> JMc h E;r^c\ƙZӶty4kO/xH*#9b6q̜@+USղ.Lf]4y+~!Is+k|VPP~5m,BL))JxK&A&c$X_@U56$TPKYI 0-Qp쥤GBHP)ZņHGo%e{ONV#f6JLMr"_{bT[xηdsI*Qg=&_? mpmD׽=uQؤLR`e7 VMa`Kd@RH54e ;c_S*֮mDb=s: X}hm ;כֿ!g;ZxJ` 2B}oX" J_+ ͼVJ]dKħٷoļVG9rYZm0J'Ln%kUY4iJ6/v x X EMFM($A{ F{Z;F-~C SajCqgƭks4>7cW%a&HOyVxޫhb,S2*4S>}*)XUeSpYE(j@3R:`\9-ìTAGǑe8o ˹IPLh!5"{m̖lQ:9LzuqNViLH3^:: LTYJ{0jmg_p~,j1^KgFy*+ YgJ)W&[`p*`i/t*-#Y39Ve{m%3Fr0I_WITW:EUo^Bdyb7ˑFO1GG;b *\OWL#>yE+Hd$I `+S[?yM7=(3eVc[ ^;@K6stA2`=8ƈ9t@e*)DGs4]}1ި #GC77#YQ6QAZڐ6ӗփL5 Ȣ7qaѪUeV[)LƤ?ikKl~&?DlF9M^ za3nfYzm{ٱaQۮ⊺(ݿ튌oJla/Lβ }Kw#:6q/ Ԣ귘e놚7\QgwV1 Gޭ .@Ҭj-QTކ:XGy6IKg?nױQU2爵xf]0=.aՎlS}:eץ9h\ 2ɛTU|"ы&%Җy.ThY>` fgb4\*UKeoX]i}Oխݫ{j*uU%vM~T>*B"v}4!B֓+qUӐ=jvZA-g"eRQ5?-|y,norVn+JQS|;~ްHR(12Vc߂q3Mc;}фaoi>M'%Xf<` $kor߿>:f#1I8E1[tY!'>m}x -K;>E0r1$S1$7-)Vi'e2F9| zmșijA<|)4Tf2b?[W{:oEYe"}R -ʇWptFػn !|!" 38:ݞΗY^ϓ[NΝ3dlױfм"[z-\4"|L4X_v}L N`CLp S6w {W\@n ᭲Ji;|5үO\x{p e/]J"]U)8ַ3h/ijbO'⧣e`2w+'I%>;h5ڂ\|f_XZ"hub%oDg\{CkF]b5٤i: ?{ILTJ<`6proY;'BDw]aBUUAXk{?W^xTN@}WLds*DQ %A{gvZxgNL|p _Rao4F*]qT";8vBI=ڏrH+j6C1{p-fCJ=2=V.~98҄IKS0M[9Vn ޞaݞQ[o0G0}qvFxq=KΏZ$N~$ ,O2; 񡥺Zu9m-u- CQ9,=1 K`uW}Hd%i$ `X÷UeYkNڤYw4rl&:{ɲ9ֆUsi-:UTN1e(0:#b,0aV;%K_^ `_.n.#P(miw?u0V+=nIW$-mZ[ R$tWsGeJ 9D4W!d-@7 &ef* ;ޝ{ip掿K'='wR2^ծm14 qAR®ZҞr8:诐Rsrqt~Qo:N4vcJX7}494:`vE.{O[ZahTMe>Ũw5D+ezkc[ze58V(xMRN0 +L! `HHe(d-MJN ĿCۡrvx޻7Sb~KP*a%1``iJ0|NYeU az0vovP y kCdeujT1츛=X.Ẅy/ ?VT[&}cN4f?ZJ&ǵ/J',εKްjZf2%p4{m 1ֵ:3T/LLBOQ"0"{[/V( t\IŞh0>ƃ/a7oQmohH5vEM(bj>_1py]s@0L8ˬ}8GFbߓ/[Gcd|#c85a0-P 7}IILioJдgA-k! k)I\zʢj6! yNQhi®[]EIg DcյTd#*~@&.K;R){#*|_VT s&!$$q"__#3"*8 id?s' M@U F". J͒', c `1^{+!v3IhUԆSP=BX6&M:=UT_["I eqnX0G\'jv\胲!!(dɵ>Ia.oΪm]J8?q"vP͕"]GeM%kMM.d8z$TfX/:d9ўy~HiR|xx>0K,Ld&nPҨK+-˞6:ƹ&JaKd4+%c'(WӠvd%Þa^" kڒkM_/p_yFµIP_gx'.d G\"գ D|K\pqlH)3DCw(btR@—>.W[bw>[SDm'rR:D*E-r[g=M  @x16IajnD+ )ذ1(r{C~5>0Cq 76$xd}S긾;>JEggH܅/0κ)3 > hӝ]GPD&xWmo6_qU X9ۇaaKA5!)Z-"m!_dq`R<>w /'Sx/eu ZV<V no[eUL>VJNhnP%2k`AxT^`zcyt5vyo׼@Qt &#$4]mE[hf`\IW!+f5&=u-}-J--Nql6Z[1 Q ݲ'8xp`9@~K0xDDZ) #^pWZEIfՌ了ԖCQ:kZ of aɋ-ŘŅf52n$Vt<i=2Y}#{Om7dQC qlߗ<461/4m͛9fԼbeI?NfsHl v|:TWz.o/.F~|Y69 g4#ECE5kۺj/FX>SM^МYMv^Ƕ 3@$ M9^"$uP#?f ;gG-y(F֊ dzgd6_8b 7E&z8̴!ݜk \ƚc\u KO+6 erSN?slڥ'27E_O+5z\WW_l+$h2ժpR 5N@W7ui;ȃo̯Zh5󛵜s9;" gNܳTCK V싙>uDs?!s>TC}?#}n7H$WxTj0}߯:xhi)ryj2k˶,wt MHΜ#|֡~Ak@h~ps (kǕ:?)-k*5R ! p\ 6ZxeGZ ʦL4|=PDG:tHJB(ЊB5\94 ʐfM- mI[hq'z.C(s=:jA:bp=s]QF;#3!nYٕMEGBY Q%V .ΙOòtY ̱}cTN+±ŁCJ R*$}C1{#enhhyF=9nKҾ8UzBSUZ$9@ dZꪘ}[ ?5W +$Z)ާeC xXÚ`f2 .O{,:Vgх «k4 gL,F=I)_+ș#n.#)B* 00k=هH"A!+`wJ'?:C%rok޻!w=L>( o@y`tYFSfl9hZ .-7jӨ[xmUmo8 _W2Z\ ;`ء{A;`@X".ߏ%hjKyHsJkk5:i45xwڊ 95L"[ ΁oh/N/V*L(b$ 4b p5Qv꽴F;!Ttj2 `;9}j[1x?[c}x]T!KxR![an0;lo=Oނrθ ŊNjrA BYw]KyWݧ# >JG҃2fJJ:OazwaF)gg:[Q!]18Նd%:VUE]NT)7(9O/ RxoDo;Wmeϗv i2ބ,qڠ܆Y ⩩n&jOַrY!㪤ezJa1pGqSp)%XXr|p.8*%T$w9!?+uT {7yFQJ7"ARE0gc90`^2YAsf) xoŪB5jDR/kkv^ nlo8N^&=*1ڹ1*[-EiUD_YaK eY1TE~&!b]kWKH9ʊb@4ğ2Ҙ,]w} q(lTތJayΠ$s'"41t<7ʋ4x} 7&ӧtگT:6qxXmo6_Z h@q@|89'gw %.a-~E3H33 uxΕg.y*q/j섲߈دFԕxd_\]}3 rV $If'HZ%*J zdVػ˳I?jSBX+xDik6JwPԼ3RmkEVVV{l;ةw"F8, rZ-z[AJl!A*ChOFf*~eyLu~9pZƲN]TU I*jά5Rb5!ɲ;!0kKp54V!eΧqZJT9Vh'l a>|oеJm#"4kܾgk{DînjB|8U$V[$OPw{vH7p5}XsK S(o:`Zܣ5îM|ۂ?;ݒsH%zT9^ȏ[|YMǑH@2>?5 HtnIHK$bmyҵ^C%v{(5F kDCN˘n|ƈvb5"ʢ͔hbzc\e e>_۴sbd^YM T ^ڭTTM#=dOBy00_*l[b($"J^M1Ԛ/Gg2-MCS,X8 f6?RrۍagiyIpqYv ae.ۍAGB}(̐Dlޠ cneMdeO2k2C Mw.֝*$B&*ꦇ XYvZ:5I {ryq >Mr`86\#1DhԸpJ>5pfӂzo&OYB7ɮՂEjO@rz8+fYXtM #6H]i R\YTFlK%P STS(%T+L Pni=;ϖ)Bo|aua[x6{>> ]\-|(1Ev.fF6:14CLenw+Z? 8bE%F"^,]Ee jIOQ6 JTtrZ7*)Se#F]Yptt,Qڀ e`He',GpvOi\ !%C<l=}~ջO/&DaQ)g =旨}HHʰIR:Y){zf/B;I0.u {R82IFņ45ꧤ@FC)[F[@?qrߜz/JA}/ߟ] ! LC%LLJs2di߸RċPmu8|Hҡ?鹿- '3?V+w YZ>Bv[0ᒔp絙ϟWcoK\On>@~nqE=iMsTk^ASK`P\¿} W3\F`.+;{ {NdSx4 nʣls,=ȀY1嵙a}2:h⋴Yd吂A&eNoZj^"s_'+Ȗ uȎOBK1ԟV .V[Y ,4BPn4tPEe8Ň't _,C.G 3UjP.(HɢiWA$̯L͖;C.E2!$29˙7q,1'}:tp4 g{(gF9_Î%D5Q*̱~Τof+Ac d\_8qJ>Wk'Ĉj^N66}`>Ps^^}3~`5z p:5'Btd6h7ibdx[A}؜(k`.:ig3j HBx@ ~ 6vԔ0njۦ}Jcg @'闲"h Be4u*[^+=lWR'PDQwCp&~S?֤]*U[17 `#XHH>}%U[I /,bE~9Szn2D&>Z`tXlE1B 0`.dz&5*i}@6[-ЛF(Ui?xأXͳqq6钟M}7nTJ.K+ h:C5e Mɴ4 GIfPQ[GKJ~[ɞRNp.s$/FXņsxWB;$s*[ (9YZ%J}i.sZk%{qg,~q2>,=TCgJnj>\vs?㞆Fb!If8"/ϱ.ؕ'y(ln7bsm/}Oec!ohx}Tn0}+hPeZUO}i j4Uſ^n{Ϲ8g*̣Ki2Hv/o*~k+[teb!";#H~BЃYi -ZwE Meoe2(bx/:=(bp$N}lڽ q@m=Yݹ;|}te݆$RmNۈ/Pdd:Wq o8ٴ}< R@ e z򋴪Z6hT~BCSn`K j_ҠmSX.v*qG,8۫Z*jO|x3@<(]bXd;ҷS9M&$u^K)(> Y1C!t.hzn2YZxXaJȧ ~m8M#,HŶK-ter30˦xoMIڳxםRxRqb3PNq bEY95@σDK/,(+Awp#/8R# $i0K1Ǎ'qTZ礴&w=$x9$ft}!>OXI$.r9t+ڠd]'"8^ > V2/pxjq:ߖpÁ)|ɠ@|}ς&aKz{} >]J˺18>uY㬘R=$,"1(n?áaPxS_0 ϧm d۱vt=^vOr4AXdO *Hs5J[XSу2vwJ9,uN6[heY-բ/j&RX#;٠xDdWZUz1I55! yk֑-IhօD"~6)3e`"&&Sxɡ Sj:-7̦aỳZ-K/#0.Qw_>/B̴B4{iЄ]hU#͇耋#ȸW2,i[`Pڶgv`?iMhkr%خ phX2JO 2h<],?y幰ѥ)6͘(t/ s8ao(&WHc5(|:bR8̱(? nB11NjMsYD h.iXē[>?U232,<ނ">糛FFjٶ?_\_/.-o;WmybbP#C(PE0^5F,9& 柙xUN0DU q\-⬍EgFnh) O0.l tvNWPW1̈́'݉r/TbAa[ Y) ,YhS5np#({^WUpm%r|qٽ{RU1F*s&3ㆾo}fiDo>d0gLh >_HJvV txXn7}WL$Y I˕*ro$X&gs;LӼR9)ʪ֔:7h跭rjCmUvVr3g2eiWrU}ԚUWi(@6d,n[soa_U+1v%~[zT囼,ٌvxJ_F$^c;7f?#fo[ aā Sn8 Wmh>?g.s}5*fJt4~m5 Ό*N;*,HddQh}$suAhn>D1a,O O^}ʮ(? I$6κqyaeuB?='%̇ޠ#r}WW܉BmQb}^,q}M@Ԧ"7+{^Wծ6VgADg\ȟL؁h@-Eh@p$QYMg*ltU.6TW$nɔ3 V6/wh,T8OF[p#sZ(R+f{0F5v.+ʲ)gƧvWIpj:i~z0ɐ U68ƛ%`, gҬ֜5Le Tl,(`'Y 6sXaT#ܔcصrQ>:1AǞ%U!V@iUx[Eu`X.Z -\֌~ĩzk bqq:yuR:F#S-^)HFݞ{DHBO8ϔ$LF#!Ԁ+RPLd=p!˿F.pC4^l uz1[|@lLXS?WVGCA7R>,8p ,kfxji+-$C,\уi8Y`,]/ JEYZ~i8BcdqVxD PLwϰA,Ab$j^s}܋S H}o Bmus!,h~ }dm+ܿ}]o?3+ %xMl>dȾw,\ !|QM* lr\3kU[DٓGsY:ϝr}9/v.r~==ee :YPU2.ei\~Z,pgWtƠQ4;*Pbrqh0U#d"uiq4lkܜ{f!/B>FB@ݨ<`vkwxCT"B]1I;g| >ț>]}~˪8iߵ(ontTiSN:,u0+=۬ZGLh+ve 04MveUm; px;qqFaT QRZ,R=JJg'HtzO?ӻBJeYVmx$FFLyrpx4Z19OCoʑ` EHڪ}#-M-[qHC}Y9o kOrwOh>)ΓVGbQGi2 Đ_,z)@ŔqƤDxYms_qдj"i<:i><zXf<}v8n$w{>{<;W7^ٴjuyWfe0+,)+=,So:~M_x֝Z5')kS־@P^T`uΝv.s;g l3}POJ-tV,2ٳT;cpzۉ4S0vEMwMJ#nGE:SvN/[3:Z5uM) n173+8 x|@(`27U=:}zkV]OOԸZx4]l72BFM]+S,ԫu;=r~)g vm_ eW]u^&3$_JzlacѱTgmҪvAKoxz :^TEˋ~E^NKUmP͒Đrit lFOi?l휝7l*vV0ٻXߑqs2ѷ^Y5s]{rq7iqL0?l6m(Jeޘ'^e:tm6Hu 'Lz3a>IBfwWׯ vb5!twNl~nG&G<"D*& Fn:{M̈́r`tqp:Aǫޛe2jZvܔoF h~KZt8 &08=q>"U.{ V2KU^ab2?#c['%)z4Zî9+Dۑ=c^cXQ鏍8>E,88/07!䃰C״6 l>}Ůi*X5E]Ǯ{2["Fc,Jti"^:*`$yt`1fMA>Z1R'}ڊ|G?"o4cPfLTS 1UmPDL(XS=d!k'&Pmv'd5SR!jȊaj' P,`V<(w+,cz#mȤ/ C)yg>f5'z_a|"IbNg|?@G1Lia윆 c &K*4@u9. Wnti}encAݧ0BGӈRL ̳=_fń&Rԫ>бgkYq;?(HɌ-"fDbArݖhC)ew%"jo0)9c9wWlZ4@}*ޯㅺ}@]y1Gf8ZfHZQ9ze74QrHym|3Juh eoOʩSbRUZBM\9=ymMJ  0TÁdl$ ׼(Ѡ5ՒmgKH$,{1aX.P8JTRrQ%lˋWg΅&ړab;]5"9}X<$^Hd{Tf6߭ۂFV =`y[)Zt1[B7.VΜ {T- )\bp}T/c5ȿN;'^>71BnCtw6aqmOzG~S2HZHg㎋>@漃P٪QPhYrc"DΞ~Po%Wo&·A$#:_q`'Zʕz1ԯ sϸ¢P_)ྜྷV0hۺU 46P`S/h砠'39=PJ想󛎃zZ(d `[GP;0lLVBlS! Bͺ:&PEA;ȶ 1K "ņՔ Y.)C\)D OK̆{VpdH 5+0FcDEP7 5fn0gC$}7e{-[{ JHL0(^ldNt,k2?8#IC,.1U"U2s֠V( Q+ID?]-R)nCeO&ě4 ĩ7  EX܋ؼ^ےy =Gc<KXH +5wI|T)4d&oLMJ&Lb~7zz7OagW0ю nE 77BT ;g.޽tavQ}ˢKse~Oxh-o#2Pe!u!ϨE\ό[UɭkMl` Uܦ1(f﬌\"H}N"{SÇ*vL( m> 7/F}s?t''kCalǵط}O>[~B/٧*K[ n<1#'/vY̋G|&JۧRhNDhLW!.pUz~,ǂpp¿se| hC~WT<쓎X9=i$~ޅd$unp44z]hXEI]6`ꭃW mxFecY;`Ƚ|%5 p+ `AoK!b?2h +fOlO)iѳ9 Szxvb %NXeok}G(v\L L$µW/(!31`ԥآ~1Ǟom|e~:= l6VڄX+5ϪB_OC`CjSRvr:=zK^ҶPFxU`ΓH<8J+e=\47e7᯼_4!f=FjgvCwB"D'.BenD S@nOGqpsW#t5\J l#.5sWX[dB:gG)=d?`)R`8^)0w^j dYO@cezh}#>GIQ6aj }H!U~ C!9߃:Bu+i|ӿ6Ecki <6#žU(= E2PWzM@.) zʟKٯ`@1Q(k}ac<)⟤- :M .?5x}Vmo6_qVBtߊ8[еC!v_lϠͅ"5rt=ٲ{s/Kw _#آQUw`l'ڔ'/~kyǛ V 9pѽAj}R *]lb oVj.U:}X+ E,vμX7xL@;#ڛSUza@i JFXR|nuW).*~kugW޴{,HEòsxWLM> YOA+yP{x4B`㱃Vki\ߺ- R?U*K +`”2iiŊsVd%$ Yaefճd ӣmJa#VY|N`7hwk/Yf|lS><h8=ESoFJ+ ) nV:,!ic÷H%51^'8ٻ<"͒g n^ p]_C|p@rkfYPS2චocn\Q 5Ŝ@l;1X<t#&u^o/JI7r-f|3 -GFCc&pU6C^~\z((ve,p56Zԫ8_>+6=( =qcRmgކd*B}T5 ό>1#t<%X̡a%B"3mC13\`Q}1jRiiaR4ׯS ϲ~7A!M?ďhQ:C,"i/? NI޶|ӝy}SA^զz^v 3 .aQnwRk}If߃3ux}Vmo6 _ 2z[qWt=@ð 9V-dHr>Rm9m4mQIå T9pڰ^^r-JT ,Uo7_D/^~xh@޷̰v;ѨPbY(%dʁX`*vw:sM#J/WEτlo/В 5rr0V9ȡA[O~[mj3f_~U* psrs3׋f@lr1wEQ@r%$)Rzn5m8-9&<ÌAq-[3U-=ncyrBpw2ST fQ"cBZHůW.?^}P ! bMc`ml'FQۼj+epcB,+i,+dA>P`93VH@< ߱B1S&IVjYV*K '2C呕(TXfhn $WnO,tKyAaS1aaJT1)K<ͧ+dCfJfJI&w,_ ^)9Uؔyr1Zx*,gM&Q+ |źE5(?Y3;)щg}pbM(p;, wgd C /rQ}T1)'Ny:F'~ᔪE|Cx~ԻiDG*nKś{dɑj$٠X!'K IvQt; NQ]V `qOyUWyGK:cϩoDpiyd؛E!Na . 4G.X ±-19Ҩ˺Z%l67[l""L3/y2ԣcr2/q? NÄ~O7dW.Z1$bYnyKt5K/B)hȋXJKVx}Tmk0_qs!ײaoPFY;-CϱY2+=8}Y!ܝ+lSbʎzD-Ȗe@: _x: kgZ)xe6ʫ]+v5C6jD=2 )]o1 WPU ẑr(Fxv@(ÇY6TCBR1o62(k3 ŝfS䱙y9\@,Ǿps^M_J9xe(Atk~[9|9=\f0(v >,֢H/TNNOy^琺AäM8 iƦfi|4fڐm45\` 13 gC9|7Qh٘KɼbKT€]u2:`YU\ғP"7:ĝqdrz<|1Wx6#S7˚aW~]K)43%Q#J%R߀P=^fI_ڮxЭc*۫P(__bĖ6qxﶭt o[K'xuj0 ?\lPcl`>h2Й89t:N}0tlG@G *ZɃ鲃Zָ1b ZF@1eS=ɗX`ՂD5xy;TakQ ` &i\(/]wF h\(2CU9z =6iŇVYK~uR'%xsRH™Xrn)k#-?+[)H5n YFQ^˲⧇ΐm.ŔJ6Cvq8xZ[o7~`&n=q c68YH(ipBr,+~琜$iA{x9;tOBID%rlu)dB:5,kYSW߻ MDaXY+yQssu_/u9#$ JZ8uw.^Jj|}߽^|*V޻Q~'t "ʃQe[uu"x諓/Oߝ }89#LT ]Pd.]Yt7/"K*hd j30XҁR$c#˃-& :Mb,j!'PH?uih§yIp#u[ic'PӺI+  bIr\ )I#NqN ,J{ugK3Jz8UiJT+![bms=:<$dgjtUv7se[/Y#6Uu ^7^_3 4xۊ8!-~6j FۃJq@)*SŊЗQTQ b:Mb.gLH3,auahʫM NǧPMsdP> 5_ȼt_fA9E@"'Äp]%Xr,M蕸g+QjNG6Ki: yaL92;nѥ h>%Y$KcAt!&f83Nu>E0c@R+9B;O|FΧ-rޒx3 Ȑ$pHa۬"`D3pp=~C!v9GRMp !I9B$G|Ȃ,Z'LMѮ:jo`+pA@ʨy7A siIC6rh6aD޹|'*Xs]#T`p(Zp|-)O&i~^|j(yM|6VQ-uJ<;T#IJEMA>gPBpѭa{aJk)Rw"ׯ/N) ;A%¡.$(:AXQXQz:TY:ZJ/xhT@̷BÆ>յ/% m^̈A؋1[za7 + \y3J COy#dC<ۧ>Rb26,-Lzf bkl7!;L1Ԁܨ|S52Y)¨fSa1(m|TPD-9s2R|ߩgiBoSZWOr.w.&•psƦ-5 6Gݎ5]X 򠡏ltJy2j׋$H!s6UްQnn˾nio4.RMisV\E4osH m0LrW|"Gm*(hBڠkwou u.46s+.F sjӔ36ا,ir՝PGU~x ɾzz )SRl4|~mkM!%h~f^tQʀ/Pj H}P'yjdfF19ģeD&%倞vHk(k8暤i](%jkdw܄I3 TT Jd)/2/ll)qlBAg\4\EVWC9^xLg(5Wѯ b@Ws (-$H]9>)>)4?YCgKtӾxr t0InOE(s<hlzF3nA`\]OV^݂$x w((vyaߪf~cql؋g'[2A}KeK .=#V@%47B]0eeF(jKgpH b'n87'[7*r>d}ՖٓG,|tяB:: JD04JԫQ-]S8(hH]dnU)2ד>泫o}jlYX \v2;[ɤJmዚg>|xj i!<8kkb&4RQ.*U;ƪԩ[&=J3O9B47ߖQ7_Ci#}zP./Fl}*GiWDڭ+SF0'EsvJeyT+PPHi 0}plOrs۝E^#qwK5K֢l}vސ zz^n(%DB H<6 $tQKm.G4#m]HrLl&˜GոgՈ_G]!Mk/zvi8 % 쇾g6*[~smh3muDlPDn,$ɖu&ǙtuȨ!dh,p |м F~H9|z E"W>BXana=ŷ@z 5^rXͅ-UZ>DoF~TC%1bUx_j̈́4R;^~- } }%7ֱxMŏX[zN-InͿncHȚ !-.퐾K[@R$zDs`O_!Qڬ:NT搅'F?]\O}2GIHH3|µ栛]n)7s^oz=CN 2Z)EV`ۃٗJBV D$(E|‘X߿LU!SR`!S\RjvwZծuVVG}-3G'al} RH&㬦#@$lT'o~j%QwGAؿ^_IEDDNfTu^hA5Tz &?l!:Dx}S0+IacHcilSv(kȒ+HVRBKޛ^qO|L]ot]@cU"Ok Q:n=b4'<]zo(JֆКM:2ef®v.;2a!;&dY\i!=f` ۹Hux S;菇[r \Zc\ᙒ i`rY9vYg?M /ק.H#x׉7i\3:rT~c qSβ%.#K~ J{sLբz<z{ݛQ|^qjmڔP1nF$C 2Y9_e1;E*v|w??p) ̓Tme/ 9^t!oTڟ0ٛVB' Ŕ]&$&$L> _ e[W.> ĉn(j xXnH}WT10`n`'؞'ZdK&M[>B`aKͪ꺜:)]]ABW$M^Be8Rе}Hi]g:;k "Zt}XOrkF_Xyt6o߮{3mR?zҵhzY9uk< dz" 2mDWnT)+j~zDbh[GKWX9ٽ~ UIPNV} 6$5Y+yNѾ𦒭ԕ.Z\zPn}olI1&Q=E(\&.qeL5Y(j랞U\uB<؁4:UCYnrc+!㨶E 1y nv,}Ȏ={pQ 6z?L1OQ1߅ԖjCF#W{3O=H:f~LNebϾ tJyox)};-JǙ0]z9b%/Nڲ'& PЯپ" ⵤ,=XHO0+q@/ xQLCm? IP`ϽS5優.`]lz6NF3a:~Bx9Z7FRaTAyJqM{vo>F)}lc(9GY^޾f;nh&ܕYj7+6Ip#6\i4fa/{5iibU)R"A J#+0ujc@8\+eHr'e Ofq(]2<(l ӂ]=p|HVUXavfYˆl wR+$r8刯 Uev%0݈ $ RE1Z. ;e`sh&WInԲ{rP(lk;-\zd&սgH_PIPzUqv!>>?_kf<.@wZ.z!=֬d`V{I&w Kty@#GIiw\M\߀ ʼ,Th9<-oPߝi‰fc֌-i{}|i-[ܥC6.u #E>3!K~ʤ}C%,{td(0,H997^#W N[1dm+D qN:BNe:^Ka[a@ME@`ZO?讘Tfn< H8\%`ї T n7&H wB"]/ows"lB~ Ծ=۾K'bwѝ`U28' \LC8,IL9JwXc{"}ƛSA,%KFr0ThR*2GQ?M@dl6Y :q'N| v»aռ\R1}!Ig xX[s6~@iO찬3ݧ̺Yɬxt` LBes@Htd ~MSq{uFQV̔['FaMyRCByx]ofpF!Z^#?)# a]?E׻|[U[m=uQ! r;t^bnlGG^D6J:u)"M;H I"v*'atة r&\|oa}mQ"2"oUB( RC{OL/!hO3?IĸT luXzee?2zD7 1کyS/vޜ|8=Y:e}-**;;e⓷ʘ!&_'NLjDXAF: G A\";96EcF[}ժϥӏˎ뵄,P3[iyjdӺ^dT֩&Ϯ; eJeQZup2oxH'dtݣvnd;b~] Ew7p+ϧ}!@9Q]C%iL![ RE;G 8$P.SOµR,x`-bSx+d@NE՘t]mc!9Lb e5̓aX1=N+iM1C_ڹ5uaW"=9< ?0!]!#Ky&ZVa6? Yk,PP$2 xĹo<P-7t,%N/~ V`< LaYx#tlN| tG-7qC6SZu#AW(% ߕY(S]h}rksWpݿe#N/P~{5y҃5y$MpJIKn`C0=ģwqm_+H+o/P"#Ue/+c"kUa8C] _^~ sK(1X}8'vbX!*KGڡC ~9+"S{iؿjdw=r [Fv풋dzguF %rtEb MhrֻxX[o8~ϯ T\ev-0 @'-v-Zmn$QCRN~琔%b m(~/BmuJTSmr'1+^įBx%ʍ*ol+W;Hz0+!Z[#KmJV}%.,oZF(KJNzUN yaVVHstwҖJxݨWFk)ecMܵq#(/+XYӜֺҩ?zղ|2Į&=nsF0@;߾-RImٿmt!>Bxc[~3ز3hzŝ*D ge,2ȏM *Ӿb#M<|B\iJPST +ցW[K)Y׭JAไvbȉĠ+KȮSRЭmȫ/uRk>Xcn8a{)FW#k7AzʧΚeWުF;Q$VH~676|yKru eLd6wcdH ,~#b斃>B6x#?b4H|#"Nypz,2FY?[ZE8>olN)Vx3UB}t=bDlC =VBwMWG{p| -6,VyJע̳۲«d3"ٟWQ[Rh3g!>xX^64/5;7?IIBDxQy]51q?q1sfM 4 1ِjLE^Z^-\T'N̝:,وҫUNa~9;۟q~؊;\8yb6Q!;h<ʲ(Vи֋E@N6xN^SАJO`9PN[͕!\]CKi(-Zr=h hdN$tC']@A١he4P I+ޏrÃr@С573 ad.W-' Yjl>mSx %n_ bB%v..GۂM-"\5\?,!|Y\]^;1X_ž\{O>xia`@O+0E~˳'YFΖQ<=?T@p|V ^]ONsC>:Jz%c4H'__RPtă$țA -=XmV>:l"L6K0rq3>&ih24:V 0?\8D|-ʺ,(lU1鉎@4U@SWBn˜s 6_?|pC5H{9*DoA;BX p3+_poX:9',*B!5n0p.6\ZXݦ͆Tl#+~]$d@7$asVt͙1򂒜I)[}wwuo,J0I4ȥV @Ke9UNdX9BHee%@ѫ+$LpCc-r"mpgESΩ22}z,շ#IF/(+1eaF:"{r ]$ $xd4uiX{I4dmoxyt;ظ-vq9 fsIb4 %k[X>XmZ oAڂM3 a"$ CrRmKUcs)2-k CXغFsK8eDCfېD/AYMCt7Zc"T+)RtN8tTw`Ir]lNݫ_(:)eѺ$:W|Kj߬`u& 9<$O[j`70 ΍ϚK[-q`.M袬Ėz#P u/@&4 sex'Sx[#a!s3UUvveBmBfSh4xa1,EmpҰ<(ɂLeaXX`q˾p \0,UEh{ǁh_W#Ƶ؁~8XvcIvEk"+p+.鍡k}NIէ /5!S5cU7>8ޟa)~:MW~`;F} {@93RxÏIa(f58[<,8Ȅ꼆a3r_ !sj9'cFuk93dC&w:g*]Ly6BEx$DU%7e<i}q#!2Q@@4=c_NY?&!^$WX hlЋ;,,!}+Ɋуfo "vzM^47ZvY`gfz^VHb՚Ϩn#z{|h4:݆O+S~cf_mlsY#{P IzHjyZzh)%hiiO acV+Zr~ gQdWv_^uk⢹mxFѸ-!>@ǯ|L:q6*|%^G? )Wwzb Co7 Drv1]G1@+c0({yO'D=[x?xm}ow^c'K);84b=9ݫt>iݒc_iYg 0Xw=ö 84gP"H>¢Ը#lZ0rH00hp4QB=ߠ ĿT4#5^AKCC4}t~5,@ъb}?0?0Љ{D{_'xl]i&52"C?VºE.D>$<%"˯xZf &'r(pޟ3u8H|w*l=&%> C?x.fMo׺?<5ΏNRPIG+od֑y^㕛>8vv*轏xVo6~_1MK 5a@t2&(QTghxαAoۯ!FF݂ybgj}Z0? ٠{5<haM[eWyco[ĂJpK㩭7FQlUdgx -V/N*gggJkJp h%# OP?0QB!!o >X-"ZzR^Vk^:: #Qn\;dN6#T2Y}sm|8oчkNhla,jWa*/U7J2`'f]cAy\Ik6+=Kd4p7'GJQ ֧ޝc_//W_ȼZ֟y>OQ|]T $Fɤ"MO! > , \r X]*Vt%%/*Nyqm`JnDD/s3VO61B{G\&#.{ԾNE5oV q}΁ sowWB u%` ?6P@Xڸ#HW:ؿXE+ce(fҍI0کK"/iF}W x`  Q£?𼃹xV]o6}s Dc}5f`C p[)\Z,#߹$+]5H{-m:CBkOiL(b/[_F(ԒU.x^?~#)O{cl;`c`=ѩF#@Pڳ5)KS:Ӳ!aJ~6`OѪ/`Qujā=B^!@XfXo 8\Y&@x'(!jA>*/j*^;&F# od–Q+sx~ٛ].TN"CmN&#3la't‚cVLhlJUzx e?vAB~}jUle N^*މV4?/2eUIŪZ%gf>ˤAA؄$'!;SP2pʖQJ4J1~*|mO$)v {n>mi H'uMlߨQo "Ƅ-`?&C`JfpT IG[+"'o=28l% B3:gs`=irW؆L;ϴ:s䥻\ ѺCAPz.7| p\OuaoG?ލ|Y|9ZӟTkbT>kgupI> zB ,8y3t6ޏ+HiKiEƋN6Dޡ2^85ؿ"3 8b1mtR3\Gcf#e6%enNEugq[ Z'Դ4Pgw4|}e7 oߡ>eI;虤 5 L9Ey]e6s|:TS:&) k|}A.Ȭ`մ:(+=dM&Bo -0^9IiL/Ǒ.)Z>'Mr/c%{c U2G evcQpG< 3g\",$JR<'x>٪8.Ue'N ~T'0CJO m?ut p͸¥u]͆|#Y6 ِoD(1;Y}(_ qe!u-2vK~׿=Պ/ TӛqWzxUKo1+T-jFjr8%rkBIi!b6ݺ(Ӌl>FqpIjf"(0uڐ8W/A(Z+UQAљ$CuC%,CL4ճuSv:31w0LuKRVĦɼb@lK:AC#k.#d.M9sɖ[ȶ$ȶAMtӒ5U/#2~1k3)AIrg4wzw>e||ΙEISĶ''Q8U9fl:j2͒q?&)7UߴLEZt:xK*~ G$oyK+*8"zs]_Ҏ<Ј l=nxM|=tn]"YK34ҕW ⟐X`eg6qˏN1|%.hB@'6Z4n3xq%hLevߊ{==oCuR{b ,mdI4aO(UF/.a}0xCI^w%3ŰH,xD=I1tobzh4k/i,MwACe6EBEѱTh` pXJh=jgxWmoۺ_)**r|Mv+a}0$iHEDuI9#; aĴGR _[/jRoJ8҉..sZ+_o7GSQKW NzU%V@W4un~ZNмoKM ea9?̪mp\7Jx#FA_*VBYksA Opё9Iʝխ39u\vjD # vƩ`(n%bگŊf6Y4^582{2jmz Q@0.df+Z U'84jՖrP5jϦGl.E2]^( ݊HjLCOfE+6ؐpRѓ(L9\B^4ulFjFOfd'޿+ hKũQL}pٝM& |@=EiZVZOf^~?nbjR&Y8;52RrpAN"H^̧/:>y5^z1;>9=>>9Uj~:?n]pG)~; ˧w2q{0o:=G&,jk1 ղb A4Îi Uo) WΓ gg?.Wscwi+t gݦ$ ٣'NO]M7-ݕ^@,|l;4faG"(Ӕfubi[CޒwW K-9<@l QjuhԼ1#PQWbmzܜ$c1GAQ 8t,4FW.e3>㲭m;۟uy".Z7#{ KOo}qGIt{Hv=zNEGs3Q@pLS'+&z(SFК7pWb~O84 fj 5Lm @Pܘ s#i} KRnt]P@>6cHWBi:<,Cx3HեI;G D̊x fɚGo&R|\Դ0? * Z~A |@ju*̱ , $>15K!˒C !&9֊Ʌ(0fC!e%erڴwt˘k\I{1^8 $Ԃ݊!eɫb VPA' ajHM(,FrCWpmHcCqIEItj#BHV!;MJ,sQT`qt!m(!,&>Uޱn YBB?1 :mڧ[0ѡ4v!j8TS]][Rn#I ǒ&x.Ͼ* WY`݂nĀD38E.-Eڢ9q tΫof-5am ώ>SutE:dF wv*ml#ga#Y x6D` WxX[s7~Pm:g純@̤!#ʱZr%m;Gڋ/aV} Ĺz!ʃ+XNު#L,aA{Y(z@, /=̅u,` &$ȋߞYe碟G' g 6zXjϸܬ>k"RN7uvz[gEVʢ9rhҧNɠQ.#PY}ٙ-Tc`Pҋ{<hsU.XHSYM\9[k g".ktCR 3tk-"^_+p4JL7Ӈ;fDryB6~~RdXgE"d5Rw>o^^ܼ5ĵZ2WlD-q387 J \\A䲄01zXjsa*l--AL\+/V,zA 1]7@8^RNjLx'fȎN4?M-]U n k6!}F# X<coBf Uu%%FOc Ij_0S[!`hM3*W#Ra\AWY\éI`i\Lq6=1>D>–c]r5'WWrxv.tQ9<>`C {Z2 0X cCyAeO`+[Z-`t mvq\9y;tA7+ ,7$ ad$9f.uwm!@Wp%NNF U~ 9?{"Qd* usNm$St/$;w+&zWwGE#IS2"4A]T~h\ hRiXOžƻĦQ4Xn*^Q LdCm߉]vw}vg·w7ǢOp\`e`} NY'^9PRѸQFrCQ4czݻT@$)ۢ!dGRzĢBm PI@2d-#pN-TbTV&dS,ڇwu`67`K9$'^Ճz+9Τ35ޕ:r}=YvCʟM8#C䒺KtQirxc.SKFBT>FwK:SĪ/圛GhJTXsNGەOƺ,\Z= b+cD͚ 4)v9_Z Rae =Ҭ-q~kKTtxhjp5ҁܯ<1z9@ %u2IL1xQn0 D&pvo䩏)ե6J Ee} wǣ;Wt+hvTcI5E_8CR(;Ty7 ٠/vRgi a}Ե}T_ҋ1~ZPs֕t|^{lYJJӖ<%WcijWK;EȻ3̙3 O$!b Ou@/AGI)&Գ AzI},:֓ @/ M=h_(<]~2^ra0X <'>'{+˧?[̈́$ٰڱ6S]u]l C#m~/d?lv;x]_w[V5ձ"bL,6;q=8gc݇KX&XQUyGkEY^l->VlK J_&JȎ"ty\ (]W$9$QF匨Inz A(V^W,˓8L1$YOH@ ն b˛~],aٔԹfB7/^0[h ,(^RfFj&Wd$ժ•䍖~we(+| ~izva5@vcAx.PE-{\6s1~؎XiCC2iΐN՚S%z/gLa9Y-=RΓgPHbp<zU <;5)Wmgq @" _\>+ `kU nUS~i/- tf'v%}w~; n=cg\ ;!1[ف6bqAh]1e8n|rqDlPxH9CCU[ur" #}0ǟB@@ڠA@Fe9y>Hta rYֲ*␋`6iesc 1D㙖T6%$ID6 :(Q/ȹZ~vPfUfqȦJRd2ƻm%֦_qh٢h*E­aGRYScA/qU7ɕa-9-S<苸;xb=?^ڈ:a''#n\L} SCK&^?.X/^侰 VOm42`*nm~ŷCmF;9c@L7O}9[e#泏 õ:m UH9F|%çEx 8%/F 7{l\!\:S>ڴa`spɆ9~`!O ˻mmȫsڸ=nx-:U<GGU7QhCރ oy8[^~oAxӀ^žǎj/bގK;M_.ݴxU6+XmH*9  c\)6{ )YJӴhO3oͼ߉Ƕ5^l['mY۝d,wJ4m}<(dЭO!6Egw"p0jE>)+BZWT3xw*E^>7n~Il:rX;UY LK*A_1˝+%9- AXyP}g$K)2 i bpa$"hܫs(igt;6`r2c؃WھR}Lsbz m1T0%zN >N`BV u}=?PU2xM5$zySV%߭hlTCl)St^5}N:yH"|DB QSܖb4G#-{.W-U3[X;@`biԹk]ҺcA9)9Cg"˟ͳ2]Vn6VF}e/_wؘĺiUAwFaҋEVV:!ew$(p1gHB@f(ߒJe!u;iQ}ۖ|Q7=g{<{XHoc {>X'mxCRrQױ·}֩VTʞr1u…,m?]:f1.2bRG7rb]F5ex:j$7IzmG)2!7iP?+W>VP!jǃk6$?L n$AΫ&-l<4+N]:ɮ3<7s!d.?C"wiKQqkܝ|^D%Cӹ+V)r_ٓ)o}Bly;M,vlIaNQ?C=T] KZʿ7d!0hdTן2դ4x]QAn@ $ SE"Lx^RѺrpxu .&ߡCG%qwVDN:z< g(<.[I鼻3`X k"bO }}׳zL8=N1`aV:֣nf/#uVpHĒ}sS28p7^u h 'NrjbwΛ}Hgl`VEmd2q:RBh?4hTT-ê^s+04SSD˄)R#Bs/IYɲn HޏsϽ:qR>V_O| vvVU QҚQ mۼ_qNlĮFi `f9(j;`eT(JXW`QAV5dV 'fͷ'J孢\w{5Y+([]sQ^$IG?(ǽKUɿi'H6;zZwHoFӰ@»;“Q c_^]0dV1jjmt΄+ }3xiBKx=RNmw2Ty=Gsȁ&!@-Wq6y)r]A^j߂ڶ7.p5?[ y$gB\CNyWͬqKJCa^qV33ٔbbK9L|O+AwgsSh +hIS} Pzq$NF*T3"޻Z-%@Pb$%Ft@oZ3Wa0mH D/eb/-6{ \#Ѳ&N)!7 ˘rU'$oQ}12 Oz'Ȏ ^!ŵm!q҆l:1b-G~>~ngij|!KwRr˚E=dy~$իܤ]Y5 9ux"gEUˣ{t`7n:ժ8`6&H<>^zYXNh9ZJJ{wK('%ib`aϤa{$ >,{uA1DTLm-Ak"yN/Ғj2?@ ]k܁ţ*\<<: 1k0ݐNbQ*2P]\M\Z,̼L!Ov*b/bѿL` }PHar`CHÀ2 ͔=¢f $R-ZRWLbP:$Qi(V?Uӹh@zxlQx e* T+NW07I+NI|A 1/?glIP^Bg2M({4MMd_֥N"?%=1xCT'O]Ҽ <Ŋ%s׫L5~qiGj84 ˡk]?&cmT@wKS!>g3% %+N'nz jӇ=ɾ7rWbT kR|~.Tؕn)ϦSh]="IUGd%nB/E Һ)t8(㕸!lS#bV*JN;q.g*{n0Hc]וe qŐN/#TC. s\Xr[݆5uϠ蚵t%K30 #^T`R7--C@/l6*|KYBRe|Hv r$~HjO6h"T&ZӀ:ZV kS&ج<|i|t537D\D6ɞž!^;`jE֘/)-XNO}H\e@ke=}QsZ3շ kNbs Y2f'?5T0AM Ph-J tҧ\5rp4qUWdk/zƊiђ围QL%rS{v]ϼmLNTU7xj=C|&429'__9?hq("| [0Bҟ#=N+㳸<^Rg? 3kI^eĮNQ}n1Gvb Rv[uue@HW0߲ͯ&چ?$P`]4zG$T$zɰZw!}[?BùxVMF WJHV9E&[haV쁥uf䵱ɑ%YI.E>||Kym jp[!/6^'v6[;:naѓ 5>Nqda=tmv9##A:F=L Of)4~p=rUѻaܵƸ0~Fն'@Q'z@vX%м\9XעJ?O#CŝC[zT~;+ɡR{Eich{`DjRgh(#<*g\mvHCy8>PeOֱݒKG#bEMr%-; C Um3TU gY޺`jx|EF687 bNR-$Ez45J9~K2xڀls _rk[5=_2֎Rs)6R|N no 3iaŌPq{$RyQ&%'>(|2{8;x ȅ<&4J;9GgW5W=4n2-JJNM~ǣa=EnAK]8J}%+߃K=\]B+WRt&eZ|Is*=QtVZӤ%^ZPy%ONm>9u|7cV ^h̗]eȏͪF֔%˦tuj)MXMx(JcuuW& ~Y mcA^8a$۶U36 r}d: PA9-_)\;SB .j6.W纀xTj@}W Cdb6 &)uBIi$ Kbw{gV+_Tה׳gΜ)<`Q*-t߫4d *,6S\f;9/|-V~`TY\]ӋYҞY.쬋ZH+'COuf_syC\ }F r~P?Y_F E& ~< sRo0[qtykIcEbU.~{JFiVvwp[VEv0[zZ<Xvx͕͊0y{h<@hz)%V"d$%}>Bo8^bY3/c;hf[{or[y2;^ ?"/!yLc7R6Sck$GxNS9(3x,>_| }c9o+}, 5[8VEAz>d#(El;ͱ- [E7LSQhOmD(]AENw +UM$SbW"]B`WcYӆ 60]#jPkFrGXUCHq[<4JoV_'eL54VO?+Hd SB)_@vu(rT G@0DxVmo6_qSQD{t+u`ųE"oK}0,= V=V(-Z;E5u8 g.F+*+p|mpʚ`Γ V}=͌=?l@b XK7*'v[x,|13/o52 %u^-<VA+;׺p6j85Uq#ރ XPTPvod49#߿Xuo@Q!A$K4p҉M`Ecu:6U'tMH[~ N6\uOx :FLV8  :p4 hBi)VIf#h6~^~X\Di}e{x(/W>}cs6ZuYR=2@$ _T+Eo|?.[s^9.[đI5M54h<Ǖ!Gb*x O7 d hp%eM澌4c -!x*=|QEL%s/c)0$f7FGv1xw_bBPKq8:QOhBrݘsuW"K|rNi$C<:fF:^iMS+vBxB<Ö$6 ~`IOשF!ܚ+$ȧZ=Sq`kvʛfђ5#0 \AϏ+"&ǧ;cFds\}xzBX\+C2F9ԆrM~UM>2y:4!IMuegOx_JQ@V#"IW/ 33;;sRgsjQ|\/X^JT.͈<nE?\Rz\y_,l_mRRWe[1iW xXm۸_:@c/rl4E bqXQ-DEP%orA,3(S7/;]ⓗs,v޷fVublɢRuܗCSfeye%mNƊom"M^]ޭ/^RTBvj}ٽoykR=V*7Da::m^=}^ʢTA^ݞ7Nlװh qi6kRRZӵ.))bW~-J{\w&Ny[,O/mHT%)p:r乼 PX/rx,'=%9x42UKg*fGgf"UsX[kjAWk2r |Qpy;t_$] hXAZm:7(VŋBy2d_nA-Js9?$⍩*"hE&)z]l([vQsqݻVa%ř5hmRpMMkje>ּ!aQrdI\S  `qy#qfoމ#(W#)5Eb/7\)i=1X=_n*fe& U&F8S JB>ы5<*Ρ>&(9d00]6&3һ_uJ2Y759GOzvRjQQnABM^r>VQq"!.\"#a&=FvbΑ a☕ wz"KF4E!o;~!d iA%ߠ?9ECAKr&tlF_x`swCj5So*R_7TC Bԃ&W"[ [O-¼7taļ5&t(5׮Mq!Nzs ('sF#+qV_5Qy,:ˍ=i%7 %0mĥQ8e/xA BmeWšAƒ69}sFѢ\T&pr qWPoK/ϖsemo]p<|UGdܥ;p̨{.>o_1HTR˯ !?[=B*7Gu\׶tx/]cS_n/ +ihjQx䑥7E#l 3 RkkŶRD1Hih`UIITF(m. T&8drڈ`c)R?"1e|C_tǒ睃~76d"7=)uĚIajd P(7lnLv |36n܈Lw|948syG^;Yw1cɷ@=Ȋ_栌/ԮhaY#ycqqd%1 FNjDbr?} 8tXYo-¼+xG{(6eq՗I],-0۠n⹄ $sg g(^ `:4xk0W `̞=kdlmYI焆d ۨ^$N?}ݧ+Y|_>P1 Y PWpC{kH5jC Bi;c%&BPϸͭGG-z5{H27hwc3"ϽNs?V!z3L9'g[!?3&2҅F8M<08}f%x!)D+7elTLFXc?A=kQq|!Q O:DséG٠ BpL_m91%[JL;SET&=ޟ1hU5_>BaVttM* ucV Xgꢗ4++7i);\rך_bϠ_ x;v} F̂TҢT b+}"tOfI^䃌"38eXq)(;y;ƻ[)%=\]C]C 62 3/ZxmT]kH}4%m,Pڴ () H3#~=#Em,i:=HƋk0{EVJm&B^PNzӾ]JQzo²4 `ܲ荭Ӽ=[Ү"ϖulvv?SpA-7.6$z\| fO1+ JR-Qݻ2yl Ԁ]uL_U־EާcO[@xwߘ@uG4UG[S>DL0e;LAh%` _ V>nT`.eZ9$La Ø>a͚βk˨oJ։5᠚)5-DPA:̝Gަ4:BԳf ړ]<7޸;F]6@@PS!˧1P#O0n}!=X<14 ?wWOT2IWTp P0mou0 \ ~iؕD) !'M}QcYױ N["Ā H?~$ KVho[ڶ&6}2ڬѕcJ:D=2ga̠*t3 `$A并nWዦ6:'[aL&v^QS;JuI ǿ)KwBvXkY& 0Iit 3+ܒ‹:K>ыbrkv%g뻿!KVשCNc:**^ď,t} sk g"8mK:jt(zmT9ؿlFq1 ρo0&wuiJvb N6th.=N2H+.rj! 8g xa4AA/+8,5<( 19J/)k^r~Jf^Bh3/xSVV(HLNLOUR3<..-ǔD?`ܔ̜T|">ļb@F*,=. !x5AN0Es!eR'hJH(ubIl%#g\)bD!ɒD["EEr@÷oG+&};x#¦>=եhWsT8]jedm(ΌDG~7rzaVbUEֶ7O+C<%:76Lݱ`rQb6gArK|tc+ep_}PQx<伶$}ǡ2Y\a*T}\?AsﴩaZSOk82<͢h2MBP, -@YXxYmHF{Q`^kJ -BCXqk4'wvuVёjPiٵrPIHSz+2*I~lFWA+O;ZR'y_Q6䗂VPWNzy 84nl5 B|~Z^~xL*J,PJI-+)V*J-VU(J-KI-OHMB'4krCxE 0FH&{ -B]cb>krN2F-6^pǍkA;:k35=2|ld׮]7YQQm.7xL*J,(I-.)H,ʁTWW+%*jrq'g&gk(!K)irmjx {hC VAq{}7,YC{ < uiNT.IG=k(ШGnJW^=CWA >+xuRMk1 WH`BҒKZʒbkvL:8TҀfg!"ձ IYYy=4s$:DBx .?MG $U)NLg*G;Yh>WoQ탏5=ڑskM]0̲~hϳDN6V/brJIl 4x;%um򫃲`+U"xmQn0 +|AibҎL;i!ƭ&UvQ}NI{xG^j]>-@;48I6ʀjk$8 G{QXYca~i o U_zд@•!mGo^`[@yd\DRQ0uFZeqf&FۊC  |ȋct&H B9c{~9fzHJBЋIID9# dj65MbB0T`xp]70yD"!QyGVVa7zxUPJ1+BAAQ,ޤ^$nc%/-nboɼ7fWt&3,#rY`Uk{BԲ9Yg.b1&z%G*Urd=->r d.qFT!N*:=wa V^54.e 2is2XB?舁0'rZ RaK.ʘg*?pzy}3(-j)SewǒV.] D k9KoG ixi_TRbq^WL^bnj5SX dV)D3Rsr2 SR2RK*KB\ $JW-4xi_HN~bnJN^WL^bnj5H (PX,PYZPHTH-IMQpʕ'VsTO<8_".ꂢ̼Z..4ixuTM6 W^1@i lŦ PpdV#9݉^R)XE>LT ?rʃ 2IZtz 0Ҽ[yݟ%L;0~?VeP'jRJ&YtQ|UpHjowwp L(%÷8^b04qK7ֳp]QLȵqEL'fD>Az-k㈱\" j?@L2eZ+r) O-$]Ui# S sif*]MUj%2p'P@:y&04/g~)+9Fzrq BCT{ӬXK%<6nsV]]vg OЧXd14D*˱?d XhA"ZG_ϸKWT(.ro\Lhnӧ i1ľX>u}:Kb"Խ ܔb\1mRIFXi%3soYL2L`*ѯ%ym ;ZLNa6oͼdd' zsg4>OB٘~3KZ\[CHhD.ʦ\?L7EOU)ytUΥ˰Vn} KGY, m2yү_|/Ûě?<]r`xuTYk@~_1KǔBpLJ_ a%mֻv>8/;3ߥ{4d8Ѽ҂T 4ᴴE\ 2(k@XfyT*41 j:2r]_}OSu/VR+/V O\ P my3ly^n+n@UG/kěC4Gs ꞕJ\wcϾ{̃1*1Sx +u=zf5B WY) e Y'-2 X#ѐSZmi[ $:|޳b - x&Axdp嫝xlIM1}ZA5TrGöCRMc@`&ăֱ7ckW9;8REx q(?*Ǿ6[8AʱkRia$2~h mJ6mK"j$̘%mAId6n QkL`\$UkƲ@d_iGXy.cQrX@#B (&PA SֳZA.#HU~az |Vث>Z{yVe{v0Q!;ǒZL,~~2EhYE Päs `"XccF|4{.(?91cPR+'pngVx1_Bq'9ƒsy۳7?6oĻuHuR8䧠pM'fuGX(RqJ7;%k hYl9&{(꬧s绀 XN ޾*{;X4{O>4i-[ȣUxTKk1W uio&- Nr3y5Uz Ɓz(dXGGD7N*"Ef<vq6zghFZ]%)`%Ww!ApY Pahh{~цX562x EfjZÕ-c^=U0p.U®xa*b4 ܳ߯hGM9-*sAC Q}8Ly9 !G$KO:bh Yiq9 |*K#,- w0l7^\zg IW~&[q2; ĸfh[$+&\-fs>N1TpEMC{ |"VXo՘x 7=56 $r1_o2e-,RźD.M: p|c_dxVMoFjh Q\H6z1@ȕT%EJN @\.͛7QNUHu(  VB^ucLA[CCUnījS+^ej~,F)Z;QhXVp˲<) EH_ջ }ww!.}*]xl iC1,)dZ7qsBeDƊ#( LIW\xbNQa]+o\t]>;KK/t46i3̘^WJ|4kD*IgDt9Ȝfj&*br{"YA=h: # UK&=oA8gU>hx DY&EI3ߧr.>uWA/ (U-KUtKi-z}(g = I j!ڮ\6 $v=^j(a3BB G0*dduxuUk#7~_18q>J9ip}24E?eh]B7|f~GA&԰B\;t?ڃ Pi,ʈӭ*%w`lPT'sCWIkd|L8|+KuB >o?n($GĒ]l}/~Xyk֏ OS ֔FW鬒O/@Hu蹯j}K¡t(u|W |RIb@!̅W &Yc)Gꄮ3kRvҸf+[wEaJfCEw{óS6k|ؐ>|*3@*$ ֓ @Lq_٦(8EvTɕvkvָ`\L(q8.=ԑ|c9W9y8ѭx%TI@%Lu2GȖ.bZ3;_ ,ôyQdRO_:g~|uT'J1iy7{}+%x=O0 /'ćĈĀB'| mC0xQ?[pB E5gV&.ڡ (>^{ VtOWl;ʶ&lJhAv4kl=?3⥥|4` A*i)Mb3)Z* T$:ӳ^;xo6v=2Ms\Mæl^1O=)^j7R=t?4OZ"PvS~vyᆪ(Yn+z`mae8`[,0ݸ0xmRMk0W% =zK0)Z"XNRޕڛ>fwgfnc$ _kGg@ѣ-aUfE;0+Ic*+U;hnzj[4vd Ih {@*v XAk,AUuW~}?7Yo}y@Rq~]>̑E`Um" snVҰ5, H[=Ax&XN&[Ak! V[VpJIms;X#ՅriWIa/^\yV&@heLqdG:ee YM$(X[m4s5 򵨬,#jp}HsҨֶQ niv8?n*\_~:s2o2 +64W2']G#,!z;Ҳx]`RU!VmW 9%\am!aQꈍb jqQ# To"f)Sa fwH d3D#ʺQoq, ے K5FqvĽcxmQj@ W@@z赴=Ul}]9YզқF)PB! R<=B Rk3}6v (ṕ@"dڀJ_L1 TqN.jڞ2T̟q#խک?[Ŵ,䋶2bac{]]C0 pȴ]TZa_07ߌe}L7k/uՂb0eʀCNNӝ,K3cb<R- 3)]lkU8dhʋ9*Qe9-'rE b|x]Mo0 D :z(ҏK ]oRŢ!R%{[; &_|ȜZ2;x^:tH*q+p*WmA; zV1+ؖcH% :C $rїXp^$Yc'>OC pV]¯1n*R/_gԉtw, M/v {p gD)nc=t1ydd;fS}+_zض0LL<0d7ZZ)1Jְxe'1R NNiQ"p<u4-K#zbp66zmk]2f7 o#"7ۜOi_h b3*.%_c5_$>F;o`]%vx29[Hpq4{Ĥ~c=JOH,pDpLvLTnyvZ/o_Lo?fBH|v MQrgdAHCeߔȉ,f8x}RMk1W%C=n $9,4J4,߫u &ތ7.,ydR=wǚ-(1Ҝ Z%.! TPdߒed,h O47sE۾^xQYli!(*4W5~{p*C.C_gafj2vPIhxys7*&`^zv r5TdzUAĨ(HA4&_ٝvNEynQ`I6> L:g\ OI${ޑBGIS}ohĄ9D٠8s;< \ĕ8~x+v:F3CfL阺ήN0D"ˎ x-̷:87$:|tVEGyצ.(QCb|Br;%hs {H_BSkO{J67%):]BHbxTMO0WREi/ET@VB&uqh{,͛7o|HذkzXS-kR`.M-f?m6Ae )C7o*@*pnDj%_! *h.rГ$Kje(g<.Ԛj^kƩf K@ENF 1r|TIӌu[2(T͹zĝ^!rYQ= fYR߉ tԦӶEE*LZ= RiKǔiPn*q{\H1% vNՌTF!넬QG!Jl:5IekR1e2۠ L2C ip3$L8:@S-f-MRNܪFiRi[#kP yUq 1y#]oYF*DbomtUr{A|Me| $Lf5Ne]J'gve$"~ߑ۲UršC8i2<pMQpwiFtŮنsU͟J(3eګ^ɦEhb`5wG?yyNzeH-EC ^^>[ SU"z|WY#!3mި \95>9b?Z6GtG%91=GO R{ۉ;^%_,k#Ib9Z5xRj0+CH\[iKBMBl2ƶX[rQ#ٛнbKyo޼"Cڃ̛w,@Js 4h8uKi~UV #kk@ؼn-‰`Gti*>%V?b- SUZOj]|Ju1ܣaMEGndEݳxoh["aE.~M>_\.")vԦD!>Ev$ٺ(Sq@H(hgAtDJ_:<Qp dObz2]A)asyOl>0TvF{z406̴AfFF|tzc&L_z)i{p!LC)f8cI<;F{}AkkMw>]lwǵ8Z9l1&'/$xn0DUHTʑ譭rF6x!VJd]hPoΛY9 Ȥroܱy҆N-|ZHUz1ӫ=ݰ-vD;ڨ* [J¬Fh|ؽ,}fpu;mRR+V7NNbRa4ӻTAOPȓNz$#edHMIm/RH{ '-2Z+`^*5^lڶZ%Q#s|<@ׄzR"+{A,TY㾒_@q~'I:xuR@ WGx/w|Pz)֖GCx{zev&+IrpFь4O#&0@Pc|}Gg-46:ڌ [lSaϗCMQ)m69C)bNIAH ,t:!u"ieBgFf| (Fx:j䯝^ڿWlL)8sM:. MUDu7x).šGBOuSP)xJ [`a>g0o;JleGg)VYo";G0ѪQ qPAYñPT`p1mm8#3xqRJ~h\[vϲ,VTgS6"$3i5'BjnY9S\!(*/U(rEKdӰ. į˼#{'{&Yν4N俔Q/G?Mq&x}QJ1W -. ś*[iMJ2k[Uk[xoޛ7/W8cϑ ~1qM@XQbC8ߨ Mײ7H.'9}BQ9 uCW$FjAX%-諀TlOȦyyͺ6'4Ñ[pa(hkX.*َ{LXegG?Zk]|+O1X5P-;s0xUOk0 Ph/sa1ʎj"V>+KYw1c{[x%J&6$/HP jŻ-~>4@X({~v5W֞t\P;Ai5gl)zO`  gLGv+xIv1Yiro]daj;O|HM+xO!.,d ..)ӿ*S:LFIbm=ƒ3M}fQ/dxTMAϯ(9,d\Rvv=淧j1:jE{zUw1fk3!f ^1.K%Ѩw.XL HaC>y0> 7$|Z ЭKvS)|U5Yr/0+8~&Zi]'7ed*=o}qܴE:urgT^f #<ӐnNG{X vqG ڷV a aBe߀5]|CmLuו(}  @}wDm?vէ7>HϦ׋cMI.R̔$c3e}8WoǶbPJ_L`.)Wb$c,ST}47L޿^uqxЇ_hϟߴ*A{v" /HضS4X2͞\:%:zl e*4 DG|ٳ%c3%~~ٮ?O5 2?Pt/WJXʳedLd .8 lNFLi].YWLebW~l+W3s˿NUk0J6!m3KhFȵeZ1xQ.@( ^xeAK1s) <,xTnC%3)-݄T=y2-"%4;Cy/ŀ<[[k TR/}(K}$8QD^\Q0c T(w54+Tr*}Q HA&<hNgwq Bd{_Ñdmfٳj %2]'Qi3EOguKB'@Zj( :% RK)t@׋V6ݥOQౘx1k@ "h:ҭ!>9I%Pک{|)v$׹/ŀ<[] )*U/])Kp~Z%;&Hŧm G1@GY;'15t5fŞ>Lhfyt:7-aRNe&Q^:uus0I95_.{9Qc^j'O84^ȫK0f%Nt4~ӧ]j}T5xRMoA ϯxB"!gI,Mf;3%*{QӮrd~A^X-ɨGi7q9h@e-Rzܽ?qZ Olrc2 ox}1k0 w -wCSحu,%%Z ak')[d}zH"*Yhϵ# 6 dYKGoЪNyE=?E:~jhj|x @Dm-TAܒ^vv5#%`ef03SB#x/b@>4 >Kh -C`8ԥ;3 LdpgXH &aN+ՇbM}IʍNρc{`JxUmk0_qd4d4  V/ly֘$_|s:oc*|XC`,2I#Gq6k/P*Paأ0:^D#ځ3ʼnddt8|¨qBP4Nke-ۮ[` ;g·.)7C7LrO5{&ˍQ76fD$s'FS#szϤsmUی-ZxV('ֹ3eMJ4ubJp7,O4Di CUնv`uiݥAᴩ3=;ԞOdF\=/p\4Y&yu's/H _mʬTN~yٖE"!\ZCK<7 ]0IBikދ>-3Z}V2UxH i9H{eMP7&.xmRMK1W PW(Exz]dڍfJږi7y0'OhxXx9JEo5|8H#jXo; ev)ޑ=v'K^, ,(/(mK %Xr 5Hl|:rY[UXjm.K0u"{=4$S Bs)+O"Uk6kg:c2AVǁĻ!/g Ydd}nIQ Nu^:B@6JFΔGOF62Sk5wܠ{}ݑ]dz4-!L foh?-"_jn+GvtA)n%)S!,'u0Î' ]/vxWk6_1,A>AH%^h_֒jdH3w׼qf4_i4y -,8px\zRL|S9M%g^ Rf-ua}cEIUd˔degK#r?y|[oAQpq"cu}GcgpkLBGAϽl5qn*4nnQgLn ā3Ql&#.dg9\$:ȽSv^z%{E`'x S<8hie*]OF 1U}&CFKfeP؅xIq X oV}\k&[Eբ .\nep@V:B$[**_Z@A+B[#C苸#-^a(M8tVG0Ej'E#_pXADzʼ"H9SD}xªCc-1\Sz=F'7R}._J7^/xkp;m4Mx%f6<6 H@H…W)+W-iLJ G--yv!`n*̇-3^%kzOӽ} 6hMfy.9<ġ =Ts 1TCpT9GT"C3?ptǜ! Um1=utx<=V; ..R窡ghL̵Dk:gDAEy7⾮"[pL a#?%[L^ DCDE/.9">CͶ޹w {;O6kWqx7VuNx{GLlHoJ /lωO?-DcGj|?=CCmًGg=;OpYfkGOu+\}b4hk?W^'xQN+1 +>!Z$\9T;%M]J@āSxWp8AӀ|QE#i{=Gw } %,z%a wEzV|,4aݣ )] f#‹K6ϝky8-8fn?<)"inm+2ʾ5 d%s9I{X.|j~)F #phh)PS2D|UTph1JwɥP0Q$wnD}"~]ó*vvtiIJsB)[fp7J:hanNÜD7s/dYxW]k7}ׯ6cǔ>IR'}hvJSI2\I)}ܯsν~ZZeGnǍ?}x.+ic+toN)^xe4W^*h²-DӚ ޘR4٘^(wb# ~,>ow ?+V⸨yT귫/wXG&\؍r#ז>?AoLs:rv_~| w+J]I]*Io\xnVreݩzF\iU`KZH~ˍũwRTh]7V2 ,[L yzߒnq@αQ{iyқbq̈)v -SJbmE ߋl%Fk<Ժ3b'-"(G1 u[Ԏ¾<fDzJW'[ XZia`GrK|,R㟟LS3BA4$+FE]mu;]鍹>Ә Q! ?<ƙ!KVCڪr,Y$4R藮wE$|kf!0KK2E~V|&)}YZ/ztɧES ~C*EY;#:k xԷ+0>s4 Ljh+@a8&`ARU@f3Pم)Y9j: kY3d6]CXϑYx*e. vOtgoCZFCVZ%,i 5>[H5P_K_*ECS VKK%&PG,=SKRFo=I@ H#MMeYЅLJEN^B>hd>Ҵwo_ *!3tTQ_SYkan I-k՗/0)gIO_:K 8kkT4Z`hP)S>^ݬrYoAdECլH>zLY'M[@Ɠ:@D碔O:-@]S+zrWIwhyRz}MFI j2,lIgFU*~ǀȈKqc,nbh JT{4|di- HL1 FyWo)}NH-1fO⸘oŝ BI7%&X9~b)+>878D95[B `<i|݈$}7y9K\+@td{:ׯ~^ge=;Kc uxuTnF}W TDE 'm ) 9^{#Mj+X̙3gΚ\H|?Y7TGG%tOZ<Ԭh3JvB>V8{d+K0xHP#'g*ܫ#tmdWFK\үlKra0pwLDc|?PݹgcP3}~dL:}Ϗk\J㯓#\W'N5`~ J}RQ>긤 sYi4v fƎF4` &Ae%Wv%팯HXbG ܷ1%G' .yZڇHs7[O7 *:Wyx#OY9g짽e:a`뾡u3~I cgΫBr˥;}0$*Y "`$SY[]YY9ia#(dhurZʸ(0uiS_aʞ u](' ˿k[0"XbqyS9O04MIND+յH5rî6M\h? Qt1r`#ifXgJ u`y vM&_ Dd̴vSTCDf%ᤱ 5 Q@:6K'lStZda/ brM~"Ӈxoj+2^^7)t \-xV tW2XaH~%VO׿f3 dRz6Uau4qn߸g8Nm|õ`xTKo0 W@;衇= v"ѶP[H#H^DG~y_Ѡ۱F( @?4¨ ~(!4[GpW~.nష> c'JBF~,˒cľvB񷗧_'j tu~zOV/RA ;;\<#[imk~ɡ":v]2^+1Fkk-EA!hSC Zoaj;D3PS*Fˆڭ=28On}!r>aZ`[ei9b&s2ۡcWiqYTJ$ s[`T`q,6$oDl?4v`b2'3.L΀ N@(9zm>h2ZG`rbq%3k{EcM A62e{dNᎫͽՊ@ޓzWEЦ| 9c#$2NR&?i`ixrTy9rl$jŠc ^D!SZE#5CDr3\, Ѱ E1\!ɏ<͇R055hB0(:E\o[ .}y H2ZWxSMk0W 64@KiKPX/ (`clkp;ĀC /F7tK`V:mvt[vrM=B w9+QMGnJKQqÅCszq6g[mRf~\Ꮧ,Xm^ l#6ưb{C NZn-(΄O'~H#rP%>Imr m RsUX+@b=ȡE?j#t*-XD˻ֻai^AkC@=ֆL~o{1RV&nLVN{|4v~OJ 4:3MUHXD`E8o1msA[c9sXзyF /j-3dѣrѲ:423-ܯ>r9.ӛ@fm>/Mo%V־jJ`,)xVko7~b!# ȧ&'A4OmFQāAQ!!Y,y'9,=wH'i%gsiQm@V72ֶ+iR.' >7Fd/宸Z Tв=A)X!Ab.VNVAoiީlmoc;1E/'B/닋199J hL>Nw.D ``1ǺmD8'*Nƴ>pѨ/+DYW=o/>Hnʲ<)7OQL^$Iذx%BqSZCA>e%降ZJ:ׁ j-vL[lK9R>;_D)3OW# 8õc/%M_IhjÂ/PkLe Q(EVZJ=Ci]{o_Kz-Q&5T:rیSq]L}-=Ss7ҥ{w5/A1tp1Z=E2_rY;zq3ufo~lY/GlB:*@\ trP>ʞ]{JQW9o{]be::I]N6=20O>Qi+OKHs9'+u z%оg& jeJ%Pl $f3 0[P7` R݇ׯnq#iYIN ˸kYרiq-S'q * < BIމ:Z7U|W%`a[de><]xbtz ӊc$VI]SxJ*S zEg.!=Z,ƊdtrЦZLFp:9b섭+1w+c=}l|9+1q@m,8f:X}6EI薄K<6f~_(?QGC)S@5?|%ߦQOvah`{Iy*t:qPP5o#ZGy6'yvxH^"î٦Ӡ5"콼|ھV @r`(m.l{}>{PSZV;ƣ?syE$:X-xeRMk0W Xs)&T"KXY2Ҩm;6}̌{OwI שCwڃ D/wdQFܣګ4#I;0?IZ{oqrޢh52!= FC?u+ȵ)rf@Zm|o?*ga"C8,zV CTT0m *Xnyk3߻P;8:쓬ǾËJBfX%q/֑ ƽ~boLjgwI 5F̘mZD$v"&3)z|%CdF)rFHudur,f6!qħ{`c(sV|,xr/z[D)ks/4"z> ο,_JzVxSn0 +E[ u ]E7lN Ybb!ITZ𿏴7+`7I$#/+zBA /[| @HV]uÏt]ߤ/Z{`x{AutzFj7j[M _~mHB8?'*z )EPFM5_u&JGmՇ%ET{Ŕ|Ӹ^aC7l}X!pRrm P*UݞU>aR <5> H<;2s<4A;0gBu+!LĞQ,F+K7:״Bp.)fF'IR.ma^kE8 O|a^[f lq$U k,kf;0G~RumƑ[^ rA&b2ѵbY8doi'\̵x~ 9BؔFL 2 X9[L5O1\ FvB&,qf &4 o/TgcCMՖ!rYpҶn(2^\\oGdAgXIBu \<{ܯL|*쐆A/?S8̹ڷu&BxuR0+Rz XҖ%=5e[[D4J75$.5yޛrICs_kg m8'ztZwKDo@w`ԲGms&9-h#gE"[nOXE<gR(.aAUWgRϱ)pnzM3_<6W +| q @F)6K-[pd'RcH(ޜ}QѸNY*wa Croϝ㺮;=5=:rPch{sz(P8. x$B9,]gscIxW }RHVvi<[u QXݬ}[5!_ъmxuTQF ~_!B}hZ,^X&cٞƞ1q7c;Yȃ#}ʎ\L=~QH\ژvzGgkб:Z:z~Nn'c'׭rmly8Iz6m _?}^h|lwK>}y|R &6[E=0MEQܫ3yFƥش~P">Ż?VAJn|&-U`r:$ׅbl 7qZw\V0[B:欭 J?u8e,?OKNɷ%),.*dFmb7чm|HNWdrCꚋQ\-FgC2 f,=<]1H/u>0Q2^!H*%܅BVeqXiӵTU$|-˖J IS42٦ZκCOFvYmMlhձ.2xw58bWg11.k.sS (٢( Dygc>9am)cy1p^{*߇o9*1/4] Hlpt% 6pg? $.EGb<=`I)6CӬcP'Nglԉ[ߧֈRK/sTvnqzR'<-/9mں& P&7ܝs$$֭Q"d`a0GӔ[!p-P9p}?tɳxu @DmD;-ə^`‘7X o0^X8jƩf9Z4$ yj:%uA =j.#<:.AQEs&Ԇ1TO6[^!-!wpDׯWN)V>xS͎0)FVjHW,'*7&V;Le倸7{xax~\zo v`|CC|B;ٰ Y:7#rjo5te\8"C*&ӞMZjgCRs3{x wniw;ZoUa}CI-㘤# :6ک-(L<@WF(5|FT6tB|NZ3| Ej<ʉH,!"͎Iqšsy ZEB m,LKن"~s2/ͷ pB7#$"Yg}O* |EbveDѥyzhZ'=NAMUm;w|8V"{,*X ,3J*0RQ^&B%{uVgӕE[~5zxM #+u,'q7mjFl[ ]S6+J+5Z(ɹ"D($ߑ5:lhnx}WQo6~8(1[YȂ"I-Q6kH*';d'i";=2;yR YIU:򋵴xO_A +F(vsdUx`e:~.Ń4ZBѨ TfkTTFTmNn7+H'ɂ *'~(|Òz:~ TSz|sy&_PB'tTM}QBQii`㚜߃& SJ-PڸE^w)[[r?K%X^1r}[+I;v*^`y҈;eؤڠ8,4/ ݄bPg=Qㄥ ǂD]Z#P'yT63\=6DX7$~>ih|^zw}_ rzRa< zD7]xQ-2%糼vs蛍[(o Ⱦ.v2\R[qՂ:=\Rq.Z>†ћ7WP @<$X*/b!xW arItr4~+`ץ6w8 YMPing$߶6䳅1BOP+3-)0mL0K9 rʂJ{=ϡ$oA_//0wK] =&i5e`*|noO]Bk= 4x:<˸4sT8E>54 !p_U6@kc9ί&]ȇE*Pqٶx2m@Y> b> ċ@4,,*;PA'X43:eцLMrp}/x]PAj@ +D @BK.=c ȻJZ2rRciZڞ3hf&J| G', NDa/4[bE6$k'>14&5j@@5d@7x$S~{,0[͖ϱncez'o t ?,{S;WMN*ٯT^)lX-9* SBXT^DUMuā߽j pJ1]x1O1 T Ċ:WLE9UtǡBg//,\H9mAK!Y!j3z6xMLՐ8(î(=TQy[^h5e g'^ܞRG]&^wc& -6A{뷩Xݯ[gDi4p,9j8HʁgZ;tqA{lnCTq!\}vk6}e%QP8"xUAK1BA`oWz\t3,[wކ7o|o' m"QǞ= H9Nu1Y{ &xʡ趉 SH]Lk'_J&ʏ.:t :6ŰWS6GWfumjfOMˉ|.%϶iy7l04@"~<^LۉNl8%RYFd6&$2n5Gv/.3E43K5߭;߸5xmj1 "H`3^7B(e q#/b+{9w3:YpOLyO T=QF@0Y`GtC -Q~GբM)0`m)s S&aࢹVeDWS1&<7=БdRSL4߉T:ˇ+ƱZ| Z>J`,_ 8#spEOG f!)hf \~NqSSVQbSg[€A5UݭoefC՝u< M=-tɖF7*%9xAk0 "@`vvl;bDJ[Ltn̳OOo/Q@{〝+#08:˰Pʫq3jpnX(~P[@~N"@"IO0R>`GQ.*Un^jS7-w!Y1V'W,: 'v%=Qk/O)uQ^E:Wh%OVFmkT6Y6dj|WUͶ)Yt{MA l 6fO$Eowd,;}3.͚g[.qceG#q+/phGly<4KP *ˇ8~¯*[ aAY-Vs8.m6Nndo;D%r*h8p @$}(i&2\MB5YA! EJ7ƞwU:. ElK{`$OķК/ҞyBT:׈Qö.` J?jcVLf^lg;YJ&ߊjѹ(YUn ؞'_[Jx=4&$p6-# L&;QcC\68җևnk"jFݧ^sEc_IQ 2}'0m?.1@FҊچ |5$uȫDn!_T}trTjg51  ]Z'm|@sn}Հi"_p;.y}y JPR޼@cNaܱ9 /t]( i(zOR{cȣ PI/ȷMCдP'R  Su ?TV:NnƿAܠm\W=DnJLOb!G[r\_2T;kSt>a,Gay0tҠ\m&fr&/j=a.fBL7sY3/$X}t|ͺ􌇊<ǤD ,n\*Jc@:95= /#i).%^((Kƚ'k{ FbC12&jz'N39;[v|~瘼_նYšZ%ůFÓ|Gg6čJ9⨴Ɩ7/F%'yW j'&x]RM1 WK{, 6l{ ,؃!L{l6,OOOғ'el<Gʟ@,K,la4L6.82N/ǁcD«旪Awo{i{ +L\87NLsC\N}040y" V䨥,R ³mݦKy5݇hӓ|Kn@M4^5IQm:qE IBԿ}suTv6=wUՕ޽!ՙ_[΁SҜ`)mqziNd&GG|s"TWkRJ+xA06-MPz)Kz54qDdь5zQ|3zs1`Ԃ^&:u6O`  uRVóGxQ- Zpvm7QAiVN$N<"`1 X 7r%޽*-pRś, :-x#xwDh YLV_yu'T3<4Oڏyw,@{39訍`OWVkӻ0r*p?opwh #G~v L(i_E6 I7ҁ[O~Tn=9aVl"ܵ/xuQJ1 + +x=xZ=)N3n;]̿vYҒ%V( Րy҆sE)JhtC{HE Ci3x>̨vI{-+mx.*X  \a!cOdx-{N#4Iam'Z薾"՚AM1ַmR,l<ґ|pCjY#N=L"iřEbas vE9I}qN *7^ Mq×c) TIC4E(v}NBB?-|G asknYGN_]xGz|zva~ճFNjda=S1%O*䈃A7aGUuz/O>*vHԳIS͆cL' m_ړ"zdVϵoxx9biS<8-Mm\PnknA>9 O{#o%}~acr ‰(uCLRsI.Hj :'%.kKc ,Gԝ*$m,$}ڿsj_st4aHmMhQY').9"막㨈ԏp;$0jG㖛llc$[&l@>}~_?|z/P,p.9C^?yRYmh٪:ha6z l =+~ samxkFMvK~|0ܪmvuZ[ 3?Ennljtjz:Vl7ՄuN8-2wu>T={ aRV+Q KLO|حYlY4ñڧ֎XŌ]WbC+XpJz| ;9=igX^&V_öt(p`_g9<E \ #c+g0QZ6gb%G,z8$\%Z0Lp"BJ[#@P:݃.p &%^0R.ɣpK[FK{LJ 2F'*I1NIdYf]]&xeRj0+B ԅ =BfeHr ]?.Y;t< 4/ - ]F4r*hk4e ('Q%~k 1:N}'s4U냭>F xFTCst85Mvu"A8N?F* ]Hi["FBo`Mq9d _-CvVoBiN~k\|GT,2hAcM Q#V:u62[Fkc?Uް "h8fq-!1J_M}/&V-m0~U˩Ku1i %sX{+ث ݄{\Ou1ZbxuT]k1|ׯX .Їo"qtv#]l_H pVٙ}Gq+Z)}UOĕNްʆ1޾69gR۲gs}>Mqn]gsAX!-뻇/sҍ!Z{]ի1RYdN-f"'"SJ'ýTAY=:a>B^ޖC#ZZ$"$vSKkVN{ߓMh']V8Xy)N (R+S[y>c]L&ouBidkٛ3vjG/\@GXpMA*- &Ĵ@[;: `RqˁbCS;;;>Í6IO%1<[iFvː!pC_h#6#A%A6C~Hn&klq]k6 0زCY`(i;MtZ&~/jFZ@$^4QGti T-2Y`pdatIdm ,)8e]*tdCH9IXZI=_ 2$ !,XhEx,6߱0l J&Ium zבx, |n+ޘقԺ~-o `>+i\ʞ!-|:GFmNEx78e)ݩuѴيT7U|O +lv0ٝp\[ya60)b.KXB!MSNx#"4) FP..h#a!f(Ay(iwY~`< iZGOpȔd>d~ E\ퟮ]`iB6 I[`(':*E*@IbِUģByf]Q2xJ?ѝ5U@oSzUObr(qZژ2 c9q`їx f?c)M#?Axd$T’#(Vpx Wc,qx'CJ94r!s'$HJ/c\}tgbM&>k: ^qP`ŔMu]N*$k 섍z F4{UȑMyRԲȓ b16:}wh+vE{ޛnBxk {68gh ,.,a'Ld>1h'O>t VUFnR߀ȓs~˪Ŝ<`B놆sQπRnFC?J&V |/}6vxR1]*L+Lr4#?Z;{jmB;uʌ5`a{>|N{!?= ܒ\aUH1EԒbOk>N& $;>raPa0.>,X!*sDBQ81Oq'樆kPY1 Ryt$ONԩb)?wf2xGBR+*Lֿ}@T.IVX(=)8raEB /=!  tǛ 7__wR6}ga`^LSƕԇqƟ't0@xTOڵ)l1l7B1vuPI=I֋U&HSB=]2OG]..[Yi& 5.DOt3(Ng  _Y觬x E<"v`q!~+Ng.(=G~sXy'g,a"9(HG:gxTۊ0}W ]HSBJo%Uq,"KFN͜sfvt6=^PD\$ZR=um鋮at2ޑqz:i-ePkp}Z[$ês A5-/E| U+z|tH8|~ũW8ۻw Eg[8ֳZ ưͥAMf@qPɱThGT3XʗDwj]5n7踜<Ƒ*7>0g8)ce TK78!*@U^; _߁cb%^/p4*XoL\x.B +GSmmm _#kII% TuΒԵg0m!8=AekC␨!s)(!*s,"37AAE\5k$I9\:XF$BWDGO</7U;5I3(0ߐ<p~% <͊"GDn%rdRpYR/-eЍM]AC@EPiJ `]$SW.gm!rku8ȇ|_L𠾸<%UbF-7RQs{o9eW>8sAVHPcB(ƬV^ZI<6+gKRX*U犼xMd~0Ċ6z[nd֣ylsԥt)2k=:~lwy);rpHnSt x}PAj0!ЂB@%nd "VlK:4܁mr3z7[5x]R]k@|_Blpc!b0!!Vӝ=B{U?/1C1`MP7^!Ӝ5Aw ֗jﱡ|QYdOp)A.fBA}[NJvi׮3v淋 c5AmfcdSMI!Т>ȴ$ ؖАqaπ΁ hD(Oƈ: ^IʣZ򆼶y[ g)Fu!qLG.;5 %RcUnACp;sDk]O,^v: QPhĠ̨I6ZmJ䡌B`BY!b$CTvVTTf"#'N#U%ja ە%3$~BVsoL|]M눯o91Bp|N؁4ֵExeSn0+ H@z!)P8@@+5M ʮ!;dMo>fgf7%P.O=9/HJ↾[Vxd'Jwdm Xbԑ2oKp.8>PXQH+"cTOߥjzϴxڐzHTq dZ7^g:ki 58$ʷݭVLL?+8 p,hͱ fHLS=ub$v)svAI "ة ʕ~'>tZzΤjzqY_[5Z(eql`nһȆb JǍZ."!JpI_HD T <\`s^-;WqM(ljdcsplYw<Xd)jl i*6Z͒%f @Cw=]O/7~xeQ]K0}ϯ8  5 Ys׆H2޴u*垜5^ȒW4xyi>JA X4ǼkZ]?[ 038 NOh&>lr#p[Mn#b=;W'S;h)ľߜzx Mؽgokϓ>;=ҧ7YxT]k0}ׯd:Ic{V(ct+{ YDl裉1RҮ{ؒ{3jc E qCr]3 wxX--jϽ2 ҙ`断5oq8| ٳ5owK^Vu&i:kqH,=]U|G/rJq`ҟ?<3pO|}`x7Mz%Z0]ΑNZuHgpYLEx9%-<.Aӻx;bCK<Bv[@/ ҡP"D[eJq,hz(ǪEjV5 b^+V7=a$- u"(Z)ܥ#$A y%5qKo dJ9ܦZVŰJ{ҠE r]{ም{]!&~T~IdG"X{FǚCGREuG. TnfiS⨖3A'9ATc'FZ,Tq E.L;lϗX܎.{?_ez1vt$=JO3OzVMjIQ,p2[z>&beժZ=`Ż% +x]Ak0C 4ą =J/$=ʬ4k%]{Gf<|I%p >brkc-|B{E2f8D%z9 fz_-sT^[ 9Nhcw4'nM!i.$Gb/,8n#g[e^Lo{SA.mHLouՂb, Qҩـ)h.Dj`KD:x:fJW8՜{Hd(6P&CԈl Gw.V™*NZܘDcD"W< xW~KGpt' /o!z$~'$ "!fŬ  ,,LI7(2O/zOGgg:?_>~*Voiڑ4%Ҭ9t:Vxp,I;ffvrd0 ޅ?eTT)B{1#Ĕ+E}lcT\'PXsmU@|4cT-F$[JTM]f&\|}UUME~D;kNVt;21'-<ոklf[GDgZn"-zKCxqV'~j~< `23&$GjJq%6mX#<$s.Eq CmKj8# ={䉍w;{`ڄ+ MKotS 9NxTM0W ^c)m)85eJXD$n0}#;f{OO8–q{B,tꆾy֙s(~[8&M{0NC9KtZg6}a֚Be6;u}Ǡr@f 4JTi[7Ƀڸ0"$w-dtQL'0$ 7YmLQ'𹭬OMq&a,rt.lt.D|H@&ĭmj];4XIcҭ;Q΁r$3@z 8@8h B3hJ ~3)o]QP`P?+D 6XGS] 0h~iB.՜vlHSŇlJLGZJץ rv(_Umi:k#6I@iIS`UNH48ӱ] l@ zSzt̢ի*AW&}b"%*fA9~Ȑ՚3N(J4Y~,1,PLl&%eG5z=2ƹ?56+0DxuRMo0 W ;amqZhG,yHMd|{|5|'CIhCOH  BIlHq 4cT6de (QԺvbcpFؠV"AMi-j'mFlwS'͸3\WD-J~=?}oTˏ~u}+O}hH!i}X橠E 1}؂u (SDbZIi%:¿dP3QhNc"8MFn+xQŎlfKaqb3:Tj2}RM&aqyOl c 9;p_塋-j_{s9sW:ȎpYOp4=-fbmOXtgQ #ߪ0Z5ݡODO*gE'Bm:[>Oe8(= O'v3sBo,W=H|R7ӗ|9ksY7Nx}TMk0W )]Hm=zjr(Msۋlmd1$wK=Y̛yf-{YQ3ws#"1ieU6lYҖ~'Uʑzi kV$iQ C .buMǞ]u9K<.0 )SyVZjq@o)s^šuPm tQc`K-:S Ț?= `kҶp* . dx9K tm|f%_^BEopLD-jr+=K ѽE"]_4gPG'F9ҦLq+b{9 R8!1$@?gY>f%ip(؏rT]A=j#A6E7 Ƀ\ lr0MGݦNӲE <ٟ*!:MBf-.坁.<:[iԤ,!#7T=(jXRk9jBp(̕EPF0N۰{,)oޕ/~2.>WBa |Cs1w]+9,;tк).4ϻ~#VlqVxTMk0W .,] ^E&6d$91;c;7{4i=/ED{mr_A9.*V%gǠ8Du6mBadeEm])}܊C"|:l8 AxRm΃ۜsbFu6y8yXAJM`ni> @l*<=$G y}_8JG,[ӵWBFCfr ;;VMyuseqʕF߸Fww%,Eu4CxWB қr>'!@z=9q3mzmiy/\Ka.YB+O llᳶӞ*Il['#L #4ƍM<\>|ӺӁt!CMhT5nq;ApR5h]~\Lw>AS@=캖u6Qg e7H.46^GY$$ҦLuZ[ᛔi%NDL_$R{6K z37 &oiF@oˎwu`؆#(Ts@W)L`^6c~* Q9}m?iWtHMY\$9!DfAW֎/u`hwSEr z1bsDxmSMk0W z @)eLYkl%lޑm`y3o-|%CI8ٗsOH  B hW6de (t}p QR?D^:ʢA'r;TД>K BZI豧$+i-N c^~S$hLsRNO<nXKuk/d ::TpRZ_H׎3w/s >:g#N* +Z>tNG|TPAr)~]x2DŽ^9Ceq(Ϩ# dàL_8f 쉮6$vvFG}^&GCQUFP貗@XX,kql''ۚΙ9LaQCzE6I:.F:jW%xåqˀ1*OUATj`<=H&YjH k\Q*KYMa^Gd)' -}m ~KaLHQ/fs3yMXubf\Q6ܾmݟ@|ڛ[_נc׍Ja\"@nDW)԰CdYWa>46_ MDACP˜:̬qFڶz,|e9* '++ -˄ejy01`ܲryײuv0zk1AdKJL= B{~{oT=]1ScM3XJ:nx 7_cm+? Ee;sR^:.!LJݳ&xPMk0 WBa- v\)j"l]kN=5%B'ԑ}ހv`ij! ծYAag-<ک@)j4 6YhRX ,^NS*c@n0*cț~ݍ}/V4$UА2<5SھW:`4OjrgہfO8azK'I"ԻdpYf >]b/#pg^o3-M-6/ר< 5?<*xAk@+@GMpZ[M`ڱh+vfE+ۇ7y-L<|Ҵ4`7q$ܺtӋ3i/f9a/xuRMk0W @ ^z S{KHs4,֬-li4"Y{%{lB{{o|~#BNv}{XB 贺0Іj%'(Ɲ=iUR9>Z Ϡ(2 n2ԭKyj70=|bTklZӯBiGGge5qG6U(k8M tQadw)&tʻ=={|Ov#‡nkY IsA|t*/mIO(C!fƼ~A۱Nƍ Cs'/ޮe5mE2ԧ>8GȯC<KVE,k_#]^dw8.xVQo8 ~ < HX˶+[4} (6c %$' o?RVҴKG20u,PBAiKPy iF儓ZTH5:f5.]D+%jZk.MD%}y䤫́JgtMJvh]vz l7'v}iD@LWtbk8tx1PSmú{ws2G۠Z^eh- XFr [km;LiE*Ŀ} _q+ʱp%*9vwO䰏Pъa^)6/`Hr4 ҹe&JNdUIM]VMpgnќYO :È͇@NC%z.D63a]Dm$xrcŒ.X*vLGTFH^3b ΨEݴGG X$,#--|=~Yϗ?SȒ9)) (wQDœC 6ʘ^ OB6H5@8ֺAAwb=y=9]>C)TRj3.[…4<,˒bLP 6 r J99ҙVgh`o|L򂀐Y|[Q,i>HE<"OqJ*I2JLRFס^EKZ;$gJsn?>=?>M>.fO)w#GBpFE׽7CexrM rݜcu1 d&rÀj+v=ɱ.͑L捻~/Q7v˵M;~S(9ѠQLѲǗ8,;F7O7{'~(*7NzT_Z>93OX}&rhhҪhWM]9xmRM0W z\Х-fN-lIf$ c'$l.gޛ7<1&2;cM#>$ c4z?a=8 1TnTQ"m8yUѝl(?~?=D)[eYM\Ĭ*eܣn% EͶ ]F;S3UZ%RrYf1 LҭQ36!c%ֳ߳'{(zTfųv.Vb Rn\r:vER%GC CZqM!RX׺(~O# ϓ )2Yسzil- LWsp }'ȡ'"nя!=oq\)wnGy:ĭ娦vZ:5{dN+ Qx~eBS^jyq|nJ-WLbNfbquiqjpbPkE^l= B KD~2[cyb-w. ɟ݉M`GZlm Y[ 22֨tr;bSJO%*]^4o]zdo0=18^|(#nKaFtc;{0Up*llC }hF3@pKUxmP=k0+@ ]:dr-RE}Ɓ ;'=U_"B,j 2 h/!1FW$S8n6%m/- Ƙ*8JLS@g3];'yhcZlJ/eEι'(|x9R7~Lacpb6sd+b.&@:vk༾ԚJ,tL⟬Fw |w ((ǜ6xJ0 TtWNыe",*`/ l0MJw7I+ ^&vs FoZ .T1 ^el*T:wo­' >5PB$uRP;pV6قP;FPJ]B$vK9zU(+5=bXGOoebpxG"II!V>~y7X6Ku>QJ=v1i~1\gJ _,Ofo_{2faarsAhJod3@G2]!C[QzMdЪ߷r6xW6h}?N9%KiPօn_(s;/!XD*ƭ|#OOia-/~L@{2x*.M,(.ix=ks8+pw%e9VZNr;55HHETj]ݟ_rHmK\M*Int7 F~1R鳠aȼ1!L'?K;UgTj@0A>[FhrhWsRlWm35ZƶQOcX8)׍=cw1qd)\>>O[`3n>Wo5Osϥ󥚶՝jQm6 K9( _=oAih-BQ`(9yIhB 5zuGݒyE>{ܗKqYQĸHm֔l} v4f-ٌFf gA"",l"A j^mAoN.v+4{Fc/͒"L=lfFkꯊtf2ZR3AS͚lZ߮6Egh"bNLm>R,if(lc5clm@8hq2>mbL\pU*bIN0@bilu0lEf&4H˭ckI򌅓dtr,"9PkJ*"'Wjȫ6O=zHkdZ.2")kC;>Áа)5.nn.>(`{v6m4r>`E5=ҝq mƄ%Rz>B!0>}ѩ`xOaqp肰XM"1!"i*:5&?Sr}c lrmр[Sq0vM C:q-նE8q5qoM~8p@w"Q?MM#}}w%Q7]ѼaF7ZW,tډ !<0ɼ!3I^m{ #HH!^/!ǚuq!9|%WwC0/6lU, y2qD`{VK(RG<jvNb 8 hLuy#+0XĝEy mtp+J)q=g!@&ZtBTIwtq~\@AÆG^"͟?㍿} $sa!@Ru NLLBe!!~m6z{&$Gicr"sô ɜM(ؔ `ch -;L׽[$C3F څdf8XEյz23F bsҀ5?`w>Bׯ~^j37V`z{;uHMj۵F_VYy6N:xh}ǭeQ;ͼɬ7#^)Ypgf7RĠ[qFdYcF`IM"+WHĒ~\8H{ J#c"WB)nUW5ʝ_\C(1_ C*֪MA<$^4`ZAzqEӑݣe?%H(gT`? !]DƊbAKID7ȌCxB{in$``F|L"tE1VI-@()R$LM05ǽ5;M 2}(Dl+2uØl>!19Ġ 708U\?r9TryurqN0)6UpP:`r]tGN5"$D qGT$[h0̐YC s3L9C 2\յ|Z}ĭj "32i1kxC`:stCNrPh=J~K ?Ȣ1}J"$UC~al{gC5T! |ِșĀv)GI<{;1ap^0q'?8 $P=LxNZ_"2C ~ޑ PS z}1nPC,D=U{Ós?BJj<6Sc^AS^k/ŵV8ةz#2`nN87X cd~-FQ$[F:zE\SoD1@8y3U`ŨPgfdoj:^>YxtjBhpj(GQّ`Fwy孎 70>t\!Ґ쓫-vièq+r).5ޅ;::x&Z񾖍A#zwAc/h/lL'  ;Iy(牙ۨ`~(ƍ;V&o~lX:8N./:NNN񩉝POH)Q„Fv!Y\C dZ6 j ^"'WvMaX9U/Esr yr4j/l_Uw^`:j*Wۼs_ (Y28SZ[/F.!ey2mr"膶|8fAh( =T.Q]% \a,ݢ,#Yei|Q ha1'LH_zjSӨ _N"ixoN(rs?;160C45pc`~Tל[L>5K\^玘2o*XDۧ^ZMȎZ97 ?%+v~V7\_0Z-VhQ= OD˵SqCHwҿ}rG9gret<2"V ~)u0fe4% wL; Q"O "{?t0‰[baϕ1ncG#tzgŠ̓lV,gRz鹣?x_"yԥ"7A)>8V#:Í_cV =7>9ZȓHVTݙBh&}H&VXcغSXWY6[D;8ࣽbلzV.A--R}M;>ixq8#|QX-0X ߚ8&,A-}PSɣ UWBҭ ɏ_OumRr(haH>i!mزD-Oe,p}l-#[513[SOUC8fb3r1|>(o~<;;S`=-JT+t1  ifo$߈?u>e~ 0҇D}&<]h frk:We 1a8&VmVꍡBݵrP!o!*v7XEh.(լ l&jHXHI(A,]'9@W'l("FzτLlg2Hh]NJe&Դ& AZ曨T,)RR/C$( r/$U۵ ǝr449LD 0g9 P' ً׈L<:+olVBOHj0[lڳ1O~Sl п?rPE5jrf(~t['M~MhEDfITaG]уLә*9$SR܉\Oę[~o0cc)( Z4:.Gti ]1nHQ|(y h0MKu O7kڲd-$+ګh\(/$Ё־B$KeCXb#l g`9fik`U%l/X~E_W4 )%d)I@)Qrk/0<,Pz[`~i*a)<ݑ;ّ[vH/Bʛ!3Lc`dMI ~_ :T8Z7WC.>c(jDϋȼZPP7i}GL]QefuKt x,J읖A1&RQI\R>rHʋ V-H9Ȑk1ؕd_YxWbeI eP@"Z 24@G[b8D2be iH6,7H|.8e6 `쉙x 7ّߔ,ˋ Q ^>?N9Ł磡f<ea@yUՒz9rq Fx.6r?19zOV%miӶ{PO|U+-8@rd,CSnb D+x 7cVIru׾ezOH )}<_yDAְZΐ#j-u!ԍzh|ܲ5ŗ}g*U%P7ș ML`K`Z'z:lW,m;%yxa77#omW UBwT6jٹxIK\ڬ dB2^ =tBޟ$[^7辝K wxM 3oަ6" f7xد_ DCkq+@}z(2@Nk#U3qa"ED_)Ppnl(P=PުS3/L>Srxz5S#<5`xeWH/cc$N\![?&a\zCCu9ӪQ,-]/.ݑoyuc_kEt_Z)q\gHyFxyNw 1JcX`@z{u'̰TS#x,rxj6raST|*LrWA~; vn?s~ C|Ui)rܔOƌE+ÍVhr©xx\b7W_M}w:xkf|-m-epl1ۏw"V!=.pxS\G^ZrM-zW0qzW+c gTL"tIwEЅkMߣWT{AgVyuNG*N 0n?{F 0 k#> / [`Tpב;WƂ 9ٓC20 ]?\A`l.V;5g`M/=?>:4}y3^>mL. ]}ҋ7.#&&w5:ɎP,5TTҪsSo2lCaKm;@#-Q"2ojR=ܷ9Zop-9WdK| oEwBhf=Ut C<8=r7ӥ-[>iC A4شX(߱:o>eqɠ˵\*d/AiSǨE,PmsB?%XV! k 3AqRtz˽@`@)T!k^3~+SVϤGdナ657׽o醷gw<(F;﬏O" cՌt)0,#ݽ'E׏]1|:隶˼ݱ{tͱDP#pApϳ-$_o5NjN WH_b#zx"*2LN,H^I+=cTE]*o,U 36e>:P,%0Bky w|7eˋzEȑ!o|DStJ3v{/@խd_8@^,/6 xSK)MIU ˰RFpSR2RA\ yE \ E%Ey A~ɩA%E\\xL*J,HI-+)H,rSs44 ѹxL*J,(I-.)H,3R5RRJs4JxWmS8\?A(8@J9 ]ۙ~QlQc[I v%I ӒK"v}>V2Y 4<`:F D까P,YQOߜ|e hA3R:ӝXH˙0d)V  N=*_i1|5Z@UԴEłPvnsz.L67*UZ#77u0"ʪXhpJUgZxsu~I\UY >~2LwW HM77kZ}//owez>8YӢ!Vq@!"a ǘ[J"^4Fт2HLF|;SR0.Zd"5 23*2Y`8 A4:MG?|@D_|<0|$Ө}5/O+r!"W nT|qgEVgr!"nWfq+q'V(ȍ]OŸ]4OOD,dXb;?ŒG/ zF(vT3uC*fxaFbs`F8'x8#ffc4۷#CJ"byLS8.ټT`BjhpNiM4>8zc1Af5` Zw` ]2'F"<k+OVTd\Ҷ6ܚ;;JWXE#nh$4 ~m ;Nڍv[G7 $nc?ol Dt%IF^ш2ɡC ߤ2 h> qc (/0LGRky90I!#(8\f CkmfH͜TVT!cÙXoܔTh;`d PHR+. -W"a+U i$MKʼnRIaMQMJ)b?0?FuA^6"t4v u,Ң2W6*sȒXMRCn󂥛}Id,/=UX}G&'oZ'%w;᝾\[4Y{, \pmx%F:*@qŴ5,dWйqHJxy[EhBc !g U$љst~V': G[P6٥^pvpPd}XmGing(e^nY!vyмx ;Z1BVojF7fFLUˍGk#W٣e ' F=CЍb9T`LAD xm1 0F#84Ew7uR#mH!񿛢{C$yw*Kq,d=kK--&‘ H/o`uW%_;JkQ!N-5]eaʀsp#ideŒ$P jiz,;[<`GQxWYo8~ >} @6[dhidH츋ΐ:919o'JZxaUqka?V+()!BIDɫẀU,U{9I4p dXVI,,3an ]ӗL-)RY0vK{a8rQJtщLD0W-+,5'a&lu{uQW>> !X%s!!;`oRƷpN%xsPtn&Jqe lt~&Y<]_^j(VCX߶d:F@i26֖rf*pX4)KP]|>2~wwO˿n5j.u: YŎ_ F377gre7fo(E°,RNd+ !ň Gedh.CLdCzݪkcBĸtD OB05z=YC*b%o`2L~U,爗\;-A q7I(ߝM&ߣu b*O&/pD" Scg*h=`ǂ< uĿ[0ښVMG Hx5VEq:ζZ+>)>=i .3rv\LJ{4* 4\9\K*Cks,VyJK eض6X.ZUɕaX>Ƣ3eDݶwc)xeUIC]Z/Aͩ_+_E(4:UQZG$&qGSƿbƒn-yA- fB,LLF a"$ [N!&1W2tDl'.cAϟ,˗)JޞүTmQĦCx#n9_.>NLyh!n-vET9tObb3aT:2&Y# :0p3")9\ܑɵ6=@EH4"vm+dz`ԫZo3||fP'=\ɔŹ(tO<,ƣvԛ@ϻ1|P=w:qBӔNVB\7(sut&$foF=ԫRqڮC=<=;1XbHRfJpXyC<5\j!̫+ӵQ]z4F3͇;UyiWX *Gep ոo=b CUkn Ej'Yg,΀uQ_fC}]ĝc5Ƈud ՝-=@;i.:|ǂ8-L>k㷴qU%c߸d*SQV&"VĢ>Wšpf+[u=a56?f):aB4ՈU,"e* Y4I[AbVTd:(,>%_@R$# z941^3יڮk>}?xUM0!X-8BP&Ƭc!;c;iݴfE3潗](ipoYewwYђypšmQ4kkЏ>^%nRB J,e3[-hÌ21'a;j@Yq WR`Bphrcq&#U>QK|f+RwK/);3+8YZɛVuI%*Z9,xp"MEQWdhna9FdCE`hlO@L^){\"jX [Kʵ#Rβe7g!{֧;7hi8=iys q}R4 aqBipӸ^Wڙ8:''B%q |TV^;+9`ש.Mx8!u<~c#DDߘc i>L$]M.`–'H[򎨊CNO`V#]$bN˃Ir *խ`?#>٨jhaTˋ܃ͳ ܜc:ҒSq3/VdȖ OYtf# YxSMO0 WXiT8BB0Sz4&U4mGCBTՎ{~s GWUm,(C 0=@XԒCdC@(jRx(#P*<ڍcdx-e`*J湩jG)dL "]|i9pk'G=Ϋzh`5:XSA/ g@4ꞁ]Y˂Pq: 逳XȎvŕR(O}6OcJG.v< ᆮ9GI7B=-I} =8_PrLs](Yd`sEW~3HhTWowxԏGM 󶿇Z0Ƈ2 .0ktK/Ʉ:o['[+m4K[O:cp g׫/LL''s|z/xVQ@~V6{ >>"yɴnw&W9Nrək㶖x@fv@GWyVLYU w3g+)ZZ57L% |vu?0$${\o2yEѿ B-k\w-lIbʸ]@,&=\:I,Gz6B+nK؀`"Q6Ctfbu_: W-ҮTey19YԶQQCy,w~jϕzv8wgc"&.x?iC<>bvSZHHɒwI$FƎ*  [V+ET$TTr),zb1ȢD}x;BOz8>R2;K/kn4ݑФ#GD@q#m!/8xx}1k0w`+ӵ4C( xhTb ˒*c^qI޻tp"H>k  Q;,#Ԑi ى&wх.t!$&`H{0ż,@/v6a"OU5#O9(r7q!U~xꣶbAX" 't} paǽZmyk|:)'+hSLnʘ8 J"n/qJ7a}}8Ʌnq? !WM/H"ъa˾BLxTMo@WCDHC$JRc{] Ǝ"q~7ͼuaM]H-oZcr6 CM5LHAuPPw T+S5x!C#ZhQ2ѣN07q7 };:C%+}Ji*r%yhhp>(8*ǚoe s\^/7_u<!ҚErκg"m4evPvYa}mkx AYz\Iu-j14B BP*𧣢w^=LJoki)$Ru5 9ڧכ Y2g 2K)(/mmcd^-n9Qu V&rL&i\ +f6/Yd]gTI@U#-iڠ%Rz|Լtѱ5ÄNF,&[ O_m2/h7$gK>'CI25$BxX]o6} 1@ay膬- ÚlM_[Dd%i;]{IQ-Yvȇx//9#r^+~x4r`] ?/KX0ƀpL$c_G>[ 췷lUIR: L&qW._bvƧW99Fۀ7bq%n\j3Xps-}%3Vaar-IǾ?`2( (9MUM w"><Ɨ!-$R ]XUYau*bXl#8Q`1e{2VЫAۇ2"DC#&k^bo̓ ϶yQpz+p:l\{βK<h8e*կ1EO s~r:18ME;epf >U< AGF\c޸fmvz.7쵆 n6`ِNf Gw[c3|D=tM 3u;l]8;n$Ds-{q3PѮ"W"݄cI7\}!kvxN@ VN }%Hh Ӊ6;N=Fl \*rۿĒ<;#8`ZZƮ  @_h\ J`wiC7}w5by ֦,uJOgE Ǣ:8Ttê7k,+^Po3[Dv؀['GA8Z328i5U/ QSKec52z}&p1<-Q\]{IUUړ%GuZcV -8JH[ 2Pw b^,&լ'76 bOn|oٖŰ]X:OzØo6Xχj>ejCD@Z >}`i0c i!ZZxSN0 )D*m{8pPvN:vut);L5$_4R}}7ow鋧4^fvX!}]a$u,'c2sR FcSS8]fCqG0Lf1ӏg3$‹n .TxWmo6_qpVDT iZ4IѠ$,IN_jV`=wn5Λ,LbkIT.E@ q+4Bc/˒!2:1^<ؘQ\7)!( kz.r HN#ɟR Z,vIGWNYmwgLaOuEwB;=lg dE8s^iN8z9V/ty׳6/BO# 'O*?7oRP+{2iG@z xG1eD{[EΨ$2Nrdh1w|aސ7,eZ+d2{P=*L?Ek_J>/͔i/̸$w/C-4/hIJ9&}T v9SP&'q՝TMOo՟Z4A v6x}-QQƮG!`SvĻzm5vHؐ5HW;$E;-> &o M| Q osla*vx3تŕ,٢k~L/5mu.\m2N>]cث'w& ؘ1Ol, ޑ\Ꜩ1:!7`?7eH4.ɉ?ώ/8"C92f萿xA EbJU36N Pzm|Ihk2dz 2X&# *5$0SЖ1Vfܘj ;,4qXh}t(RS+eٺ=c]5fpxUkk@_qI&ݢ_bAQт/||*%&w7fgME;dRA Kw瞹s̔^J=d+M:͒ҤkA Tnmң&,-W-<%gU\d-Hj%,?A ޞ^sMp1iY)PycN2d\[y06o5QNҭk]5x}(1~g?{9[Ӄ23cϼu$Q+5@iYhu}}HXXp(8f亳,h5ɿgn&b%|JͦYS@ϭf*7̊ʨ1oܕΰz{8&k٠|`KW!̝ul:fln 5VXն|o48jӍ6_oxz?-ʰ:_2*iƕz[q k(roOlD]X+?6mcnȳjg..0.9㐋8bϛGQS]Z[CionkA kB6aiP~j} ے4\bT3&')[{l6ڇ,ɰ)%mn ‡4#P 1HN57ZgY@,u}/=5JzfWlu{F!kkb:;l,P"S5ۡw;EHE@c1Ldlf HޠeN?M6 *&t4*9FSE`H_ܶI(wDئ+y#GKDizn#72Pgchhd}۴~bQWIMe>Jb;_ *~ 9vu%SstMHcf5Ey1 R oAo#442áwq(.=:zh0f`s8xj0 "@ebe}nPۥ#OMz߷: qگC^{)7$G ĩ,[Z} ֝ܖ NN,y%cQs&/#Tn|/*: HSp"]C3κđm,״<Ύ\i@c@Gו)A)钨fױHI2H+zbxMsU/Jutqq@(L6 xi:H7HPl R.25 5\8ΘF |G9^ńe]<٘Y0xaBm^p8Szne[wlNjޝO ,MЮ=%?elRoL0 6ܻ$3w.B˴R/(^pt؟hOK""GcC~[XίĿ z6Ffo2{עi1x̹?yi)Ө&>e\ xVKO1WRBjZNku&coҊޱwC)3;4AFrAkU*m2ӛ,jђ"j.e&d".2x!aˡuC7e!KS#0x=I;9l iW=NIdڃ) S>>x&KAN$L~l3X<&cQlppq CXdM0XL!r118z\e .vSk}G7 5 BQц3,--(z"%|C{=:Jz>hL $mvDܣ\i(-~6bZWˀ-Usc*lWyfl7w5T) &,*D:p~KoYD4P;Px68:59[Ghp[ܶ^Q_Gi2tex:+̂LR̡-РM'c&1U3Bd$.,wP TF-sEg%s& ']Ht_o{Zu xWOH~_1T["ѾڞT*hu'!-aﺻkMB'/ޝo_ɬ /t9NGW5B& OOA⋺e납RaI|%\6R9c?eәܣV6.O~.T[W*t(09Έ*d .OvNwh9ZNp{S d20=Õ7ه=f 0 %}i =e9;U~$5SXTQ' 5֋-D 3 q?:"-.6s  )h$Hd|2 ]>>WV0(2K$~+7 D0\<۔)[;Mӽul`)%u|!#gF^1.m6bN(A\(lsx͝*[ Lh-m1r̤KtkxH`wN_V pK=ӽzx{\pYPQ7k:8ƥdS >jt3Q|<{-\;cy\d6㠹=%vZ/=/nns.SjﵹLNۜhھ-m'+Rǘ^H8Cڤs }9YjDm6L8]l:>ɾkDZ*yB}LhlA.UOkOZǤY-ox8j~Z'n]8Ao[l ٥jb3:}Wg&s pr%[9HC\d^ZR/z-ed&O"؄Qy]R{\Iam&L7\oKOeE\=(9I"AYHS|D4Uw f ָˣiW>iIS6ȭmgITZ3[A1a;4 D$鮧elW}5P(ހ )"EeIԶ%b|feU:~ʷac IGs'&6j%lZB]/\>8Աu?p\a28[fcɸ=9޴UED,-lkPvf뿲劳xN!<Ngzһ1m<6cnbffVrKgojs XFh7ps"#2z$^#)'ՅS.+Z;ԑZ*4ݼpSjnGluWucybecc{jٳm3)̔+( Ms"['?I]k#% _r+2xRN0 )L)J H0sw&%q&Ļtv8ߟY=|m;H,# TQ~te5x4`w!/+.YmC,<A"j|I6 |x!J0U`"간w U#k@\p}(ZY{T'T,MIx E9 G1$f2%{VZs!= ;֜r7"?Uϑsy`&^CYURdWi,caj?j ="?g8!:taR3A[6Ѣ `qk΁8@x-J}e bxݓN@<ń^ Azl=5p ZإP5wwSeVLm#SZX㒲jA0%\ RƲF;c揷Iw:fR[[2,4oe#0$;s=ώE,|e]۰-VNک37 o+vCG<[ Sq:Cic[/+]v>#`xVo0~_qMri&46:*/V;v$[ҭc>_;ˢk1Sh-(N%¢?LWsjz{,J@Ls38oL!*UԿ-H$w#zdEwfv2 .S2PzKh^<<~ZL7^ٺֺOM+|BVg7Zga*$M?=Uw²ǬVo,Cf5mL/)2:s6ބJ*Tl|PZQsݤF"Vx4 fZ变 ʥdiS=הg5FhŢB [G1qmSq"KT̅MyjSl@*={v?I5 n6R;O.ay@[v?@0CI c*G O@De '2OM,ȸ>_I+VHHeM1YN ,HYGl@Yh>_v jrr85..8Q^}g}m{\JC.HqUkuZX<ʱ.PvBiH ]LaHGRRuB~aƉv`9-I &q&NI.\ H@Luh՝HCXJCDVXMXܠYyAW5nD+0GSFN>H~s+U<wXW76xxQ: 19;1=J$9#59۵(1-3'uĉ)\ @WҷsxɸQ$ 19;1=J$$8%$qĉ1 hxxQ2 19;1=J$9?''$յ(qĴD"$"{8x;xqcRqj^De0Sx;xqofqqf^DIzETsx;xQ4 19;1=J$%$'r2~ eJSxz.%x;xQ( 19;1=J$#??x&~ Kx;xQ, 19;1=J$'?1#??x& Vux;xQ: 19;1=J$$,3=/r*)( }:xK(/*H_xSVWpT+MJ-RVWpH(/*JTU0b(x ,V<ԊD̜T6xKRU0> xA 0D9ſEݵ( n*)C:%z~=n )^$xHȤB@p$'6s[v%^%6ѷ HV>rGsR^i[BSxK(/* H,)I-PыU}zxSVWpK)MIUH+JTU0Sv 7xKTU0G xA 0E9\. EJC2$CLIFO#=TufI569 *p!p(+g+S1[ʵqͅ-*KAJ7t>pݛ0|Bt4wm/-l@AfxøQ0 19;1=J$%$q2q 8xxQ0 19;1=J$#5`BPRH/RHI-+)V*ћxK(/*(,NI,IOs5qx`fNG]%$Eh`QffVL{gZZ ) \*bcV`n8$&x`fNG]')Ln00`PE {ҍz >@"k7xKRU0#HxMkA +rPJ,Hl ;Y2{s4bO,j23[nY$W0#AG_e橇HV&uGN4Ֆߝ˵{gkCm6Fci9$eޑ'k{rQj|.YQ土XTWW:a9xkc:(jPZ\ⓟY4Qjs~NNb P)J\p7j'xSK)MIU ˰RFpSR2RA\ y99%% \ @PZRZSjU؈jjS(oxN 07`EEC"=&!9k}o fUKj3$0awPuA8lXznFU6D)R_܇T~`7ЍHNK^]<»MyB)8,Fb$.2 ?N}0Tx+-NuL(I-.q J.(J(/*(r3RsrrRP%@:KJ2JlM.~5>x J.(p(/*) )x*J.(Hɏ//IQUH+K.ЬsNPPBW`TZ\⒓TUm۰x J.(p(/*)25;xMo0 D-%(֢C4LBd%{A>Jء>Cz>GB5gPPWր@m&/1N`Kh\:pקۛpwʕYmH:z0ȶ]w-*:ml>O€[:kd QlSH[ Ϥ53JLN{|p(@X$:\OZׇr15;;섣%{xtZɚVȼ׉|i2ńH*G:N:O|@7Q_¯d $h(]4O4O_nWv)K+USHK[T۠(%mKӠ,S(Lh_ꚷ 6CJDD :% @+J<8Ŋ%qm \>ǭJ4RiE\݄&I %?̬*5L>"g ?wj:մ̲/4 x%M @὿2EZdTJDk3̽Fa7BAݷڣbm{` 48g=OLN` kg尯2 :4Dwy妨4׎A !sYU4O?;q4x]@ aBZi&˲"e:fct!??|F[@۶ZWCZK!7w* L:b*`1Y+R6Pky c9Ia|#di<͂]/Q'LȞZvUy}{Q9gaTd*(-l~k9'x}Ok0C6ie(%韋.C{sOFwo'mHzofr YMSovvl۟{{[DV;—H 7_׏S nfhpO"㢹PBȐ ;Ќq.8B6= ]Y2%A00B5cG7ۂmlQs%G>(H_4g8JJF|',3:q *W'2Y?eZOᎨ\KC$m04izIO*ľ>r+契*𩶡[gP Æ9Wuunk{~Hg xK @s " J\D"N|tx{'xoSdGAIsʦgA#,x i9pi)J)4]ڙ3 fS.{W'gq })r2')˒M FexKSUH+K.TRP0UR04 b JSH-*/RTɓxSVpOK-J,IMQHT(ʯLO3R033дRHW/QHM,)HKJ(/*H"M.s@ xEA 0D=. {Aэ$?3!AzwS, 71JV. pLǦk0΂XRdA%ڕϢZyoSpMf婤.+2iO?3y/_`h::cxE1 0 Ew/d tұȘōĐXK봅})h!u Aum͹0< 8فQ{:gʖԦ:=9(N&-EMFNQjUFbܲY#aIaH` ]<<nNgC+FylYz rC}SNxuα 0=Oq hEptu.19i6"}wZ?~G˽kҀp-w&;O[. ϶ |5i|(U#R}`$&!4ә| G6an*ͅªMMc4Rfi:?pU/UxM-.NLOPQQHO-)OQ4|1ű/x}OK1UpuAATRn &%;I ?'M>|L5Lm?&ƶ-{m1SDJw©$zO3~ ~ U5VӋ=h K+Ԫ ^mLہ,' ^[`24F pa4 :tT 9 c&2ĩQq4w$ZlchV{gW >y9RFzYǝ;7b{)/iw&N`["_lkjLjG^u+#XȌ%eQ꺘K[dICbb_\ xO 0+ 걈(zPx_%YGIofӡRBE ճ|Ip$I>M-0eղXܵ:Q1'Zgq)7pvN،dM {uERA廵urF\N5"xQ. 19;1=J$7(;%t~&+ulb]мtgE#6Vy^ly,iا: $m:u]Hhour̕)bQ5bq>|,XAtxK(/*qI,.N-H%`aM. %x+N-qI,.POQ%E%: EEũy%%yE%ELxPJ@W@YHE+#޷it*~6o:0̼=+ R/\?zq825" 8I-|8w!A"GCs] nLi_C8g_G+x4x@c[4,_{GRRO &T R=uyQszLI0W:m܏$|g/ub+wۓGn{#3kVNLx-/*qI,.N-v+(I-. 6I($dK!\\Mx+N-qI,.POI-,KMQQH+I+UOIŗǧVkrq!hxA 1 E9EvꦨTEPܗb[y/fe,_{Hi, >mϱf/`UIE#ؒ:m XwRκ.^'~6Mb dŚZPZSnB_k l8562Co W']xK(/*qI,.pQpQpQpQpQpQpQˠ ,xeAo0 Vsaw'B,M8V(Qy6Q3c*kq$ou=;Ke/XrЮw},_$?C['q?IS.CQ )8YG jhbb UxK(/*qI,.pQpQpܜT#0a;jrej$hrO[ x+N-qI,.PrT*xyy@NJ: IE -: \\HZssRy%y !L󀂎&tz#<-9#59[C &|xE 0FH&{  VwplN«_J5 2pd"ߟC CH!a&sֈ ϼ܋&יL&=;5}4} ){xL*J,(I-.)H,A ̒ ԂԼbM..h|rFjrJ\sVxIO.M+II,.K-LIN,*LIIJM̫ I-.IKA8xK(/*H+K.ϋ/,Ɉ/+-.M̉KM xK+K.ϋ/,Ɉ/+-.M̉KMUUHjh*Ts)(eh(yd**irN xE 0{}ZDppqt1%Y .v)rtf'5_]ue͒wg`z@uݎ3Į}# OHnCE} 7`ŕnO3)T(R +,̚2Ä>&mZJskL*W ǭ,7όRfI uZ4]E2*ϿܬrۓӠ"+h0A&iY+}3YpNr$JV5AB^!i]A*fRW}9Vp4C:AN=fumKk<(9ثjWY('bvi2I.D@ΔK^Al& Xm,r}&6y(Z)Koȯ)dQ$@})Q5)xVD]?{~57hϾQ[cpiaɴG Maintainer: Hadley Version: 0.1 Depends: MASS, R Collate: a.r b.rdevtools/tests/testthat/test-infrastructure.r0000644000176200001440000000650413200623656021343 0ustar liggesuserscontext("Infrastructure") test_that("use_* functions consistently", { pkg <- "infrastructure" unlink(pkg, recursive = TRUE) withr::with_output_sink(tempfile(), create(pkg)) use_test("test1", pkg = pkg) use_package_doc(pkg = pkg) use_vignette("test2", pkg = pkg) use_rcpp(pkg = pkg) use_travis(pkg = pkg, browse = FALSE) use_coverage(pkg = pkg) use_appveyor(pkg = pkg) x <- 1:100 use_data(x, pkg = pkg) use_data_raw(pkg = pkg) use_readme_rmd(pkg = pkg) use_readme_md(pkg = pkg) use_news_md(pkg = pkg) use_revdep(pkg = pkg) use_cran_comments(pkg = pkg) use_code_of_conduct(pkg = pkg) use_mit_license(pkg = pkg) # Suppress R CMD check note file.rename("infrastructure/.travis.yml", "infrastructure/travis.yml") file.rename("infrastructure/.Rbuildignore", "infrastructure/Rbuildignore") }) test_that("use_data", { on.exit(unlink(c("testUseData/data", "testUseData/R/sysdata.rda"), recursive = TRUE, force = TRUE), add = TRUE) # Add data to package local({ expect_false(exists("global_test_data_item_to_save", .GlobalEnv)) .GlobalEnv$global_test_data_item_to_save <- 42L on.exit(rm(list = "global_test_data_item_to_save", pos = .GlobalEnv), add = TRUE) local_test_data_item_to_save <- global_test_data_item_to_save system_test_data_item_to_save <- global_test_data_item_to_save expect_message(use_data(global_test_data_item_to_save, pkg = "testUseData"), "Saving") expect_message(use_data(local_test_data_item_to_save, pkg = "testUseData"), "Saving") expect_message(use_data(system_test_data_item_to_save, pkg = "testUseData", internal = TRUE), "Saving") expect_error(use_data(global_test_data_item_to_save, pkg = "testUseData"), "overwrite = TRUE") expect_error(use_data(local_test_data_item_to_save, pkg = "testUseData"), "overwrite = TRUE") expect_error(use_data(system_test_data_item_to_save, pkg = "testUseData", internal = TRUE), "overwrite = TRUE") expect_message(use_data(global_test_data_item_to_save, pkg = "testUseData", overwrite = TRUE), "Saving") expect_message(use_data(local_test_data_item_to_save, pkg = "testUseData", overwrite = TRUE), "Saving") expect_message(use_data(system_test_data_item_to_save, pkg = "testUseData", internal = TRUE, overwrite = TRUE), "Saving") }) # Test data is in package local({ expect_false(exists("global_test_data_item_to_save")) expect_false(exists("local_test_data_item_to_save")) expect_false(exists("system_test_data_item_to_save")) expect_warning(load_all("testUseData"), "Objects listed as exports, but not present in namespace: sysdata_export") on.exit(unload("testUseData"), add = TRUE) expect_false(exists("global_test_data_item_to_save")) expect_false(exists("local_test_data_item_to_save")) expect_equal(system_test_data_item_to_save, 42L) data(global_test_data_item_to_save, envir = environment()) data(local_test_data_item_to_save, envir = environment()) expect_equal(global_test_data_item_to_save, 42L) expect_equal(local_test_data_item_to_save, 42L) }) }) devtools/tests/testthat/testData/0000755000176200001440000000000013200623656016667 5ustar liggesusersdevtools/tests/testthat/testData/NAMESPACE0000644000176200001440000000002713200623656020105 0ustar liggesusersexport(sysdata_export) devtools/tests/testthat/testData/data/0000755000176200001440000000000013200623656017600 5ustar liggesusersdevtools/tests/testthat/testData/data/a.rda0000644000176200001440000000007313200623656020510 0ustar liggesusers r0b```b`bf H020pD b`(+L8devtools/tests/testthat/testData/data/b.r0000644000176200001440000000000713200623656020201 0ustar liggesusersb <- 2 devtools/tests/testthat/testData/R/0000755000176200001440000000000013200623656017070 5ustar liggesusersdevtools/tests/testthat/testData/R/sysdata.rda0000644000176200001440000000013013200623656021222 0ustar liggesusers r0b```b`gf`b2Y# '+,NI,IO(/*d8E Lu^~5vdevtools/tests/testthat/testData/DESCRIPTION0000644000176200001440000000026713200623656020402 0ustar liggesusersPackage: testData Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/testS4export/0000755000176200001440000000000013200623656017546 5ustar liggesusersdevtools/tests/testthat/testS4export/NAMESPACE0000644000176200001440000000003713200623656020765 0ustar liggesusersexportClasses(class_to_export) devtools/tests/testthat/testS4export/R/0000755000176200001440000000000013200623656017747 5ustar liggesusersdevtools/tests/testthat/testS4export/R/all.r0000644000176200001440000000006713200623656020705 0ustar liggesuserssetClass('class_to_export', representation='character')devtools/tests/testthat/testS4export/DESCRIPTION0000644000176200001440000000053613200623656021260 0ustar liggesusersPackage: testS4export Title: reproduce S4 export bug with devtools Version: 0.1 Description: reproduce S4 export bug with devtools Author: Karl Forner Maintainer: Karl Forner Depends: R (>= 2.15) Imports: methods Suggests: testthat (>= 0.7.1.99), License: GPL (>= 2) Collate: all.r devtools/tests/testthat/test-metadata.r0000644000176200001440000000264613200623656020046 0ustar liggesuserscontext("Metadata") test_that("devtools metadata for load hooks", { # testLoadHooks test package has .onLoad and .onAttach load_all("testLoadHooks") md <- dev_meta("testLoadHooks") expect_true(md$.onLoad) expect_true(md$.onAttach) unload("testLoadHooks") # testNamespace test package doesn't have .onLoad and .onAttach load_all("testNamespace") md <- dev_meta("testNamespace") expect_false(exists("onLoad", envir = md)) expect_false(exists("onAttach", envir = md)) unload("testNamespace") }) test_that("NULL metadata for non-devtools-loaded packages", { expect_true(is.null(dev_meta("stats"))) }) test_that("dev_packages() lists devtools-loaded packages", { expect_false(any(c("testNamespace", "testLoadHooks") %in% dev_packages())) expect_false("testNamespace" %in% dev_packages()) expect_false("testLoadHooks" %in% dev_packages()) load_all("testNamespace") expect_true("testNamespace" %in% dev_packages()) expect_false("testLoadHooks" %in% dev_packages()) load_all("testLoadHooks") expect_true("testNamespace" %in% dev_packages()) expect_true("testLoadHooks" %in% dev_packages()) unload("testNamespace") expect_false("testNamespace" %in% dev_packages()) expect_true("testLoadHooks" %in% dev_packages()) unload("testLoadHooks") expect_false("testNamespace" %in% dev_packages()) expect_false("testLoadHooks" %in% dev_packages()) expect_false("stats" %in% dev_packages()) }) devtools/tests/testthat/testCollateAbsent/0000755000176200001440000000000013200623656020536 5ustar liggesusersdevtools/tests/testthat/testCollateAbsent/R/0000755000176200001440000000000013200623656020737 5ustar liggesusersdevtools/tests/testthat/testCollateAbsent/R/b.r0000644000176200001440000000000613200623656021337 0ustar liggesusersa <- 2devtools/tests/testthat/testCollateAbsent/R/a.r0000644000176200001440000000000613200623656021336 0ustar liggesusersa <- 1devtools/tests/testthat/testCollateAbsent/R/c.r0000644000176200001440000000000613200623656021340 0ustar liggesusersa <- 3devtools/tests/testthat/testCollateAbsent/DESCRIPTION0000644000176200001440000000030013200623656022235 0ustar liggesusersPackage: testCollateAbsent Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/test-remotes.r0000644000176200001440000000711513200623656017740 0ustar liggesuserscontext("remote_deps") test_that("remote_deps returns NULL if no remotes specified", { expect_equal(remote_deps("testTest"), NULL) }) test_that("remote_deps returns works with implicit types", { with_mock(`devtools::package2remote` = function(...) NULL, { expect_equal(parse_one_remote("hadley/testthat"), github_remote("hadley/testthat")) expect_equal(parse_one_remote("klutometis/roxygen"), github_remote("klutometis/roxygen")) }) expect_equal(split_remotes("hadley/testthat,klutometis/roxygen"), c("hadley/testthat", "klutometis/roxygen")) expect_equal(split_remotes("hadley/testthat,\n klutometis/roxygen"), c("hadley/testthat", "klutometis/roxygen")) expect_equal(split_remotes("hadley/testthat,\n\t klutometis/roxygen"), c("hadley/testthat", "klutometis/roxygen")) }) test_that("dev_remote_type errors", { expect_error(parse_one_remote(""), "Malformed remote specification ''") expect_error(parse_one_remote("git::testthat::blah"), "Malformed remote specification 'git::testthat::blah'") expect_error(parse_one_remote("hadley::testthat"), "Unknown remote type: hadley") expect_error(parse_one_remote("SVN2::testthat"), "Unknown remote type: SVN2") }) test_that("dev_remote_type works with explicit types", { with_mock(`devtools::package2remote` = function(...) NULL, { expect_equal(parse_one_remote("github::hadley/testthat"), github_remote("hadley/testthat")) }) expect_equal(split_remotes("github::hadley/testthat,klutometis/roxygen"), c("github::hadley/testthat", "klutometis/roxygen")) expect_equal(split_remotes("hadley/testthat,github::klutometis/roxygen"), c("hadley/testthat", "github::klutometis/roxygen")) expect_equal(split_remotes("github::hadley/testthat,github::klutometis/roxygen"), c("github::hadley/testthat", "github::klutometis/roxygen")) expect_equal(split_remotes("bioc::user:password@release/Biobase#12345,github::klutometis/roxygen"), c("bioc::user:password@release/Biobase#12345", "github::klutometis/roxygen")) }) test_that("different_sha returns TRUE if remote or local sha is NA not found", { expect_true(different_sha(remote_sha = NA, local_sha = "4a2ea2")) expect_true(different_sha(remote_sha = "4a2ea2", local_sha = NA)) expect_true(different_sha(remote_sha = NA, local_sha = NA)) }) test_that("different_sha returns TRUE if remote_sha and local_sha are different", { expect_true(different_sha(remote_sha = "5b3fb3", local_sha = "4a2ea2")) }) test_that("different_sha returns FALSE if remote_sha and local_sha are the same", { expect_false(different_sha(remote_sha = "4a2ea2", local_sha = "4a2ea2")) }) test_that("local_sha returns NA if package is not installed", { expect_equal(local_sha("tsrtarst"), NA_character_) }) test_that("remote_sha.github_remote returns NA if remote doesn't exist", { expect_equal(remote_sha(github_remote("arst/arst")), NA_character_) }) test_that("remote_sha.github_remote returns expected value if remote does exist", { expect_equal(remote_sha(github_remote("hadley/devtools@v1.8.0")), "ad9aac7b9a522354e1ff363a86f389e32cec181b") }) test_that("package2remotes looks for the DESCRIPTION in .libPaths", { expect_equal(package2remote("testTest")$sha, NA_character_) withr::with_temp_libpaths({ expect_equal(package2remote("testTest")$sha, NA_character_) install("testTest", quiet = TRUE) expect_equal(package2remote("testTest")$sha, "0.1") # Load the namespace, as packageDescription looks in loaded namespaces # first. loadNamespace("testTest") }) expect_equal(package2remote("testTest")$sha, NA_character_) }) devtools/tests/testthat/test-git.R0000644000176200001440000000133013171407310016770 0ustar liggesuserscontext("git") git_test_repo <- function() { d <- tempfile("") dir.create(d) r <- git2r::init(d) git2r::config(r, user.name = "user", user.email = "user@email.com") writeLines(character(), file.path(d, ".gitignore")) git2r::add(r, ".gitignore") git2r::commit(r, "initial") r } test_that("SHA for regular repository", { r <- git_test_repo() commit <- git2r::commits(r)[[1]] expect_false(git2r::is_commit(git2r::head(r))) expect_equal(git_repo_sha1(r), commit@sha) }) test_that("SHA for detached head", { skip_on_cran() r <- git_test_repo() commit <- git2r::commits(r)[[1]] git2r::checkout(commit) expect_true(git2r::is_commit(git2r::head(r))) expect_equal(git_repo_sha1(r), commit@sha) }) devtools/tests/testthat/testCollateOrder/0000755000176200001440000000000013200623656020375 5ustar liggesusersdevtools/tests/testthat/testCollateOrder/NAMESPACE0000644000176200001440000000003113200623656021606 0ustar liggesusersexportPattern("^[^\\.]") devtools/tests/testthat/testCollateOrder/R/0000755000176200001440000000000013200623656020576 5ustar liggesusersdevtools/tests/testthat/testCollateOrder/R/b.r0000644000176200001440000000000713200623656021177 0ustar liggesusersa <- 2 devtools/tests/testthat/testCollateOrder/R/a.r0000644000176200001440000000003013200623656021172 0ustar liggesusers#' @include b.r a <- 1 devtools/tests/testthat/testCollateOrder/DESCRIPTION0000644000176200001440000000030113200623656022075 0ustar liggesusersPackage: testCollateOrder Title: Tools to make developing R code easier License: GPL-2 Description: Author: Geoff Maintainer: Geoff Version: 0.1 devtools/tests/testthat/rtools-gcc493-winbuilder/0000755000176200001440000000000012724305435021576 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/Rtools.txt0000644000176200001440000000632312724305435023625 0ustar liggesusers Rtools Collection 3.3.0.1959 This is the Rtools.txt file, which will be installed in the main Rtools directory. See also the README.txt file there, which describes the origin of some of the tools. The tools installed in the Rtools\mingw_32, and Rtools\mingw_64 directories are from the MinGW-w64 distribution. CYGWIN Some of the R tools use the Cygwin DLLs, which are included. If you already have Cygwin installed, you should not install these (but see "EXISTING CYGWIN INSTALLATIONS" below). REMAINING TASKS This installer doesn't install all of the tools necessary to build R or R packages, because of license or size limitations. The remaining tools are all available online (at no charge) as described below. TO BUILD R PACKAGES, you may optionally want item 1 below (LaTeX). TO BUILD R, you do need item 1, and item 2 (Inno Setup) is optional if you would like to build the installer. As of R 3.2.0, the manuals are optional. To build them you will need item 3 below. The Rtools installer will optionally edit your PATH variable as follows: PATH=c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin; (where you will substitute appropriate directories for the ones listed above, but please keep the path in the same order as shown. LaTeX and R itself should be installed among the "others".) REMAINING ITEMS 1. You may install LaTeX, available from http://www.miktex.org LaTeX is used to build .pdf forms of documentation. 2. You need the Inno Setup installer, available from http://www.innosetup.com to build the R installer. 3. You will need Perl, e.g. from http://strawberryperl.com/ to build the manuals. VERSIONS This installer includes a multilib build of gcc 4.6.3, compiled by Brian Ripley, and separate 32- and 64-bit builds of gcc 4.9.3 and mingw-w64 v3 compiled by Jeroen Ooms and others. For use with the latter it also includes a copy of libicu55. The Cygwin tools and DLLs were updated on November 19, 2013. They are 32 bit versions taken from base-cygwin 3.3-1 coreutils 8.23-4 cygwin 1.7.33-1 diffutils 3.3-2 findutils 4.5.12-1 gawk 4.1.1-1 grep 2.21-1 gzip 1.6-1 texinfo 4.13 (used for R 3.1.x and earlier) and 5.2 (used for R 3.2.x and later). Tcl/Tk is version 8.5.8. tar is a locally modified version of tar version 1.21. EXISTING CYGWIN INSTALLATIONS If you already have a full 32 bit Cygwin installation, then you should not install our Cygwin DLLs in the Rtools/bin directory. You should make sure your existing cygwin/bin directory is on the path (*after* all the other entries listed above) and use the DLLs from there. However, this may not work if your Cygwin installation is too old. In that case the Rtools utilities will fail to run. To fix this, you should update the Cygwin installation, or (with great care!) replace the DLLs with the ones from the Rtools distribution. Be very careful, because if you have incompatible DLLs, your Cygwin tools will stop working. devtools/tests/testthat/rtools-gcc493-winbuilder/VERSION.txt0000644000176200001440000000003312724305435023460 0ustar liggesusersRtools version 3.3.0.1959 devtools/tests/testthat/rtools-gcc493-winbuilder/bin/0000755000176200001440000000000012724305435022346 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/bin/ls.exe0000644000176200001440000000000012724305435023455 0ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/mingw_32/0000755000176200001440000000000012724305435023223 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/mingw_32/bin/0000755000176200001440000000000012724305435023773 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/mingw_32/bin/gcc.exe0000644000176200001440000000000012724305435025220 0ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/mingw_64/0000755000176200001440000000000012724305435023230 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/mingw_64/bin/0000755000176200001440000000000012724305435024000 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493-winbuilder/mingw_64/bin/gcc.exe0000644000176200001440000000000012724305435025225 0ustar liggesusersdevtools/tests/testthat/test-depend.r0000644000176200001440000000340313200623656017515 0ustar liggesuserscontext("Dependencies") test_that("Warned about dependency versions", { # Should give a warning about grid version expect_warning(load_all("testImportVersion"), "Need grid >=") unload("testImportVersion") # TODO: Add check for NOT giving a warning about compiler version # Not possible with testthat? }) test_that("Error on missing dependencies", { # Should give a warning about missing package expect_error(load_all("testImportMissing"), "missingpackage not available") # Loading process will be partially done; unload it unload("testImportMissing") }) test_that("Packages in depends are required", { load_all("testDependMissing") expect_true("package:MASS" %in% search()) unload("testDependMissing") detach("package:MASS", unload = TRUE) }) test_that("Parse dependencies", { deps <- parse_deps("\nhttr (< 2.1),\nRCurl (>= 3),\nutils (== 2.12.1),\ntools,\nR (>= 2.10),\nmemoise") expect_equal(nrow(deps), 5) expect_false("R" %in% deps$name) expect_equal(deps$compare, c("<", ">=", "==", NA, NA)) expect_equal(deps$version, c("2.1", "3", "2.12.1", NA, NA)) # Invalid version specifications expect_error(parse_deps("\nhttr (< 2.1),\nRCurl (3.0)")) expect_error(parse_deps("\nhttr (< 2.1),\nRCurl ( 3.0)")) expect_error(parse_deps("\nhttr (< 2.1),\nRCurl (==3.0)")) expect_error(parse_deps("\nhttr (< 2.1),\nRCurl (==3.0 )")) expect_error(parse_deps("\nhttr (< 2.1),\nRCurl ( ==3.0)")) # This should be OK (no error) deps <- parse_deps("\nhttr (< 2.1),\nRCurl (== 3.0.1)") expect_equal(deps$compare, c("<", "==")) expect_equal(deps$version, c("2.1", "3.0.1")) }) test_that("Dependencies of development package include direct dependencies", { deps <- dev_package_deps("testNamespace") expect_equal(deps$package, "bitops") }) devtools/tests/testthat/test-load-collate.r0000644000176200001440000000237213200623656020622 0ustar liggesuserscontext("Load: collate") test_that("If collate absent, load in alphabetical order", { load_all("testCollateAbsent") expect_equal(a, 3) unload("testCollateAbsent") }) test_that("Warned about files missing from collate, but they're still loaded", { expect_message(load_all("testCollateMissing"), "a.r") expect_equal(a, 1) expect_equal(b, 2) unload("testCollateMissing") }) test_that("Extra files in collate don't error, but warn", { expect_message(load_all("testCollateExtra"), "b.r") expect_equal(a, 1) unload("testCollateExtra") }) temp_copy_pkg <- function(pkg) { file.copy(normalizePath(pkg), tempdir(), recursive = TRUE) normalizePath(file.path(tempdir(), pkg)) } test_that("DESCRIPTION Collate field, with latest @includes, is recognised by load_all", { # Make a temporary copy of the package for this test, # since update_collate (in load_all) may have permanent side effects, # namely changing the collate field in the DESCRIPTION file test_pkg <- temp_copy_pkg('testCollateOrder') on.exit(unlink(test_pkg, recursive = TRUE)) expect_output( expect_message(load_all(test_pkg), "Loading testCollateOrder"), "Updating collate directive" ) expect_equal(a, 1) #even though b.r set it to 2 unload(test_pkg) }) devtools/tests/testthat/infrastructure/0000755000176200001440000000000013200656427020200 5ustar liggesusersdevtools/tests/testthat/infrastructure/README.Rmd0000644000176200001440000000065113200624327021575 0ustar liggesusers--- output: md_document: variant: markdown_github --- ```{r, echo = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>", fig.path = "README-" ) ``` # infrastructure The goal of infrastructure is to ... ## Example This is a basic example which shows you how to solve a common problem: ```{r example} ## basic example code ``` devtools/tests/testthat/infrastructure/tests/0000755000176200001440000000000013200624327021334 5ustar liggesusersdevtools/tests/testthat/infrastructure/tests/testthat.R0000644000176200001440000000011013200624327023307 0ustar liggesuserslibrary(testthat) library(infrastructure) test_check("infrastructure") devtools/tests/testthat/infrastructure/tests/testthat/0000755000176200001440000000000013201030625023165 5ustar liggesusersdevtools/tests/testthat/infrastructure/tests/testthat/test-test1.R0000644000176200001440000000012213200624327025327 0ustar liggesuserscontext("test1") test_that("multiplication works", { expect_equal(2 * 2, 4) }) devtools/tests/testthat/infrastructure/infrastructure.Rproj0000644000176200001440000000047013200624327024271 0ustar liggesusersVersion: 1.0 RestoreWorkspace: No SaveWorkspace: No AlwaysSaveHistory: Default EnableCodeIndexing: Yes Encoding: UTF-8 AutoAppendNewline: Yes StripTrailingWhitespace: Yes BuildType: Package PackageUseDevtools: Yes PackageInstallArgs: --no-multiarch --with-keep.source PackageRoxygenize: rd,collate,namespace devtools/tests/testthat/infrastructure/NAMESPACE0000644000176200001440000000014013200624327021404 0ustar liggesusers# Generated by roxygen2: fake comment so roxygen2 overwrites silently. exportPattern("^[^\\.]") devtools/tests/testthat/infrastructure/NEWS.md0000644000176200001440000000013213200624327021264 0ustar liggesusers# infrastructure 0.0.0.9000 * Added a `NEWS.md` file to track changes to the package. devtools/tests/testthat/infrastructure/data/0000755000176200001440000000000013200624446021105 5ustar liggesusersdevtools/tests/testthat/infrastructure/data/x.rda0000644000176200001440000000034113200624446022042 0ustar liggesusersBZh91AY&SYM@ 8h0  hd 4 SFL CL&M=o@.jA-˧o (cD<2̚BH ixdevtools/tests/testthat/infrastructure/R/0000755000176200001440000000000013200624327020373 5ustar liggesusersdevtools/tests/testthat/infrastructure/R/infrastructure-package.r0000644000176200001440000000010713200624327025225 0ustar liggesusers#' infrastructure. #' #' @name infrastructure #' @docType package NULL devtools/tests/testthat/infrastructure/vignettes/0000755000176200001440000000000013200624327022202 5ustar liggesusersdevtools/tests/testthat/infrastructure/vignettes/test2.Rmd0000644000176200001440000000370513200624327023714 0ustar liggesusers--- title: "Vignette Title" author: "Vignette Author" date: "`r Sys.Date()`" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Vignette Title} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- Vignettes are long form documentation commonly included in packages. Because they are part of the distribution of the package, they need to be as compact as possible. The `html_vignette` output type provides a custom style sheet (and tweaks some options) to ensure that the resulting html is as small as possible. The `html_vignette` format: - Never uses retina figures - Has a smaller default figure size - Uses a custom CSS stylesheet instead of the default Twitter Bootstrap style ## Vignette Info Note the various macros within the `vignette` section of the metadata block above. These are required in order to instruct R how to build the vignette. Note that you should change the `title` field and the `\VignetteIndexEntry` to match the title of your vignette. ## Styles The `html_vignette` template includes a basic CSS theme. To override this theme you can specify your own CSS in the document metadata as follows: output: rmarkdown::html_vignette: css: mystyles.css ## Figures The figure sizes have been customised so that you can easily put two images side-by-side. ```{r, fig.show='hold'} plot(1:10) plot(10:1) ``` You can enable figure captions by `fig_caption: yes` in YAML: output: rmarkdown::html_vignette: fig_caption: yes Then you can use the chunk option `fig.cap = "Your figure caption."` in **knitr**. ## More Examples You can write math expressions, e.g. $Y = X\beta + \epsilon$, footnotes^[A footnote here.], and tables, e.g. using `knitr::kable()`. ```{r, echo=FALSE, results='asis'} knitr::kable(head(mtcars, 10)) ``` Also a quote using `>`: > "He who gives up [code] safety for [code] speed deserves neither." ([via](https://twitter.com/hadleywickham/status/504368538874703872)) devtools/tests/testthat/infrastructure/README.md0000644000176200001440000000025413200624327021452 0ustar liggesusers# infrastructure The goal of infrastructure is to ... ## Example This is a basic example which shows you how to solve a common problem: ``` r ## basic example code ``` devtools/tests/testthat/infrastructure/codecov.yml0000644000176200001440000000001713200624327022335 0ustar liggesuserscomment: false devtools/tests/testthat/infrastructure/cran-comments.md0000644000176200001440000000073513200624446023271 0ustar liggesusers## Test environments * local OS X install, R 3.3.1 * ubuntu 12.04 (on travis-ci), R 3.3.1 * win-builder (devel and release) ## R CMD check results 0 errors | 0 warnings | 1 note * This is a new release. ## Reverse dependencies This is a new release, so there are no reverse dependencies. --- * I have run R CMD check on the NUMBER downstream dependencies. (Summary at ...). * FAILURE SUMMARY * All revdep maintainers were notified of the release on RELEASE DATE. devtools/tests/testthat/infrastructure/DESCRIPTION0000644000176200001440000000065613200624446021711 0ustar liggesusersPackage: infrastructure Title: What the Package Does (one line, title case) Version: 0.0.0.9000 Authors@R: person("First", "Last", email = "first.last@example.com", role = c("aut", "cre")) Description: What the package does (one paragraph). Depends: R (>= 3.3.1) License: MIT + file LICENSE Encoding: UTF-8 LazyData: true Suggests: testthat, knitr, rmarkdown, covr VignetteBuilder: knitr LinkingTo: Rcpp Imports: Rcpp devtools/tests/testthat/infrastructure/revdep/0000755000176200001440000000000013200624327021457 5ustar liggesusersdevtools/tests/testthat/infrastructure/revdep/check.R0000644000176200001440000000013613200624327022657 0ustar liggesuserslibrary("devtools") revdep_check() revdep_check_save_summary() revdep_check_print_problems() devtools/tests/testthat/infrastructure/travis.yml0000644000176200001440000000017213200624327022225 0ustar liggesusers# R for travis: see documentation at https://docs.travis-ci.com/user/languages/r language: R sudo: false cache: packages devtools/tests/testthat/infrastructure/Rbuildignore0000644000176200001440000000024013200624327022536 0ustar liggesusers^.*\.Rproj$ ^\.Rproj\.user$ ^\.travis\.yml$ ^codecov\.yml$ ^appveyor\.yml$ ^data-raw$ ^README\.Rmd$ ^README-.*\.png$ ^revdep$ ^cran-comments\.md$ ^CONDUCT\.md$ devtools/tests/testthat/infrastructure/CONDUCT.md0000644000176200001440000000255313200624327021620 0ustar liggesusers# Contributor Code of Conduct As contributors and maintainers of this project, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities. We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, or religion. Examples of unacceptable behavior by participants include the use of sexual language or imagery, derogatory comments or personal attacks, trolling, public or private harassment, insults, or other unprofessional conduct. Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct. Project maintainers who do not follow the Code of Conduct may be removed from the project team. Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by opening an issue or contacting one or more of the project maintainers. This Code of Conduct is adapted from the Contributor Covenant (http:contributor-covenant.org), version 1.0.0, available at http://contributor-covenant.org/version/1/0/0/ devtools/tests/testthat/infrastructure/LICENSE0000644000176200001440000000006113200624446021176 0ustar liggesusersYEAR: 2016 COPYRIGHT HOLDER: Your name goes here devtools/tests/testthat/infrastructure/appveyor.yml0000644000176200001440000000153413200624327022565 0ustar liggesusers# DO NOT CHANGE the "init" and "install" sections below # Download script file from GitHub init: ps: | $ErrorActionPreference = "Stop" Invoke-WebRequest http://raw.github.com/krlmlr/r-appveyor/master/scripts/appveyor-tool.ps1 -OutFile "..\appveyor-tool.ps1" Import-Module '..\appveyor-tool.ps1' install: ps: Bootstrap cache: - C:\RLibrary # Adapt as necessary starting from here build_script: - travis-tool.sh install_deps test_script: - travis-tool.sh run_tests on_failure: - 7z a failure.zip *.Rcheck\* - appveyor PushArtifact failure.zip artifacts: - path: '*.Rcheck\**\*.log' name: Logs - path: '*.Rcheck\**\*.out' name: Logs - path: '*.Rcheck\**\*.fail' name: Logs - path: '*.Rcheck\**\*.Rout' name: Logs - path: '\*_*.tar.gz' name: Bits - path: '\*_*.zip' name: Bits devtools/tests/testthat/test-install-version.R0000644000176200001440000000241612724305435021354 0ustar liggesuserscontext("Install specific version") local_archive <- function(x) { if (x == "http://cran.r-project.org") readRDS("archive.rds") else NULL } test_that("package_find_repo() works correctly with multiple repos", { # The archive format is not readable on older R versions # (`do not know how to convert 'value' to class "POSIXct"`) skip_if_not(getRversion() >= "3.2.0") repos <- c(CRANextras = "http://www.stats.ox.ac.uk/pub/RWin", CRAN = "http://cran.r-project.org") # ROI.plugin.glpk is the smallest package in the CRAN archive package <- "ROI.plugin.glpk" with_mock(`devtools:::read_archive` = local_archive, res <- package_find_repo(package, repos = repos) ) expect_equal(NROW(res), 1L) expect_equal(res$repo, "http://cran.r-project.org") expect_true(all(grepl("^ROI.plugin.glpk", res$path))) }) test_that("package_find_repo() works correctly with archived packages", { # Issue 1033 skip_if_not(getRversion() >= "3.2.0") repos <- c(CRAN = "http://cran.r-project.org") package <- "igraph0" with_mock(`devtools:::read_archive` = local_archive, res <- package_find_repo(package, repos = repos) ) expect_gte(NROW(res), 8L) expect_true(all(res$repo == "http://cran.r-project.org")) expect_true(all(grepl("^igraph0", res$path))) }) devtools/tests/testthat/test-shim.r0000644000176200001440000000736713200623656017233 0ustar liggesuserscontext("shim") # Utility functions ----------------------------- # Take file paths and split them into pieces expand_path <- function(path) { strsplit(path, .Platform$file.sep) } # Return the last n elements of vector x last_n <- function(x, n = 1) { len <- length(x) x[(len-n+1):len] } # Tests ----------------------------------------- test_that("system.file returns correct values when used with load_all", { load_all("testShim") shim_ns <- ns_env("testShim") # The devtools::system.file function should return modified values. files <- shim_system.file(c("A.txt", "B.txt", "C.txt", "D.txt"), package = "testShim") files <- expand_path(files) expect_true(all(last_n(files[[1]], 3) == c("testShim", "inst", "A.txt"))) expect_true(all(last_n(files[[2]], 3) == c("testShim", "inst", "B.txt"))) # Note that C.txt wouldn't be returned by base::system.file (see comments # in shim_system.file for explanation) expect_true(all(last_n(files[[3]], 2) == c("testShim", "C.txt"))) # D.txt should be dropped expect_equal(length(files), 3) # If all files are not present, return "" files <- shim_system.file("nonexistent", package = "testShim") expect_equal(files, "") # Test packages loaded the usual way - should just pass through to # base::system.file expect_identical(base::system.file("Meta", "Rd.rds", package = "stats"), shim_system.file("Meta", "Rd.rds", package = "stats")) expect_identical(base::system.file("INDEX", package = "stats"), shim_system.file("INDEX", package = "stats")) expect_identical(base::system.file("nonexistent", package = "stats"), shim_system.file("nonexistent", package = "stats")) unload("testShim") }) test_that("shimmed system.file respects mustWork", { load_all("testShim") find_missing <- function(mustWork) { shim_system.file("missing.txt", package = "testShim", mustWork = mustWork) } expect_equal(find_missing(FALSE), "") expect_error(find_missing(TRUE), "No file found") }) test_that("Shimmed system.file returns correct values when used with load_all", { load_all("testShim") shim_ns <- ns_env("testShim") # Make sure the version of system.file inserted into the namespace's imports # is the same as devtools::system.file expect_identical(get("system.file", envir = shim_ns), shim_system.file) # Another check expect_identical(get_system.file(), shim_system.file) unload("testShim") }) test_that("Replacement system.file returns correct values when installed", { # This set of tests is mostly a sanity check - it doesn't use the special # version of system.file, but it's useful to make sure we know what to look # for in the other tests. # Make a temp lib directory to install test package into old_libpaths <- .libPaths() tmp_libpath = file.path(tempdir(), "devtools_test") if (!dir.exists(tmp_libpath)) dir.create(tmp_libpath) .libPaths(c(tmp_libpath, .libPaths())) install("testShim", quiet = TRUE) expect_true(require(testShim)) # The special version of system.file shouldn't exist - this get() will fall # through to the base namespace expect_identical(get("system.file", pos = asNamespace("testShim")), base::system.file) # Test within package testShim files <- get_system.file()(c("A.txt", "B.txt", "C.txt", "D.txt"), package = "testShim") files <- expand_path(files) expect_true(all(last_n(files[[1]], 2) == c("testShim", "A.txt"))) expect_true(all(last_n(files[[2]], 2) == c("testShim", "B.txt"))) expect_equal(length(files), 2) # Third and fourth should be dropped # If all files are not present, return "" files <- get_system.file()("nonexistent", package = "testShim") expect_equal(files, "") detach("package:testShim", unload = TRUE) # Reset the libpath .libPaths(old_libpaths) }) devtools/tests/testthat/test-session-info.R0000644000176200001440000000027713200623656020640 0ustar liggesuserscontext("package_info") test_that("package_info errors if an input package is not installed", { expect_error(package_info(c("foo", "bar")), "`pkgs` 'foo', 'bar' are not installed") }) devtools/tests/testthat/testShim/0000755000176200001440000000000013200623656016716 5ustar liggesusersdevtools/tests/testthat/testShim/A.txt0000644000176200001440000000001413200623656017632 0ustar liggesusersfile /A.txt devtools/tests/testthat/testShim/inst/0000755000176200001440000000000013200623656017673 5ustar liggesusersdevtools/tests/testthat/testShim/inst/A.txt0000644000176200001440000000002013200623656020604 0ustar liggesusersfile inst/A.txt devtools/tests/testthat/testShim/inst/B.txt0000644000176200001440000000001713200623656020613 0ustar liggesusersfile inst/B.txtdevtools/tests/testthat/testShim/NAMESPACE0000644000176200001440000000003013200623656020126 0ustar liggesusersexport(get_system.file) devtools/tests/testthat/testShim/R/0000755000176200001440000000000013200623656017117 5ustar liggesusersdevtools/tests/testthat/testShim/R/a.r0000644000176200001440000000045513200623656017526 0ustar liggesusersa <- 1 # When this package is loaded with load_all, devtools should add a # replacement system.file function. # When the package is loaded with load_all, this returns devtools::system.file # When installed and loaded, this returns base:system.file. get_system.file <- function(...) { system.file } devtools/tests/testthat/testShim/DESCRIPTION0000644000176200001440000000037113200623656020425 0ustar liggesusersPackage: testShim Title: Tools to make developing R code easier License: GPL-2 Description: This package is for testing the devtools shim system. Author: Hadley Maintainer: Hadley Version: 0.1 Collate: a.rdevtools/tests/testthat/testShim/C.txt0000644000176200001440000000001413200623656017634 0ustar liggesusersfile /C.txt devtools/tests/testthat/test-data.r0000644000176200001440000000466313200623656017200 0ustar liggesuserscontext("Data") test_that("data available when lazydata not true", { load_all("testData") # a and b are in data/ and shouldn't be available yet # sysdata_export and sysdata_nonexport are in R/sysdata.rda, and should be available expect_false(exists("a")) expect_false(exists("b")) expect_equal(sysdata_export, 3) expect_equal(sysdata_nonexport, 4) # Load the data objects (into the local environment) data(a, envir = environment()) data(b, envir = environment()) expect_equal(a, 1) expect_equal(b, 2) unload("testData") # Objects loaded with data() should still be available expect_equal(a, 1) expect_equal(b, 2) # Objects loaded in sysdata.rda shouldn't be available expect_false(exists("sysdata_export")) expect_false(exists("sysdata_nonexport")) }) test_that("data available when lazydata is true", { load_all("testDataLazy") # a and b are in data/ and should be available because of lazydata # sysdata_export and sysdata_nonexport are in R/sysdata.rda, and should be available expect_equal(a, 1) expect_equal(b, 2) expect_equal(sysdata_export, 3) expect_equal(sysdata_nonexport, 4) unload("testDataLazy") }) test_that("data available when lazydata not true, and export_all is FALSE", { load_all("testData", export_all = FALSE) # a and b are in data/ and shouldn't be available yet # sysdata_export is exported; sysdata_nonexport isn't expect_false(exists("a")) expect_false(exists("b")) expect_equal(sysdata_export, 3) expect_false(exists("sysdata_nonexport")) # Load the data objects (into the local environment) data(a, envir = environment()) data(b, envir = environment()) expect_equal(a, 1) expect_equal(b, 2) # Shouldn't be able to load objects in R/sysdata.rda with data() expect_warning(data(sysdata_export, envir = environment())) expect_false(exists("sysdata_nonexport")) unload("testData") }) test_that("data available when lazydata is true, and export_all is FALSE", { load_all("testDataLazy", export_all = FALSE) # a and b are in data/ and should be available because of lazydata # sysdata_export is exported; sysdata_nonexport isn't expect_equal(a, 1) expect_equal(b, 2) expect_equal(sysdata_export, 3) expect_false(exists("sysdata_nonexport")) # Shouldn't be able to load objects in R/sysdata.rda with data() expect_warning(data(sysdata_export, envir = environment())) expect_false(exists("sysdata_nonexport")) unload("testDataLazy") }) devtools/tests/testthat/rtools-manual/0000755000176200001440000000000012634340020017702 5ustar liggesusersdevtools/tests/testthat/rtools-manual/bin/0000755000176200001440000000000012634340125020460 5ustar liggesusersdevtools/tests/testthat/rtools-manual/bin/ls.exe0000644000176200001440000000000012416621515021572 0ustar liggesusersdevtools/tests/testthat/rtools-manual/gcc-4.6.3/0000755000176200001440000000000012634340336021116 5ustar liggesusersdevtools/tests/testthat/rtools-manual/gcc-4.6.3/bin/0000755000176200001440000000000012634340767021676 5ustar liggesusersdevtools/tests/testthat/rtools-manual/gcc-4.6.3/bin/gcc.exe0000644000176200001440000000000012416621515023112 0ustar liggesusersdevtools/tests/testthat/test-sort.R0000644000176200001440000000036513171407310017203 0ustar liggesuserscontext("sort") test_that("case-insensitive sort order", { expect_equal(sort_ci(rev(letters)), letters) expect_equal(sort_ci(rev(LETTERS)), LETTERS) expect_equal(sort_ci(c(letters[1:3], LETTERS[1:3])), c("A", "a", "B", "b", "C", "c")) }) devtools/tests/testthat/testHooks/0000755000176200001440000000000013200623656017101 5ustar liggesusersdevtools/tests/testthat/testHooks/R/0000755000176200001440000000000013200623656017302 5ustar liggesusersdevtools/tests/testthat/testHooks/R/a.r0000644000176200001440000000041313200623656017703 0ustar liggesusersrecord_use <- function(hook) { function(...) { h <- globalenv()$hooks h$events <- c(h$events, hook) } } .onLoad <- record_use("pkg_load") .onUnload <- record_use("pkg_unload") .onAttach <- record_use("pkg_attach") .onDetach <- record_use("pkg_detach") devtools/tests/testthat/testHooks/DESCRIPTION0000644000176200001440000000030413200623656020604 0ustar liggesusersPackage: testHooks Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: a.rdevtools/tests/testthat/test-extraction.R0000644000176200001440000000220213200623656020372 0ustar liggesuserscontext("extract_lang") f <- function(x) { a <- 1:10 for (i in seq_along(a)) { print(i) } } test_that("extract_lang issues warning if nothing found", { expect_warning(extract_lang(body(f), comp_lang, quote(j)), "Devtools is incompatible") }) test_that("extract_lang and comp_lang finds full statements", { expect_equal(extract_lang(body(f), comp_lang, quote(a <- 1:10)), quote(a <- 1:10)) }) test_that("extract_lang and comp_lang find child calls", { expect_equal(extract_lang(body(f), comp_lang, quote(seq_along(a))), quote(seq_along(a))) }) test_that("extract_lang and comp_lang finds partial statements", { expect_equal(extract_lang(body(f), comp_lang, quote(a <- NULL), 1:2), quote(a <- 1:10)) }) test_that("extract_lang and comp_lang finds partial statements from for conditionals", { expect_equal(extract_lang(body(f), comp_lang, quote(for (i in seq_along(a)) NULL), 1:3), quote(for (i in seq_along(a)) { print(i) })) }) test_that("modify_lang modifies properly", { expect_equal(modify_lang(quote(a <- 1:10), function(x) if (comp_lang(x, quote(a))) quote(b) else x), quote(b <- 1:10)) }) devtools/tests/testthat/testHelp/0000755000176200001440000000000013200623656016706 5ustar liggesusersdevtools/tests/testthat/testHelp/NAMESPACE0000644000176200001440000000010513200623656020121 0ustar liggesusers# Generated by roxygen2 (4.0.0): do not edit by hand export(foofoo) devtools/tests/testthat/testHelp/R/0000755000176200001440000000000013200623656017107 5ustar liggesusersdevtools/tests/testthat/testHelp/R/foofoo.r0000644000176200001440000000025713200623656020565 0ustar liggesusers#' Test function for help #' #' The purpose of this function is to test out \code{help} and \code{?} from #' devtools. #' #' @export foofoo <- function() "You called foofoo." devtools/tests/testthat/testHelp/DESCRIPTION0000644000176200001440000000032713200623656020416 0ustar liggesusersPackage: testHelp Title: Tools to make developing R code easier License: GPL-2 Description: Test package for devtools help. Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/testHelp/man/0000755000176200001440000000000013200623656017461 5ustar liggesusersdevtools/tests/testthat/testHelp/man/foofoo.Rd0000644000176200001440000000035113200623656021236 0ustar liggesusers% Generated by roxygen2 (4.0.0): do not edit by hand \name{foofoo} \alias{foofoo} \title{Test function for help} \usage{ foofoo() } \description{ The purpose of this function is to test out \code{help} and \code{?} from devtools. } devtools/tests/testthat/test-github-connections.R0000644000176200001440000001045013200623656022020 0ustar liggesuserscontext("git usage and GitHub connections") test_that("git (non-)usage is detected, diagnosed, and can be added", { skip_on_cran() test_pkg <- create_in_temp("testNoGit") expect_false(uses_git(test_pkg)) expect_warning(expect_message(print(dr_github(test_pkg)), "not a git repository"), 'DR_GITHUB FOUND PROBLEMS') expect_message(use_git_with_config(message = "initial", pkg = test_pkg, add_user_config = TRUE), "Initialising repo") expect_true(uses_git(test_pkg)) erase(test_pkg) }) test_that("GitHub non-usage is handled", { skip_on_cran() test_pkg <- create_in_temp("testNoGitHub") use_git_with_config(message = "initial", pkg = test_pkg, add_user_config = TRUE, quiet = TRUE) expect_true(uses_git(test_pkg)) expect_false(uses_github(test_pkg)) expect_warning(expect_message(print(dr_github(test_pkg)), "not a GitHub repository"), "DR_GITHUB FOUND PROBLEMS") expect_identical(github_dummy, github_info(test_pkg)) expect_error(use_github_links(test_pkg), "Cannot detect .* GitHub") erase(test_pkg) }) ## If env var GITHUB_PAT exists and there's willingness to call GitHub ## use_github() could be tested right around here. ## As it stands, that function is not under automated testing. test_that("github info and links can be queried and manipulated", { skip_on_cran() test_pkg <- create_in_temp("testGitHub") mock_use_github(test_pkg) expect_true(uses_github(test_pkg)) gh_info <- github_info(test_pkg) expect_equal(gh_info$username, "hadley") expect_equal(gh_info$repo, "devtools") desc_path <- file.path(test_pkg, "DESCRIPTION") desc <- read_dcf(desc_path) ## default GitHub links created by use_github_links() via use_github() expect_identical(desc[["URL"]], file.path("https://github.com", gh_info$username, gh_info$repo)) expect_identical(desc[["BugReports"]], file.path("https://github.com", gh_info$username, gh_info$repo, "issues")) ## make sure we don't clobber existing links mtime_before <- file.info(desc_path)$mtime expect_message(use_github_links(test_pkg), "found and preserved") mtime_after <- file.info(desc_path)$mtime expect_identical(mtime_before, mtime_after) ## make sure we diagnose lack of GitHub links desc$URL <- "http://www.example.com" desc$BugReports <- "http://www.example.com/issues" write_dcf(desc_path, desc) expect_warning(expect_message(print(dr_github(test_pkg)), "no GitHub repo link"), "DR_GITHUB FOUND PROBLEMS") expect_warning(expect_message(print(dr_github(test_pkg)), "no GitHub Issues"), "DR_GITHUB FOUND PROBLEMS") erase(test_pkg) }) test_that("github_info() prefers, but doesn't require, remote named 'origin'", { skip_on_cran() test_pkg <- create_in_temp("testGitHubInfo") mock_use_github(test_pkg) r <- git2r::repository(test_pkg, discover = TRUE) git2r::remote_add(r, "anomaly", "https://github.com/twitter/AnomalyDetection.git") ## defaults to "origin" expect_equal(github_info(test_pkg)$username, "hadley") expect_equal(github_info(test_pkg)$repo, "devtools") ## another remote will be used if no "origin" git2r::remote_rename(r, "origin", "zzz") gh_info <- github_info(test_pkg) expect_equal(gh_info$username, "twitter") expect_equal(gh_info$repo, "AnomalyDetection") git2r::remote_rename(r, "zzz", "origin") ## another remote can be requested by name gh_info <- github_info(test_pkg, remote_name = "anomaly") expect_equal(gh_info$username, "twitter") expect_equal(gh_info$repo, "AnomalyDetection") ## error if nonexistent remote requested by name expect_error(github_info(test_pkg, remote_name = "nope")) erase(test_pkg) }) test_that("username and repo are extracted from github remote URL", { gh_info <- list(username = "hadley", repo = "devtools", fullname = "hadley/devtools") expect_identical(github_remote_parse("https://github.com/hadley/devtools.git"), gh_info) expect_identical(github_remote_parse("https://github.com/hadley/devtools"), gh_info) expect_identical(github_remote_parse("git@github.com:hadley/devtools.git"), gh_info) }) devtools/tests/testthat/test-bioconductor.r0000644000176200001440000000357113200623656020756 0ustar liggesuserscontext("bioc") test_that("bioc repo paths are parsed correctly", { expect_equal(parse_bioc_repo("devtools"), list(repo="devtools")) expect_equal(parse_bioc_repo("devtools#12345"), list(repo="devtools", revision="12345")) expect_equal(parse_bioc_repo("user@devtools"), list(repo="devtools", username = "user")) expect_equal(parse_bioc_repo("user:pass@devtools"), list(username = "user", password = "pass", repo="devtools")) expect_equal(parse_bioc_repo("devel/devtools"), list(release = "devel", repo="devtools")) expect_equal(parse_bioc_repo("3.1/devtools"), list(release = "3.1", repo="devtools")) expect_equal(parse_bioc_repo("release/devtools"), list(release = "release", repo="devtools")) expect_equal(parse_bioc_repo("user:pass@3.1/devtools#123"), list(username = "user", password = "pass", release = "3.1", repo="devtools", revision = "123")) expect_error(parse_bioc_repo("user:@devtools"), "Invalid bioc repo") expect_error(parse_bioc_repo("@devtools"), "Invalid bioc repo") expect_error(parse_bioc_repo("devtools/"), "Invalid bioc repo") expect_error(parse_bioc_repo("junk/devtools"), "Invalid bioc repo") }) test_that("install_bioc", { skip_on_cran() lib <- tempfile() on.exit(unlink(lib, recursive = TRUE), add = TRUE) dir.create(lib) libpath <- .libPaths() on.exit(.libPaths(libpath), add = TRUE) .libPaths(lib) # unload BiocInstaller if it is already loaded, unload it after this function # finishes as well unloadNamespace("BiocInstaller") on.exit(unloadNamespace("BiocInstaller"), add = TRUE) # Install BiocInstaller to the new library source("https://bioconductor.org/biocLite.R") # This package has no dependencies or compiled code and is old install_bioc("MeasurementError.cor", quiet = TRUE) expect_silent(packageDescription("MeasurementError.cor")) expect_equal(packageDescription("MeasurementError.cor")$RemoteType, "bioc") }) devtools/tests/testthat/testS4sort/0000755000176200001440000000000013200623656017214 5ustar liggesusersdevtools/tests/testthat/testS4sort/NAMESPACE0000644000176200001440000000004413200623656020431 0ustar liggesusersexportClass(A, B, C, D, E, F, G, H) devtools/tests/testthat/testS4sort/R/0000755000176200001440000000000013200623656017415 5ustar liggesusersdevtools/tests/testthat/testS4sort/R/classes.r0000644000176200001440000000060713200623656021240 0ustar liggesusers## Define a graph of classes with complex inheritance pattern ## example taken from wikipedia: ## https://en.wikipedia.org/wiki/Topological_sorting#Examples setClass("A") setClass("B") setClass("C") setClassUnion("D", members = c("A", "B", "C")) setClass("E") setIs("B", "E") setClassUnion("F", members = c("D", "E")) setClass("G") setIs("D", "G") setClassUnion("H", members = c("C", "E")) devtools/tests/testthat/testS4sort/DESCRIPTION0000644000176200001440000000041513200623656020722 0ustar liggesusersPackage: testS4sort Title: Test package for sorting S4 classes License: GPL (>= 2) Description: Author: Facundo Muñoz Maintainer: Facundo Muñoz Version: 0.1 Collate: 'classes.r' Imports: methods devtools/tests/testthat/archive.rds0000644000176200001440000000714412724305435017261 0ustar liggesusers | wG$B<ãӊxWb%+\^%ASϟF|g&ݝLv6MoΜ9yey< եŕ8lo$tͻk5|"cooLEwvIʛiA3w!Ӵty:'k CSB?̕N\d3-* cVT7=8p!(\ɽN_i{ Q5lrmiz&Sn[RҔ94w"SlZTdeLl T9* H*2:!RPX3UaZ 3R否:g4Y-sGw\Bм'KEZ&ŘJL,,d(\f2-"D6Qll$ll"8ذ{TX$(g>#||?B;aa=k{h09 %*v4q+vT:x :PTgeZ'h9R(1MH 7ucK1ONƄḅ8rC^tK6y&}v+=Rz (NP`Ԋ~8p|Rp{D<efCS7ҟg> R-Em1FlXv&-ΩvA_gk62;v6uJMk(UZONxT 薛VRVk}"+1'+ +m[9qəSe/dJ*U+bdk}f\J,SGr*HN%*ƙֶb3y z9f43L7`9X8]#KA&k^iz Î{WZ`IddX Z&'Z5Z2a|kdl,JGk T&"ca'J\r>-d|80Í3hݽw}ncӉ lf~t sK;̟d'`ޅVV#Sc-hplFŸ]Ak6+;K6;. -a4}kK=r[IlŜm#7A~ntF8]S%G2E-sn$smz3 , I#dz+5BevQWg;1/Uc^y& Da>*9wzmLZAsr'Wy0%&fZ<̑%ð?#x8_On8_޵zex<6XMȗFauՒB~؟σ11xOD |cVw)x<'_8P  D.sp;Ck'(CpdQ8?񽢀I! ;wp"`b!`RiGE%1}N O]іS׍Nܰ8wQ<^bYۡP= |R2 J {+V'9-4uAY]Y`҉u~} s*tREQb!MЧp}ӧtzW^_Bϟ^ʘgzH " dlvA,A E͢ga?c݇]RMk=تՄiff`I1V0HJZꁭ%l%-a+i [IKJZRNHJXۏ(= 7ZF30ˇb_>@;7 Lx0 i4aw1q393 y7<͑X<; if =rHxzf*3ef$`2qI3j:ZuP7.)L| 0䵫@q)0oA@t 0.dpĨkzϨu,v;\X@{G#Z `F>=PMQdO I/0۶/mEV<1ڶ2M+6 I=X$ =,~у IuI&djQ0b=u}evX'OH #C,!|UQdevtools/tests/testthat/testMarkdownVignettes/0000755000176200001440000000000013200656427021473 5ustar liggesusersdevtools/tests/testthat/testMarkdownVignettes/vignettes/0000755000176200001440000000000013200624357023500 5ustar liggesusersdevtools/tests/testthat/testMarkdownVignettes/vignettes/test.Rmd0000644000176200001440000000015012645523442025124 0ustar liggesusers This is a test. ```{r} 1 + 2 ``` devtools/tests/testthat/testMarkdownVignettes/DESCRIPTION0000644000176200001440000000035312645523442023204 0ustar liggesusersPackage: testMarkdownVignettes Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 VignetteBuilder: knitr Suggests: knitr devtools/tests/testthat/testLoadDir/0000755000176200001440000000000013200623656017334 5ustar liggesusersdevtools/tests/testthat/testLoadDir/R/0000755000176200001440000000000013200623656017535 5ustar liggesusersdevtools/tests/testthat/testLoadDir/R/a.r0000644000176200001440000000003313200623656020134 0ustar liggesusersmessage("|", getwd(), "|") devtools/tests/testthat/testLoadDir/DESCRIPTION0000644000176200001440000000030713200623656021042 0ustar liggesusersPackage: testLoadDir Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: a.r devtools/tests/testthat/testCollateExtra/0000755000176200001440000000000013200623656020405 5ustar liggesusersdevtools/tests/testthat/testCollateExtra/R/0000755000176200001440000000000013200623656020606 5ustar liggesusersdevtools/tests/testthat/testCollateExtra/R/a.r0000644000176200001440000000000613200623656021205 0ustar liggesusersa <- 1devtools/tests/testthat/testCollateExtra/DESCRIPTION0000644000176200001440000000031713200623656022114 0ustar liggesusersPackage: testCollateExtra Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: a.r b.rdevtools/tests/testthat/test-remote-metadata.R0000644000176200001440000000357113200623656021275 0ustar liggesuserscontext("remote-metadata") test_that("pkg_sha metadata from shallow clone", { pkg_sha <- git_sha1(path='shallowRepo') expect_equal(pkg_sha, '21d5d94011') }) test_that("install on packages adds metadata", { skip_on_cran() # temp libPaths withr::with_temp_libpaths({ test_pkg <- create_in_temp("testMetadataInstall") mock_use_github(test_pkg) # first do metadata = NULL install(test_pkg, quiet = TRUE, metadata = NULL) # cleanup code for when we are all finished on.exit(unload(test_pkg), add = TRUE) on.exit(erase(test_pkg), add = TRUE) # first time loading the package library("testMetadataInstall") pkg_info <- session_info()$packages expect_equal(pkg_info[pkg_info[, "package"] %in% "testMetadataInstall", "source"], "local") # now use default r <- git2r::repository(test_pkg) # then use metadata install(test_pkg, quiet = TRUE) library("testMetadataInstall") pkg_info <- session_info()$packages pkg_source <- pkg_info[pkg_info[, "package"] %in% "testMetadataInstall", "source"] pkg_sha <- substring(git2r::commits(r)[[1]]@sha, 1, 7) expect_match(pkg_source, pkg_sha) # dirty the repo cat("just a test", file = file.path(test_pkg, "test.txt")) install(test_pkg, quiet = TRUE) pkg_info <- session_info()$packages pkg_source <- pkg_info[pkg_info[, "package"] %in% "testMetadataInstall", "source"] expect_match(pkg_source, "local") # use load_all() and reinstall git2r::add(r, file.path(test_pkg, "test.txt")) git2r::commit(r, "adding test.txt") load_all(test_pkg, quiet = TRUE) install(test_pkg, quiet = TRUE) pkg_info <- session_info()$packages pkg_source <- pkg_info[pkg_info[, "package"] %in% "testMetadataInstall", "source"] pkg_sha <- substring(git2r::commits(r)[[1]]@sha, 1, 7) expect_match(pkg_source, pkg_sha) }) }) devtools/tests/testthat/check-results-note.log0000644000176200001440000000640312724305435021345 0ustar liggesusers* using log directory ‘/private/tmp/Rtmpe1I3BZ/devtools.Rcheck’ * using R version 3.2.3 (2015-12-10) * using platform: x86_64-apple-darwin13.4.0 (64-bit) * using session charset: UTF-8 * using option ‘--as-cran’ * checking for file ‘devtools/DESCRIPTION’ ... OK * this is package ‘devtools’ version ‘1.10.0.9000’ * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for executable files ... OK * checking for hidden files and directories ... OK * checking for portable file names ... OK * checking for sufficient/correct file permissions ... OK * checking whether package ‘devtools’ can be installed ... OK * checking installed package size ... OK * checking package directory ... OK * checking ‘build’ directory ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking R files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... OK * checking whether the package can be loaded with stated dependencies ... OK * checking whether the package can be unloaded cleanly ... OK * checking whether the namespace can be loaded with stated dependencies ... OK * checking whether the namespace can be unloaded cleanly ... OK * checking loading without being on the library search path ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... NOTE Registration problem: Evaluating ‘dll$foo’ during check gives error ‘object 'dll' not found’: .C(dll$foo, 0L) See chapter ‘System and foreign language interfaces’ in the ‘Writing R Extensions’ manual. * checking R code for possible problems ... NOTE Found the following calls to attach(): File ‘devtools/R/package-env.r’: attach(NULL, name = pkg_env_name(pkg)) File ‘devtools/R/shims.r’: attach(e, name = "devtools_shims", warn.conflicts = FALSE) See section ‘Good practice’ in ‘?attach’. * checking Rd files ... OK * checking Rd metadata ... OK * checking Rd line widths ... OK * checking Rd cross-references ... OK * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking line endings in C/C++/Fortran sources/headers ... OK * checking compiled code ... OK * checking installed files from ‘inst/doc’ ... OK * checking files in ‘vignettes’ ... OK * checking examples ... OK ** found \donttest examples: check also with --run-donttest * checking for unstated dependencies in ‘tests’ ... OK * checking tests ... OK Running ‘has-devel.R’ Running ‘test-that.R’ [33s/42s] * checking for unstated dependencies in vignettes ... OK * checking package vignettes in ‘inst/doc’ ... OK * checking running R code from vignettes ... OK * checking re-building of vignette outputs ... OK * checking PDF version of manual ... OK * DONE Status: 2 NOTEs devtools/tests/testthat/testTest/0000755000176200001440000000000013200656427016737 5ustar liggesusersdevtools/tests/testthat/testTest/tests/0000755000176200001440000000000012656131117020077 5ustar liggesusersdevtools/tests/testthat/testTest/tests/testthat.R0000644000176200001440000000007412656131117022063 0ustar liggesuserslibrary(testthat) library(testTest) test_check("testTest") devtools/tests/testthat/testTest/tests/testthat/0000755000176200001440000000000013201030625021724 5ustar liggesusersdevtools/tests/testthat/testTest/tests/testthat/test-dummy.R0000644000176200001440000000020313200623656024165 0ustar liggesuserscontext("dummy") ## TODO: Rename context ## TODO: Add more tests test_that("multiplication works", { expect_equal(2 * 2, 4) }) devtools/tests/testthat/testTest/NAMESPACE0000644000176200001440000000005612724305435020157 0ustar liggesusers# Generated by roxygen2: do not edit by hand devtools/tests/testthat/testTest/DESCRIPTION0000644000176200001440000000036613200624463020445 0ustar liggesusersPackage: testTest Title: Tools to make developing R code easier License: GPL-2 Description: Package description. Author: Hadley Maintainer: Hadley Version: 0.1 Suggests: testthat RoxygenNote: 5.0.1 devtools/tests/testthat/test-vignettes.r0000644000176200001440000000671513200623656020277 0ustar liggesuserscontext("Vignettes") test_that("Sweave vignettes copied into inst/doc", { if (!has_latex()) { skip("pdflatex not available") } clean_vignettes("testVignettes") expect_false("new.pdf" %in% dir("testVignettes/inst/doc")) expect_false("new.R" %in% dir("testVignettes/inst/doc")) expect_false("new.Rnw" %in% dir("testVignettes/inst/doc")) build_vignettes("testVignettes") expect_true("new.pdf" %in% dir("testVignettes/inst/doc")) expect_true("new.R" %in% dir("testVignettes/inst/doc")) expect_true("new.Rnw" %in% dir("testVignettes/inst/doc")) clean_vignettes("testVignettes") expect_false("new.pdf" %in% dir("testVignettes/inst/doc")) expect_false("new.R" %in% dir("testVignettes/inst/doc")) expect_false("new.Rnw" %in% dir("testVignettes/inst/doc")) }) test_that("Built files are updated", { if (!has_latex()) { skip("pdflatex not available") } clean_vignettes("testVignettes") build_vignettes("testVignettes") on.exit(clean_vignettes("testVignettes")) output <- dir("testVignettes/inst/doc", "new", full.names = TRUE) first <- file.info(output)$mtime Sys.sleep(1) build_vignettes("testVignettes") second <- file.info(output)$mtime expect_true(all(second > first)) }) if (packageVersion("knitr") >= 1.2) { test_that("Rmarkdown vignettes copied into inst/doc", { pkg <- as.package("testMarkdownVignettes") doc_path <- file.path(pkg$path, "inst", "doc") clean_vignettes(pkg) expect_false("test.html" %in% dir(doc_path)) expect_false("test.R" %in% dir(doc_path)) expect_false("test.Rmd" %in% dir(doc_path)) build_vignettes(pkg) expect_true("test.html" %in% dir(doc_path)) expect_true("test.R" %in% dir(doc_path)) expect_true("test.Rmd" %in% dir(doc_path)) clean_vignettes(pkg) expect_false("test.html" %in% dir(doc_path)) expect_false("test.R" %in% dir(doc_path)) expect_false("test.Rmd" %in% dir(doc_path)) }) test_that("dependencies argument", { pkg <- as.package("testMarkdownVignettes") doc_path <- file.path(pkg$path, "inst", "doc") clean_vignettes(pkg) on.exit(clean_vignettes(pkg), add = TRUE) installed_deps <- NULL with_mock( install_deps = function(pkg, dependencies, ...) installed_deps <<- dependencies, build_vignettes(pkg, FALSE) ) expect_false(installed_deps) }) } test_that("Extra files copied and removed", { if (!has_latex()) { skip("pdflatex not available") } pkg <- as.package("testVignetteExtras") doc_path <- file.path(pkg$path, "inst", "doc") extras_path <- file.path("testVignetteExtras", "vignettes", ".install_extras") writeLines("a.r", extras_path) on.exit(unlink(extras_path)) clean_vignettes(pkg) expect_false("a.r" %in% dir(doc_path)) build_vignettes(pkg) expect_true("a.r" %in% dir(doc_path)) clean_vignettes(pkg) expect_false("a.r" %in% dir(doc_path)) }) test_that("vignettes built on install", { if (!has_latex()) { skip("pdflatex not available") } # Make sure it fails if we build without installing expect_error(build_vignettes("testVignettesBuilt"), "there is no package called") install("testVignettesBuilt", reload = FALSE, quiet = TRUE, build_vignettes = TRUE) unlink("testVignettesBuilt/vignettes/new.tex") unlink("testVignettesBuilt/vignettes/.build.timestamp") vigs <- vignette(package = "testVignettesBuilt")$results expect_equal(nrow(vigs), 1) expect_equal(vigs[3], "new") suppressMessages(remove.packages("testVignettesBuilt")) }) devtools/tests/testthat/testImportMissing/0000755000176200001440000000000013200623656020622 5ustar liggesusersdevtools/tests/testthat/testImportMissing/R/0000755000176200001440000000000013200623656021023 5ustar liggesusersdevtools/tests/testthat/testImportMissing/R/a.r0000644000176200001440000000000713200623656021423 0ustar liggesusersa <- 1 devtools/tests/testthat/testImportMissing/DESCRIPTION0000644000176200001440000000035413200623656022332 0ustar liggesusersPackage: testImportMissing Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Imports: missingpackage Collate: a.r b.rdevtools/tests/testthat/testError/0000755000176200001440000000000012634337677017126 5ustar liggesusersdevtools/tests/testthat/testError/R/0000755000176200001440000000000012634337420017311 5ustar liggesusersdevtools/tests/testthat/testError/R/error.r0000644000176200001440000000007212416621515020623 0ustar liggesusersf <- function() { 5 * 10 } stop("This is an error!") devtools/tests/testthat/testError/DESCRIPTION0000644000176200001440000000031112416621515020610 0ustar liggesusersPackage: testError Title: Test package to check error message gives file/line info License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1devtools/tests/testthat/testDataLazy/0000755000176200001440000000000013200623656017527 5ustar liggesusersdevtools/tests/testthat/testDataLazy/NAMESPACE0000644000176200001440000000002713200623656020745 0ustar liggesusersexport(sysdata_export) devtools/tests/testthat/testDataLazy/data/0000755000176200001440000000000013200623656020440 5ustar liggesusersdevtools/tests/testthat/testDataLazy/data/a.rda0000644000176200001440000000007313200623656021350 0ustar liggesusers r0b```b`bf H020pD b`(+L8devtools/tests/testthat/testDataLazy/data/b.r0000644000176200001440000000000713200623656021041 0ustar liggesusersb <- 2 devtools/tests/testthat/testDataLazy/R/0000755000176200001440000000000013200623656017730 5ustar liggesusersdevtools/tests/testthat/testDataLazy/R/sysdata.rda0000644000176200001440000000013013200623656022062 0ustar liggesusers r0b```b`gf`b2Y# '+,NI,IO(/*d8E Lu^~5vdevtools/tests/testthat/testDataLazy/DESCRIPTION0000644000176200001440000000031213200623656021231 0ustar liggesusersPackage: testDataLazy Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 LazyData: true devtools/tests/testthat/testS4import/0000755000176200001440000000000013200623656017537 5ustar liggesusersdevtools/tests/testthat/testS4import/NAMESPACE0000644000176200001440000000006213200623656020754 0ustar liggesusersimportClassesFrom(testS4export, class_to_export) devtools/tests/testthat/testS4import/R/0000755000176200001440000000000013200623656017740 5ustar liggesusersdevtools/tests/testthat/testS4import/R/all.r0000644000176200001440000000006113200623656020670 0ustar liggesuserssetClass('derived', contains='class_to_export') devtools/tests/testthat/testS4import/DESCRIPTION0000644000176200001440000000060213200623656021243 0ustar liggesusersPackage: testS4import Title: reproduce S4 import bug with devtools Version: 0.1 Description: reproduce S4 import bug with devtools. See testS4export Author: Karl Forner Maintainer: Karl Forner Depends: R (>= 2.15) Imports: methods, testS4export Suggests: testthat (>= 0.7.1.99), License: GPL (>= 2) Collate: all.r devtools/tests/testthat/testDllLoad/0000755000176200001440000000000013200623656017331 5ustar liggesusersdevtools/tests/testthat/testDllLoad/src/0000755000176200001440000000000013200624321020106 5ustar liggesusersdevtools/tests/testthat/testDllLoad/src/null-test.c0000644000176200001440000000020113200623656022204 0ustar liggesusers#include #include SEXP null_test() { return R_NilValue; } SEXP null_test2() { return R_NilValue; } devtools/tests/testthat/testDllLoad/NAMESPACE0000644000176200001440000000013413200623656020546 0ustar liggesusersuseDynLib(testDllLoad) useDynLib(testDllLoad,null_test2) export(nulltest) export(nulltest2) devtools/tests/testthat/testDllLoad/R/0000755000176200001440000000000013200623656017532 5ustar liggesusersdevtools/tests/testthat/testDllLoad/R/a.r0000644000176200001440000000020213200623656020127 0ustar liggesusersa <- 1 nulltest <- function() { .Call("null_test", PACKAGE = "testDllLoad") } nulltest2 <- function() { .Call(null_test2) } devtools/tests/testthat/testDllLoad/DESCRIPTION0000644000176200001440000000031313200623656021034 0ustar liggesusersPackage: testDllLoad Title: Test package for loading and unloading DLLs License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: a.rdevtools/tests/testthat/testLoadHooks/0000755000176200001440000000000013200623656017701 5ustar liggesusersdevtools/tests/testthat/testLoadHooks/R/0000755000176200001440000000000013200623656020102 5ustar liggesusersdevtools/tests/testthat/testLoadHooks/R/a.r0000644000176200001440000000136113200623656020506 0ustar liggesusersa <- 1 b <- 1 c <- 1 onload_lib <- "" onattach_lib <- "" .onLoad <- function(lib, pkg) { onload_lib <<- lib a <<- a + 1 } .onAttach <- function(lib, pkg) { onattach_lib <<- lib # Attempt to modify b in namespace. This should throw an error # in a real install+load because namespace is locked. But with # load_all, it will work because the namespace doesn't get locked. try(b <<- b + 1, silent = TRUE) # Now modify c in package environment env <- as.environment("package:testLoadHooks") env$c <- env$c + 1 } .onUnload <- function(libpath) { # Increment this variable if it exists in the global env if (exists(".__testLoadHooks__", .GlobalEnv)) { .GlobalEnv$.__testLoadHooks__ <- .GlobalEnv$.__testLoadHooks__ + 1 } } devtools/tests/testthat/testLoadHooks/DESCRIPTION0000644000176200001440000000031013200623656021401 0ustar liggesusersPackage: testLoadHooks Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: a.rdevtools/tests/testthat/test-update.R0000644000176200001440000000322412724305435017503 0ustar liggesuserscontext("update.package_deps") ## -2 = not installed, but available on CRAN ## -1 = installed, but out of date ## 0 = installed, most recent version ## 1 = installed, version ahead of CRAN ## 2 = package not on CRAN test_that("update.package_deps", { object <- data.frame( stringsAsFactors = FALSE, package = c("dotenv", "falsy", "magrittr"), installed = c("1.0", "1.0", "1.0"), available = c("1.0", NA, "1.0"), diff = c(0L, 2L, 0L), remote = c("dotenv", "falsy", "magrittr") # these are not actual remotes ) class(object) <- c("package_deps", "data.frame") expect_message( update(object, quiet = FALSE, upgrade = FALSE), "Skipping 1 unavailable package: falsy" ) object <- data.frame( stringsAsFactors = FALSE, package = c("dotenv", "falsy", "magrittr"), installed = c("1.0", "1.1", "1.0"), available = c("1.0", "1.0", "1.0"), diff = c(0L, 1L, 0L), remote = c("dotenv", "falsy", "magrittr") # these are not actual remotes ) class(object) <- c("package_deps", "data.frame") expect_message( update(object, quiet = FALSE, upgrade = FALSE), "Skipping 1 package ahead of CRAN: falsy" ) object <- data.frame( stringsAsFactors = FALSE, package = c("dotenv", "falsy", "magrittr"), installed = c("1.0", "1.0", NA), available = c("1.0", "1.1", "1.0"), diff = c(0L, 1L, -2L), remote = c("dotenv", "falsy", "magrittr") # these are not actual remotes ) class(object) <- c("package_deps", "data.frame") with_mock( `devtools::install_remotes` = function(packages, ...) packages, expect_equal( update(object, upgrade = FALSE), "magrittr" ) ) }) devtools/tests/testthat/test-namespace.r0000644000176200001440000001062313200623656020214 0ustar liggesuserscontext("Namespace") # Is e an ancestor environment of x? is_ancestor_env <- function(e, x) { if (identical(e, x)) return(TRUE) else if (identical(x, emptyenv())) return(FALSE) else is_ancestor_env(e, parent.env(x)) } # Get parent environment n steps deep parent_env <- function(e, n = 1) { if (n == 0) e else parent_env(parent.env(e), n-1) } test_that("Loaded namespaces have correct version", { load_all("testNamespace") expect_identical(c(version="0.1"), getNamespaceVersion("testNamespace")) unload("testNamespace") }) test_that("Exported objects are visible from global environment", { # a is listed as an export in NAMESPACE, b is not. But with load_all(), # they should both be visible in the global env. load_all("testNamespace") expect_equal(a, 1) expect_equal(b, 2) unload("testNamespace") # With export_all = FALSE, only the listed export should be visible # in the global env. load_all("testNamespace", export_all = FALSE) expect_equal(a, 1) expect_false(exists("b")) unload("testNamespace") }) test_that("Missing exports don't result in error", { expect_warning(load_all("testMissingNsObject")) nsenv <- ns_env("testMissingNsObject") expect_equal(nsenv$a, 1) unload("testMissingNsObject") }) test_that("All objects are loaded into namespace environment", { load_all("testNamespace") nsenv <- ns_env("testNamespace") expect_equal(nsenv$a, 1) expect_equal(nsenv$b, 2) unload("testNamespace") }) test_that("All objects are copied to package environment", { load_all("testNamespace") pkgenv <- pkg_env("testNamespace") expect_equal(pkgenv$a, 1) expect_equal(pkgenv$b, 2) unload("testNamespace") # With export_all = FALSE, only the listed export should be copied load_all("testNamespace", export_all = FALSE) pkgenv <- pkg_env("testNamespace") expect_equal(pkgenv$a, 1) expect_false(exists("b", envir = pkgenv)) unload("testNamespace") }) test_that("Unloading and reloading a package works", { load_all("testNamespace") expect_equal(a, 1) # A load_all() again without unloading shouldn't change things load_all("testNamespace") expect_equal(a, 1) # Unloading should remove objects unload("testNamespace") expect_false(exists('a')) # Loading again should work load_all("testNamespace") expect_equal(a, 1) # Loading with reset should work load_all("testNamespace", reset = TRUE) expect_equal(a, 1) unload("testNamespace") }) test_that("Namespace, imports, and package environments have correct hierarchy", { load_all("testNamespace") pkgenv <- pkg_env("testNamespace") nsenv <- ns_env("testNamespace") impenv <- imports_env("testNamespace") expect_identical(parent_env(nsenv, 1), impenv) expect_identical(parent_env(nsenv, 2), .BaseNamespaceEnv) expect_identical(parent_env(nsenv, 3), .GlobalEnv) # pkgenv should be an ancestor of the global environment expect_true(is_ancestor_env(pkgenv, .GlobalEnv)) unload("testNamespace") }) test_that("unload() removes package environments from search", { load_all("testNamespace") pkgenv <- pkg_env("testNamespace") nsenv <- ns_env("testNamespace") unload("testNamespace") unload(inst("compiler")) unload(inst("bitops")) # Should report not loaded for package and namespace environments expect_false(is_attached("testNamespace")) expect_false(is_loaded("testNamespace")) # R's asNamespace function should error expect_error(asNamespace("testNamespace")) # pkgenv should NOT be an ancestor of the global environment # This is what makes the objects inaccessible from global env expect_false(is_ancestor_env(pkgenv, .GlobalEnv)) # Another check of same thing expect_false(pkg_env_name("testNamespace") %in% search()) }) test_that("Environments have the correct attributes", { load_all("testNamespace") pkgenv <- pkg_env("testNamespace") impenv <- imports_env("testNamespace") # as.environment finds the same package environment expect_identical(pkgenv, as.environment("package:testNamespace")) # Check name attribute of package environment expect_identical(attr(pkgenv, "name"), "package:testNamespace") # Check path attribute of package environment if (has_tests()) { wd <- normalizePath(devtest("testNamespace")) expect_identical(wd, attr(pkgenv, "path")) } # Check name attribute of imports environment expect_identical(attr(impenv, "name"), "imports:testNamespace") unload("testNamespace") }) devtools/tests/testthat/test-s4-export.r0000644000176200001440000000064613200623656020131 0ustar liggesuserscontext("s4-export") test_that("importing an S4 exported by another pkg with export_all = FALSE", { load_all("testS4export", export_all = FALSE) # this used to crash with error: # class "class_to_export" is not exported by 'namespace:testS4export' load_all("testS4import", export_all = FALSE) expect_true(isClassDef(getClass('derived'))) # cleanup unload('testS4import') unload('testS4export') }) devtools/tests/testthat/test-rtools.r0000644000176200001440000000345413200623656017606 0ustar liggesusersif (.Platform$OS.type == "windows") { context("Rtools tests") with_rtools_path <- function(path, code) { bin <- c(file.path(path, "bin"), file.path(path, "gcc-4.6.3", "bin")) old <- set_path(bin) on.exit(set_path(old)) force(code) } test_that("rtools found on path if present", { with_rtools_path("rtools-2.15", { rt <- scan_path_for_rtools(gcc49 = FALSE) expect_equal(rt$version, "2.15") }) }) test_that("out of date rtools is not compatible", { with_rtools_path("rtools-2.15", { rt <- scan_path_for_rtools(gcc49 = FALSE) expect_false(is_compatible(rt)) }) }) test_that("rtools must be complete to be located", { with_rtools_path("rtools-no-gcc", { rt <- scan_path_for_rtools(gcc49 = FALSE) expect_equal(rt, NULL) }) }) test_that("correct path doesn't need fixing", { with_rtools_path("rtools-manual", { rt <- scan_path_for_rtools(gcc49 = FALSE) expect_equal(rt$version, NULL) }) }) with_mock( `devtools:::RCMD` = function(...) "invalid_path", `base::file.info` = function(x) data.frame(exe = if (grepl("32", x)) "win32" else "win64"), { test_that("gcc-493 directory structure is found on 32 bit", { withr::with_path(file.path("rtools-gcc493", "bin"), { rt <- scan_path_for_rtools(gcc49 = TRUE, arch = "32") expect_equal(rt$version, "3.3") }) }) test_that("gcc-493 directory structure is found on 64 bit", { withr::with_path(file.path("rtools-gcc493", "bin"), { rt <- scan_path_for_rtools(gcc49 = TRUE, arch = "64") expect_equal(rt$version, "3.3") }) }) test_that("gcc-493 directory structure like winbuilder is found on 32 bit", { withr::with_path(file.path("rtools-gcc493-winbuilder", "bin"), { rt <- scan_path_for_rtools(gcc49 = TRUE, arch = "32") expect_equal(rt$version, "3.3") }) }) })} devtools/tests/testthat/testCheckExtrafile/0000755000176200001440000000000012634340572020702 5ustar liggesusersdevtools/tests/testthat/testCheckExtrafile/NAMESPACE0000644000176200001440000000001212416621515022107 0ustar liggesusersexport(a) devtools/tests/testthat/testCheckExtrafile/R/0000755000176200001440000000000012634340145021077 5ustar liggesusersdevtools/tests/testthat/testCheckExtrafile/R/a.r0000644000176200001440000000003712416621515021503 0ustar liggesusers#' A number. #' @export a <- 1 devtools/tests/testthat/testCheckExtrafile/DESCRIPTION0000644000176200001440000000032512416621515022405 0ustar liggesusersPackage: testCheckExtrafile Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Collate: 'a.r' devtools/tests/testthat/testCheckExtrafile/an_extra_file0000644000176200001440000000004612416621515023422 0ustar liggesusersThis is an extra file in the package. devtools/tests/testthat/testCheckExtrafile/man/0000755000176200001440000000000012634340400021443 5ustar liggesusersdevtools/tests/testthat/testCheckExtrafile/man/a.Rd0000644000176200001440000000020112416621515022152 0ustar liggesusers\docType{data} \name{a} \alias{a} \title{A number.} \format{num 1} \usage{ a } \description{ A number. } \keyword{datasets} devtools/tests/testthat/test-install.R0000644000176200001440000000063613171407310017663 0ustar liggesuserscontext("Install") test_that("install(dependencies = FALSE) doesn't query available_packages()", { withr::with_temp_libpaths( with_mock( `devtools::available_packages` = function(...) stop("available_packages() called"), expect_error(install("testNamespace", quiet = TRUE), "available_packages"), expect_error(install("testNamespace", quiet = TRUE, dependencies = FALSE), NA) ) ) }) devtools/tests/testthat/test-test.r0000644000176200001440000000104613171407310017230 0ustar liggesuserscontext("Test") test_that("Package can be tested with testthat not on search path", { testthat_pos <- which(search() == "package:testthat") if (length(testthat_pos) > 0) { testthat_env <- detach(pos = testthat_pos) on.exit(attach(testthat_env, testthat_pos), add = TRUE) } test("testTest", reporter = "stop") expect_true(TRUE) test("testTestWithDepends", reporter = "stop") expect_true(TRUE) }) test_that("Filtering works with devtools::test", { test("testTest", filter = "dummy", reporter = "stop") expect_true(TRUE) }) devtools/tests/testthat/testNamespace/0000755000176200001440000000000012634340226017711 5ustar liggesusersdevtools/tests/testthat/testNamespace/NAMESPACE0000644000176200001440000000013512416621515021130 0ustar liggesusersexport(a) export(bitAnd) import(compiler) importFrom(bitops,bitAnd) importFrom(bitops,bitOr) devtools/tests/testthat/testNamespace/R/0000755000176200001440000000000012634337467020127 5ustar liggesusersdevtools/tests/testthat/testNamespace/R/b.r0000644000176200001440000000000712416621515020514 0ustar liggesusersb <- 2 devtools/tests/testthat/testNamespace/R/a.r0000644000176200001440000000000712416621515020513 0ustar liggesusersa <- 1 devtools/tests/testthat/testNamespace/DESCRIPTION0000644000176200001440000000035612416621515021424 0ustar liggesusersPackage: testNamespace Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Imports: compiler, bitops Collate: a.r b.rdevtools/tests/testthat/test-load.r0000644000176200001440000000355613200625264017203 0ustar liggesuserscontext("Loading") test_that("Package root and subdirectory is working directory when loading", { expect_message(load_all("testLoadDir"), "[|].*/testLoadDir[|]") expect_message(load_all(file.path("testLoadDir", "R")), "[|].*/testLoadDir[|]") }) test_that("user is queried if no package structure present", { with_mock( `devtools::interactive` = function() TRUE, `devtools::menu` = function(...) stop("menu() called"), `devtools::setup` = function(...) stop("setup() called"), `devtools::package_file` = function(..., path) file.path(path, ...), expect_error(load_all(file.path("testLoadDir", "R")), "menu[(][)] called") ) }) test_that("setup is called upon user consent if no package structure present", { with_mock( `devtools::interactive` = function() TRUE, `devtools::menu` = function(choices, ...) match("Yes", choices), `devtools::setup` = function(...) stop("setup() called"), `devtools::package_file` = function(..., path) file.path(path, ...), expect_error(load_all(file.path("testLoadDir", "R")), "setup[(][)] called") ) }) test_that("setup is called if no package structure present", { with_mock( `devtools::menu` = function(...) stop("menu() called"), `devtools::setup` = function(...) stop("setup() called"), `devtools::package_file` = function(..., path) file.path(path, ...), expect_error(load_all(file.path("testLoadDir", "R"), create = TRUE), "setup[(][)] called") ) }) test_that("error is thrown if no package structure present", { with_mock( `devtools::menu` = function(...) stop("menu() called"), `devtools::setup` = function(...) stop("setup() called"), `devtools::package_file` = function(..., path) file.path(path, ...), expect_error(load_all(file.path("testLoadDir", "R"), create = FALSE), "No description at") ) }) devtools/tests/testthat/test-uninstall.r0000644000176200001440000000145513143377264020303 0ustar liggesuserscontext("Uninstall") test_that("uninstall() unloads and removes from library", { # Make a temp lib directory to install test package into old_libpaths <- .libPaths() tmp_libpath = file.path(tempdir(), "devtools_test") if (!dir.exists(tmp_libpath)) dir.create(tmp_libpath) .libPaths(c(tmp_libpath, .libPaths())) # Reset the libpath on exit on.exit(.libPaths(old_libpaths), add = TRUE) # Install package install("testHelp", quiet = TRUE) expect_true(require(testHelp)) expect_true("testHelp" %in% loaded_packages()$package) # Uninstall package uninstall("testHelp", quiet = TRUE) expect_false("testHelp" %in% loaded_packages()$package) expect_warning(expect_false(require(testHelp, quietly = TRUE)), paste0("there is no package called ", sQuote("testHelp") )) }) devtools/tests/testthat/test-help.r0000644000176200001440000000405313200623656017210 0ustar liggesuserscontext("help") test_that("shim_help behaves the same as utils::help for non-devtools-loaded packages", { # stats wasn't loaded with devtools. There are many combinations of calling # with quotes and without; make sure they're the same both ways. Need to index # in using [1] to drop attributes for which there are unimportant differences. expect_identical(shim_help(lm)[1], utils::help(lm)[1]) expect_identical(shim_help(lm, stats)[1], utils::help(lm, stats)[1]) expect_identical(shim_help(lm, 'stats')[1], utils::help(lm, 'stats')[1]) expect_identical(shim_help('lm')[1], utils::help('lm')[1]) expect_identical(shim_help('lm', stats)[1], utils::help('lm', stats)[1]) expect_identical(shim_help('lm', 'stats')[1], utils::help('lm', 'stats')[1]) }) test_that("shim_help behaves the same as utils::help for nonexistent objects", { expect_equal(length(shim_help(foofoo)), 0) expect_equal(length(shim_help("foofoo")), 0) }) test_that("shim_question behaves the same as utils::? for non-devtools-loaded packages", { expect_identical(shim_question(lm)[1], utils::`?`(lm)[1]) expect_identical(shim_question(lm(123))[1], utils::`?`(lm(123))[1]) expect_identical(shim_question(`lm`)[1], utils::`?`(`lm`)[1]) expect_identical(shim_question('lm')[1], utils::`?`('lm')[1]) }) test_that("shim_question behaves the same as utils::? for nonexistent objects", { expect_equal(length(shim_question(foofoo)), 0) expect_equal(length(shim_question(`foofoo`)), 0) expect_equal(length(shim_question("foofoo")), 0) # If given a function call with nonexistent function, error expect_error(utils::`?`(foofoo(123))) expect_error(shim_question(foofoo(123))) }) test_that("help and ? find files for devtools-loaded packages", { load_all('testHelp') # We can't test dev_help or help directly, because instead of returning an # object, they display the Rd file directly. But dev_help uses find_topic, # and we can test that. expect_true(!is.null(find_topic('foofoo'))) expect_null(find_topic('bad_value')) unload('testHelp') }) devtools/tests/testthat/testTestWithDepends/0000755000176200001440000000000012656131117021074 5ustar liggesusersdevtools/tests/testthat/testTestWithDepends/tests/0000755000176200001440000000000012656131117022236 5ustar liggesusersdevtools/tests/testthat/testTestWithDepends/tests/testthat.R0000644000176200001440000000012212656131117024214 0ustar liggesuserslibrary(testthat) library(testTestWithDepends) test_check("testTestWithDepends") devtools/tests/testthat/testTestWithDepends/tests/testthat/0000755000176200001440000000000013201030625024063 5ustar liggesusersdevtools/tests/testthat/testTestWithDepends/tests/testthat/test-dummy.R0000644000176200001440000000020313200623656026324 0ustar liggesuserscontext("dummy") ## TODO: Rename context ## TODO: Add more tests test_that("multiplication works", { expect_equal(2 * 2, 4) }) devtools/tests/testthat/testTestWithDepends/NAMESPACE0000644000176200001440000000003112656131117022305 0ustar liggesusersexportPattern("^[^\\.]") devtools/tests/testthat/testTestWithDepends/DESCRIPTION0000644000176200001440000000034012656131117022577 0ustar liggesusersPackage: testTestWithDepends Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Depends: R Suggests: testthat devtools/tests/testthat/test-package.R0000644000176200001440000000045713200623656017617 0ustar liggesuserscontext('package') test_that("it can load from outside of package root", { expect_false('testHooks' %in% loadedNamespaces()) load_all(file.path("testHooks")) expect_true('testHooks' %in% loadedNamespaces()) unload(file.path("testHooks")) expect_false('testHooks' %in% loadedNamespaces()) }) devtools/tests/testthat/rtools-2.15/0000755000176200001440000000000012634340060017016 5ustar liggesusersdevtools/tests/testthat/rtools-2.15/Rtools.txt0000755000176200001440000000717512464225256021070 0ustar liggesusers Rtools Collection 2.15.0.1919 This is the Rtools.txt file, which will be installed in the main Rtools directory. See also the README.txt file there, which describes the origin of some of the tools. The tools installed in the Rtools\MinGW directory are from the MinGW distribution. CYGWIN Some of the R tools use the Cygwin DLLs, which are included. If you already have Cygwin installed, you should not install these (but see "EXISTING CYGWIN INSTALLATIONS" below). REMAINING TASKS This installer doesn't install all of the tools necessary to build R or R packages, because of license or size limitations. The remaining tools are all available online (at no charge) as described below. TO BUILD R PACKAGES, you may optionally want items 1 and 2 below (LaTeX and the HTML Help Workshop). TO BUILD R, you need these plus item 3 below (Inno Setup). Finally, the Rtools installer will optionally edit your PATH variable as follows: PATH=c:\Rtools\bin;c:\Rtools\MinGW\bin;c:\R\bin; (where you will substitute appropriate directories for the ones listed above, but please keep the path in the same order as shown. LaTeX and the HTML Help Workshop should be installed among the "others".) REMAINING ITEMS 1. You may install LaTeX, available from http://www.miktex.org LaTeX is used to build .pdf forms of documentation. 2. You need the Inno Setup installer, available from http://www.innosetup.com to build the R installer. VERSIONS This installer includes the following versions of the MinGW packages, obtained from http://sourceforge.net/projects/mingw/files/: binutils-2.20.51-1-mingw32-bin.tar.lzma gcc-c++-4.5.0-1-mingw32-bin.tar.lzma gcc-core-4.5.0-1-mingw32-bin.tar.lzma gcc-fortran-4.5.0-1-mingw32-bin.tar.lzma libgcc-4.5.0-1-mingw32-dll-1.tar.lzma libgmp-5.0.1-1-mingw32-dll-10.tar.lzma libgomp-4.5.0-1-mingw32-dll-1.tar.lzma libmpc-0.8.1-1-mingw32-dll-2.tar.lzma libmpfr-2.4.1-1-mingw32-dll-1.tar.lzma libssp-4.5.0-1-mingw32-dll-0.tar.lzma mingwrt-3.18-mingw32-dev.tar.gz mingwrt-3.18-mingw32-dll.tar.gz w32api-3.15-1-mingw32-dev.tar.lzma It also includes the MinGW-64 version of gcc pre-4.6.3, compiled by Brian Ripley, obtained from his http://www.stats.ox.ac.uk/pub/Rtools/ web page as multi.zip. The Cygwin tools and DLLs were updated on March 25, 2011. They are taken from base-files 4.0-6 coreutils 8.10-1 cygwin 1.7.8-1 diffutils 2.9-1 findutils 4.5.9-1 gawk 3.1.8-1 grep 2.6.3-1 gzip 1.4-1 texinfo 4.13-3 The bitmap libraries are based on the following versions: jpegsrc.v8c libpng-1.5.8 tiff-3.9.1 Tcl/Tk is version 8.5.8. tar is a locally modified version of tar version 1.21. EXISTING CYGWIN INSTALLATIONS If you already have a full Cygwin installation, then you should not install our Cygwin DLLs in the Rtools/bin directory. You should make sure your existing cygwin/bin directory is on the path (*after* all the other entries listed above) and use the DLLs from there. However, this may not work if your Cygwin installation is too old. In that case the Rtools utilities will fail to run. To fix this, you should update the Cygwin installation, or (with great care!) replace the DLLs with the ones from the Rtools distribution. Be very careful, because if you have incompatible DLLs, your Cygwin tools will stop working. devtools/tests/testthat/rtools-2.15/VERSION.txt0000755000176200001440000000003112464225256020713 0ustar liggesusersRtools version 2.15.001 devtools/tests/testthat/rtools-2.15/bin/0000755000176200001440000000000012634337746017607 5ustar liggesusersdevtools/tests/testthat/rtools-2.15/bin/ls.exe0000644000176200001440000000000012416621515020702 0ustar liggesusersdevtools/tests/testthat/rtools-2.15/gcc-4.6.3/0000755000176200001440000000000012634340170020222 5ustar liggesusersdevtools/tests/testthat/rtools-2.15/gcc-4.6.3/bin/0000755000176200001440000000000012634340577021005 5ustar liggesusersdevtools/tests/testthat/rtools-2.15/gcc-4.6.3/bin/gcc.exe0000644000176200001440000000000012416621515022222 0ustar liggesusersdevtools/tests/testthat/testVignettes/0000755000176200001440000000000013200656427017770 5ustar liggesusersdevtools/tests/testthat/testVignettes/NAMESPACE0000644000176200001440000000000012416621515021173 0ustar liggesusersdevtools/tests/testthat/testVignettes/vignettes/0000755000176200001440000000000013200624356021774 5ustar liggesusersdevtools/tests/testthat/testVignettes/vignettes/new.Rnw0000644000176200001440000000010612416621515023254 0ustar liggesusers\documentclass[oneside]{article} \begin{document} Test \end{document}devtools/tests/testthat/testVignettes/DESCRIPTION0000644000176200001440000000027412416621515021477 0ustar liggesusersPackage: testVignettes Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 devtools/tests/testthat/rtools-no-gcc/0000755000176200001440000000000012634340207017602 5ustar liggesusersdevtools/tests/testthat/rtools-no-gcc/Rtools.txt0000755000176200001440000000717512464225256021651 0ustar liggesusers Rtools Collection 2.15.0.1919 This is the Rtools.txt file, which will be installed in the main Rtools directory. See also the README.txt file there, which describes the origin of some of the tools. The tools installed in the Rtools\MinGW directory are from the MinGW distribution. CYGWIN Some of the R tools use the Cygwin DLLs, which are included. If you already have Cygwin installed, you should not install these (but see "EXISTING CYGWIN INSTALLATIONS" below). REMAINING TASKS This installer doesn't install all of the tools necessary to build R or R packages, because of license or size limitations. The remaining tools are all available online (at no charge) as described below. TO BUILD R PACKAGES, you may optionally want items 1 and 2 below (LaTeX and the HTML Help Workshop). TO BUILD R, you need these plus item 3 below (Inno Setup). Finally, the Rtools installer will optionally edit your PATH variable as follows: PATH=c:\Rtools\bin;c:\Rtools\MinGW\bin;c:\R\bin; (where you will substitute appropriate directories for the ones listed above, but please keep the path in the same order as shown. LaTeX and the HTML Help Workshop should be installed among the "others".) REMAINING ITEMS 1. You may install LaTeX, available from http://www.miktex.org LaTeX is used to build .pdf forms of documentation. 2. You need the Inno Setup installer, available from http://www.innosetup.com to build the R installer. VERSIONS This installer includes the following versions of the MinGW packages, obtained from http://sourceforge.net/projects/mingw/files/: binutils-2.20.51-1-mingw32-bin.tar.lzma gcc-c++-4.5.0-1-mingw32-bin.tar.lzma gcc-core-4.5.0-1-mingw32-bin.tar.lzma gcc-fortran-4.5.0-1-mingw32-bin.tar.lzma libgcc-4.5.0-1-mingw32-dll-1.tar.lzma libgmp-5.0.1-1-mingw32-dll-10.tar.lzma libgomp-4.5.0-1-mingw32-dll-1.tar.lzma libmpc-0.8.1-1-mingw32-dll-2.tar.lzma libmpfr-2.4.1-1-mingw32-dll-1.tar.lzma libssp-4.5.0-1-mingw32-dll-0.tar.lzma mingwrt-3.18-mingw32-dev.tar.gz mingwrt-3.18-mingw32-dll.tar.gz w32api-3.15-1-mingw32-dev.tar.lzma It also includes the MinGW-64 version of gcc pre-4.6.3, compiled by Brian Ripley, obtained from his http://www.stats.ox.ac.uk/pub/Rtools/ web page as multi.zip. The Cygwin tools and DLLs were updated on March 25, 2011. They are taken from base-files 4.0-6 coreutils 8.10-1 cygwin 1.7.8-1 diffutils 2.9-1 findutils 4.5.9-1 gawk 3.1.8-1 grep 2.6.3-1 gzip 1.4-1 texinfo 4.13-3 The bitmap libraries are based on the following versions: jpegsrc.v8c libpng-1.5.8 tiff-3.9.1 Tcl/Tk is version 8.5.8. tar is a locally modified version of tar version 1.21. EXISTING CYGWIN INSTALLATIONS If you already have a full Cygwin installation, then you should not install our Cygwin DLLs in the Rtools/bin directory. You should make sure your existing cygwin/bin directory is on the path (*after* all the other entries listed above) and use the DLLs from there. However, this may not work if your Cygwin installation is too old. In that case the Rtools utilities will fail to run. To fix this, you should update the Cygwin installation, or (with great care!) replace the DLLs with the ones from the Rtools distribution. Be very careful, because if you have incompatible DLLs, your Cygwin tools will stop working. devtools/tests/testthat/rtools-no-gcc/VERSION.txt0000755000176200001440000000003112464225256021474 0ustar liggesusersRtools version 2.15.001 devtools/tests/testthat/rtools-no-gcc/bin/0000755000176200001440000000000012634340125020351 5ustar liggesusersdevtools/tests/testthat/rtools-no-gcc/bin/ls.exe0000644000176200001440000000000012416621515021463 0ustar liggesusersdevtools/tests/testthat/testDllRcpp/0000755000176200001440000000000013200625223017346 5ustar liggesusersdevtools/tests/testthat/testDllRcpp/src/0000755000176200001440000000000013200656427020147 5ustar liggesusersdevtools/tests/testthat/testDllRcpp/src/rcpp_hello_world.cpp0000644000176200001440000000047413200625223024204 0ustar liggesusers#include // [[Rcpp::export]] SEXP rcpp_hello_world() { using namespace Rcpp; CharacterVector x = CharacterVector::create("foo", "bar"); NumericVector y = NumericVector::create(0.0, 1.0); List z = List::create(x, y); return z; } // [[Rcpp::export]] bool rcpp_test_attributes() { return true; } devtools/tests/testthat/testDllRcpp/src/Makevars0000644000176200001440000000166613200623656021652 0ustar liggesusers## Use the R_HOME indirection to support installations of multiple R version PKG_LIBS = `$(R_HOME)/bin/Rscript -e "Rcpp:::LdFlags()"` ## As an alternative, one can also add this code in a file 'configure' ## ## PKG_LIBS=`${R_HOME}/bin/Rscript -e "Rcpp:::LdFlags()"` ## ## sed -e "s|@PKG_LIBS@|${PKG_LIBS}|" \ ## src/Makevars.in > src/Makevars ## ## which together with the following file 'src/Makevars.in' ## ## PKG_LIBS = @PKG_LIBS@ ## ## can be used to create src/Makevars dynamically. This scheme is more ## powerful and can be expanded to also check for and link with other ## libraries. It should be complemented by a file 'cleanup' ## ## rm src/Makevars ## ## which removes the autogenerated file src/Makevars. ## ## Of course, autoconf can also be used to write configure files. This is ## done by a number of packages, but recommended only for more advanced users ## comfortable with autoconf and its related tools. devtools/tests/testthat/testDllRcpp/src/Makevars.win0000644000176200001440000000024113200623656022432 0ustar liggesusers ## Use the R_HOME indirection to support installations of multiple R version PKG_LIBS = $(shell "${R_HOME}/bin${R_ARCH_BIN}/Rscript.exe" -e "Rcpp:::LdFlags()") devtools/tests/testthat/testDllRcpp/src/RcppExports.cpp0000644000176200001440000000160613200622662023141 0ustar liggesusers// Generated by using Rcpp::compileAttributes() -> do not edit by hand // Generator token: 10BE3573-1514-4C36-9D1C-5A225CD40393 #include using namespace Rcpp; // rcpp_test_attributes bool rcpp_test_attributes(); RcppExport SEXP _testDllRcpp_rcpp_test_attributes() { BEGIN_RCPP Rcpp::RObject rcpp_result_gen; Rcpp::RNGScope rcpp_rngScope_gen; rcpp_result_gen = Rcpp::wrap(rcpp_test_attributes()); return rcpp_result_gen; END_RCPP } RcppExport SEXP rcpp_hello_world(); static const R_CallMethodDef CallEntries[] = { {"_testDllRcpp_rcpp_test_attributes", (DL_FUNC) &_testDllRcpp_rcpp_test_attributes, 0}, {"rcpp_hello_world", (DL_FUNC) &rcpp_hello_world, 0}, {NULL, NULL, 0} }; RcppExport void R_init_testDllRcpp(DllInfo *dll) { R_registerRoutines(dll, NULL, CallEntries, NULL, NULL); R_useDynamicSymbols(dll, FALSE); } devtools/tests/testthat/testDllRcpp/NAMESPACE0000644000176200001440000000014313200625223020563 0ustar liggesusersuseDynLib(testDllRcpp, .registration = TRUE) export(rcpp_hello_world) export(rcpp_test_attributes) devtools/tests/testthat/testDllRcpp/R/0000755000176200001440000000000013200656427017561 5ustar liggesusersdevtools/tests/testthat/testDllRcpp/R/RcppExports.R0000644000176200001440000000035713200622006022164 0ustar liggesusers# Generated by using Rcpp::compileAttributes() -> do not edit by hand # Generator token: 10BE3573-1514-4C36-9D1C-5A225CD40393 rcpp_test_attributes <- function() { .Call('_testDllRcpp_rcpp_test_attributes', PACKAGE = 'testDllRcpp') } devtools/tests/testthat/testDllRcpp/R/rcpp_hello_world.R0000644000176200001440000000000013200625223023216 0ustar liggesusersdevtools/tests/testthat/testDllRcpp/DESCRIPTION0000644000176200001440000000040213200625223021050 0ustar liggesusersPackage: testDllRcpp Title: Test package for compiling DLLs that link to Rcpp License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Depends: Rcpp (>= 0.10.0) LinkingTo: Rcpp RoxygenNote: 6.0.1 devtools/tests/testthat/test-description.r0000644000176200001440000000030613200623656020600 0ustar liggesuserscontext("DESCRIPTION checks") test_that("Parse DESCRIPTION file", { pkg <- as.package("testNamespace") expect_identical("0.1", pkg$version) expect_identical("testNamespace", pkg$package) }) devtools/tests/testthat/rtools-gcc493/0000755000176200001440000000000013201030625017417 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493/Rtools.txt0000644000176200001440000000632312724305435021463 0ustar liggesusers Rtools Collection 3.3.0.1959 This is the Rtools.txt file, which will be installed in the main Rtools directory. See also the README.txt file there, which describes the origin of some of the tools. The tools installed in the Rtools\mingw_32, and Rtools\mingw_64 directories are from the MinGW-w64 distribution. CYGWIN Some of the R tools use the Cygwin DLLs, which are included. If you already have Cygwin installed, you should not install these (but see "EXISTING CYGWIN INSTALLATIONS" below). REMAINING TASKS This installer doesn't install all of the tools necessary to build R or R packages, because of license or size limitations. The remaining tools are all available online (at no charge) as described below. TO BUILD R PACKAGES, you may optionally want item 1 below (LaTeX). TO BUILD R, you do need item 1, and item 2 (Inno Setup) is optional if you would like to build the installer. As of R 3.2.0, the manuals are optional. To build them you will need item 3 below. The Rtools installer will optionally edit your PATH variable as follows: PATH=c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin; (where you will substitute appropriate directories for the ones listed above, but please keep the path in the same order as shown. LaTeX and R itself should be installed among the "others".) REMAINING ITEMS 1. You may install LaTeX, available from http://www.miktex.org LaTeX is used to build .pdf forms of documentation. 2. You need the Inno Setup installer, available from http://www.innosetup.com to build the R installer. 3. You will need Perl, e.g. from http://strawberryperl.com/ to build the manuals. VERSIONS This installer includes a multilib build of gcc 4.6.3, compiled by Brian Ripley, and separate 32- and 64-bit builds of gcc 4.9.3 and mingw-w64 v3 compiled by Jeroen Ooms and others. For use with the latter it also includes a copy of libicu55. The Cygwin tools and DLLs were updated on November 19, 2013. They are 32 bit versions taken from base-cygwin 3.3-1 coreutils 8.23-4 cygwin 1.7.33-1 diffutils 3.3-2 findutils 4.5.12-1 gawk 4.1.1-1 grep 2.21-1 gzip 1.6-1 texinfo 4.13 (used for R 3.1.x and earlier) and 5.2 (used for R 3.2.x and later). Tcl/Tk is version 8.5.8. tar is a locally modified version of tar version 1.21. EXISTING CYGWIN INSTALLATIONS If you already have a full 32 bit Cygwin installation, then you should not install our Cygwin DLLs in the Rtools/bin directory. You should make sure your existing cygwin/bin directory is on the path (*after* all the other entries listed above) and use the DLLs from there. However, this may not work if your Cygwin installation is too old. In that case the Rtools utilities will fail to run. To fix this, you should update the Cygwin installation, or (with great care!) replace the DLLs with the ones from the Rtools distribution. Be very careful, because if you have incompatible DLLs, your Cygwin tools will stop working. devtools/tests/testthat/rtools-gcc493/VERSION.txt0000644000176200001440000000003312724305435021316 0ustar liggesusersRtools version 3.3.0.1959 devtools/tests/testthat/rtools-gcc493/bin/0000755000176200001440000000000012724305435020204 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493/bin/ls.exe0000644000176200001440000000000012724305435021313 0ustar liggesusersdevtools/tests/testthat/rtools-gcc493/mingw_32/0000755000176200001440000000000012724305435021061 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493/mingw_32/bin/0000755000176200001440000000000012724305435021631 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493/mingw_32/bin/gcc.exe0000644000176200001440000000000012724305435023056 0ustar liggesusersdevtools/tests/testthat/rtools-gcc493/mingw_64/0000755000176200001440000000000012724305435021066 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493/mingw_64/bin/0000755000176200001440000000000012724305435021636 5ustar liggesusersdevtools/tests/testthat/rtools-gcc493/mingw_64/bin/gcc.exe0000644000176200001440000000000012724305435023063 0ustar liggesusersdevtools/tests/testthat/testImportVersion/0000755000176200001440000000000013200623656020636 5ustar liggesusersdevtools/tests/testthat/testImportVersion/NAMESPACE0000644000176200001440000000001213200623656022046 0ustar liggesusersexport(a) devtools/tests/testthat/testImportVersion/R/0000755000176200001440000000000013200623656021037 5ustar liggesusersdevtools/tests/testthat/testImportVersion/R/b.r0000644000176200001440000000000713200623656021440 0ustar liggesusersb <- 2 devtools/tests/testthat/testImportVersion/R/a.r0000644000176200001440000000000713200623656021437 0ustar liggesusersa <- 1 devtools/tests/testthat/testImportVersion/DESCRIPTION0000644000176200001440000000040613200623656022344 0ustar liggesusersPackage: testImportVersion Title: Tools to make developing R code easier License: GPL-2 Description: Author: Hadley Maintainer: Hadley Version: 0.1 Imports: grid (>= 100.0), compiler (>= 2.0-0) Collate: a.r b.rdevtools/NAMESPACE0000644000176200001440000001224113200625264013330 0ustar liggesusers# Generated by roxygen2: do not edit by hand S3method("[",remotes) S3method(as.character,quietError) S3method(format,bitbucket_remote) S3method(format,check_results) S3method(format,cran_remote) S3method(format,git_remote) S3method(format,github_remote) S3method(format,local_remote) S3method(format,remotes) S3method(format,revdep_check_result) S3method(format,url_remote) S3method(github_resolve_ref,"NULL") S3method(github_resolve_ref,default) S3method(github_resolve_ref,github_pull) S3method(github_resolve_ref,github_release) S3method(print,check_results) S3method(print,doctor) S3method(print,maintainers) S3method(print,package_deps) S3method(print,packages_info) S3method(print,platform_info) S3method(print,revdep_check_result) S3method(print,session_info) S3method(print,spellcheck) S3method(remote_download,bioc_remote) S3method(remote_download,bitbucket_remote) S3method(remote_download,cran_remote) S3method(remote_download,git_remote) S3method(remote_download,github_remote) S3method(remote_download,local_remote) S3method(remote_download,svn_remote) S3method(remote_download,url_remote) S3method(remote_metadata,bioc_remote) S3method(remote_metadata,bitbucket_remote) S3method(remote_metadata,cran_remote) S3method(remote_metadata,git_remote) S3method(remote_metadata,github_remote) S3method(remote_metadata,local_remote) S3method(remote_metadata,package) S3method(remote_metadata,svn_remote) S3method(remote_metadata,url_remote) S3method(remote_package_name,bioc_remote) S3method(remote_package_name,bitbucket_remote) S3method(remote_package_name,cran_remote) S3method(remote_package_name,git_remote) S3method(remote_package_name,github_remote) S3method(remote_package_name,local_remote) S3method(remote_package_name,svn_remote) S3method(remote_package_name,url_remote) S3method(remote_sha,bioc_remote) S3method(remote_sha,bitbucket_remote) S3method(remote_sha,cran_remote) S3method(remote_sha,git_remote) S3method(remote_sha,github_remote) S3method(remote_sha,local_remote) S3method(remote_sha,svn_remote) S3method(remote_sha,url_remote) S3method(replay_stop,default) S3method(replay_stop,error) S3method(replay_stop,list) S3method(update,package_deps) export(RCMD) export(add_path) export(as.package) export(bash) export(build) export(build_github_devtools) export(build_vignettes) export(build_win) export(check) export(check_built) export(check_cran) export(check_failures) export(check_man) export(clean_dll) export(clean_source) export(clean_vignettes) export(compile_dll) export(compiler_flags) export(create) export(create_description) export(dev_example) export(dev_help) export(dev_meta) export(dev_mode) export(dev_package_deps) export(dev_packages) export(devtest) export(document) export(dr_devtools) export(dr_github) export(eval_clean) export(evalq_clean) export(find_rtools) export(find_topic) export(get_path) export(github_pat) export(github_pull) export(github_release) export(has_devel) export(has_tests) export(imports_env) export(in_dir) export(inst) export(install) export(install_bioc) export(install_bitbucket) export(install_cran) export(install_deps) export(install_dev_deps) export(install_git) export(install_github) export(install_local) export(install_svn) export(install_url) export(install_version) export(is.package) export(lint) export(load_all) export(load_code) export(load_data) export(load_dll) export(loaded_packages) export(missing_s3) export(ns_env) export(on_path) export(package_deps) export(package_file) export(parse_deps) export(parse_ns_file) export(pkg_env) export(r_env_vars) export(release) export(release_checks) export(reload) export(revdep) export(revdep_check) export(revdep_check_print_problems) export(revdep_check_reset) export(revdep_check_resume) export(revdep_check_save_summary) export(revdep_email) export(revdep_maintainers) export(run_examples) export(session_info) export(set_path) export(setup) export(setup_rtools) export(show_news) export(source_gist) export(source_url) export(spell_check) export(submit_cran) export(system_check) export(system_output) export(test) export(uninstall) export(unload) export(update_packages) export(use_appveyor) export(use_build_ignore) export(use_code_of_conduct) export(use_coverage) export(use_cran_badge) export(use_cran_comments) export(use_data) export(use_data_raw) export(use_dev_version) export(use_git) export(use_git_hook) export(use_github) export(use_github_links) export(use_gpl3_license) export(use_mit_license) export(use_news_md) export(use_package) export(use_package_doc) export(use_rcpp) export(use_readme_md) export(use_readme_rmd) export(use_revdep) export(use_rstudio) export(use_test) export(use_testthat) export(use_travis) export(use_vignette) export(uses_testthat) export(wd) export(with_collate) export(with_debug) export(with_envvar) export(with_lib) export(with_libpaths) export(with_locale) export(with_makevars) export(with_options) export(with_par) export(with_path) importFrom(memoise,memoise) importFrom(stats,update) importFrom(utils,available.packages) importFrom(utils,contrib.url) importFrom(utils,download.packages) importFrom(utils,install.packages) importFrom(utils,installed.packages) importFrom(utils,modifyList) importFrom(utils,packageDescription) importFrom(utils,packageVersion) importFrom(utils,remove.packages) devtools/NEWS.md0000644000176200001440000021010113200625404013176 0ustar liggesusers# devtools 1.13.4 * Fix test errors for upcoming testthat release. # devtools 1.13.3 * Workaround a change in how Rcpp::compileAttributes stores the symbol names that broke tests. # devtools 1.13.2 * Workaround a regression in Rcpp::compileAttributes. Add trimws implementation for R 3.1 support. # devtools 1.13.1 * Bugfix for installing from git remote and not passing git2r credentials (@james-atkins, #1498) * Fix `test()` compatibility with testthat versions 1.0.2 (#1503). * Fix `install_version()`, `install_bitbucket()`, `install_local()`, `install_url()`, `install_svn()`, `install_bioc()` gain `quiet` arguments and properly pass them to internal functions. (#1502) # devtools 1.13.0 ## New Features * `spell_check` gains a `dict` argument to set a custom language or dictionary * `release()` now checks documentation for spelling errors by default. * New `use_gpl3_license()` sets the license field in `DESCRIPTION` and includes a copy of the license in `LICENSE`. ## Revdep check improvements * Various minor improvements around checking of reverse dependencies (#1284, @krlmlr). All packages involved are listed at the start, the whole process is now more resilient against package installation failures. * `revdep_check()` and `revdep_check_resume()` gain a skip argument which takes a character vector of packages to skip. * `revdep_check()` and `check_cran()` gain a `quiet_check` argument. You can use `quiet_check = FALSE` to see the actual text of R CMD check as it runs (not recommending with multiple threads). * `revdep_check_resume()` now takes `...` which can be used to override settings from `revdep_check()`. For debugging a problem with package checks, try `revdep_check(threads = 1, quiet_check = FALSE)` * `revdep_check()` collects timing information in `timing.md` (#1319, @krlmlr). * Package names and examples are sorted in case-insensitive C collation (#1322, @krlmlr). * `use_revdep()` adds `.gitignore` entry for check database (#1321, @krlmlr). * Own package is installed in temporary library for revdep checking (#1338, @krlmlr). * Automated revdep check e-mails now can use the new `my_version` and `you_cant_install` variables. The e-mail template has been updated to use these variables (#1285, @krlmlr). * Installation failures are logged during revdep checking, by default in `revdep/install`. Once an installation has failed, it is not attempted a second time (#1300, @krlmlr). * Print summary table in README.md and problems.md (#1284, @krlmlr).Revdep check improvements (#1284) ## Bug fixes and minor improvements * Handle case of un-installed package being passed to session_info (#1281). * Using authentication to access Github package name. (#1262, @eriknil). * `spell_check()` checks for hunspell before running (#1475, @jimvine). * `add_desc_package()` checks for package dependencies correctly (#1463, @thomasp85). * Remove deprecated `args` argument from `install_git()` to allow passthrough to `install` (#1373, @ReportMort). * added a `quiet` argument to `install_bitbucket()`, with a default value of `FALSE` (fixes issue #1345, @plantarum). * `update_packages()` allows for override of interactive prompt (#1260, @pkq). * `use_test()` template no longer includes useless comments (#1349) * Add encoding support in `test_dir()` call by adding reference to pkg$encoding (#1306, @hansharhoff) * Parse valid Git remote URLs that lack trailing `.git`, e.g. GitHub browser URLs (#1253, @jennybc). * Add a `check_bioconductor()` internal function to automatically install BiocInstaller() if it is not installed and the user wants to do so. * Improve Git integration. `use_git_ignore()` and `use_git_config()` gain `quiet` argument, tests work without setting `user.name` and `user.email` Git configuration settings (#1320, @krlmlr). * Improve Git status checks used in `release()` (#1205, @krlmlr). * Improved handling of local `file://` repositories in `install()` (#1284, @krlmlr). * `setup()` and `create()` gain new `quiet` argument (#1284, @krlmlr). * Avoid unnecessary query of `available_packages()` (#1269, @krlmlr). * Add cache setting to AppVeyor template (#1290, @krlmlr). * Fix AppVeyor test by manually installing `curl` (#1301). * `install(dependencies = FALSE)` doesn't query the available packages anymore (@krlmlr, #1269). * `use_travis()` now opens a webpage in your browser to more easily activate a repo. * `use_readme_rmd()` and `use_readme()` share a common template with sections for package overview, GitHub installation (if applicable), and an example (@jennybc, #1287). * `test()` doesn't load helpers twice anymore (@krlmlr, #1256). * Fix auto download method selection for `install_github()` on R 3.1 which lacks "libcurl" in `capabilities()`. (@kiwiroy, #1244) * Fix removal of vignette files by not trying to remove files twice anymore (#1291) # devtools 1.12.0 ## New features * New `install_bioc()` function and bioc remote to install Bioconductor packages from their SVN repository. * `install_dev_deps()` gets everything you need to start development on source package - it installs all dependencies, and roxygen2 (#1193). * `use_dev_version()` automates the process of switching from a release version number by tweaking the `DESCRIPTION`, adding a heading to `NEWS.md` (if present), and checking into git (if you use it) (#1076.) * `use_github()` accepts a host argument, similar to `install_github()` (@ijlyttle, #1101) ## Bug fixes and minor improvements * Update with Rtools-3.4 information, (@jimhester) * devtools now uses https to access the RStudio CRAN mirror if it will work on your system (#1059) * Handle case when a GitHub request returns a non-JSON error response. (@jimhester, #1204, #1211) * Suggested packages, including those specified as `Remotes:` are now installed after package installation. This allows you to use circular `Remotes:` dependencies for two related packages as long as one of the dependencies is a Suggested package. (@jimhester, #1184, hadley/dplyr#1809) * bug fix for installation of binary packages on windows, they must be installed directly from a zip file. (@jimhester, #1191, #1192) * `build_vignette()` will now only install the "VignetteBuilder" if it's not present, not try and upgrade it if it is (#1139). * `clean_dll()` Only removes package_name.def files and now operates recursively. (@jimhester, #1175, #1159, #1161) * `check_man()` now prints a message if no problems are found (#1187). * `install_*` functions and `update_packages()` refactored to allow updating of packages installed using any of the install methods. (@jimhester, #1067) * `install_github()` now uses `https://api.github.com` as the host argument, so users can specify 'http:' or other protocols if needed. (@jimhester, #1131, #1200) * `load_all()` runs package hooks before sourcing test helper files allowing test helper to make use of objects created when a package is loaded or attached. (@imanuelcostigan, #1146) * `revdep_check()` will now create the `revdep/` directory if it does not already exist (#1178). * `source_gist()` gains a `filename` argument to specify a particular file to source from a GitHub gist. (@ateucher, #1172) * Add a default codecov.yml file to turn off commenting with `use_coverage()` (@jimhester, #1188) * Bug fix for 'nchar(text) : invalid multibyte string' errors when running `write_dcf()` on DESCRIPTION files with non-ASCII encodings (#1224, @jimhester). # devtools 1.11.1 * Bug fix in `search_path_for_rtools()` using the gcc-4.9.3 toolchain when there is no rtools setting in the windows registry. (@jimhester, #1155) # devtools 1.11.0 ## Infrastructure helpers * `create_description()` now sets `Encoding: UTF-8`. This helps non-English package authors (#1123). * All `use_` function have been overhauled to be more consistent, particularly arround notification. Most functions now also ask to overwrite if a file already exists (#1074). * `use_coverage()` now adds covr to "Suggests", rather than recommending you install it explicitly in `.travis.yml`. * `use_cran_badge()` now uses HTTPS URL (@krlmlr, #1124). * `use_github()` now confirms that you've picked a good title and description (#1092) and prints the url of the repo (#1063). * `use_news()`, and `use_test()` open the files in RStudio (if you're using it and have the rstudioapi package installed). * `use_testthat()` tells you what it's doing (#1056). * `use_travis()` generates a template compatible with the newest R-travis. * `use_readme_md()` creates a basic `README.md` template (#1064). * `use_revdep()` has an updated template for the new revdep check system (#1090, @krlmlr). * Removed the deprecated `use_coveralls()`, `add_rstudio_project()`, `add_test_infrastructure()`, and `add_travis()`. ## Checks and and release() * `check()` now always succeeds (instead of throwing an error when `R CMD check` finds an `ERROR`), returning an object that summarises the check failures. * `check()` gains `run_dont_test` and `manual` arguments to control whether or not `\donttest{}` tests are tested, or manuals are built. This defaults to `FALSE`, but `release()` runs check with it set to `TRUE` (#1071; #1087, @krlmlr). * The `cleanup` argument to `check()` is deprecated: it now always returns the path to the check directory. * `check_built()` allows you to run `R CMD check` on an already built package. * `check_cran()` suppresses X11 with `DISPLAY = ""`. * `release()` has been tweaked to improve the order of the questions, and to ensure that you're ok with problems. It warns if both `inst/NEWS.Rd` and `NEWS.md` exist (@krlmlr, #1135), doesn't throw error if Git head is detached (@krlmlr, #1136). * `release()` gains an `args` argument to control build options, e.g. to allow passing `args = "--compact-vignettes=both"` for packages with heavy PDF vignettes (@krlmlr, #1077). * `system_check()` gains new arguments `path` to controls the working directory of the command, and `throw` to control whether or not it throws an error on command failure. `env` has been renamed to the more explicit `env_vars`. ## Revdep checks `revdep_check()` has been overhauled. All `revdep_` functions now work like other devtools functions, taking a path to the package as the first argument. `revdep_check()` now saves its results to disk as `check/check.rds`, and the other `revdep()` functions read from that cache. This also allows you to resume a partial run with `revdep_check_resume()`. This should be a big time saver if something goes unexpected wrong in the middle of the checks. You can blow away the cache and start afresh with `revdep_check_reset()`. `revdep_check_save_summary()` now creates `README.md` to save one level of clicking in github. It also creates a `problems.md` that contains only results for only packages that had warnings or errors. Each problem is limited to at most 25 lines of output - this avoids lengthy output for failing examples. `revdep_check_print_problems()` prints a bulleted list of problems, suitable for inclusion in your `cran-comments.md`. Summary results are reported as they come in, every then messages you'll get a message giving elapsed and estimated remaining time. An experimental `revdep_email()` emails individual maintainers with their `R CMD check` summary results (#1014). See testthat and dplyr for example usage. There were a handful of smaller fixes: * `revdep_check()` doesn't complain about missing `git2r` package anymore (#1068, @krlmlr). * Package index caches for `revdep_check()` now time out after 30 minutes. * `revdep_check_save_logs()` has been removed - it is just not that useful. * `revdep_check_summary()` has been removed - it never should have been part of the exported API. ## Other improvements * Devtools now uses new gcc toolchain on windows, if installed (@jimhester). * `install_git()` now allows you to pass credentials to git2r to specify specific ssh credentials (@onlymee, #982) * `load_all()` now sources all test helpers if you use testthat. This makes it much easier to interactively run tests (#1125). `load_all()` also correctly handles `unix` and `windows` subdirectories within `R` (@gaborcsardi, #1102) * `build_win()` defaults to only R-devel, since this is most commonly what you want. * Help shims now inform you that you're using development documentation (#1049). * `git_sha1()` Fix fetching the latest git commit so that it also works for shallow git clones, i.e. git clones which make use of depth. (#1048, #1046, @nparley) # devtools 1.10.0 ## New features * `curl`, `evaluate`, `roxygen2` and `rversions` have been moved from Imports to Suggests to lighten the dependency load of devtools. If you run a function that needs one of the packages, you'll prompted to install it (#962, @jimhester). * Devtools uses a new strategy for detecting RTools on windows: it now only looks for Rtools if you need to `load_all()` or `build()` a package with compiled code. This should make it easier to work with devtools if you're developing pure R packages (#947). * `package_file()` lets you find files inside a package. It starts by finding the root directory of the package (i.e. the directory that contains `DESCRIPTION`) (#985). * `use_news_md()` adds a basic `NEWS.md` template (#957). * `use_mit_license()` writes the necessary infrastructure to declare and release an R package under the MIT license in a CRAN-compliant way. (#995, @kevinushey) * `check(cran = TRUE)` adds `--run-donttest` since you do need to test code in `\dontest()` for CRAN submission (#1002). ## Package installation * `install()` installs packages specified in the `Additional_repositories` field, such as drat repositories. (#907, #1028, @jimhester). It correctly installs missing dependencies (#1013, @gaborcsardi). If called on a Bioconductor package, include the Bioconductor repositories if they are not already set (#895, @jimhester). * `install()` gains a `metadata` argument which lets you add extra fields to the `DESCRIPTION` on install. (#1027, @rmflight) * `install_github()` and `install_git()` only downloads and installs the package if the remote SHA1 reference differs from the currently installed reference (#903, @jimhester). * `install_local()` captures git and github information and stores it in the `DESCRIPTION` (#1027, @rmflight). * `install_version()` is more robust when handling multiple repos (#943, #1030, @jimhester). * Bugfix for `Remotes: ` feature that prevented it from working if devtools was not attached as is done in travis-r (#936, @jimhester). ## Bug fixes and minor improvements * `check_dev_versions()` checks only package dependencies (#983). * `check_man()` replaces `check_doc()` (since most other functions are named after the corresponding directory). `check_doc()` will hang around as an alias for the forseeable future (#958). * `create()` produces a dummy namespace will fake comment so roxygen2 will overwrite silently (#1016). * `create()` and `setup()` are more permissive -- they now accept a path to either a new directory or empty directory. (#966, @kevinushey) * `document()` now only runs `update_collate()` once. * `load_all()` resolves a longstanding lazy load database corruption issue when reloading packages which define S3 methods on generics from base or other packages (#1001, @jimhester). * `release_checks()` gains two new checks: * `check_vignette_titles()` checks that your vignette titles aren't the default "Vignette Title" (#960, @jennybc). * `check_news_md()` checks that `NEWS.md` isn't in your `.Rbuildignore` (since it's now supported by CRAN, #1042). * `revdep_check()`: * More verbose about which package is installed (#926, @krlmlr) * Verifies the integrity of already downloaded package archives (#930, @krlmlr) * Is now more tolerant of errors when retrieving the summary for a checked package (#929, @krlmlr). * When `ncpus > 1`, it includes the package name for when so you know which package has failed and can start looking at the output without needing to wait for all packages to finish (@mattdowle). * Uses proper repository when `BiocInstaller::useDevel(TRUE)` (#937, @jimhester). * Shimmed `system.file()` now respects `mustWork = TRUE` and throws an error if the file does not exist (#1034). * `use_appveyor()` template now creates `failure.zip` artifact instead of polluting the logs with `R CMD check` output (#1017, @krlmlr, @HenrikBengtsson). * `use_cran_comments()` template has been improved (#1038). * `use_data()` now warns when trying to save the same object twice, and stops if there is no object to save (#948, @krlmlr). * `use_revdep_check()` no longer includes `revdep_check_save_logs` in default template. I found I never used the logs and they just cluttered up the package directory (#1003). * `with_*()` functions have moved into the withr package, and devtools functions have been deprecated (#925, @jimhester). # devtools 1.9.1 * Avoid importing heavy dependencies to speed up loading (#830, @krlmlr). * Remove explicit `library(testthat)` call in `test()` (#798, @krlmlr). * `as.package()` and `load_all()` gain new argument `create`. Like other functions with a `pkg` argument, `load_all()` looks for a `DESCRIPTION` file in parent directories - if `create = TRUE` it will be automatically created if there's a `R/` or `data/` directory (#852, @krlmlr). * `build_vignettes()` gains dependencies argument (#825, @krlmlr). * `build_win()` now uses `curl` instead of `RCurl` for ftp upload. * `build_win()` asks for consent to receive e-mail at maintainer address in interactive mode (#800, @krlmlr). * `check()` now uses a better strategy when `cran = TRUE`. Instead of attempting to simulate `--as-cran` behaviour by turning on certain env vars, it now uses `--as-cran` and turns off problematic checks with env vars (#866). The problematic `cran_env_vars()` function has been removed. * `find_rtools()` now looks for registry keys in both HKCU (user) and HKLM (admin) locations (@Kevin-Jin, #844) * `install()` can now install dependencies from remote repositories by specifying them as `Remotes` in the `DESCRIPTION` file (#902, @jimhester). See `vignette("dependencies")` for more details. * `install_*()` detects if called on a Bioconductor package and if so, automatically includes the Bioconductor repositories if needed (#895, @jimhester). * `install_deps()` now automatically upgrades out of date dependencies. This is typically what you want when you're working on a development version of a package. To suppress this behaviour, set `upgrade_dependencies = FALSE` (#863). `install_deps()` is more careful with `...` - this means additional arguments to `install_*` are more likely to work (#870). * `install_gitorious()` has been removed since gitorious no longer exists (#913). * `load_all()` no longer fails if a `useDynLib()` entry in the NAMESPACE is incorrect. This should make it easy to recover from an incorrect `@useDynLib`, because re-documenting() should now succeed. * `release()` works for packages not located at root of git repository (#845, #846, @mbjones). * `revdep_check()` now installs _suggested_ packages by default (#808), and sets `NOT_CRAN` env var to `false` (#809). This makes testing more similar to CRAN so that more packages should pass cleanly. It also sets `RGL_USE_NULL` to `true` to stop rgl windows from popping up during testing (#897). It also downloads all source packages at the beginning - this makes life a bit easier if you're on a flaky internet connection (#906). * New `uninstall()` removes installed package (#820, @krlmlr). * Add `use_coverage()` function to add codecov.io or coveralls.io to a project, deprecate `use_coveralls()` (@jimhester, #822, #818). * `use_cran_badge()` uses canonical url form preferred by CRAN. * `use_data()` also works with data from the parent frame (#829, @krlmlr). * `use_git_hook()` now creates `.git/hooks` if needed (#888) * GitHub integration extended: `use_github()` gains a `protocol` argument (ssh or https), populates URL and BugReports fields of DESCRIPTION (only if non-existent or empty), pushes to the newly created GitHub repo, and sets a remote tracking branch. `use_github_links()` is a new exported function. `dr_github()` diagnoses more possible problems. (#642, @jennybc). * `use_travis()`: Default travis script leaves notifications on default settings. * `uses_testthat()` and `check_failures()` are now exported (#824, #839, @krlmlr). * `use_readme_rmd()` uses `uses_git()` correctly (#793). * `with_debug()` now uses `with_makevars()` rather than `with_env()`, because R reads compilation variables from the Makevars rather than the environment (@jimhester, #788). * Properly reset library path after `with_lib()` (#836, @krlmlr). * `remove_s4classes()` performs a topological sort of the classes (#848, #849, @famuvie). * `load_all()` warns (instead of failing) if importing symbols, methods, or classes from `NAMESPACE` fails (@krlmlr, #921). # devtools 1.8.0 ## Helpers * New `dr_devtools()` runs some common diagnostics: are you using the latest version of R and devtools? It is run automatically by `release()` (#592). * `use_code_of_conduct()` adds a contributor code of conduct from http://contributor-covenant.org. (#729) * `use_coveralls()` allows you to easily add test coverage with coveralls (@jimhester, #680, #681). * `use_git()` sets up a package to use git, initialising the repo and checking the existing files. * `use_test()` adds a new test file (#769, @krlmlr). * New `use_cran_badge()` adds a CRAN status badge that you can copy into a README file. Green indicates package is on CRAN. Packages not yet submitted or accepted to CRAN get a red badge. ## Package installation and info * `build_vignettes()` automatically installs the VignetteBuilder package, if necessary (#736). * `install()` and `install_deps()` gain a `...` argument, so additional arguments can be passed to `utils::install.packages()` (@jimhester, #712). `install_svn()` optionally accepts a revision (@lev-kuznetsov, #739). `install_version()` now knows how to look in multiple repos (#721). * `package_deps()` (and `dev_package_deps()`) determines all recursive dependencies and whether or not they're up-to-date (#663). Use `update(package_deps("xyz"))` to update out of date dependencies. This code is used in `install_deps()` and `revdep_check()` - it's slightly more aggressive than previous code (i.e. it forces you to use the latest version), which should avoid problems when you go to submit to CRAN. * New `update_packages()` will install a package (and its dependencies) only if they are missing or out of date (#675). * `session_info()` can now take a vector of package names, in which case it will print the version of those packages and their dependencies (#664). ## Git and github * Devtools now uses the git2r package to inspect git properties and install remote git packages with `install_git()`. This should be considerably more reliable than the previous strategy which involves calling the command line `git` client. It has two small downsides: `install_git()` no longer accepts additional `args`, and must do a deep clone when installing. * `dr_github()` checks for common problems with git/github setup (#643). * If you use git, `release()` now warns you if you have uncommited changes, or if you've forgotten to synchronise with the remote (#691). * `install_github()` warns if repository contains submodules (@ashander, #751). ## Bug fixes and minor improvements * Previously, devtools ran all external R processes with `R --vanilla`. Now it only suppresses user profiles, and constructs a custom `.Rprofile` to override the default. Currently, this `.Rprofile` sets up the `repos` option. Among others, this enables the cyclic dependency check in `devtools::release` (#602, @krlmlr). * `R_BROWSER` and `R_PDFVIEWER` environment variables are set to "false" to suppress random windows opening during checks. * Devtools correctly identifies RTools 3.1 and 3.2 (#738), and preserves continuation lines in the `DESCRIPTION` (#709). * `dev_help()` now uses `normalizePath()`. Hopefully this will make it more likely to work if you're on windows and have a space in the path. * `lint()` gains a `cache` argument (@jimhester, #708). * Fixed namespace issues related to `stats::setNames()` (#734, #772) and `utils::unzip()` (#761, @robertzk). * `release()` now reminds you to check the existing CRAN check results page (#613) ands shows file size before uploading to CRAN (#683, @krlmlr). * `RCMD()` and `system_check()` are now exported so they can be used by other packages. (@jimhester, #699). * `revdep_check()` creates directories if needed (#759). * `system_check()` combines arguments with ` `, not `, `. (#753) * `test()` gains an `...` argument so that additional arguments can be passed to `testthat::test_dir` (@jimhester, #747) * `use_travis()` now suggests you link to the svg icon since that looks a little sharper. Default template sets `CRAN: http://cran.rstudio.com/` to enable the cyclic dependency check. * `NOT_CRAN` envvar no longer overrides externally set variable. * `check(check_version = TRUE)` also checks spelling of the `DESCRIPTION`; if no spell checker is installed, a warning is given (#784, @krlmlr). # devtools 1.7.0 ## Improve reverse dependency checking Devtools now supports a new and improved style of revdep checking with `use_revdep()`. This creates a new directory called `revdep` which contains a `check.R` template. Run this template to check all reverse dependencies, and save summarised results to `check/summary.md`. You can then check this file into git, making it much easier to track how reverse dependency results change between versions. The documentation for `revdep_check()` is much improved, and should be more useful (#635) I recommend that you specify a library to use when checking with `options("devtools.revdep.libpath")`. (This should be a directory that already exists). This should be difference from your default library to keep the revdep environment isolated from your development environment. I've also tweaked the output of `revdep_maintainers()` so it's easier to copy and paste into an email (#634). This makes life a little easier pre-release. ## New helpers * `lint()` runs `lintr::lint_package()` to check style consistency and errors in a package. (@jimhester, #694) * `use_appveyor()` sets up a package for testing with AppVeyor (@krlmlr, #549). * `use_cran_comments()` creates a `cran-comments.md` template and adds it to `.Rbuildignore` to help with CRAN submissions. (#661) * `use_git_hook()` allows you to easily add a git hook to a package. * `use_readme_rmd()` sets up a template to generate a `README.md` from a `README.Rmd` with knitr. ## Minor improvements * Deprecated `doc_clean` argument to `check()` has been removed. * Initial package version in `create()` is now `0.0.0.9000` (#632). `create()` and `create_description()` checks that the package name is valid (#610). * `load_all()` runs `roxygen2::update_collate()` before loading code. This ensures that files are sourced in the way you expect, as defined by roxygen `@include` tags. If you don't have any `@include` tags, the collate will be not be touched (#623). * `session_info()` gains `include_base` argument to also display loaded/attached base packages (#646). * `release()` no longer asks if you've read the CRAN policies since the CRAN submission process now asks the same question (#692). `release(check = TRUE)` now runs some additional custom checks. These include: * Checking that you don't depend on a development version of a package. * Checking that the version number has exactly three components (#633). `release()` now builds packages without the `--no-manual` switch, both for checking and for actually building the release package (#603, @krlmlr). `build()` gains an additional argument `manual`, defaulting to `FALSE`, and `check()` gains `...` unmodified to `build()`. * `use_travis()` now sets an environment variable so that any WARNING will also cause the build to fail (#570). * `with_debug()` and `compiler_flags()` set `CFLAGS` etc instead of `PKG_CFLAGS`. `PKG_*` are for packages to use, the raw values are for users to set. (According to http://cran.rstudio.com/doc/manuals/r-devel/R-exts.html#Using-Makevars) * New `setup()` works like `create()` but assumes an existing, not necessarily empty, directory (#627, @krlmlr). ## Bug fixes * When installing a pull request, `install_github()` now uses the repository associated with the pull request's branch (and not the repository of the user who created the pull request) (#658, @krlmlr). * `missing_s3()` works once again (#672) * Fixed scoping issues with `unzip()`. * `load_code()` now executes the package's code with the package's root as working directory, just like `R CMD build` et al. (#640, @krlmlr). # devtools 1.6.1 * Don't set non-portable compiler flags on Solaris. * The file `template.Rproj` is now correctly installed and the function `use_rstudio` works as it should. (#595, @hmalmedal) * The function `use_rcpp` will now create the file `src/.gitignore` with the correct wildcards. (@hmalmedal) * The functions `test`, `document`, `load_all`, `build`, `check` and any function that applies to some package directory will work from subdirectories of a package (like the "R" or "inst/tests" directories). (#616, @robertzk) # devtools 1.6 ## Tool templates and `create()` * `create()` no longer generates `man/` directory - roxygen2 now does this automatically. It also no longer generates an package-level doc template. If you want this, use `use_package_doc()`. It also makes a dummy namespace so that you can build & reload without running `document()` first. * New `use_data()` makes it easy to include data in a package, either in `data/` (for exported datasets) or in `R/sysdata.rda` (for internal data). (#542) * New `use_data_raw()` creates `data-raw/` directory for reproducible generation of `data/` files (#541). * New `use_package()` allows you to set dependencies (#559). * New `use_package_doc()` sets up an Roxygen template for package-level docs. * New `use_rcpp()` sets up a package to use Rcpp. * `use_travis()` now figures out your github username and repo so it can construct the markdown for the build image. (#546) * New `use_vignette()` creates a draft vignette using Rmarkdown (#572). * renamed `add_rstudio_project()` to `use_rstudio()`, `add_travis()` to `use_travis()`, `add_build_ignore()` to `use_build_ignore()`, and `add_test_infrastructure()` to `use_testthat()` (old functions are aliased to new) ## The release process * You can add arbitrary extra questions to `release()` by defining a function `release_questions()` in your package. Your `release_questions()` should return a character vector of questions to ask (#451). * `release()` uses new CRAN submission process, as implemented by `submit_cran()` (#430). ## Package installation * All `install_*` now use the same code and store much useful metadata. Currently only `session_info()` takes advantage of this information, but it will allow the development of future tools like generic update functions. * Vignettes are no longer installed by default because they potentally require all suggested packages to also be installed. Use `build_vignettes = TRUE` to force building and to install all suggested packages (#573). * `install_bitbucket()` has been bought into alignment with `install_github()`: this means you can now specify repos with the compact `username/repo@ref` syntax. The `username` is now deprecated. * `install_git()` has been simplified and many of the arguments have changed names for consistency with metadata for other package installs. * `install_github()` has been considerably improved: * `username` is deprecated - please include the user in the repo name: `rstudio/shiny`, `hadley/devtools` etc. * `dependencies = TRUE` is no longer forced (regression in 1.5) (@krlmlr, #462). * Deprecated parameters `auth_user`, `branch`, `pull` and `password` have all been removed. * New `host` argument which allows you to install packages from github enterprise (#410, #506). * The GitHub API is used to download archive file (@krlmlr, #466) - this makes it less likely to break in the future. * To download a specified pull request, use `ref = github_pull(...)` (@krlmlr, #509). To install the latest release, use `"user/repo@*release"` or `ref = github_release()` (@krlmlr, #350). * `install_gitorious()` has been bought into alignment with `install_github()`: this means you can now specify repos with the compact `username/repo@ref` syntax. You must now always supply user (project) name and repo. * `install_svn()` lets you install an R package from a subversion repository (assuming you have subversion installed). * `decompress()` and hence `install_url()` now work when the downloaded file decompresses without additional top-level directory (#537). ## Other minor improvements and bug fixes * If you're using RStudio, and you you're trying to build a binary package without the necessary build tools, RStudio will prompt to download and install the right thing. (#488) * Commands are no longer run with `LC_ALL=C` - this no longer seems necessary (#507). * `build(binary = TRUE)` creates an even-more-temporary package library avoid conflicts (#557). * `check_dir()` no longer fails on UNC paths (#522). * `check_devtools()` also checks for dependencies on development versions of packages (#534). * `load_all()` no longer fails on partial loading of a package containing S4 or RC classes (#577). * On windows, `find_rtools()` is now run on package load, not package attach. * `help()`, `?`, and `system.file()` are now made available when a package is loaded with `load_all()`, even if the devtools package isn't attached. * `httr` 0.3 required (@krlmlr, #466). * `load_all()` no longer gives an error when objects listed as exports are missing. * Shim added for `library.dynam.unload()`. * `loaded_packages()` now returns package name and path it was loaded from. (#486) * The `parenvs()` function has been removed from devtools, because is now in the pryr package. * `missing_s3()` uses a better heuristic for determining if a function is a S3 method (#393). * New `session_info()` provides useful information about your R session. It's a little more focussed than `sessionInfo()` and includes where packages where installed from (#526). * `rstudioapi` package moved from suggests to imports, since it's always needed (it's job is to figure out if rstudio is available, #458) * Implemented own version `utils::unzip()` that throws error if command fails and doesn't print unneeded messages on non-Windows platforms (#540). * Wrote own version of `write.dcf()` that doesn't butcher whitespace and fieldnames. ## Removed functionality * The `fresh` argument to `test()` has been removed - this is best done by the editor since it can run the tests in a completely clean environment by starting a new R session. * `compile_dll()` can now build packages located in R's `tempdir()` directory (@richfitz, #531). # devtools 1.5 Four new functions make it easier to add useful infrastructure to packages: * `add_test_infrastructure()` will create test infrastructure for a new package. It is called automatically from `test()` if no test directories are found, the session is interactive and you agree. * `add_rstudio_project()` adds an RStudio project file to your package. `create()` gains an `rstudio` argument which will automatically create an RStudio project in the package directory. It defaults to `TRUE`: if you don't use RStudio, just delete the file. * `add_travis()` adds a basic travis template to your package. `.travis.yml` is automatically added to `.Rbuildignore` to avoid including it in the built package. * `add_build_ignore()` makes it easy to add files to `.Rbuildignore`, correctly escaping special characters Two dependencies were incremented: * devtools requires at least R version 3.0.2. * `document()` requires at least roxygen2 version 3.0.0. ## Minor improvements * `build_win()` now builds R-release and R-devel by default (@krlmlr, #438). It also gains parameter `args`, which is passed on to `build()` (@krlmlr, #421). * `check_doc()` now runs `document()` automatically. * `install()` gains `thread` argument which allows you to install multiple packages in parallel (@mllg, #401). `threads` argument to `check_cran()` now defaults to `getOption("Ncpus")` * `install_deps(deps = T)` no longer installs all dependencies of dependencies (#369). * `install_github()` now prefers personal access tokens supplied to `auth_token` rather than passwords (#418, @jeroenooms). * `install_github()` now defaults to `dependencies = TRUE` so you definitely get all the packages you need to build from source. * devtools supplies its own version of `system.file()` so that when the function is called from the R console, it will have special behavior for packages loaded with devtools. * devtools supplies its own version of `help` and `?`, which will search devtools-loaded packages as well as normally-loaded packages. ## Bug fixes * `check_devtools()` no longer called by `check()` because the relevant functionality is now included in `R CMD CHECK` and it was causing false positives (#446). * `install_deps(TRUE)` now includes packages listed in `VignetteBuilder` (#396) * `build()` no longer checks for `pdflatex` when building vignettes, as many modern vignettes don't need it (#398). It also uses `--no-build-vignettes` for >3.0.0 compatibility (#391). * `release()` does a better job of opening your email client if you're inside of RStudio (#433). * `check()` now correctly reports the location of the `R CMD check` output when called with a custom `check_dir`. (Thanks to @brentonk) * `check_cran()` records check times for each package tested. * Improved default `DESCRIPTION` file created by `create_description()`. (Thanks to @ncarchedi, #428) * Fixed bug in `install_github()` that prevented installing a pull request by supplying `repo = "username/repo#pull"`. (#388) * explicitly specify user agent when querying user name and ref for pull request in `install_github`. (Thanks to Kirill Müller, #405) * `install_github()` now removes blank lines found in a package `DESCRIPTION` file, protecting users from the vague `error: contains a blank line` error. (#394) * `with_options()` now works, instead of throwing an error (Thanks to @krlmlr, #434) # devtools 1.4.1 * Fixed bug in `wd()` when `path` was ommitted. (#374) * Fixed bug in `dev_help()` that prevented it from working when not using RStudio. * `source_gist()` respects new github policy by sending user agent (hadley/devtools) * `install_github()` now takes repo names of the form `[username/]repo[/subdir][@ref|#pull]` - this is now the recommended form to specify username, subdir, ref and/or pull for install_github. (Thanks to Kirill Müller, #376) # devtools 1.4 ## Installation improvements * `install()` now respects the global option `keep.source.pkgs`. * `install()` gains a `build_vignettes` which defaults to TRUE, and ensures that vignettes are built even when doing a local install. It does this by forcing `local = FALSE` if the package has vignettes, so `R CMD build` can follow the usual process. (#344) * `install_github()` now takes repo names of the form `username/repo` - this is now the recommended form for install_github if your username is not hadley ;) * `install_github()` now adds details on the source of the installed package (e.g. repository, SHA1, etc.) to the package DESCRIPTION file. (Thanks to JJ Allaire) * Adjusted `install_version()` to new meta data structure on CRAN. (Thanks to Kornelius Rohmeyer) * Fixed bug so that `install_version()` works with version numbers that contain hyphens. (Thanks to Kornelius Rohmeyer) * `install_deps()` is now exported, making it easier to install the dependencies of a package. ## Other minor improvements * `build(binary = TRUE)` now no longer installs the package as a side-effect. (#335) * `build_github_devtools()` is a new function which makes it easy for Windows users to upgrade to the development version of devtools. * `create_description()` does a better job of combining defaults and user specified options. (#332) * `install()` also installs the dependencies that do not have the required versions; besides, the argument `dependencies` now works like `install.packages()` (in previous versions, it was essentially `c("Depends", "Imports", "LinkingTo")`) (thanks, Yihui Xie, #355) * `check()` and `check_cran()` gain new `check_dir` argument to control where checking takes place (#337) * `check_devtools()` no longer incorrectly complains about a `vignettes/` directory * Decompression of zip files now respects `getOption("unzip")` (#326) * `dev_help` will now use the RStudio help pane, if you're using a recent version of RStudio (#322) * Release is now a little bit smarter: if it's a new package, it'll ask you to read and agree to the CRAN policies; it will only ask about dependencies if it has any. * `source_url()` (and `source_gist()`) accept SHA1 prefixes. * `source_gist()` uses the github api to reliably locate the raw gist. Additionally it now only attempts to source files with `.R` or `.r` extensions, and gains a `quiet` argument. (#348) * Safer installation of source packages, which were previously extracted directly into the temp directory; this could be a problem if directory names collide. Instead, source packages are now extracted into unique subdirectories. # devtools 1.3 ## Changes to best practices * The documentation for many devtools functions has been considerably expanded, aiming to give the novice package developer more hints about what they should be doing and why. * `load_all()` now defaults to `reset = TRUE` so that changes to the NAMESPACE etc are incorporated. This makes it slightly slower (but hopefully not noticeably so), and generally more accurate, and a better simulation of the install + restart + reload cycle. * `test()` now looks in both `inst/test` and `tests/testthat` for unit tests. It is recommended to use `tests/testthat` because it allows users to choose whether or not to install test. If you move your tests from `inst/tests` to `tests/testthat`, you'll also need to change `tests/test-all.R` to run `test_check()` instead of `test_package()`. This change requires testthat 0.8 which will be available on CRAN shortly. * New devtools guarantee: if because of a devtools bug, a CRAN maintainer yells at you, I'll send you a hand-written apology note. Just forward me the email and your address. ## New features * New `install_local()` function for installing local package files (as zip, tar, tgz, etc.) (Suggested by landroni) * `parse_deps()`, which parses R's package dependency strings, is now exported. * All package and user events (e.g. load, unload, attach and detach) are now called in the correct place. ## Minor improvements and bug fixes * `build()` gains `args` parameter allowing you to add additional arbitrary arguments, and `check()` gains similar `build_args` parameter. * `install_git` gains `git_arg` parameter allowing you to add arbitrary additional arguments. * Files are now loaded in a way that preserves srcreferences - this means that you will get much better locations on error messages, which should considerably aid debugging. * Fixed bug in `build_vignettes()` which prevented files in `inst/doc` from being updated * `as.package()` no longer uses the full path, which should make for nicer error messages. * More flexibility when installing package dependencies with the `dependencies` argument to `install_*()` (thanks to Martin Studer) * The deprecated `show_rd()` function has now been removed. * `install_bitbucket()` gains `auth_user` and `password` params so that you can install from private repos (thanks to Brian Bolt) * Better git detection on windows (thanks to Mikhail Titov) * Fix bug so that `document()` will automatically create `man/` directory * Default `DESCRIPTION` gains `LazyData: true` * `create_description()` now checks that the directory is probably a package by looking for `R/`, `data/` or `src/` directories * Rolled back required R version from 3.0 to 2.15. * Add missing import for `digest()` * Bump max compatible version of R with RTools 3.0, and add details for RTools 3.1 # devtools 1.2 ## Better installation * `install` gains a `local` option for installing the package from the local package directory, rather than from a built tar.gz. This is now used by default for all package installations. If you want to guarantee a clean build, run `local = FALSE` * `install` now uses option `devtools.install.args` for default installation arguments. This allows you to set any useful defaults (e.g. `--no-multiarch`) in your Rprofile. * `install_git` gains `branch` argument to specify branch or tag (Fixes #255) ## Clean sessions * `run_examples` and `test` gain a `fresh` argument which forces them to run in a fresh R session. This completely insulates the examples/tests from your current session but means that interactive code (like `browser()`) won't work.(Fixes #258) * New functions `eval_clean` and `evalq_clean` make it easy to evaluate code in a clean R session. * `clean_source` loses the `vanilla` argument (which did not work) and gains a `quiet` argument ## New features * `source_url` and `source_gist` now allow you to specify a sha, so you can make sure that files you source from the internet don't change without you knowing about it. (Fixes #259) * `build_vignettes` builds using `buildVignette()` and movies/copies outputs using the same algorithm as `R CMD build`. This means that `build_vignettes()` now exactly mimics R's regular behaviour, including building non-Sweave vignettes (#277), building in the correct directory (#231), using make files (if present), and copying over extra files. * devtools now sets best practice compiler flags: from `check()`, `-Wall -pedantic` and from `load_all()`, `-Wall -pedantic -g -O0 -UNDEBUG`. These are prefixed to existing environment variables so that you can override them if desired. (Fixes #257) * If there's no `DESCRIPTION` file present, `load_all()` will automatically create one using `create_description()`. You can set options in your `.Rprofile` to control what it contains: see `package?devtools` for more details. ## Minor improvements * `check()` now also sets environment variable `_R_CHECK_CODE_DATA_INTO_GLOBALENV_` to TRUE (to match current `--as-cran` behaviour) (Fixes #256) * Improved default email sent by `release()`, eliminating `create.post()` boilerplate * `revdep` includes LinkingTo by default. * Fixed regular expression problem that caused RTools `3.0.*` to fail to be found on Windows. * `load_data()` got an overhaul and now respects `LazyData` and correctly exports datasets by default (Fixes #242) * `with_envvar` gains the option to either replace, prefix or suffix existing environmental variables. The default is to replace, which was the previous behaviour. * `check_cran` includes `sessionInfo()` in the summary output (Fixes #273) * `create()` gains a `check` argument which defaults to FALSE. * `with_env` will be deprecated in 1.2 and removed in 1.3 * When `load_all()` calls `.onAttach()` and `.onLoad()`, it now passes the lib path to those functions. # devtools 1.1 * `source_gist()` has been updated to accept new gist URLs with username. (Fixes #247) * `test()` and `document()` now set environment variables, including NOT_CRAN. * Test packages have been renamed to avoid conflicts with existing packages on CRAN. This bug prevented devtools 1.0 from passing check on CRAN for some platforms. * Catch additional case in `find_rtools()`: previously installed, but directory empty/deleted (Fixes #241) # devtools 1.0 ## Improvements to package loading * Rcpp attributes are now automatically compiled during build. * Packages listed in depends are `require()`d (Fixes #161, #178, #192) * `load_all` inserts a special version of `system.file` into the package's imports environment. This tries to simulate the behavior of `base::system.file` but gives modified results because the directory structure of installed packages and uninstalled source packages is different. (Fixes #179). In other words, `system.file` should now just work even if the package is loaded with devtools. * Source files are only recompiled if they've changed since the last run, and the recompile will be clean (`--preclean`) if any exported header files have changed. (Closes #224) * The compilation process now performs a mock install instead of using `R CMD SHLIB`. This means that `Makevars` and makefiles will now be respected and generally there should be fewer mismatches between `load_all` and the regular install and reload process. * S4 classes are correctly loaded and unloaded. ## Windows * Rtools detection on windows has been substantially overhauled and should both be more reliable, and when it fails give more information about what is wrong with your install. * If you don't have rtools installed, devtools now automatically sets the TAR environment variable to internal so you can still build packages. ## Minor features * `check_cran` now downloads packages from cran.rstudio.com. * `check()` now makes the CRAN version check optional, and off by default. The `release()` function still checks the version number against CRAN. * In `check()`, it is optional to require suggested packages, using the `force_suggests` option. * When `check()` is called, the new default behavior is to not delete existing .Rd files from man/. This behavior can be set with the "devtools.cleandoc" option. * `install_bitbucket()` now always uses lowercase repo names. (Thanks to mnel) * New function `with_lib()`, which runs an expression code with a library path prepended to the existing libpaths. It differs slightly from `with_libpaths()`, which replaces the existing libpaths. * New function `install_git()` installs a package directly from a git repository. (Thanks to David Coallier) * If `pdflatex` isn't available, don't try to build vignettes with `install()` or `check()`. (Fixes #173) * `install_github()` now downloads from a new URL, to reflect changes on how files are hosted on GitHub. * `build()` now has a `vignettes` option to turn off rebuilding vignettes. * `install(quick=TRUE)` now builds the package without rebuilding vignettes. (Fixes #167) * All R commands called from `devtools` now have the environment variable `NOT_CRAN` set, so that you can perform tasks when you know your code is definitely not running on CRAN. (Closes #227) * Most devtools functions can a quiet argument that suppresses output. This is particularly useful for testing. ## Bug fixes * Fixed path issue when looking for Rtools on windows when registry entry is not present. (Fixes #201) * Reloading a package that requires a forced-unload of the namespace now works. * When reloading a package that another loaded package depends on, if there was an error loading the code, devtools would print out something about an error in `unloadNamespace`, which was confusing. It now gives more useful errors. * An intermittent error in `clear_topic_index` related to using `rm()` has been fixed. (Thanks to Gregory Jefferis) * `revdep()` now lists "Suggests" packages, in addition to "Depends" and "Imports". * `revdep_check()` now correctly passes the `recursive` argument to `revdep()`. * The collection of check results at the end of `check_cran()` previously did not remove existing results, but now it does. * When a package is loaded with `load_all()`, it now passes the name of the package to the `.onLoad()` function. (Thanks to Andrew Redd) # devtools 0.8.0 ## New features * `create` function makes it easier to create a package skeleton using devtools standards. * `install_github()` can now install from a pull request -- it installs the branch referenced in the pull request. * `install_github` now accepts `auth_user` and `password` arguments if you want to install a package in a private github repo. You only need to specify `auth_user` if it's not your package (i.e. it's not your `username`) (Fixes #116) * new `dev_help` function replaces `show_rd` and makes it easy to get help on any topic in a development package (i.e. a package loaded with `load_all`) (Fixes #110) * `dev_example` runs the examples for one in-development package. (Fixes #108) * `build_vignettes` now looks in modern location for vignettes (`vignettes/`) and warn if vignettes found in old location (`inst/doc`). Building now occurs in a temporary directory (to avoid polluting the package with build artefacts) and only final pdf files are copied over. * new `clean_vignettes` function to remove pdfs in `inst/doc` that were built from vignettes in `vignettes/` * `load_all` does a much much better job at simulating package loading (see LOADING section). It also compiles and loads C/C++/Fortran code. * `unload()` is now an exported function, which unloads a package, trying harder than just `detach`. It now also unloads DLLs. (Winston Chang. Fixes #119) * `run_examples` now has parameters `show`, `test`, `run` to control which of `\dontrun{}`, `\dontshow{}`, `\donttest{}` and `\testonly{}` are commented out. The `strict` parameter has been removed since it is no longer necessary because `load_all` can respect namespaces. (Fixes #118) * `build()`, `check()`, `install()` etc now run R in `--vanilla` mode which prevents it from reading any of your site or personal configuration files. This should prevent inconsistencies between the environment in which the package is run between your computer and other computers (e.g. the CRAN server) (Fixes #145) * All system R command now print the full command used to make it easier to understand what's going on. ## Package paths * `as.package` no longer uses `~/.Rpackages`. * `as.package` provides more informative error messages when path does not exist, isn't a directory, or doesn't contain a `DESCRIPTION` file. * New function `inst()` takes the name of a package and returns the installed path of that package. (Winston Chang. Fixes #130). This makes it possible to use `devtools` functions (e.g. `unload`) with regular installed packages, not just in-development source packages. * New function `devtest()` returns paths to an internal testing packages in devtools. ## Loading * Development packages are now loaded into a namespace environment, , and then the objects namespace are copied to the package environment, . This more accurately simulates how packages are normally loaded. However, all of the objects (not just the exported ones) are still copied to the package environment. (Winston Chang. Fixes #3, #60, and #125) * Packages listed in Imports and Depends are now loaded into an imports environment, with name attribute "imports:xxxx", which is the parent of the namespace environment. The imports environment is in turn a child of the environment, which is a child of the global environment. This more accurately simulates how packages are normally loaded. These packages previously were loaded and attached. (Winston Chang. Fixes #85) * The NAMESPACE file is now used for loading imports, instead of the DESCRIPTION file. Previously, `load_all` loaded all objects from the packages listed in DESCRIPTION. Now it loads packages (and, when when 'importfrom' is used, specific objects from packages) listed in NAMESPACE. This more closely simulates normal package loading. It still checks version numbers of packages listed in DESCRIPTION. (Winston Chang) * `load_all` can now be used to properly reload devtools. It does this by creating a copy of the devtools namespace environment, and calling `load_all` from that environment. (Winston Chang) * The `.onLoad` and `.onAttach` functions for a development package are now both called when loading a package for the first time, or with `reset=TRUE`, and the order more correctly simulates normal package loading (create the namespace, call `.onLoad`, copy objects to the package environment, and then call `.onAttach`). (Winston Chang) * `load_all` will now throw a warning if a dependency package does not satisfy the version requirement listed in DESCRIPTION. (Winston Chang. Fixes #109) * The package environment now has a 'path' attribute, similar to a package loaded the normal way. (Winston Chang) * `load_all` now has an option `export_all`. When set to TRUE, only the objects listed as exports in NAMESPACE are exported. (Winston Chang) * `load_all` now compiles C files in the /src directory. (Winston Chang) * New functions `compile_dll()` and `clean_dll()`, which compile C/C++/ Fortan source code, and clean up the compiled objects, respectively. (Winston Chang. Fixes #131) ## Bug fixes * `load_code` now properly skips missing files. (Winston Chang) * Add `--no-resave-data` to default build command. * The subject line of the email created by `release` is now "CRAN submission [package] [version]", per CRAN repository policy. * `install_bitbucket` properly installs zip files of projects stored in Mercurial repositories. (Winston Chang. Fixes #148) * `build` now builds vignettes because `install` does not. (Fixes #155) ## Introspection * New function `loaded_packages()`, which returns the names of packages that are loaded and attached. * Packages loaded with `load_all` now store devtools metadata in their namespace environment, in a variable called `.__DEVTOOLS__`. This can be accessed with the `dev_meta` function. (Winston Chang. Fixes #128) * `dev_mode` now stores the path it uses in option `dev_path`. That makes it easy for other applications to detect when it is on and to act accordingly. * New function `parse_ns_file()`, which parses a NAMESPACE file for a package. * New function `parenvs()`, which parents the parent environments of an object. (Winston Chang) # devtools 0.7.1 * bump dependency to R 2.15 * `load_code` now also looks for files ending in `.q` - this is not recommended, but is needed for some older packages # devtools 0.7 ## Installation * `install_bitbucket` installs R packages on bitbucket. * `install` now uses `--with-keep.source` to make debugging a little easier. * All remote install functions give better error messages in the case of http errors (Fixes #82). * `install` has new quick option to make package installation faster, by sacrificing documentation, demos and multi-architecture binaries. (Fixes #77) * `install_url`, `install_github` and `install_gitorious` gain a subdir argument which makes it possible to install packages that are contained within a sub-directory of a repository or compressed file. (Fixes #64) ## Checking * `with_debug` function temporarily sets env vars so that compilation is performed with the appropriate debugging flags set. Contributed by Andrew Redd. * `revdep`, `revdep_maintainers` and `revdep_check` for calculating reverse dependencies, finding their maintainers and running `R CMD check`. (Fixes #78) * `check_cran` has received a massive overhaul: it now checks multiple packages, installs dependencies (in user specified library), and parse check output to extract errors and warnings * `check` uses new `--as-cran` option to make checking as close to CRAN as possible (fixes #68) ## Other new features * devtools now uses options `devtools.path` to set the default path to use with devmode, and `github.user` to set the default user when installing packages from github. * if no package supplied, and no package has been worked with previously, all functions now will try the working directory. (Fixes #87) * on windows, devtools now looks in the registry to find where Rtools is installed, and does a better a job of locating gcc. (Contributed by Andrew Redd) * `show_rd` passes `...` on to `Rd2txt` - this is useful if you're checking how build time `\Sexpr`s are generated. * A suite of `with` functions that allow you to temporarily alter the environment in which code is run: `in_dir`, `with_collate`, `with_locale`, `with_options`, `with_path`, ... (Fixes #89) * `release` ask more questions and randomises correct answers so you really need to read them (Fixes #79) * `source_gist` now accepts default url such as "https://gist.github.com/nnn" * New system path manipulation functions, `get_path`, `set_path`, `add_path` and `on_path`, contributed by Andrew Redd. * If you're on windows, `devtools` now suppresses the unimportant warning from CYGWIN about the dos style file paths ## Bug fixes * `decompress` now uses target directory as defined in the function call when expanding a compressed file. (Fixes #84) * `document` is always run in a C locale so that `NAMESPACE` sort order is consistent across platforms. * `install` now quotes `libpath` and build path so paths with embedded spaces work (Fixes #73 and #76) * `load_data` now also loads `.RData` files (Fixes #81) * `install` now has `args` argument to pass additional command line arguments on to `R CMD install` (replaces `...` which didn't actually do anything). (Fixes #69) * `load_code` does a better job of reconciling files in DESCRIPTION collate with files that actually exist in the R directory. (Fixes #14) # devtools 0.6 ## New features * `test` function takes `filter` argument which allows you to restrict which tests are to be run * `check` runs with example timings, as is done on CRAN. Run with new param `cleanup = F` to access the timings. * `missing_s3` function to help figure out if you've forgotten to export any s3 methods * `check_cran` downloads and checks a CRAN package - this is useful to run as part of the testing process of your package if you want to check the dependencies of your package * `strict` mode for `run_examples` which runs each example in a clean environment. This is much slower than the default (running in the current environment), but ensures that each example works standalone. * `dev_mode` now updates prompt to indicate that it's active (Thanks to Kohske Takahashi) * new `source_url` function for sourcing script on a remote server via protocols other than http (e.g. https or ftp). (Thanks to Kohske Takahashi) * new `source_gist` function to source R code stored in a github gist. (Thanks to Kohske Takahashi) * `load_all` now also loads all package dependencies (including suggestions) - this works around some bugs in the way that devtools attaches the development environment into the search path in a way that fails to recreate what happens normally during package loading. ## Installation * remote installation will ensure the configure file is executable. * all external package installation functions are vectorised so you can install multiple packages at time * new `install_gitorious` function install packages in gitorious repos. * new `install_url` function for installing package from an arbitrary url * include `install_version` function from Jeremy Stephens for installing a specific version of a CRAN package from the archive. ## Better windows behaviour * better check for OS type (thanks to Brian Ripley) * better default paths for 64-bit R on windows (Fixes #35) * check to see if Rtools is already available before trying to mess with the paths. (Fixes #55) ## Bug fixes * if an error occurs when calling loading R files, the cache will be automatically cleared so that all files are loaded again next time you try (Fixes #55) * functions that run R now do so with `R_LIBS` set to the current `.libPaths()` - this will ensure that checking uses the development library if you are in development mode. `R_ENVIRON_USER` is set to an empty file to avoid your existing settings overriding this. * `load_data` (called by `load_all`) will also load data defined in R files in the data directory. (Fixes #45) * `dev_mode` performs some basic tests to make sure you're not setting your development library to a directory that's not already an R library. (Fixes #25) # devtools 0.5.1 * Fix error in that was causing R commands to fail on windows. # devtools 0.5 ## New functions * new `show_rd` function that will show the development version of a help file. ## Improvements and bug fixes * external R commands always run in locale `C`, because that's what the CRAN severs do. * `clean_source` sources an R script into a fresh R environment, ensuring that it can run independently of your current working environment. Optionally (`vanilla = T`), it will source in a vanilla R environment which ignores all local environment settings. * On windows, `devtools` will also add the path to `mingw` on startup. (Thanks to pointer from Dave Lovell) # devtools 0.4 ## New functions * new `wd` function to change the working directory to a package subdirectory. * `check_doc` now checks package documentation as a whole, in the same way that `R CMD check` does, rather than low-level syntax checking, which is done by `roxygen2. DESCRIPTION checking has been moved into `load_all`. `check_rd` has been removed. * `build` is now exported, and defaults to building in the package's parent directory. It also gains a new `binary` parameter controls whether a binary or a source version (with no vignettes or manuals) is built. Confusingly, binary packages are built with `R CMD INSTALL`. * `build_win` sends your package to the R windows builder, allowing you to make a binary version of your package for windows users if you're using linux or a max (if you're using windows already, use `build(binary = T)`) ## Improvements and bug fixes * if using `.Rpackages` config file, default function is used last, not first. * on Windows, `devtools` now checks for the presence of `Rtools` on startup, and will automatically add it to the path if needed. * `document` uses `roxygen2` instead of `roxygen`. It now loads package dependency so that they're available when roxygen executes the package source code. * `document` has new parameter `clean` which clears all roxygen caches and removes all existing man files. `check` now runs `document` in this mode. * `dev_mode` will create directories recursively, and complain if it can't create them. It should also work better on windows. * `install_github` now allows you to specify which branch to download, and automatically reloads package if needed. * `reload` now will only reload if the package is already loaded. * `release` gains `check` parameter that allows you to skip package check (if you've just done it.) * `test` automatically reloads code so you never run tests on old code # devtools 0.3 * new `bash()` function that starts bash shell in package directory. Useful if you want to use git etc. * removed inelegant `update_src()` since now superseded by `bash()` * fix bug in ftp upload that was adding extraneous space * `build` function builds package in specified directory. `install`, `check` and `release` now all use this function. * `build`, `install`, `check` and `release` better about cleaning up after themselves - always try to both work in session temporary directory and delete any files/directories that they create # devtools 0.2 * `install_github` now uses `RCurl` instead of external `wget` to retrieve package. This should make it more robust wrt external dependencies. * `load_all` will skip missing files with a warning (thanks to suggestion from Jeff Laake) * `check` automatically deletes `.Rcheck` directory on successful completion * Quote the path to R so it works even if there are spaces in the path. # devtools 0.1 * Check for presence of `DESCRIPTION` when loading packages to avoid false positives * `install` now works correctly with `devel_mode` to install packages in your development library * `release` prints news so you can more easily check it * All `R CMD xxx` functions now use the current R, not the first R found on the system path. devtools/R/0000755000176200001440000000000013200625264012312 5ustar liggesusersdevtools/R/revdep-email.R0000644000176200001440000000762713200623655015025 0ustar liggesusers#' Experimental email notification system. #' #' This currently assumes that you use github and gmail, and you have a #' \code{revdep/email.md} email template. #' #' @inheritParams revdep_check #' @param date Date package will be submitted to CRAN #' @param version Version which will be used for the CRAN submission (usually #' different from the current package version) #' @param author Name used to sign email #' @param draft If \code{TRUE}, creates as draft email; if \code{FALSE}, #' sends immediately. #' @param template Path of template to use #' @param only_problems Only inform authors with problems? #' @param unsent If some emails fail to send, in a previous #' @keywords internal #' @export revdep_email <- function(pkg = ".", date, version, author = getOption("devtools.name"), draft = TRUE, unsent = NULL, template = "revdep/email.md", only_problems = TRUE) { pkg <- as.package(pkg) force(date) force(version) if (is.null(author)) { stop("Please supply `author`", call. = FALSE) } if (is.null(unsent)) { results <- readRDS(revdep_check_path(pkg))$results } else { results <- unsent } if (only_problems) { results <- Filter(has_problems, results) } if (length(results) == 0) { message("No emails to send") return(invisible()) } template_path <- file.path(pkg$path, template) if (!file.exists(template_path)) { stop("`", template, "` does not exist", call. = FALSE) } template <- readLines(template_path) maintainers <- vapply(results, function(x) x$maintainer, character(1)) orphaned <- grepl("ORPHAN", maintainers) if (any(orphaned)) { orphans <- paste(names(results)[orphaned], collapse = ", ") message("Dropping ", sum(orphaned), " orphaned packages: ", orphans) results <- results[!orphaned] maintainers <- maintainers[!orphaned] } gh <- github_info(pkg$path) data <- lapply(results, maintainer_data, pkg = pkg, version = version, gh = gh, date = date, author = author) bodies <- lapply(data, whisker::whisker.render, template = template) subjects <- lapply(data, function(x) { paste0(x$your_package, " and " , x$my_package, " ", x$my_version, " release") }) emails <- Map(maintainer_email, maintainers, bodies, subjects) message("Testing first email") send_email(emails[[1]], draft = TRUE) if (yesno("Did first draft email look ok?")) return(invisible()) sent <- vapply(emails, send_email, draft = draft, FUN.VALUE = logical(1)) if (all(sent)) { message("All emails successfully sent") } else { message(sum(!sent), " failed. Call again with unsent = .Last.value") } results <- results[!sent] invisible(results) } send_email <- function(email, draft = TRUE) { send <- if (draft) gmailr::create_draft else gmailr::send_message msg <- if (draft) "Drafting" else "Sending" tryCatch( { message(msg, ": ", gmailr::subject(email)) send(email) TRUE }, interrupt = function(e) { message("Aborted by user") invokeRestart("abort") }, error = function(e) { message("Failed") FALSE } ) } maintainer_data <- function(result, pkg, version, gh, date, author) { problems <- result$results summary <- indent(paste(trunc_middle(unlist(problems)), collapse = "\n\n")) list( your_package = result$package, your_version = result$version, your_summary = summarise_check_results(problems), your_results = summary, you_have_problems = length(unlist(problems)) > 0, you_cant_install = any(grepl("Rcheck/00install[.]out", problems$errors)), me = author, date = date, my_package = pkg$package, my_version = version, my_github = gh$fullname ) } maintainer_email <- function(to, body, subject) { gmailr::mime(To = to, Subject = subject, body = body) } devtools/R/inst.r0000644000176200001440000000165513200623655013463 0ustar liggesusers#' Get the installation path of a package #' #' Given the name of a package, this returns a path to the installed #' copy of the package, which can be passed to other devtools functions. #' #' It searches for the package in \code{\link{.libPaths}()}. If multiple #' dirs are found, it will return the first one. #' #' @param name the name of a package. #' #' @examples #' inst("devtools") #' inst("grid") #' \dontrun{ #' # Can be passed to other devtools functions #' unload(inst("ggplot2")) #' } #' @export inst <- function(name) { # It would be nice to use find.package or system.file, but they # also search in the directory in the 'path' attribute of the # package environment. # Look in the library paths paths <- file.path(.libPaths(), name) paths <- paths[dir.exists(paths)] if (length(paths) > 0) { # If multiple matches, return the first one return(normalizePath(paths[1])) } else { return(NULL) } } devtools/R/install-url.r0000644000176200001440000000330413200623655014745 0ustar liggesusers#' Install a package from a url #' #' This function is vectorised so you can install multiple packages in #' a single command. #' #' @param url location of package on internet. The url should point to a #' zip file, a tar file or a bzipped/gzipped tar file. #' @param subdir subdirectory within url bundle that contains the R package. #' @param config additional configuration argument (e.g. proxy, #' authentication) passed on to \code{\link[httr]{GET}}. #' @param ... Other arguments passed on to \code{\link{install}}. #' @param quiet if \code{TRUE} suppresses output from this function. #' @export #' @family package installation #' @examples #' \dontrun{ #' install_url("https://github.com/hadley/stringr/archive/master.zip") #' } install_url <- function(url, subdir = NULL, config = list(), ..., quiet = FALSE) { remotes <- lapply(url, url_remote, subdir = subdir, config = config) install_remotes(remotes, ..., quiet = quiet) } url_remote <- function(url, subdir = NULL, config = list()) { remote("url", url = url, subdir = subdir, config = config ) } #' @export remote_download.url_remote <- function(x, quiet = FALSE) { if (!quiet) { message("Downloading package from url: ", x$url) } bundle <- tempfile(fileext = paste0(".", file_ext(x$url))) download(bundle, x$url, x$config) } #' @export remote_metadata.url_remote <- function(x, bundle = NULL, source = NULL) { list( RemoteType = "url", RemoteUrl = x$url, RemoteSubdir = x$subdir ) } #' @export remote_package_name.url_remote <- function(remote, ...) { NA_character_ } #' @export remote_sha.url_remote <- function(remote, ...) { NA_character_ } #' @export format.url_remote <- function(x, ...) { "URL" } devtools/R/install-local.r0000644000176200001440000000413713171407310015235 0ustar liggesusers#' Install a package from a local file #' #' This function is vectorised so you can install multiple packages in #' a single command. #' #' @param path path to local directory, or compressed file (tar, zip, tar.gz #' tar.bz2, tgz2 or tbz) #' @inheritParams install_url #' @export #' @examples #' \dontrun{ #' dir <- tempfile() #' dir.create(dir) #' pkg <- download.packages("testthat", dir, type = "source") #' install_local(pkg[, 2]) #' } install_local <- function(path, subdir = NULL, ..., quiet = FALSE) { remotes <- lapply(path, local_remote, subdir = subdir) install_remotes(remotes, ..., quiet = quiet) } local_remote <- function(path, subdir = NULL, branch = NULL, args = character(0)) { remote("local", path = path, subdir = subdir ) } #' @export remote_download.local_remote <- function(x, quiet = FALSE) { # Already downloaded - just need to copy to tempdir() bundle <- tempfile() dir.create(bundle) file.copy(x$path, bundle, recursive = TRUE) # file.copy() creates directory inside of bundle dir(bundle, full.names = TRUE)[1] } #' @export remote_metadata.local_remote <- function(x, bundle = NULL, source = NULL) { res <- list( RemoteType = "local", RemoteUrl = x$path ) res$RemoteSha <- remote_sha.local_remote(x) if (uses_git(x$path)) { res$RemoteBranch <- git_branch(path = x$path) } if (uses_github(x$path)) { info <- github_info(x$path) res$RemoteUsername <- info$username res$RemoteRepo <- info$repo } res } #' @export remote_metadata.package <- remote_metadata.local_remote #' @export remote_package_name.local_remote <- function(remote, ...) { description_path <- file.path(remote$path, "DESCRIPTION") read_dcf(description_path)$Package } #' @export remote_sha.local_remote <- function(remote, ...) { if (uses_git(remote$path)) { if (git_uncommitted(remote$path)) { return(NA_character_) } tryCatch({ git_sha1(path = remote$path) }, error = function(e) NA_character_) } else { read_dcf(file.path(remote$path, "DESCRIPTION"))$Version } } #' @export format.local_remote <- function(x, ...) { "local" } devtools/R/check-cran.r0000644000176200001440000001435313200623655014503 0ustar liggesusers#' Check a package from CRAN. #' #' Internal function used to power \code{\link{revdep_check}()}. #' #' This function does not clean up after itself, but does work in a #' session-specific temporary directory, so all files will be removed #' when your current R session ends. #' #' @param pkgs Vector of package names - note that unlike other \pkg{devtools} #' functions this is the name of a CRAN package, not a path. #' @param libpath Path to library to store dependencies packages - if you #' you're doing this a lot it's a good idea to pick a directory and stick #' with it so you don't have to download all the packages every time. #' @param srcpath Path to directory to store source versions of dependent #' packages - again, this saves a lot of time because you don't need to #' redownload the packages every time you run the package. #' @param check_libpath Path to library used for checking, should contain #' the top-level library from \code{libpath}. #' @param bioconductor Include bioconductor packages in checking? #' @param type binary Package type to test (source, mac.binary etc). Defaults #' to the same type as \code{\link{install.packages}()}. #' @param threads Number of concurrent threads to use for checking. #' It defaults to the option \code{"Ncpus"} or \code{1} if unset. #' @param check_dir,install_dir Directory to store check and installation #' results. #' @param quiet_check If \code{TRUE}, suppresses individual \code{R CMD #' check} output and only prints summaries. Set to \code{FALSE} for #' debugging. #' @return Returns (invisibly) the directory where check results are stored. #' @keywords internal #' @inheritParams check #' @export check_cran <- function(pkgs, libpath = file.path(tempdir(), "R-lib"), srcpath = libpath, check_libpath = libpath, bioconductor = FALSE, type = getOption("pkgType"), threads = getOption("Ncpus", 1), check_dir = tempfile("check_cran"), install_dir = tempfile("check_cran_install"), env_vars = NULL, quiet_check = TRUE) { stopifnot(is.character(pkgs)) if (length(pkgs) == 0) return() # For installing from GitHub, if git2r is not installed in the check_libpath requireNamespace("git2r", quietly = TRUE) rule("Checking ", length(pkgs), " CRAN packages", pad = "=") if (!file.exists(check_dir)) dir.create(check_dir) message("Results saved in ", check_dir) old <- options(warn = 1) on.exit(options(old), add = TRUE) # Create and use temporary library lapply(libpath, dir.create, showWarnings = FALSE, recursive = TRUE) libpath <- normalizePath(libpath) # Add the temporary library and remove on exit withr::with_libpaths(libpath, { message("Package library: ", paste(.libPaths(), collapse = ", ")) repos <- c(CRAN = cran_mirror()) if (bioconductor) { check_bioconductor() repos <- c(repos, BiocInstaller::biocinstallRepos()) } rule("Installing dependencies") # ------------------------------------------ deps <- package_deps(pkgs, repos = repos, type = type, dependencies = TRUE) dir.create(install_dir, showWarnings = FALSE, recursive = TRUE) needed <- deps$diff != CURRENT if (any(needed)) { message("Installing ", sum(needed), " packages: ", comma(pkgs)) update(deps, Ncpus = threads, quiet = FALSE, out_dir = install_dir, skip_if_log_exists = TRUE) } # Download source packages available_src <- available_packages(repos, "source") urls <- lapply(pkgs, package_url, repos = repos, available = available_src) ok <- vapply(urls, function(x) !is.na(x$name), logical(1)) if (any(!ok)) { message( "Skipping ", sum(!ok), " packages without source:", comma(pkgs[!ok]) ) urls <- urls[ok] pkgs <- pkgs[ok] } local_urls <- file.path(srcpath, vapply(urls, `[[`, "name", FUN.VALUE = character(1))) remote_urls <- vapply(urls, `[[`, "url", FUN.VALUE = character(1)) needs_download <- !vapply(local_urls, is_source_pkg, logical(1)) if (any(needs_download)) { message( "Downloading ", sum(needs_download), " source packages: ", comma(pkgs[needs_download]) ) Map(utils::download.file, remote_urls[needs_download], local_urls[needs_download], quiet = TRUE) } }) # Add the temporary library and remove on exit withr::with_libpaths(check_libpath, { rule("Checking packages") # -------------------------------------------------- message("Checking ", length(pkgs), " packages: ", comma(pkgs)) message("Package library: ", paste(.libPaths(), collapse = ", ")) check_start <- Sys.time() pkg_names <- format(pkgs) check_pkg <- function(i) { start_time <- Sys.time() res <- check_built( local_urls[i], args = "--no-multiarch --no-manual --no-codoc", env_vars = env_vars, check_dir = check_dir, quiet = quiet_check ) end_time <- Sys.time() message("Checked ", pkg_names[i], ": ", summarise_check_results(res, colour = TRUE)) status_update(i, length(pkgs), check_start) write_check_time( i, pkgs[i], elapsed_time = as.numeric(end_time - start_time, units = "secs"), file.path(check_dir, paste0(pkgs[i], ".Rcheck"), "check-time.txt")) NULL } if (length(pkgs) == 0) return() if (identical(as.integer(threads), 1L)) { lapply(seq_along(pkgs), check_pkg) } else { parallel::mclapply(seq_along(pkgs), check_pkg, mc.preschedule = FALSE, mc.cores = threads) } }) invisible(check_dir) } status_update <- function(i, n, start_time) { if (i %% 10 != 0) return() hm <- function(x) { sprintf("%02i:%02i", x %/% 3600, x %% 3600 %/% 60) } elapsed <- as.numeric(Sys.time() - start_time, units = "secs") estimated <- elapsed / i * (n - i) msg <- sprintf( "Checked %i/%i. Elapsed %s. Remaining ~%s", i, n, hm(elapsed), hm(estimated) ) message(msg) } write_check_time <- function(i, pkgs, elapsed_time, path) { writeLines(sprintf("%d %s %.1f", i, pkgs, elapsed_time), path) } parse_check_time <- function(path) { utils::read.table(file = path, header = FALSE)[[3]] } devtools/R/download-method.R0000644000176200001440000000226413171407310015523 0ustar liggesusers# Adapted from: # https://github.com/rstudio/rstudio/blob/master/src/cpp/session/modules/SessionPackages.R # # Copyright (C) 2009-12 by RStudio, Inc. download_method <- function(x) { if (identical(x, "auto")) { auto_download_method() } else { x } } auto_download_method <- function() { if (isTRUE(capabilities("libcurl"))) { "libcurl" } else if (isTRUE(capabilities("http/ftp"))) { "internal" } else if (nzchar(Sys.which("wget"))) { "wget" } else if (nzchar(Sys.which("curl"))) { "curl" } else { "" } } download_method_secure <- function(method = getOption("download.file.method", "auto")) { method <- download_method(method) if (method %in% c("wininet", "libcurl", "wget", "curl")) { # known good methods TRUE } else if (identical(method, "internal")) { # if internal then see if were using windows internal with inet2 identical(Sys.info()[["sysname"]], "Windows") && utils::setInternet2(NA) } else { # method with unknown properties (e.g. "lynx") or unresolved auto FALSE } } cran_mirror <- function() { if (download_method_secure()) { "https://cran.rstudio.com" } else { "http://cran.rstudio.com" } } devtools/R/check.r0000644000176200001440000001445313200623655013563 0ustar liggesusers#' Build and check a package, cleaning up automatically on success. #' #' \code{check} automatically builds and checks a source package, using all #' known best practices. \code{check_built} checks an already built package. #' #' Passing \code{R CMD check} is essential if you want to submit your package #' to CRAN: you must not have any ERRORs or WARNINGs, and you want to ensure #' that there are as few NOTEs as possible. If you are not submitting to CRAN, #' at least ensure that there are no ERRORs or WARNINGs: these typically #' represent serious problems. #' #' \code{check} automatically builds a package before calling \code{check_built} #' as this is the recommended way to check packages. Note that this process #' runs in an independent realisation of R, so nothing in your current #' workspace will affect the process. #' #' @section Environment variables: #' #' Devtools does its best to set up an environment that combines best practices #' with how check works on CRAN. This includes: #' #' \itemize{ #' #' \item The standard environment variables set by devtools: #' \code{\link{r_env_vars}}. Of particular note for package tests is the #' \code{NOT_CRAN} env var which lets you know that your tests are not #' running on cran, and hence can take a reasonable amount of time. #' #' \item Debugging flags for the compiler, set by #' \code{\link{compiler_flags}(FALSE)}. #' #' \item If \code{aspell} is found \code{_R_CHECK_CRAN_INCOMING_USE_ASPELL_} #' is set to \code{TRUE}. If no spell checker is installed, a warning is #' issued.) #' #' \item env vars set by arguments \code{check_version} and #' \code{force_suggests} #' } #' #' @return An object containing errors, warnings, and notes. #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param document if \code{TRUE} (the default), will update and check #' documentation before running formal check. #' @param build_args Additional arguments passed to \code{R CMD build} #' @param ... Additional arguments passed on to \code{\link{build}()}. #' @param cleanup Deprecated. #' @seealso \code{\link{release}} if you want to send the checked package to #' CRAN. #' @export check <- function(pkg = ".", document = TRUE, build_args = NULL, ..., manual = FALSE, cran = TRUE, check_version = FALSE, force_suggests = FALSE, run_dont_test = FALSE, args = NULL, env_vars = NULL, quiet = FALSE, check_dir = tempdir(), cleanup = TRUE) { pkg <- as.package(pkg) if (!missing(cleanup)) { warning("`cleanup` is deprecated", call. = FALSE) } if (document) { document(pkg) } if (!quiet) { show_env_vars(compiler_flags(FALSE)) rule("Building ", pkg$package) } withr::with_envvar(compiler_flags(FALSE), action = "prefix", { built_path <- build(pkg, tempdir(), quiet = quiet, args = build_args, manual = manual, ...) on.exit(unlink(built_path), add = TRUE) }) check_built( built_path, cran = cran, check_version = check_version, force_suggests = force_suggests, run_dont_test = run_dont_test, manual = manual, args = args, env_vars = env_vars, quiet = quiet, check_dir = check_dir ) } #' @export #' @rdname check #' @param path Path to built package. #' @param cran if \code{TRUE} (the default), check using the same settings as #' CRAN uses. #' @param check_version Sets \code{_R_CHECK_CRAN_INCOMING_} env var. #' If \code{TRUE}, performs a number of checked related #' to version numbers of packages on CRAN. #' @param force_suggests Sets \code{_R_CHECK_FORCE_SUGGESTS_}. If #' \code{FALSE} (the default), check will proceed even if all suggested #' packages aren't found. #' @param run_dont_test Sets \code{--run-donttest} so that tests surrounded in #' \code{\\dontest\{\}} are also tested. This is important for CRAN #' submission. #' @param manual If \code{FALSE}, don't build and check manual #' (\code{--no-manual}). #' @param args Additional arguments passed to \code{R CMD check} #' @param env_vars Environment variables set during \code{R CMD check} #' @param check_dir the directory in which the package is checked #' @param quiet if \code{TRUE} suppresses output from this function. check_built <- function(path = NULL, cran = TRUE, check_version = FALSE, force_suggests = FALSE, run_dont_test = FALSE, manual = FALSE, args = NULL, env_vars = NULL, check_dir = tempdir(), quiet = FALSE) { pkgname <- gsub("_.*?$", "", basename(path)) args <- c("--timings", args) if (cran) { args <- c("--as-cran", args) } if (run_dont_test) { args <- c("--run-donttest", args) } if (manual && !has_latex(verbose = TRUE)) { manual <- FALSE } if (!manual) { args <- c(args, "--no-manual") } env_vars <- check_env_vars(cran, check_version, force_suggests, env_vars) if (!quiet) { show_env_vars(env_vars) rule("Checking ", pkgname) } R(c(paste("CMD check", shQuote(path)), args), path = check_dir, env_vars = env_vars, quiet = quiet, throw = FALSE ) package_path <- file.path( normalizePath(check_dir), paste(pkgname, ".Rcheck", sep = "") ) if (!file.exists(package_path)) { stop("Check failed: '", package_path, "' doesn't exist", call. = FALSE) } # Record successful completion writeLines("OK", file.path(package_path, "COMPLETE")) log_path <- file.path(package_path, "00check.log") parse_check_results(log_path) } check_env_vars <- function(cran = FALSE, check_version = FALSE, force_suggests = TRUE, env_vars) { c( aspell_env_var(), "_R_CHECK_CRAN_INCOMING_" = as.character(check_version), "_R_CHECK_FORCE_SUGGESTS_" = as.character(force_suggests), env_vars ) } aspell_env_var <- function() { tryCatch({ utils::aspell(NULL) c("_R_CHECK_CRAN_INCOMING_USE_ASPELL_" = "TRUE") }, error = function(e) character()) } show_env_vars <- function(env_vars) { rule("Setting env vars") message(paste0(format(names(env_vars)), ": ", unname(env_vars), collapse = "\n")) } devtools/R/topic-index.r0000644000176200001440000000434413200623655014727 0ustar liggesusers# Tools for indexing package documentation by alias, and for finding # the rd file for a given topic (alias). # @return path to rd file within package find_pkg_topic <- function(pkg = ".", topic) { pkg <- as.package(pkg) # First see if a man file of that name exists man <- file.path(pkg$path, "man", topic) if (file.exists(man)) return(basename(man)) # Next, look in index index <- topic_index(pkg) if (topic %in% names(index)) return(index[[topic]]) # Finally, try adding .Rd to name man_rd <- file.path(pkg$path, "man", paste0(topic, ".Rd")) if (file.exists(man_rd)) return(basename(man_rd)) NULL } #' Find the rd file that documents a topic. #' #' Only packages loaded by devtools are searched. #' #' @param topic The topic, a string. #' @return A named string. The values gives the path to file; the name gives #' the path to package. #' @export #' @keywords internal #' @examples #' find_topic("help") find_topic <- function(topic) { if (is.null(topic) || topic == "") return(NULL) pieces <- strsplit(topic, "::")[[1]] if (length(pieces) == 1) { pkgs <- dev_packages() } else { pkgs <- pieces[1] topic <- pieces[2] } for (pkg in pkgs) { path <- getNamespaceInfo(pkg, "path") rd <- find_pkg_topic(path, topic) if (!is.null(rd)) return(stats::setNames(file.path(path, "man", rd), path)) } NULL } topic_indices <- new.env(parent = emptyenv()) topic_index <- function(pkg = ".") { pkg <- as.package(pkg) if (!exists(pkg$package, topic_indices)) { topic_indices[[pkg$package]] <- build_topic_index(pkg) } topic_indices[[pkg$package]] } clear_topic_index <- function(pkg = ".") { pkg <- as.package(pkg) if (exists(pkg$package, topic_indices)) { rm(list = pkg$package, envir = topic_indices) } invisible(TRUE) } build_topic_index <- function(pkg = ".") { pkg <- as.package(pkg) rds <- rd_files(pkg) aliases <- function(path) { parsed <- tools::parse_Rd(path) tags <- vapply(parsed, function(x) attr(x, "Rd_tag")[[1]], character(1)) unlist(parsed[tags == "\\alias"]) } invert(lapply(rds, aliases)) } invert <- function(L) { if (length(L) == 0) return(L) t1 <- unlist(L) names(t1) <- rep(names(L), lapply(L, length)) tapply(names(t1), t1, c) } devtools/R/build.r0000644000176200001440000001102613200623655013576 0ustar liggesusers#' Build package. #' #' Building converts a package source directory into a single bundled file. #' If \code{binary = FALSE} this creates a \code{tar.gz} package that can #' be installed on any platform, provided they have a full development #' environment (although packages without source code can typically be #' install out of the box). If \code{binary = TRUE}, the package will have #' a platform specific extension (e.g. \code{.zip} for windows), and will #' only be installable on the current platform, but no development #' environment is needed. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param path path in which to produce package. If \code{NULL}, defaults to #' the parent directory of the package. #' @param binary Produce a binary (\code{--binary}) or source ( #' \code{--no-manual --no-resave-data}) version of the package. #' @param vignettes,manual For source packages: if \code{FALSE}, don't build PDF #' vignettes (\code{--no-build-vignettes}) or manual (\code{--no-manual}). #' @param args An optional character vector of additional command #' line arguments to be passed to \code{R CMD build} if \code{binary = FALSE}, #' or \code{R CMD install} if \code{binary = TRUE}. #' @param quiet if \code{TRUE} suppresses output from this function. #' @export #' @family build functions #' @return a string giving the location (including file name) of the built #' package build <- function(pkg = ".", path = NULL, binary = FALSE, vignettes = TRUE, manual = FALSE, args = NULL, quiet = FALSE) { pkg <- as.package(pkg) if (is.null(path)) { path <- dirname(pkg$path) } check_build_tools(pkg) compile_rcpp_attributes(pkg) if (binary) { args <- c("--build", args) cmd <- paste0("CMD INSTALL ", shQuote(pkg$path), " ", paste0(args, collapse = " ")) if (.Platform$OS.type == "windows") { ext <- ".zip" } else if (grepl("darwin", R.version$os)) { ext <- ".tgz" } else { ext <- paste0("_R_", Sys.getenv("R_PLATFORM"), ".tar.gz") } } else { args <- c(args, "--no-resave-data") if (manual && !has_latex(verbose = TRUE)) { manual <- FALSE } if (!manual) { args <- c(args, "--no-manual") } if (!vignettes) { args <- c(args, "--no-build-vignettes") } cmd <- paste0("CMD build ", shQuote(pkg$path), " ", paste0(args, collapse = " ")) ext <- ".tar.gz" } # Run in temporary library to ensure that default library doesn't get # contaminated withr::with_temp_libpaths(R(cmd, path, quiet = quiet)) targz <- paste0(pkg$package, "_", pkg$version, ext) file.path(path, targz) } #' Build windows binary package. #' #' This function works by bundling source package, and then uploading to #' \url{http://win-builder.r-project.org/}. Once building is complete you'll #' receive a link to the built package in the email address listed in the #' maintainer field. It usually takes around 30 minutes. As a side effect, #' win-build also runs \code{R CMD check} on the package, so \code{build_win} #' is also useful to check that your package is ok on windows. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @inheritParams build #' @param version directory to upload to on the win-builder, controlling #' which version of R is used to build the package. Possible options are #' listed on \url{http://win-builder.r-project.org/}. Defaults to R-devel. #' @export #' @family build functions build_win <- function(pkg = ".", version = c("R-release", "R-devel"), args = NULL, quiet = FALSE) { pkg <- as.package(pkg) if (missing(version)) { version <- "R-devel" } else { version <- match.arg(version, several.ok = TRUE) } if (!quiet) { message("Building windows version of ", pkg$package, " for ", paste(version, collapse = ", "), " with win-builder.r-project.org.\n") if (interactive() && yesno("Email results to ", maintainer(pkg)$email, "?")) { return(invisible()) } } built_path <- build(pkg, tempdir(), args = args, quiet = quiet) on.exit(unlink(built_path)) url <- paste0("ftp://win-builder.r-project.org/", version, "/", basename(built_path)) lapply(url, upload_ftp, file = built_path) if (!quiet) { message("Check ", maintainer(pkg)$email, " for a link to the built package", if (length(version) > 1) "s" else "", " in 30-60 mins.") } invisible() } devtools/R/check-doc.r0000644000176200001440000000324512740754321014326 0ustar liggesusers#' Check documentation, as \code{R CMD check} does. #' #' This function attempts to run the documentation related checks in the #' same way that \code{R CMD check} does. Unfortunately it can't run them #' all because some tests require the package to be loaded, and the way #' they attempt to load the code conflicts with how devtools does it. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @return Nothing. This function is called purely for it's side effects: if # no errors there will be no output. #' @export #' @examples #' \dontrun{ #' check_man("mypkg") #' } check_man <- function(pkg = ".") { pkg <- as.package(pkg) document(pkg) old <- options(warn = -1) on.exit(options(old)) message("Checking documentation...") ok <- man_message(("tools" %:::% ".check_package_parseRd")(dir = pkg$path)) && man_message(("tools" %:::% ".check_Rd_metadata")(dir = pkg$path)) && man_message(("tools" %:::% ".check_Rd_xrefs")(dir = pkg$path)) && man_message(("tools" %:::% ".check_Rd_contents")(dir = pkg$path)) && man_message(tools::checkDocFiles(dir = pkg$path)) # Can't run because conflicts with how devtools loads code # print_if_not_null(checkDocStyle(dir = pkg$path)) # print_if_not_null(checkReplaceFuns(dir = pkg$path)) # print_if_not_null(checkS3methods(dir = pkg$path)) # print(undoc(dir = pkg$path)) if (ok) { message("No issues detected") } invisible() } man_message <- function(x) { if ("bad" %in% names(x) && length(x$bad) == 0) { # Returned by check_Rd_xrefs() TRUE } else if (length(x) == 0) { TRUE } else { print(x) FALSE } } devtools/R/release.r0000644000176200001440000002252213200623655014122 0ustar liggesusers#' Release package to CRAN. #' #' Run automated and manual tests, then ftp to CRAN. #' #' The package release process will: #' #' \itemize{ #' #' \item Confirm that the package passes \code{R CMD check} #' \item Ask if you've checked your code on win-builder #' \item Confirm that news is up-to-date #' \item Confirm that DESCRIPTION is ok #' \item Ask if you've checked packages that depend on your package #' \item Build the package #' \item Submit the package to CRAN, using comments in "cran-comments.md" #' } #' #' You can also add arbitrary extra questions by defining an (un-exported) #' function called \code{release_questions()} that returns a character vector #' of additional questions to ask. #' #' You also need to read the CRAN repository policy at #' \url{https://cran.r-project.org/web/packages/policies.html} and make #' sure you're in line with the policies. \code{release} tries to automate as #' many of polices as possible, but it's impossible to be completely #' comprehensive, and they do change in between releases of devtools. #' #' @section Guarantee: #' #' If a devtools bug causes one of the CRAN maintainers to treat you #' impolitely, I will personally send you a handwritten apology note. #' Please forward me the email and your address, and I'll get a card in #' the mail. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param check if \code{TRUE}, run checking, otherwise omit it. This #' is useful if you've just checked your package and you're ready to #' release it. #' @param args An optional character vector of additional command #' line arguments to be passed to \code{R CMD build}. #' @param spelling language or dictionary file to spell check documentation. #' See \code{\link{spell_check}}. Set to \code{NULL} to skip spell checking. #' @export release <- function(pkg = ".", check = TRUE, args = NULL, spelling = "en_US") { pkg <- as.package(pkg) # Figure out if this is a new package cran_version <- cran_pkg_version(pkg$package) new_pkg <- is.null(cran_version) dr_d <- dr_devtools() if (!dr_d) { print(dr_d) if (yesno("Proceed anyway?")) return(invisible()) } if (uses_git(pkg$path)) { git_checks(pkg) if (yesno("Were Git checks successful?")) return(invisible()) } if (length(spelling)) { cat("Spell checking documentation...\n") print(spell_check(pkg, dict = spelling)) cat("\n") if (yesno("Is documentation free of spelling errors? (you can ignore false positives)")) return(invisible()) } if (check) { rule("Building and checking ", pkg$package, pad = "=") check(pkg, cran = TRUE, check_version = TRUE, manual = TRUE, build_args = args, run_dont_test = TRUE) } if (yesno("Was R CMD check successful?")) return(invisible()) release_checks(pkg) if (yesno("Were devtool's checks successful?")) return(invisible()) if (!new_pkg) { cran_url <- paste0(cran_mirror(), "/web/checks/check_results_", pkg$package, ".html") if (yesno("Have you fixed all existing problems at \n", cran_url, " ?")) return(invisible()) } if (has_src(pkg)) { if (yesno("Have you run R CMD check with valgrind?")) return(invisible()) } deps <- if (new_pkg) 0 else length(revdep(pkg$package)) if (deps > 0) { msg <- paste0("Have you checked the ", deps ," packages that depend on ", "this package (with revdep_check())?") if (yesno(msg)) return(invisible()) } if (yesno("Have you checked on win-builder (with build_win())?")) return(invisible()) if (yesno("Have you updated your NEWS file?")) return(invisible()) rule("DESCRIPTION") cat(readLines(file.path(pkg$path, "DESCRIPTION")), sep = "\n") cat("\n") if (yesno("Is DESCRIPTION up-to-date?")) return(invisible()) release_questions <- pkg_env(pkg)$release_questions if (!is.null(release_questions)) { questions <- release_questions() for (question in questions) { if (yesno(question)) return(invisible()) } } rule("cran-comments.md") cat(cran_comments(pkg), "\n\n") if (yesno("Are the CRAN submission comments correct?")) return(invisible()) if (yesno("Is your email address ", maintainer(pkg)$email, "?")) return(invisible()) built_path <- build_cran(pkg, args = args) if (yesno("Ready to submit?")) return(invisible()) upload_cran(pkg, built_path) if (uses_git(pkg$path)) { message("Don't forget to tag the release when the package is accepted!") } invisible(TRUE) } release_email <- function(name, new_pkg) { paste( "Dear CRAN maintainers,\n", "\n", if (new_pkg) { paste("I have uploaded a new package, ", name, ", to CRAN. ", "I have read and agree to the CRAN policies.\n", sep = "") } else { paste("I have just uploaded a new version of ", name, " to CRAN.\n", sep = "") }, "\n", "Thanks!\n", "\n", getOption("devtools.name"), "\n", sep = "") } yesno <- function(...) { yeses <- c("Yes", "Definitely", "For sure", "Yup", "Yeah", "I agree", "Absolutely") nos <- c("No way", "Not yet", "I forget", "No", "Nope", "Uhhhh... Maybe?") cat(paste0(..., collapse = "")) qs <- c(sample(yeses, 1), sample(nos, 2)) rand <- sample(length(qs)) menu(qs[rand]) != which(rand == 1) } # http://tools.ietf.org/html/rfc2368 email <- function(address, subject, body) { url <- paste( "mailto:", utils::URLencode(address), "?subject=", utils::URLencode(subject), "&body=", utils::URLencode(body), sep = "" ) tryCatch({ utils::browseURL(url, browser = email_browser())}, error = function(e) { message("Sending failed with error: ", e$message) cat("To: ", address, "\n", sep = "") cat("Subject: ", subject, "\n", sep = "") cat("\n") cat(body, "\n", sep = "") } ) invisible(TRUE) } email_browser <- function() { if (!identical(.Platform$GUI, "RStudio")) return (getOption("browser")) # Use default browser, even if RStudio running if (.Platform$OS.type == "windows") return (NULL) browser <- Sys.which(c("xdg-open", "open")) browser[nchar(browser) > 0][[1]] } maintainer <- function(pkg = ".") { pkg <- as.package(pkg) authors <- pkg$`authors@r` if (!is.null(authors)) { people <- eval(parse(text = authors)) if (is.character(people)) { maintainer <- utils::as.person(people) } else { maintainer <- Find(function(x) "cre" %in% x$role, people) } } else { maintainer <- pkg$maintainer if (is.null(maintainer)) { stop("No maintainer defined in package.", call. = FALSE) } maintainer <- utils::as.person(maintainer) } list( name = paste(maintainer$given, maintainer$family), email = maintainer$email ) } cran_comments <- function(pkg = ".") { pkg <- as.package(pkg) path <- file.path(pkg$path, "cran-comments.md") if (!file.exists(path)) { warning("Can't find cran-comments.md.\n", "This file gives CRAN volunteers comments about the submission,\n", "and it must exist. Create it with use_cran_comments().\n", call. = FALSE) return(character()) } paste0(readLines(path, warn = FALSE), collapse = "\n") } cran_submission_url <- "http://xmpalantir.wu.ac.at/cransubmit/index2.php" #' Submit a package to CRAN. #' #' This uses the new CRAN web-form submission process. After submission, you #' will receive an email asking you to confirm submission - this is used #' to check that the package is submitted by the maintainer. #' #' It's recommended that you use \code{\link{release}()} rather than this #' function as it performs more checks prior to submission. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @inheritParams release #' @export #' @keywords internal submit_cran <- function(pkg = ".", args = NULL) { built_path <- build_cran(pkg, args = args) upload_cran(pkg, built_path) } build_cran <- function(pkg, args) { message("Building") built_path <- build(pkg, tempdir(), manual = TRUE, args = args) message("Submitting file: ", built_path) message("File size: ", format(as.object_size(file.info(built_path)$size), units = "auto")) built_path } upload_cran <- function(pkg, built_path) { pkg <- as.package(pkg) maint <- maintainer(pkg) comments <- cran_comments(pkg) # Initial upload --------- message("Uploading package & comments") body <- list( pkg_id = "", name = maint$name, email = maint$email, uploaded_file = httr::upload_file(built_path, "application/x-gzip"), comment = comments, upload = "Upload package" ) r <- httr::POST(cran_submission_url, body = body) httr::stop_for_status(r) new_url <- httr::parse_url(r$url) new_url$query$strErr # Confirmation ----------- message("Confirming submission") body <- list( pkg_id = new_url$query$pkg_id, name = maint$name, email = maint$email, policy_check = "1/", submit = "Submit package" ) r <- httr::POST(cran_submission_url, body = body) httr::stop_for_status(r) new_url <- httr::parse_url(r$url) if (new_url$query$submit == "1") { message("Package submission successful.\n", "Check your email for confirmation link.") } else { stop("Package failed to upload.", call. = FALSE) } invisible(TRUE) } as.object_size <- function(x) structure(x, class = "object_size") devtools/R/vignette-r.r0000644000176200001440000000454413171407310014565 0ustar liggesusers# Modified from src/library/tools/R/build.R # # Copyright (C) 1995-2013 The R Core Team # Copyright (C) 2013 Hadley Wickham # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # A copy of the GNU General Public License is available at # http://www.r-project.org/Licenses/ copy_vignettes <- function(pkg) { pkg <- as.package(pkg) doc_dir <- file.path(pkg$path, "inst", "doc") if (!file.exists(doc_dir)) { dir.create(doc_dir, recursive = TRUE, showWarnings = FALSE) } vigns <- tools::pkgVignettes(dir = pkg$path, output = TRUE, source = TRUE) if (length(vigns$docs) == 0) return(invisible()) out_mv <- c(vigns$outputs, unique(unlist(vigns$sources, use.names = FALSE))) out_cp <- vigns$docs message("Moving ", paste(basename(out_mv), collapse = ", "), " to inst/doc/") file.copy(out_mv, doc_dir, overwrite = TRUE) file.remove(out_mv) message("Copying ", paste(basename(out_cp), collapse = ", "), " to inst/doc/") file.copy(out_cp, doc_dir, overwrite = TRUE) # Copy extra files, if needed extra_files <- find_vignette_extras(pkg) if (length(extra_files) == 0) return(invisible()) message("Copying extra files ", paste(basename(extra_files), collapse = ", "), " to inst/doc/") file.copy(extra_files, doc_dir, recursive = TRUE) invisible() } find_vignette_extras <- function(pkg = ".") { pkg <- as.package(pkg) vig_path <- file.path(pkg$path, "vignettes") extras_file <- file.path(vig_path, ".install_extras") if (!file.exists(extras_file)) return(character()) extras <- readLines(extras_file, warn = FALSE) if (length(extras) == 0) return(character()) withr::with_dir(vig_path, { allfiles <- dir(all.files = TRUE, full.names = TRUE, recursive = TRUE, include.dirs = TRUE) }) inst <- rep(FALSE, length(allfiles)) for (e in extras) { inst <- inst | grepl(e, allfiles, perl = TRUE, ignore.case = TRUE) } normalizePath(file.path(vig_path, allfiles[inst])) } devtools/R/bash.r0000644000176200001440000000042512656131112013411 0ustar liggesusers#' Open bash shell in package directory. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @export bash <- function(pkg = ".") { pkg <- as.package(pkg) withr::with_dir(pkg$path, system("bash")) } devtools/R/imports-env.r0000644000176200001440000000643113200623655014766 0ustar liggesusers#' Return imports environment for a package #' #' Contains objects imported from other packages. Is the parent of the #' package namespace environment, and is a child of , #' which is a child of R_GlobalEnv. #' @keywords internal #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information. #' @seealso \code{\link{ns_env}} for the namespace environment that #' all the objects (exported and not exported). #' @seealso \code{\link{pkg_env}} for the attached environment that contains #' the exported objects. #' @export imports_env <- function(pkg = ".") { pkg <- as.package(pkg) if (!is_loaded(pkg)) { stop("Namespace environment must be created before accessing imports environment.") } env <- parent.env(ns_env(pkg)) if (attr(env, 'name') != imports_env_name(pkg)) { stop("Imports environment does not have attribute 'name' with value ", imports_env_name(pkg), ". This probably means that the namespace environment was not created correctly.") } env } # Generate name of package imports environment # Contains exported objects imports_env_name <- function(pkg = ".") { pkg <- as.package(pkg) paste("imports:", pkg$package, sep = "") } #' Load all of the imports for a package #' #' The imported objects are copied to the imports environment, and are not #' visible from R_GlobalEnv. This will automatically load (but not attach) #' the dependency packages. #' #' @keywords internal load_imports <- function(pkg = ".") { pkg <- as.package(pkg) # Get data frame of dependency names and versions deps <- parse_deps(pkg$imports) if (is.null(deps) || nrow(deps) == 0) return(invisible()) # If we've already loaded imports, don't load again (until load_all # is run with reset=TRUE). This is to avoid warnings when running # process_imports() if (length(ls(imports_env(pkg))) > 0) return(invisible(deps)) mapply(check_dep_version, deps$name, deps$version, deps$compare) process_imports(pkg) invisible(deps) } # Load imported objects # The code in this function is taken and adapted from base::loadNamespace # Setup variables were added and the for loops put in a tryCatch block # https://github.com/wch/r-source/blob/tags/R-3-3-0/src/library/base/R/namespace.R#L397-L427 # This wraps the inner for loop iterations in a tryCatch wrap_inner_loop <- function(x) { inner <- x[[4]] x[[4]] <- call("tryCatch", error = quote(warning), inner) x } onload_assign("process_imports", { make_function(alist(pkg = "."), bquote({ package <- pkg$name vI <- ("tools" %:::% ".split_description")(("tools" %:::% ".read_description")(file.path(pkg$path, "DESCRIPTION")))$Imports nsInfo <- parse_ns_file(pkg) ns <- ns_env(pkg) lib.loc <- NULL .(for1) .(for2) .(for3) }, list( for1 = wrap_inner_loop( extract_lang(body(loadNamespace), comp_lang, y = quote(for(i in nsInfo$imports) NULL), idx = 1:3)), for2 = wrap_inner_loop(extract_lang(body(loadNamespace), comp_lang, y = quote(for(imp in nsInfo$importClasses) NULL), idx = 1:3)), for3 = wrap_inner_loop(extract_lang(body(loadNamespace), comp_lang, y = quote(for(imp in nsInfo$importMethods) NULL), idx = 1:3)) )), asNamespace("devtools")) }) devtools/R/shims.r0000644000176200001440000001107213200623655013623 0ustar liggesusers# Insert shim objects into a package's imports environment # # @param pkg A path or package object insert_imports_shims <- function(pkg = ".") { pkg <- as.package(pkg) imp_env <- imports_env(pkg) imp_env$system.file <- shim_system.file imp_env$library.dynam.unload <- shim_library.dynam.unload } # Create a new environment as the parent of global, with devtools versions of # help, ?, and system.file. insert_global_shims <- function() { # If shims already present, just return if ("devtools_shims" %in% search()) return() e <- new.env() e$help <- shim_help e$`?` <- shim_question e$system.file <- shim_system.file base::attach(e, name = "devtools_shims", warn.conflicts = FALSE) } #' Replacement version of system.file #' #' This function is meant to intercept calls to \code{\link[base]{system.file}}, #' so that it behaves well with packages loaded by devtools. It is made #' available when a package is loaded with \code{\link{load_all}}. #' #' When \code{system.file} is called from the R console (the global #' envrironment), this function detects if the target package was loaded with #' \code{\link{load_all}}, and if so, it uses a customized method of searching #' for the file. This is necessary because the directory structure of a source #' package is different from the directory structure of an installed package. #' #' When a package is loaded with \code{load_all}, this function is also inserted #' into the package's imports environment, so that calls to \code{system.file} #' from within the package namespace will use this modified version. If this #' function were not inserted into the imports environment, then the package #' would end up calling \code{base::system.file} instead. #' @inheritParams base::system.file #' #' @usage # system.file(..., package = "base", lib.loc = NULL, mustWork = FALSE) #' @rdname system.file #' @name system.file #' @usage system.file(..., package = "base", lib.loc = NULL, mustWork = FALSE) shim_system.file <- function(..., package = "base", lib.loc = NULL, mustWork = FALSE) { # If package wasn't loaded with devtools, pass through to base::system.file. # If package was loaded with devtools (the package loaded with load_all) # search for files a bit differently. if (!(package %in% dev_packages())) { base::system.file(..., package = package, lib.loc = lib.loc, mustWork = mustWork) } else { pkg_path <- find.package(package) # First look in inst/ files_inst <- file.path(pkg_path, "inst", ...) present_inst <- file.exists(files_inst) # For any files that weren't present in inst/, look in the base path files_top <- file.path(pkg_path, ...) present_top <- file.exists(files_top) # Merge them together. Here are the different possible conditions, and the # desired result. NULL means to drop that element from the result. # # files_inst: /inst/A /inst/B /inst/C /inst/D # present_inst: T T F F # files_top: /A /B /C /D # present_top: T F T F # result: /inst/A /inst/B /C NULL # files <- files_top files[present_inst] <- files_inst[present_inst] # Drop cases where not present in either location files <- files[present_inst | present_top] if (length(files) > 0) { # Make sure backslahses are replaced with slashes on Windows normalizePath(files, winslash = "/") } else { if (mustWork) { stop("No file found", call. = FALSE) } else { "" } } # Note that the behavior isn't exactly the same as base::system.file with an # installed package; in that case, C and D would not be installed and so # would not be found. Some other files (like DESCRIPTION, data/, etc) would # be installed. To fully duplicate R's package-building and installation # behavior would be complicated, so we'll just use this simple method. } } shim_library.dynam.unload <- function(chname, libpath, verbose = getOption("verbose"), file.ext = .Platform$dynlib.ext) { # If package was loaded by devtools, we need to unload the dll ourselves # because libpath works differently from installed packages. if (!is.null(dev_meta(chname))) { try({ pkg <- as.package(libpath) unload_dll(pkg) }) return() } # Should only reach this in the rare case that the devtools-loaded package is # trying to unload a different package's DLL. base::library.dynam.unload(chname, libpath, verbose, file.ext) } devtools/R/source.r0000644000176200001440000000143112416621515013777 0ustar liggesuserssource_many <- function(files, envir = parent.frame()) { stopifnot(is.character(files)) stopifnot(is.environment(envir)) oop <- options( keep.source = TRUE, show.error.locations = TRUE, topLevelEnvironment = as.environment(envir)) on.exit(options(oop)) for (file in files) { source_one(file, envir = envir) } invisible() } source_one <- function(file, envir = parent.frame()) { stopifnot(file.exists(file)) stopifnot(is.environment(envir)) lines <- readLines(file, warn = FALSE) srcfile <- srcfilecopy(file, lines, file.info(file)[1, "mtime"], isFile = TRUE) exprs <- parse(text = lines, n = -1, srcfile = srcfile) n <- length(exprs) if (n == 0L) return(invisible()) for (i in seq_len(n)) { eval(exprs[i], envir) } invisible() } devtools/R/file-cache.r0000644000176200001440000000142413200623655014460 0ustar liggesusers# Generate checksums for a vector of file paths. # @keywords internal md5 <- function(paths) { unlist(lapply(paths, tools::md5sum)) } make_cache <- function() { .file_cache <- character() make <- function(paths) { paths <- path.expand(paths) new_hash <- md5(paths) old_hash <- .file_cache[paths] changed <- is.na(old_hash) | new_hash != old_hash .file_cache[paths[changed]] <<- new_hash[changed] paths[changed] } clear <- function() { .file_cache <<- character() } list(make = make, clear = clear) } .cache <- make_cache() # Given vector of paths, return only those paths that have changed since the # last invocation. # @keywords internal changed_files <- .cache$make # Clear file cache. # @keywords internal clear_cache <- .cache$clear devtools/R/spell-check.R0000644000176200001440000000541113200623655014632 0ustar liggesusers#' Spell checking #' #' Runs a spell check on text fields in the package description file and #' manual pages. Hunspell includes dictionaries for \code{en_US} and \code{en_GB} #' by default. Other languages require installation of a custom dictionary, see #' the \href{https://cran.r-project.org/package=hunspell/vignettes/intro.html#system_dictionaries}{hunspell vignette} #' for details. #' #' @export #' @rdname spell_check #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param ignore character vector with words to ignore. See #' \code{\link[hunspell:hunspell]{hunspell}} for more information #' @param dict a dictionary object or language string. See #' \code{\link[hunspell:hunspell]{hunspell}} for more information spell_check <- function(pkg = ".", ignore = character(), dict = "en_US"){ check_suggested("hunspell") pkg <- as.package(pkg) ignore <- c(pkg$package, hunspell::en_stats, ignore) # Check Rd manual files rd_files <- list.files(file.path(pkg$path, "man"), "\\.Rd$", full.names = TRUE) rd_lines <- lapply(sort(rd_files), spell_check_rd, ignore = ignore, dict = dict) # Check 'DESCRIPTION' fields pkg_fields <- c("title", "description") pkg_lines <- lapply(pkg_fields, function(x){ spell_check_file(textConnection(pkg[[x]]), ignore = ignore, dict = dict) }) # Combine all_sources <- c(rd_files, pkg_fields) all_lines <- c(rd_lines, pkg_lines) words_by_file <- lapply(all_lines, names) bad_words <- sort(unique(unlist(words_by_file))) # Find all occurences for each word out <- lapply(bad_words, function(word) { index <- which(vapply(words_by_file, `%in%`, x = word, logical(1))) reports <- vapply(index, function(i){ paste0(basename(all_sources[i]), ":", all_lines[[i]][word]) }, character(1)) }) structure(out, names = bad_words, class = "spellcheck") } #' @export print.spellcheck <- function(x, ...){ words <- names(x) fmt <- paste0("%-", max(nchar(words)) + 3, "s") pretty_names <- sprintf(fmt, words) cat(sprintf(fmt, " WORD"), " FOUND IN\n", sep = "") for(i in seq_along(x)){ cat(pretty_names[i]) cat(paste(x[[i]], collapse = ", ")) cat("\n") } } spell_check_text <- function(text, ignore, dict){ bad_words <- hunspell::hunspell(text, ignore = ignore, dict = dict) vapply(sort(unique(unlist(bad_words))), function(word) { line_numbers <- which(vapply(bad_words, `%in%`, x = word, logical(1))) paste(line_numbers, collapse = ",") }, character(1)) } spell_check_file <- function(file, ignore, dict){ spell_check_text(readLines(file), ignore = ignore, dict = dict) } spell_check_rd <- function(rdfile, ignore, dict){ text <- tools::RdTextFilter(rdfile) spell_check_text(text, ignore = ignore, dict = dict) } devtools/R/doctor.R0000644000176200001440000000762212740754326013750 0ustar liggesusers# Supress R CMD check note #' @importFrom memoise memoise NULL .rstudio_release <- function() { url <- "http://s3.amazonaws.com/rstudio-server/current.ver" numeric_version(readLines(url, warn = FALSE)) } rstudio_release <- memoise::memoise(.rstudio_release) .r_release <- function() { check_suggested("rversions") R_system_version(rversions::r_release()$version) } r_release <- memoise::memoise(.r_release) #' Diagnose potential devtools issues #' #' This checks to make sure you're using the latest release of R, #' the released version of RStudio (if you're using it as your gui), #' and the latest version of devtools and its dependencies. #' #' @family doctors #' @export #' @examples #' \dontrun{ #' dr_devtools() #' } dr_devtools <- function() { msg <- character() if (getRversion() < r_release()) { msg[["R"]] <- paste0( "* R is out of date (", getRversion(), " vs ", r_release(), ")" ) } deps <- package_deps("devtools", dependencies = NA) old <- deps$diff < 0 if (any(old)) { msg[["devtools"]] <- paste0( "* Devtools or dependencies out of date: \n", paste(deps$package[old], collapse = ", ") ) } if (rstudioapi::isAvailable()) { rel <- rstudio_release() cur <- rstudioapi::getVersion() if (cur < rel) { msg[["rstudio"]] <- paste0( "* RStudio is out of date (", cur, " vs ", rel, ")" ) } } doctor("devtools", msg) } #' Diagnose potential GitHub issues #' #' @param path Path to repository to check. Defaults to current working #' directory #' @family doctors #' @export #' @examples #' \donttest{ #' dr_github() #' } dr_github <- function(path = ".") { if (!uses_git(path)) { return(doctor("github", "Path is not a git repository")) } if (!uses_github(path)) { return(doctor("github", "Path is not a GitHub repository")) } msg <- character() r <- git2r::repository(path, discover = TRUE) config <- git2r::config(r) config_names <- names(modifyList(config$global, config$local)) if (!uses_github(path)) msg[["github"]] <- " * cannot detect that this repo is connected to GitHub" if (!("user.name" %in% config_names)) msg[["name"]] <- "* user.name config option not set" if (!("user.email" %in% config_names)) msg[["user"]] <- "* user.email config option not set" if (!file.exists("~/.ssh/id_rsa")) msg[["ssh"]] <- "* SSH private key not found" if (identical(Sys.getenv("GITHUB_PAT"), "")) msg[["PAT"]] <- paste("* GITHUB_PAT environment variable not set", "(this is not necessary unless you want to install private repos", "or connect local repos to GitHub)") desc_path <- file.path(path, "DESCRIPTION") desc <- read_dcf(desc_path) field_empty <- function(d, f) is.null(d[[f]]) || identical(d[[f]], "") field_no_re <- function(d, f, re) !grepl(re, d[[f]]) re <- "https://github.com/(.*?)/(.*)" if (field_empty(desc, "URL")) { msg[["URL_empty"]] <-"* empty URL field in DESCRIPTION" } else if (field_no_re(desc, "URL", re)) { msg[["URL"]] <-"* no GitHub repo link in URL field in DESCRIPTION" } re <- paste0(re, "/issues") if (field_empty(desc, "BugReports")) { msg[["BugReports_empty"]] <-"* empty BugReports field in DESCRIPTION" } else if (field_no_re(desc, "BugReports", re)) { msg[["BugReports"]] <-"* no GitHub Issues link in URL field in DESCRIPTION" } doctor("github", msg) } # Doctor class ------------------------------------------------------------ doctor <- function(name, messages) { structure( length(messages) == 0, doctor = paste0("DR_", toupper(name)), messages = messages, class = "doctor" ) } #' @export print.doctor <- function(x, ...) { if (x) { message(attr(x, "doctor"), " SAYS YOU LOOK HEALTHY") return() } warning(attr(x, "doctor"), " FOUND PROBLEMS", call. = FALSE, immediate. = TRUE) messages <- strwrap(attr(x, "messages"), exdent = 2) message(paste(messages, collapse = "\n")) } devtools/R/dev-help.r0000644000176200001440000001460713200623655014213 0ustar liggesusers#' Read the in-development help for a package loaded with devtools. #' #' Note that this only renders a single documentation file, so that links #' to other files within the package won't work. #' #' @param topic name of help to search for. #' @param stage at which stage ("build", "install", or "render") should #' \\Sexpr macros be executed? This is only important if you're using #' \\Sexpr macro's in your Rd files. #' @param type of html to produce: \code{"html"} or \code{"text"}. Defaults to #' your default documentation type. #' @export #' @examples #' \dontrun{ #' library("ggplot2") #' help("ggplot") # loads installed documentation for ggplot #' #' load_all("ggplot2") #' dev_help("ggplot") # loads development documentation for ggplot #' } dev_help <- function(topic, stage = "render", type = getOption("help_type")) { message("Using development documentation for ", topic) path <- find_topic(topic) if (is.null(path)) { dev <- paste(dev_packages(), collapse = ", ") stop("Could not find topic ", topic, " in: ", dev) } pkg <- basename(names(path)[1]) path <- normalizePath(path, winslash = "/") if (rstudioapi::hasFun("previewRd")) { rstudioapi::callFun("previewRd", path) } else { view_rd(path, pkg, stage = stage, type = type) } } view_rd <- function(path, package, stage = "render", type = getOption("help_type")) { if (is.null(type)) type <- "text" type <- match.arg(type, c("text", "html")) out_path <- paste(tempfile("Rtxt"), type, sep = ".") if (type == "text") { tools::Rd2txt(path, out = out_path, package = package, stages = stage) file.show(out_path, title = paste(package, basename(path), sep = ":")) } else if (type == "html") { tools::Rd2HTML(path, out = out_path, package = package, stages = stage, no_links = TRUE) css_path <- file.path(tempdir(), "R.css") if (!file.exists(css_path)) { file.copy(file.path(R.home("doc"), "html", "R.css"), css_path) } utils::browseURL(out_path) } } #' Drop-in replacements for help and ? functions #' #' The \code{?} and \code{help} functions are replacements for functions of the #' same name in the utils package. They are made available when a package is #' loaded with \code{\link{load_all}}. #' #' The \code{?} function is a replacement for \code{\link[utils]{?}} from the #' utils package. It will search for help in devtools-loaded packages first, #' then in regular packages. #' #' The \code{help} function is a replacement for \code{\link[utils]{help}} from #' the utils package. If \code{package} is not specified, it will search for #' help in devtools-loaded packages first, then in regular packages. If #' \code{package} is specified, then it will search for help in devtools-loaded #' packages or regular packages, as appropriate. #' #' @inheritParams utils::help utils::`?` #' @param topic A name or character string specifying the help topic. #' @param package A name or character string specifying the package in which #' to search for the help topic. If NULL, search all packages. #' @param e1 First argument to pass along to \code{utils::`?`}. #' @param e2 Second argument to pass along to \code{utils::`?`}. #' @param ... Additional arguments to pass to \code{\link[utils]{help}}. #' #' @rdname help #' @name help #' @usage # help(topic, package = NULL, ...) #' #' @examples #' \dontrun{ #' # This would load devtools and look at the help for load_all, if currently #' # in the devtools source directory. #' load_all() #' ?load_all #' help("load_all") #' } #' #' # To see the help pages for utils::help and utils::`?`: #' help("help", "utils") #' help("?", "utils") #' #' \dontrun{ #' # Examples demonstrating the multiple ways of supplying arguments #' # NB: you can't do pkg <- "ggplot2"; help("ggplot2", pkg) #' help(lm) #' help(lm, stats) #' help(lm, 'stats') #' help('lm') #' help('lm', stats) #' help('lm', 'stats') #' help(package = stats) #' help(package = 'stats') #' topic <- "lm" #' help(topic) #' help(topic, stats) #' help(topic, 'stats') #' } shim_help <- function(topic, package = NULL, ...) { # Reproduce help's NSE for topic - try to eval it and see if it's a string topic_name <- substitute(topic) is_char <- FALSE try(is_char <- is.character(topic) && length(topic) == 1L, silent = TRUE) if (is_char) { topic_str <- topic topic_name <- as.name(topic) } else if (missing(topic_name)) { # Leave the vars missing } else if (is.null(topic_name)) { topic_str <- NULL topic_name <- NULL } else { topic_str <- deparse(substitute(topic)) } # help's NSE for package is slightly simpler package_name <- substitute(package) if (is.name(package_name)) { package_str <- as.character(package_name) } else if (is.null(package_name)) { package_str <- NULL } else { package_str <- package package_name <- as.name(package) } use_dev <- (!is.null(package_str) && package_str %in% dev_packages()) || (is.null(package_str) && !is.null(find_topic(topic_str))) if (use_dev) { dev_help(topic_str) } else { # This is similar to list(), except that one of the args is a missing var, # it will replace it with an empty symbol instead of trying to evaluate it. as_list <- function(..., .env = parent.frame()) { dots <- match.call(expand.dots = FALSE)$`...` lapply(dots, function(var) { is_missing <- eval(substitute(missing(x), list(x = var)), .env) if (is_missing) { quote(expr=) } else { eval(var, .env) } }) } call <- substitute( utils::help(topic, package, ...), as_list(topic = topic_name, package = package_name) ) eval(call) } } #' @usage #' # ?e2 #' # e1?e2 #' #' @rdname help #' @name ? shim_question <- function(e1, e2) { # Get string version of e1, for find_topic e1_expr <- substitute(e1) if (is.name(e1_expr)) { # Called with a bare symbol, like ?foo e1_str <- deparse(e1_expr) } else if (is.call(e1_expr)) { if (e1_expr[[1]] == "?") { # Double question mark, like ??foo e1_str <- NULL } else { # Called with function arguments, like ?foo(12) e1_str <- deparse(e1_expr[[1]]) } } else { # If we got here, it's probably a string e1_str <- e1 } # Search for the topic in devtools-loaded packages. # If not found, call utils::`?`. if (!is.null(find_topic(e1_str))) { dev_help(e1_str) } else { eval(as.call(list(utils::`?`, substitute(e1), substitute(e2)))) } } devtools/R/wd.r0000644000176200001440000000100012416621515013101 0ustar liggesusers#' Set working directory. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param path path within package. Leave empty to change working directory #' to package directory. #' @export wd <- function(pkg = ".", path = "") { pkg <- as.package(pkg) path <- file.path(pkg$path, path) if (!file.exists(path)) { stop(path, " does not exist", call. = FALSE) } message("Changing working directory to ", path) setwd(path) } devtools/R/infrastructure-git.R0000644000176200001440000002070413200623655016303 0ustar liggesusers#' Initialise a git repository. #' #' @param message Message to use for first commit. #' @param pkg Path to package. See \code{\link{as.package}} for more #' information. #' @family git infrastructure #' @export #' @examples #' \dontrun{use_git()} use_git <- function(message = "Initial commit", pkg = ".") { use_git_with_config(message = message, pkg = pkg) } use_git_with_config <- function(message, pkg, add_user_config = FALSE, quiet = FALSE) { pkg <- as.package(pkg) if (uses_git(pkg$path)) { message("* Git is already initialized") return(invisible()) } if (!quiet) { message("* Initialising repo") } r <- git2r::init(pkg$path) if (add_user_config) { git2r::config(r, global = FALSE, user.name = "user", user.email = "user@email.xx") } use_git_ignore(c(".Rproj.user", ".Rhistory", ".RData"), pkg = pkg, quiet = quiet) if (!quiet) { message("* Adding files and committing") } paths <- unlist(git2r::status(r)) git2r::add(r, paths) git2r::commit(r, message) invisible() } #' Connect a local repo with GitHub. #' #' If the current repo does not use git, calls \code{\link{use_git}} #' automatically. \code{\link{use_github_links}} is called to populate the #' \code{URL} and \code{BugReports} fields of DESCRIPTION. #' #' @section Authentication: #' #' A new GitHub repo will be created via the GitHub API, therefore you must #' provide a GitHub personal access token (PAT) via the argument #' \code{auth_token}, which defaults to the value of the \code{GITHUB_PAT} #' environment variable. Obtain a PAT from #' \url{https://github.com/settings/tokens}. The "repo" scope is required #' which is one of the default scopes for a new PAT. #' #' The argument \code{protocol} reflects how you wish to authenticate with #' GitHub for this repo in the long run. For either \code{protocol}, a remote #' named "origin" is created, an initial push is made using the specified #' \code{protocol}, and a remote tracking branch is set. The URL of the #' "origin" remote has the form \code{git@@github.com:/.git} #' (\code{protocol = "ssh"}, the default) or #' \code{https://github.com//.git} (\code{protocol = #' "https"}). For \code{protocol = "ssh"}, it is assumed that public and #' private keys are in the default locations, \code{~/.ssh/id_rsa.pub} and #' \code{~/.ssh/id_rsa}, respectively, and that \code{ssh-agent} is configured #' to manage any associated passphrase. Alternatively, specify a #' \code{\link[git2r]{cred_ssh_key}} object via the \code{credentials} #' parameter. #' #' @inheritParams use_git #' @param auth_token Provide a personal access token (PAT) from #' \url{https://github.com/settings/tokens}. Defaults to the \code{GITHUB_PAT} #' environment variable. #' @param private If \code{TRUE}, creates a private repository. #' @param host GitHub API host to use. Override with the endpoint-root for your #' GitHub enterprise instance, for example, #' "https://github.hostname.com/api/v3". #' @param protocol transfer protocol, either "ssh" (the default) or "https" #' @param credentials A \code{\link[git2r]{cred_ssh_key}} specifying specific #' ssh credentials or NULL for default ssh key and ssh-agent behaviour. #' Default is NULL. #' @family git infrastructure #' @export #' @examples #' \dontrun{ #' ## to use default ssh protocol #' create("testpkg") #' use_github(pkg = "testpkg") #' #' ## or use https #' create("testpkg2") #' use_github(pkg = "testpkg2", protocol = "https") #' } use_github <- function(auth_token = github_pat(), private = FALSE, pkg = ".", host = "https://api.github.com", protocol = c("ssh", "https"), credentials = NULL) { if (is.null(auth_token)) { stop("GITHUB_PAT required to create new repo") } protocol <- match.arg(protocol) pkg <- as.package(pkg) use_git(pkg = pkg) if (uses_github(pkg$path)) { message("* GitHub is already initialized") return(invisible()) } message("* Checking title and description") message(" Title: ", pkg$title) message(" Description: ", pkg$description) if (yesno("Are title and description ok?")) { return(invisible()) } message("* Creating GitHub repository") create <- github_POST( "user/repos", pat = auth_token, body = list( name = jsonlite::unbox(pkg$package), description = jsonlite::unbox(gsub("\n", " ", pkg$title)), private = jsonlite::unbox(private) ), host = host ) message("* Adding GitHub remote") r <- git2r::repository(pkg$path) origin_url <- switch(protocol, https = create$clone_url, ssh = create$ssh_url) git2r::remote_add(r, "origin", origin_url) message("* Adding GitHub links to DESCRIPTION") use_github_links(pkg$path, auth_token = auth_token, host = host) if (git_uncommitted(pkg$path)) { git2r::add(r, "DESCRIPTION") git2r::commit(r, "Add GitHub links to DESCRIPTION") } message("* Pushing to GitHub and setting remote tracking branch") if (protocol == "ssh") { ## [1] push via ssh required for success setting remote tracking branch ## [2] to get passphrase from ssh-agent, you must use NULL credentials git2r::push(r, "origin", "refs/heads/master", credentials = credentials) } else { ## protocol == "https" ## in https case, when GITHUB_PAT is passed as password, ## the username is immaterial, but git2r doesn't know that cred <- git2r::cred_user_pass("EMAIL", auth_token) git2r::push(r, "origin", "refs/heads/master", credentials = cred) } git2r::branch_set_upstream(git2r::head(r), "origin/master") message("* View repo at ", create$html_url) invisible(NULL) } #' Add a git hook. #' #' @param hook Hook name. One of "pre-commit", "prepare-commit-msg", #' "commit-msg", "post-commit", "applypatch-msg", "pre-applypatch", #' "post-applypatch", "pre-rebase", "post-rewrite", "post-checkout", #' "post-merge", "pre-push", "pre-auto-gc". #' @param script Text of script to run #' @inheritParams use_git #' @export #' @family git infrastructure #' @keywords internal use_git_hook <- function(hook, script, pkg = ".") { pkg <- as.package(pkg) git_dir <- file.path(pkg$path, ".git") if (!file.exists(git_dir)) { stop("This project doesn't use git", call. = FALSE) } hook_dir <- file.path(git_dir, "hooks") if (!file.exists(hook_dir)) { dir.create(hook_dir) } hook_path <- file.path(hook_dir, hook) writeLines(script, hook_path) Sys.chmod(hook_path, "0744") } use_git_ignore <- function(ignores, directory = ".", pkg = ".", quiet = FALSE) { pkg <- as.package(pkg) paths <- paste0("`", ignores, "`", collapse = ", ") if (!quiet) { message("* Adding ", paths, " to ", file.path(directory, ".gitignore")) } path <- file.path(pkg$path, directory, ".gitignore") union_write(path, ignores) invisible(TRUE) } #' Add GitHub links to DESCRIPTION. #' #' Populates the URL and BugReports fields of DESCRIPTION with #' \code{https://github.com//} AND #' \code{https://github.com///issues}, respectively, unless #' those fields already exist. #' #' @inheritParams use_git #' @param auth_token Provide a personal access token (PAT) from #' \url{https://github.com/settings/tokens}. Defaults to the \code{GITHUB_PAT} #' environment variable. #' @param host GitHub API host to use. Override with the endpoint-root for your #' GitHub enterprise instance, for example, #' "https://github.hostname.com/api/v3". #' @family git infrastructure #' @keywords internal #' @export use_github_links <- function(pkg = ".", auth_token = github_pat(), host = "https://api.github.com") { if (!uses_github(pkg)) { stop("Cannot detect that package already uses GitHub.\n", "You might want to run use_github().") } gh_info <- github_info(pkg) pkg <- as.package(pkg) desc_path <- file.path(pkg$path, "DESCRIPTION") desc <- new_desc <- read_dcf(desc_path) path_to_repo <- paste("repos", gh_info$fullname, sep = "/") res <- github_GET(path = path_to_repo, pat = auth_token, host = host) github_URL <- res$html_url fill <- function(d, f, filler) { if (is.null(d[[f]]) || identical(d[[f]], "")) { d[[f]] <- filler } else { message("Existing ", f, " field found and preserved") } d } new_desc <- fill(new_desc, "URL", github_URL) new_desc <- fill(new_desc, "BugReports", file.path(github_URL, "issues")) if (!identical(desc, new_desc)) write_dcf(desc_path, new_desc) new_desc[c("URL", "BugReports")] } devtools/R/clean.r0000644000176200001440000000320013200623655013554 0ustar liggesusers#' Sources an R file in a clean environment. #' #' Opens up a fresh R environment and sources file, ensuring that it works #' independently of the current working environment. #' #' @param path path to R script #' @param quiet If \code{FALSE}, the default, all input and output will be #' displayed, as if you'd copied and paste the code. If \code{TRUE} #' only the final result and the any explicitly printed output will be #' displayed. #' @export clean_source <- function(path, quiet = FALSE) { stopifnot(file.exists(path)) opts <- paste("--quiet --file=", shQuote(path), sep = "") if (quiet) opts <- paste(opts, "--slave") R(opts, dirname(path)) } #' Evaluate code in a clean R session. #' #' @export #' @param expr an R expression to evaluate. For \code{eval_clean} this should #' already be quoted. For \code{evalq_clean} it will be quoted for you. #' @param quiet if \code{TRUE}, the default, only the final result and the #' any explicitly printed output will be displayed. If \code{FALSE}, all #' input and output will be displayed, as if you'd copied and paste the code. #' @return An invisible \code{TRUE} on success. #' @examples #' x <- 1 #' y <- 2 #' ls() #' evalq_clean(ls()) #' evalq_clean(ls(), FALSE) #' eval_clean(quote({ #' z <- 1 #' ls() #' })) eval_clean <- function(expr, quiet = TRUE) { stopifnot(is.language(expr)) tmp <- tempfile() on.exit(unlink(tmp)) text <- deparse(expr) writeLines(text, tmp) suppressMessages(clean_source(tmp, quiet = quiet)) invisible(TRUE) } #' @export #' @rdname eval_clean evalq_clean <- function(expr, quiet = TRUE) { eval_clean(substitute(expr), quiet = quiet) } devtools/R/package.r0000644000176200001440000000521412724305435014077 0ustar liggesusers#' Coerce input to a package. #' #' Possible specifications of package: #' \itemize{ #' \item path #' \item package object #' } #' @param x object to coerce to a package #' @param create only relevant if a package structure does not exist yet: if #' \code{TRUE}, create a package structure; if \code{NA}, ask the user #' (in interactive mode only) #' @export #' @keywords internal as.package <- function(x = NULL, create = NA) { if (is.package(x)) return(x) x <- package_file(path = x) load_pkg_description(x, create = create) } #' Find file in a package. #' #' It always starts by finding by walking up the path until it finds the #' root directory, i.e. a directory containing \code{DESCRIPTION}. If it #' cannot find the root directory, or it can't find the specified path, it #' will throw an error. #' #' @param ... Components of the path. #' @param path Place to start search for package directory. #' @export #' @examples #' \dontrun{ #' package_file("figures", "figure_1") #' } package_file <- function(..., path = ".") { if (!is.character(path) || length(path) != 1) { stop("`path` must be a string.", call. = FALSE) } path <- strip_slashes(normalizePath(path, mustWork = FALSE)) if (!file.exists(path)) { stop("Can't find '", path, "'.", call. = FALSE) } if (!file.info(path)$isdir) { stop("'", path, "' is not a directory.", call. = FALSE) } # Walk up to root directory while (!has_description(path)) { path <- dirname(path) if (is_root(path)) { stop("Could not find package root.", call. = FALSE) } } file.path(path, ...) } has_description <- function(path) { file.exists(file.path(path, 'DESCRIPTION')) } is_root <- function(path) { identical(path, dirname(path)) } strip_slashes <- function(x) { x <- sub("/*$", "", x) x } # Load package DESCRIPTION into convenient form. load_pkg_description <- function(path, create) { path_desc <- file.path(path, "DESCRIPTION") if (!file.exists(path_desc)) { if (is.na(create)) { if (interactive()) { message("No package infrastructure found in ", path, ". Create it?") create <- (menu(c("Yes", "No")) == 1) } else { create <- FALSE } } if (create) { setup(path = path) } else { stop("No description at ", path_desc, call. = FALSE) } } desc <- as.list(read.dcf(path_desc)[1, ]) names(desc) <- tolower(names(desc)) desc$path <- path structure(desc, class = "package") } #' Is the object a package? #' #' @keywords internal #' @export is.package <- function(x) inherits(x, "package") # Mockable variant of interactive interactive <- function() .Primitive("interactive")() devtools/R/with-debug.r0000644000176200001440000000401213200623655014533 0ustar liggesusers#' Temporarily set debugging compilation flags. #' #' @param code to execute. #' @param CFLAGS flags for compiling C code #' @param CXXFLAGS flags for compiling C++ code #' @param FFLAGS flags for compiling Fortran code. #' @param FCFLAGS flags for Fortran 9x code. #' @inheritParams withr::with_envvar #' @inheritParams compiler_flags #' @family debugging flags #' @export #' @examples #' flags <- names(compiler_flags(TRUE)) #' with_debug(Sys.getenv(flags)) #' #' \dontrun{ #' install("mypkg") #' with_debug(install("mypkg")) #' } with_debug <- function(code, CFLAGS = NULL, CXXFLAGS = NULL, FFLAGS = NULL, FCFLAGS = NULL, debug = TRUE) { defaults <- compiler_flags(debug = debug) flags <- c( CFLAGS = CFLAGS, CXXFLAGS = CXXFLAGS, FFLAGS = FFLAGS, FCFLAGS = FCFLAGS ) flags <- unlist(modifyList(as.list(defaults), as.list(flags))) withr::with_makevars(flags, code) } #' Default compiler flags used by devtools. #' #' These default flags enforce good coding practice by ensuring that #' \env{CFLAGS} and \env{CXXFLAGS} are set to \code{-Wall -pedantic}. #' These tests are run by cran and are generally considered to be good practice. #' #' By default \code{\link{compile_dll}} is run with \code{compiler_flags(TRUE)}, #' and check with \code{compiler_flags(FALSE)}. If you want to avoid the #' possible performance penalty from the debug flags, install the package. #' #' @param debug If \code{TRUE} adds \code{-g -O0} to all flags #' (Adding \env{FFLAGS} and \env{FCFLAGS} #' @family debugging flags #' @export #' @examples #' compiler_flags() #' compiler_flags(TRUE) compiler_flags <- function(debug = FALSE) { if (Sys.info()[["sysname"]] == "SunOS") { c( CFLAGS = "-g", CXXFLAGS = "-g" ) } else if (debug) { c( CFLAGS = "-UNDEBUG -Wall -pedantic -g -O0", CXXFLAGS = "-UNDEBUG -Wall -pedantic -g -O0", FFLAGS = "-g -O0", FCFLAGS = "-g -O0" ) } else { c( CFLAGS = "-Wall -pedantic", CXXFLAGS = "-Wall -pedantic" ) } } devtools/R/install-svn.r0000644000176200001440000001120013200623655014743 0ustar liggesusers#' Install a package from a SVN repository #' #' This function requires \code{svn} to be installed on your system in order to #' be used. #' #' It is vectorised so you can install multiple packages with #' a single command. #' #' @inheritParams install_git #' @param subdir A sub-directory withing a svn repository that may contain the #' package we are interested in installing. By default, this #' points to the 'trunk' directory. #' @param args A character vector providing extra arguments to pass on to # svn. #' @param revision svn revision, if omitted updates to latest #' @param branch Name of branch or tag to use, if not trunk. #' @param ... Other arguments passed on to \code{\link{install}} #' @export #' @family package installation #' @examples #' \dontrun{ #' install_svn("https://github.com/hadley/stringr") #' install_svn("https://github.com/hadley/httr", branch = "oauth") #'} install_svn <- function(url, subdir = NULL, branch = NULL, args = character(0), ..., revision = NULL, quiet = FALSE) { remotes <- lapply(url, svn_remote, svn_subdir = subdir, branch = branch, revision = revision, args = args) install_remotes(remotes, ..., quiet = quiet) } svn_remote <- function(url, svn_subdir = NULL, branch = NULL, revision = NULL, args = character(0)) { remote("svn", url = url, svn_subdir = svn_subdir, branch = branch, revision = revision, args = args ) } #' @export remote_download.svn_remote <- function(x, quiet = FALSE) { if (!quiet) { message("Downloading svn repo ", x$url) } bundle <- tempfile() svn_binary_path <- svn_path() args <- c("co", x$args, full_svn_url(x), bundle) message(shQuote(svn_binary_path), " ", paste0(args, collapse = " ")) request <- system2(svn_binary_path, args, stdout = FALSE, stderr = FALSE) # This is only looking for an error code above 0-success if (request > 0) { stop("There seems to be a problem retrieving this SVN-URL.", call. = FALSE) } withr::with_dir(bundle, { if (!is.null(x$revision)) { request <- system2(svn_binary_path, paste("update -r", x$revision)) if (request > 0) { stop("There was a problem switching to the requested SVN revision", call. = FALSE) } } }) bundle } #' @export remote_metadata.svn_remote <- function(x, bundle = NULL, source = NULL) { if (!is.null(bundle)) { withr::with_dir(bundle, { revision <- svn_revision() }) } else { revision <- NULL } list( RemoteType = "svn", RemoteUrl = x$url, RemoteSvnSubdir = x$svn_subdir, RemoteBranch = x$branch, RemoteArgs = if (length(x$args) > 0) paste0(deparse(x$args), collapse = " "), RemoteRevision = revision, RemoteSha = revision # for compatibility with other remotes ) } svn_path <- function(svn_binary_name = NULL) { # Use user supplied path if (!is.null(svn_binary_name)) { if (!file.exists(svn_binary_name)) { stop("Path ", svn_binary_name, " does not exist", .call = FALSE) } return(svn_binary_name) } # Look on path svn_path <- Sys.which("svn")[[1]] if (svn_path != "") return(svn_path) # On Windows, look in common locations if (.Platform$OS.type == "windows") { look_in <- c( "C:/Program Files/Svn/bin/svn.exe", "C:/Program Files (x86)/Svn/bin/svn.exe" ) found <- file.exists(look_in) if (any(found)) return(look_in[found][1]) } stop("SVN does not seem to be installed on your system.", call. = FALSE) } #' @export remote_package_name.svn_remote <- function(remote, ...) { description_url <- file.path(full_svn_url(remote), "DESCRIPTION") tmp_file <- tempfile() on.exit(rm(tmp_file)) response <- system2(svn_path(), paste("cat", description_url), stdout = tmp_file) if (!identical(response, 0L)) { stop("There was a problem retrieving the current SVN revision", call. = FALSE) } read_dcf(tmp_file)$Package } #' @export remote_sha.svn_remote <- function(remote, ...) { svn_revision(full_svn_url(remote)) } svn_revision <- function(url = NULL, svn_binary_path = svn_path()) { request <- system2(svn_binary_path, paste("info --xml", url), stdout = TRUE) if (!is.null(attr(request, "status")) && !identical(attr(request, "status"), 0L)) { stop("There was a problem retrieving the current SVN revision", call. = FALSE) } gsub(".*.*", "\\1", paste(collapse = "\n", request)) } full_svn_url <- function(x) { if (!is.null(x$branch)) { url <- file.path(x$url, "branches", x$branch) } else { url <- file.path(x$url, "trunk") } if (!is.null(x$svn_subdir)) { url <- file.path(url, x$svn_subdir) } url } format.svn_remote <- function(x, ...) { "SVN" } devtools/R/deps.R0000755000176200001440000003216413200623655013403 0ustar liggesusers#' Find all dependencies of a CRAN or dev package. #' #' Find all the dependencies of a package and determine whether they are ahead #' or behind CRAN. A \code{print()} method identifies mismatches (if any) #' between local and CRAN versions of each dependent package; an #' \code{update()} method installs outdated or missing packages from CRAN. #' #' @param pkg A character vector of package names. If missing, defaults to #' the name of the package in the current directory. #' @param dependencies Which dependencies do you want to check? #' Can be a character vector (selecting from "Depends", "Imports", #' "LinkingTo", "Suggests", or "Enhances"), or a logical vector. #' #' \code{TRUE} is shorthand for "Depends", "Imports", "LinkingTo" and #' "Suggests". \code{NA} is shorthand for "Depends", "Imports" and "LinkingTo" #' and is the default. \code{FALSE} is shorthand for no dependencies (i.e. #' just check this package, not its dependencies). #' @param quiet If \code{TRUE}, suppress output. #' @param upgrade If \code{TRUE}, also upgrade any of out date dependencies. #' @param repos A character vector giving repositories to use. #' @param type Type of package to \code{update}. If "both", will switch #' automatically to "binary" to avoid interactive prompts during package #' installation. #' #' @param object A \code{package_deps} object. #' @param bioconductor Install Bioconductor dependencies if the package has a #' BiocViews field in the DESCRIPTION. #' @param ... Additional arguments passed to \code{install_packages}. #' #' @return #' #' A \code{data.frame} with columns: #' #' \tabular{ll}{ #' \code{package} \tab The dependent package's name,\cr #' \code{installed} \tab The currently installed version,\cr #' \code{available} \tab The version available on CRAN,\cr #' \code{diff} \tab An integer denoting whether the locally installed version #' of the package is newer (1), the same (0) or older (-1) than the version #' currently available on CRAN.\cr #' } #' #' @export #' @examples #' \dontrun{ #' package_deps("devtools") #' # Use update to update any out-of-date dependencies #' update(package_deps("devtools")) #' } package_deps <- function(pkg, dependencies = NA, repos = getOption("repos"), type = getOption("pkgType")) { if (identical(type, "both")) { type <- "binary" } if (length(repos) == 0) repos <- character() repos[repos == "@CRAN@"] <- cran_mirror() if (missing(pkg)) { pkg <- as.package(".")$package } # It is important to not extract available_packages() to a variable, # for the case when pkg is empty (e.g., install(dependencies = FALSE) ). deps <- sort_ci(find_deps(pkg, available_packages(repos, type), top_dep = dependencies)) # Remove base packages inst <- installed.packages() base <- unname(inst[inst[, "Priority"] %in% c("base", "recommended"), "Package"]) deps <- setdiff(deps, base) # get remote types remote <- structure(lapply(deps, package2remote, repos = repos, type = type), class = "remotes") inst_ver <- vapply(deps, local_sha, character(1)) cran_ver <- vapply(remote, remote_sha, character(1)) cran_remote <- vapply(remote, inherits, logical(1), "cran_remote") diff <- compare_versions(inst_ver, cran_ver, cran_remote) res <- structure( data.frame( package = deps, installed = inst_ver, available = cran_ver, diff = diff, stringsAsFactors = FALSE ), class = c("package_deps", "data.frame") ) res$remote <- remote res } #' @export #' @rdname package_deps dev_package_deps <- function(pkg = ".", dependencies = NA, repos = getOption("repos"), type = getOption("pkgType"), bioconductor = TRUE) { pkg <- as.package(pkg) repos <- c(repos, parse_additional_repositories(pkg)) dependencies <- tolower(standardise_dep(dependencies)) dependencies <- intersect(dependencies, names(pkg)) parsed <- lapply(pkg[tolower(dependencies)], parse_deps) deps <- unlist(lapply(parsed, `[[`, "name"), use.names = FALSE) if (isTRUE(bioconductor) && is_bioconductor(pkg)) { check_bioconductor() bioc_repos <- BiocInstaller::biocinstallRepos() missing_repos <- setdiff(names(bioc_repos), names(repos)) if (length(missing_repos) > 0) repos[missing_repos] <- bioc_repos[missing_repos] } res <- filter_duplicate_deps( package_deps(deps, repos = repos, type = type), # We set this cache in install() so we can run install_deps() twice without # having to re-query the remotes installing$remote_deps %||% remote_deps(pkg)) # Only keep dependencies we actually want to use res[res$package %in% deps, ] } filter_duplicate_deps <- function(cran_deps, remote_deps, dependencies) { deps <- rbind(cran_deps, remote_deps) # Only keep the remotes that are specified in the cran_deps # Keep only the Non-CRAN remotes if there are duplicates as we want to install # the development version rather than the CRAN version. The remotes will # always be specified after the CRAN dependencies, so using fromLast will # filter out the CRAN dependencies. deps[!duplicated(deps$package, fromLast = TRUE), ] } ## -2 = not installed, but available on CRAN ## -1 = installed, but out of date ## 0 = installed, most recent version ## 1 = installed, version ahead of CRAN ## 2 = package not on CRAN UNINSTALLED <- -2L BEHIND <- -1L CURRENT <- 0L AHEAD <- 1L UNAVAILABLE <- 2L compare_versions <- function(inst, remote, is_cran) { stopifnot(length(inst) == length(remote) && length(inst) == length(is_cran)) compare_var <- function(i, c, cran) { if (!cran) { if (identical(i, c)) { return(CURRENT) } else { return(BEHIND) } } if (is.na(c)) return(UNAVAILABLE) # not on CRAN if (is.na(i)) return(UNINSTALLED) # not installed, but on CRAN i <- package_version(i) c <- package_version(c) if (i < c) { BEHIND # out of date } else if (i > c) { AHEAD # ahead of CRAN } else { CURRENT # most recent CRAN version } } vapply(seq_along(inst), function(i) compare_var(inst[[i]], remote[[i]], is_cran[[i]]), integer(1)) } parse_one_remote <- function(x) { pieces <- strsplit(x, "::", fixed = TRUE)[[1]] if (length(pieces) == 1) { type <- "github" repo <- pieces } else if (length(pieces) == 2) { type <- pieces[1] repo <- pieces[2] } else { stop("Malformed remote specification '", x, "'", call. = FALSE) } fun <- tryCatch(get(paste0(tolower(type), "_remote"), envir = asNamespace("devtools"), mode = "function", inherits = FALSE), error = function(e) stop("Unknown remote type: ", type, call. = FALSE)) fun(repo) } split_remotes <- function(x) { trim_ws(unlist(strsplit(x, ",[[:space:]]*"))) } remote_deps <- function(pkg) { pkg <- as.package(pkg) if (!has_dev_remotes(pkg)) { return(NULL) } dev_packages <- split_remotes(pkg[["remotes"]]) remote <- lapply(dev_packages, parse_one_remote) package <- vapply(remote, remote_package_name, character(1), USE.NAMES = FALSE) installed <- vapply(package, local_sha, character(1), USE.NAMES = FALSE) available <- vapply(remote, remote_sha, character(1), USE.NAMES = FALSE) diff <- installed == available diff <- ifelse(!is.na(diff) & diff, CURRENT, BEHIND) res <- structure( data.frame( package = package, installed = installed, available = available, diff = diff, stringsAsFactors = FALSE ), class = c("package_deps", "data.frame")) res$remote <- structure(remote, class = "remotes") res } has_dev_remotes <- function(pkg) { pkg <- as.package(pkg) !is.null(pkg[["remotes"]]) } #' @export print.package_deps <- function(x, show_ok = FALSE, ...) { class(x) <- "data.frame" ahead <- x$diff > CURRENT behind <- x$diff < CURRENT same_ver <- x$diff == CURRENT x$diff <- NULL x[] <- lapply(x, format) if (any(behind)) { cat("Needs update -----------------------------\n") print(x[behind, , drop = FALSE], row.names = FALSE, right = FALSE) } if (any(ahead)) { cat("Not on CRAN ----------------------------\n") print(x[ahead, , drop = FALSE], row.names = FALSE, right = FALSE) } if (show_ok && any(same_ver)) { cat("OK ---------------------------------------\n") print(x[same_ver, , drop = FALSE], row.names = FALSE, right = FALSE) } } #' @export #' @rdname package_deps #' @importFrom stats update update.package_deps <- function(object, ..., quiet = FALSE, upgrade = TRUE) { non_cran <- !vapply(object$remote, inherits, logical(1), "cran_remote") unavailable <- object$diff == UNAVAILABLE & non_cran if (any(unavailable)) { if (upgrade) { install_remotes(object$remote[unavailable], ..., quiet = quiet) } else if (!quiet) { message(sprintf(ngettext(sum(unavailable), "Skipping %d unavailable package: %s", "Skipping %d unavailable packages: %s" ), sum(unavailable), paste(object$package[unavailable], collapse = ", "))) } } ahead <- object$diff == AHEAD & non_cran if (any(ahead)) { if (upgrade) { install_remotes(object$remote[ahead], ..., quiet = quiet) } else if (!quiet) { message(sprintf(ngettext(sum(ahead), "Skipping %d package ahead of CRAN: %s", "Skipping %d packages ahead of CRAN: %s" ), sum(ahead), paste(object$package[ahead], collapse = ", "))) } } if (upgrade) { behind <- object$diff < CURRENT } else { behind <- is.na(object$installed) } install_remotes(object$remote[behind], ..., quiet = quiet) } install_packages <- function(pkgs, repos = getOption("repos"), type = getOption("pkgType"), ..., dependencies = FALSE, quiet = NULL) { if (identical(type, "both")) type <- "binary" if (is.null(quiet)) quiet <- !identical(type, "source") message(sprintf(ngettext(length(pkgs), "Installing %d package: %s", "Installing %d packages: %s" ), length(pkgs), paste(pkgs, collapse = ", "))) # if type is 'source' and on windows add Rtools to the path this assumes # setup_rtools() has already run and set the rtools path if (type == "source" && !is.null(get_rtools_path())) { old <- add_path(get_rtools_path(), 0) on.exit(set_path(old)) } utils::install.packages(pkgs, repos = repos, type = type, dependencies = dependencies, quiet = quiet) } find_deps <- function(pkgs, available = available.packages(), top_dep = TRUE, rec_dep = NA, include_pkgs = TRUE) { if (length(pkgs) == 0 || identical(top_dep, FALSE)) return(character()) top_dep <- standardise_dep(top_dep) rec_dep <- standardise_dep(rec_dep) top <- tools::package_dependencies(pkgs, db = available, which = top_dep) top_flat <- unlist(top, use.names = FALSE) if (length(rec_dep) != 0 && length(top_flat) > 0) { rec <- tools::package_dependencies(top_flat, db = available, which = rec_dep, recursive = TRUE) rec_flat <- unlist(rec, use.names = FALSE) } else { rec_flat <- character() } unique(c(if (include_pkgs) pkgs, top_flat, rec_flat)) } standardise_dep <- function(x) { if (identical(x, NA)) { c("Depends", "Imports", "LinkingTo") } else if (isTRUE(x)) { c("Depends", "Imports", "LinkingTo", "Suggests") } else if (identical(x, FALSE)) { character(0) } else if (is.character(x)) { x } else { stop("Dependencies must be a boolean or a character vector", call. = FALSE) } } #' Update packages that are missing or out-of-date. #' #' Works similarly to \code{install.packages()} but doesn't install packages #' that are already installed, and also upgrades out dated dependencies. #' #' @param pkgs Character vector of packages to update. IF \code{TRUE} all #' installed packages are updated. If \code{NULL} user is prompted to #' confirm update of all installed packages. #' @inheritParams package_deps #' @seealso \code{\link{package_deps}} to see which packages are out of date/ #' missing. #' @export #' @examples #' \dontrun{ #' update_packages("ggplot2") #' update_packages(c("plyr", "ggplot2")) #' } update_packages <- function(pkgs = NULL, dependencies = NA, repos = getOption("repos"), type = getOption("pkgType")) { if (isTRUE(pkgs)) { pkgs <- installed.packages()[, "Package"] } else if (is.null(pkgs)) { if (!yesno("Are you sure you want to update all installed packages?")) { pkgs <- installed.packages()[, "Package"] } else { return(invisible()) } } pkgs <- package_deps(pkgs, dependencies = dependencies, repos = repos, type = type) update(pkgs) } has_additional_repositories <- function(pkg) { pkg <- as.package(pkg) "additional_repositories" %in% names(pkg) } parse_additional_repositories <- function(pkg) { pkg <- as.package(pkg) if (has_additional_repositories(pkg)) { strsplit(pkg[["additional_repositories"]], "[,[:space:]]+")[[1]] } } #' @export `[.remotes` <- function(x,i,...) { r <- NextMethod("[") mostattributes(r) <- attributes(x) r } devtools/R/github.R0000644000176200001440000000557712740754321013742 0ustar liggesusersgithub_auth <- function(token) { if (is.null(token)) { NULL } else { httr::authenticate(token, "x-oauth-basic", "basic") } } github_response <- function(req) { text <- httr::content(req, as = "text") parsed <- jsonlite::fromJSON(text, simplifyVector = FALSE) if (httr::status_code(req) >= 400) { stop(github_error(req)) } parsed } github_error <- function(req) { text <- httr::content(req, as = "text", encoding = "UTF-8") parsed <- tryCatch(jsonlite::fromJSON(text, simplifyVector = FALSE), error = function(e) { list(message = text) }) errors <- vapply(parsed$errors, `[[`, "message", FUN.VALUE = character(1)) structure( list( call = sys.call(-1), message = paste0(parsed$message, " (", httr::status_code(req), ")\n", if (length(errors) > 0) { paste("* ", errors, collapse = "\n") }) ), class = c("condition", "error", "github_error")) } github_GET <- function(path, ..., pat = github_pat(), host = "https://api.github.com") { url <- httr::parse_url(host) url$path <- paste(url$path, path, sep = "/") ## May remove line below at release of httr > 1.1.0 url$path <- gsub("^/", "", url$path) ## req <- httr::GET(url, github_auth(pat), ...) github_response(req) } github_POST <- function(path, body, ..., pat = github_pat(), host = "https://api.github.com") { url <- httr::parse_url(host) url$path <- paste(url$path, path, sep = "/") ## May remove line below at release of httr > 1.1.0 url$path <- gsub("^/", "", url$path) ## req <- httr::POST(url, body = body, github_auth(pat), encode = "json", ...) github_response(req) } github_rate_limit <- function() { req <- github_GET("rate_limit") core <- req$resources$core reset <- as.POSIXct(core$reset, origin = "1970-01-01") cat(core$remaining, " / ", core$limit, " (Reset ", strftime(reset, "%H:%M:%S"), ")\n", sep = "") } github_commit <- function(username, repo, ref = "master") { github_GET(file.path("repos", username, repo, "commits", ref)) } github_tag <- function(username, repo, ref = "master") { github_GET(file.path("repos", username, repo, "tags", ref)) } #' Retrieve Github personal access token. #' #' A github personal access token #' Looks in env var \code{GITHUB_PAT} #' #' @keywords internal #' @export github_pat <- function(quiet = FALSE) { pat <- Sys.getenv("GITHUB_PAT") if (nzchar(pat)) { if (!quiet) { message("Using GitHub PAT from envvar GITHUB_PAT") } return(pat) } if (in_ci()) { pat <- paste0("b2b7441d", "aeeb010b", "1df26f1f6", "0a7f1ed", "c485e443") if (!quiet) { message("Using bundled GitHub PAT. Please add your own PAT to the env var `GITHUB_PAT`") } return(pat) } return(NULL) } in_ci <- function() { nzchar(Sys.getenv("CI")) } devtools/R/install-git.r0000644000176200001440000000655113171407310014730 0ustar liggesusers#' Install a package from a git repository #' #' It is vectorised so you can install multiple packages with #' a single command. You do not need to have git installed. #' #' @param url Location of package. The url should point to a public or #' private repository. #' @param subdir A sub-directory within a git repository that may #' contain the package we are interested in installing. #' @param branch Name of branch or tag to use, if not master. #' @param credentials A git2r credentials object passed through #' to \code{\link[git2r]{clone}}. #' @param ... passed on to \code{\link{install}} #' @inheritParams install_url #' @export #' @family package installation #' @examples #' \dontrun{ #' install_git("git://github.com/hadley/stringr.git") #' install_git("git://github.com/hadley/stringr.git", branch = "stringr-0.2") #'} install_git <- function(url, subdir = NULL, branch = NULL, credentials = NULL, quiet=FALSE, ...) { remotes <- lapply(url, git_remote, subdir = subdir, branch = branch, credentials=credentials) install_remotes(remotes, quiet = quiet, ...) } git_remote <- function(url, subdir = NULL, branch = NULL, credentials=NULL) { remote("git", url = url, subdir = subdir, branch = branch, credentials = credentials ) } #' @export remote_download.git_remote <- function(x, quiet = FALSE) { if (!quiet) { message("Downloading git repo ", x$url) } bundle <- tempfile() git2r::clone(x$url, bundle, credentials=x$credentials, progress = FALSE) if (!is.null(x$branch)) { r <- git2r::repository(bundle) git2r::checkout(r, x$branch) } bundle } #' @export remote_metadata.git_remote <- function(x, bundle = NULL, source = NULL) { if (!is.null(bundle)) { r <- git2r::repository(bundle) sha <- git_repo_sha1(r) } else { sha <- NULL } list( RemoteType = "git", RemoteUrl = x$url, RemoteSubdir = x$subdir, RemoteRef = x$ref, RemoteSha = sha ) } #' @export remote_package_name.git_remote <- function(remote, ...) { tmp <- tempfile() on.exit(unlink(tmp)) description_path <- paste0(collapse = "/", c(remote$subdir, "DESCRIPTION")) # Try using git archive --remote to retrieve the DESCRIPTION, if the protocol # or server doesn't support that return NULL res <- try(silent = TRUE, system_check(git_path(), args = c("archive", "-o", tmp, "--remote", remote$url, if (is.null(remote$branch)) "HEAD" else remote$branch, description_path), quiet = TRUE)) if (inherits(res, "try-error")) { return(NA) } # git archive return a tar file, so extract it to tempdir and read the DCF utils::untar(tmp, files = description_path, exdir = tempdir()) read_dcf(file.path(tempdir(), description_path))$Package } #' @export remote_sha.git_remote <- function(remote, ...) { # If the remote ref is the same as the sha it is a pinned commit so just # return that. if (!is.null(remote$ref) && grepl(paste0("^", remote$ref), remote$sha)) { return(remote$sha) } tryCatch({ res <- git2r::remote_ls(remote$url, credentials=remote$credentials, ...) branch <- remote$branch %||% "master" found <- grep(pattern = paste0("/", branch), x = names(res)) if (length(found) == 0) { return(NA) } unname(res[found[1]]) }, error = function(e) NA_character_) } #' @export format.git_remote <- function(x, ...) { "Git" } devtools/R/install-version.r0000644000176200001440000000526313200623655015636 0ustar liggesusers#' Install specified version of a CRAN package. #' #' If you are installing an package that contains compiled code, you will #' need to have an R development environment installed. You can check #' if you do by running \code{\link{has_devel}}. #' #' @export #' @family package installation #' @param package package name #' @param version If the specified version is NULL or the same as the most #' recent version of the package, this function simply calls #' \code{\link{install}}. Otherwise, it looks at the list of #' archived source tarballs and tries to install an older version instead. #' @param ... Other arguments passed on to \code{\link{install}}. #' @inheritParams utils::install.packages #' @inheritParams install_url #' @author Jeremy Stephens install_version <- function(package, version = NULL, repos = getOption("repos"), type = getOption("pkgType"), ..., quiet = FALSE) { contriburl <- contrib.url(repos, type) available <- available.packages(contriburl) if (package %in% row.names(available)) { current.version <- available[package, 'Version'] if (is.null(version) || version == current.version) { return(install.packages(package, repos = repos, contriburl = contriburl, type = type, ...)) } } info <- package_find_repo(package, repos) if (is.null(version)) { # Grab the latest one: only happens if pulled from CRAN package.path <- info$path[NROW(info)] } else { package.path <- paste(package, "/", package, "_", version, ".tar.gz", sep = "") if (!(package.path %in% info$path)) { stop(sprintf("version '%s' is invalid for package '%s'", version, package)) } } url <- paste(info$repo[1L], "/src/contrib/Archive/", package.path, sep = "") install_url(url, ...) } read_archive <- function(repo) { tryCatch({ con <- gzcon(url(sprintf("%s/src/contrib/Meta/archive.rds", repo), "rb")) on.exit(close(con)) readRDS(con) }, warning = function(e) { list() }, error = function(e) { list() }) } package_find_repo <- function(package, repos) { archive_info <- function(repo) { if (length(repos) > 1) message("Trying ", repo) archive <- read_archive(repo) info <- archive[[package]] if (!is.null(info)) { info$repo <- repo info$path <- rownames(info) info } } res <- do.call(rbind.data.frame, c(lapply(repos, archive_info), list(make.row.names = FALSE))) if (NROW(res) == 0) { stop(sprintf("couldn't find package '%s'", package)) } # order by the path (which contains the version) and then by modified time. # This needs to be done in case the same package is available from multiple # repositories. res[order(res$path, res$mtime), ] } devtools/R/check-results.R0000644000176200001440000000665312724305435015230 0ustar liggesusersparse_check_results <- function(path) { lines <- paste(readLines(path, warn = FALSE), collapse = "\n") # Strip off trailing NOTE and WARNING messages lines <- gsub("^NOTE: There was .*\n$", "", lines) lines <- gsub("^WARNING: There was .*\n$", "", lines) pieces <- strsplit(lines, "\n\\* ")[[1]] structure( list( errors = pieces[grepl("... ERROR", pieces, fixed = TRUE)], warnings = pieces[grepl("... WARN", pieces, fixed = TRUE)], notes = pieces[grepl("... NOTE", pieces, fixed = TRUE)] ), path = path, class = "check_results" ) } signal_check_results <- function(x, on = c("none", "error", "warning", "note")) { has <- lapply(x, function(x) length(x) > 0) on <- match.arg(on) has_problem <- switch(on, none = FALSE, error = has$errors, warning = has$errors | has$warnings, note = has$errors | has$warnings | has$notes ) if (has_problem) { stop(summarise_check_results(x), call. = FALSE) } invisible(TRUE) } #' @export print.check_results <- function(x, ...) { message("R CMD check results") message(summarise_check_results(x)) cat(format(x), "\n", sep = "") invisible(x) } #' @export format.check_results <- function(x, ...) { checks <- trunc_middle(unlist(x)) paste0(checks, collapse = "\n\n") } summarise_check_results <- function(x, colour = FALSE) { n <- lapply(x, length) paste0( show_count(n$errors, "error ", "errors", colour && n$errors > 0), " | ", show_count(n$warnings, "warning ", "warnings", colour && n$warnings > 0), " | ", show_count(n$notes, "note ", "notes") ) } show_count <- function(n, singular, plural, is_error = FALSE) { out <- paste0(n, " ", ngettext(n, singular, plural)) if (is_error && requireNamespace("crayon", quietly = TRUE)) { out <- crayon::red(out) } out } has_problems <- function(x) { length(x$results$errors) > 0 || length(x$results$warnings) > 0 } first_problem <- function(x) { if (length(x$errors) > 0) { problem <- x$errors[[1]] } else if (length(x$warnings) > 0) { problem <- x$warnings[[1]] } else { return(NA_character_) } strsplit(problem, "\n", fixed = TRUE)[[1]][1] } trunc_middle <- function(x, n_max = 25, n_top = 10, n_bottom = 10) { trunc_middle_one <- function(x) { lines <- strsplit(x, "\n", fixed = TRUE)[[1]] nlines <- length(lines) if (nlines <= n_max) return(x) paste(c( lines[1:n_top], paste0("... ", length(lines) - n_top - n_bottom, " lines ..."), lines[(nlines - n_bottom):nlines] ), collapse = "\n") } vapply(x, trunc_middle_one, character(1), USE.NAMES = FALSE) } #' Parses R CMD check log file for ERRORs, WARNINGs and NOTEs #' #' Extracts check messages from the \code{00check.log} file generated by #' \code{R CMD check}. #' #' @param path check path, e.g., value of the \code{check_dir} argument in a #' call to \code{\link{check}} #' @param error,warning,note logical, indicates if errors, warnings and/or #' notes should be returned #' @return a character vector with the relevant messages, can have length zero #' if no messages are found #' #' @seealso \code{\link{check}}, \code{\link{revdep_check}} #' @export check_failures <- function(path, error = TRUE, warning = TRUE, note = TRUE) { check_dir <- file.path(path, "00check.log") results <- parse_check_results(check_dir) c( if (error) results$errors, if (warning) results$warnings, if (note) results$notes ) } devtools/R/install-min.r0000644000176200001440000000121313200623655014723 0ustar liggesusersinstall_min <- function(pkg = ".", dest, components = NULL, args = NULL, quiet = FALSE) { pkg <- as.package(pkg) stopifnot(is.character(dest), length(dest) == 1, file.exists(dest)) poss <- c("R", "data", "help", "demo", "inst", "docs", "exec", "libs") if (!is.null(components)) { components <- match.arg(components, poss, several.ok = TRUE) } no <- setdiff(poss, components) no_args <- paste0("--no-", no) RCMD("INSTALL", c( shQuote(pkg$path), paste("--library=", shQuote(dest), sep = ""), no_args, "--no-multiarch", "--no-test-load", args ), quiet = quiet) invisible(file.path(dest, pkg$package)) } devtools/R/load-dll.r0000644000176200001440000000575413200623655014202 0ustar liggesusers#' Load a compiled DLL #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @keywords programming #' @name load_dll #' @usage load_dll(pkg = ".") #' @export onload_assign("load_dll", make_function(alist(pkg = "."), bquote({ pkg <- as.package(pkg) env <- ns_env(pkg) nsInfo <- parse_ns_file(pkg) dlls <- list() dynLibs <- nsInfo$dynlibs ## The code below taken directly from base::loadNamespace ## https://github.com/wch/r-source/blob/tags/R-3-3-0/src/library/base/R/namespace.R#L466-L485 ## except for the call to library.dynam2, which is a special version of ## library.dynam .(for_loop) addNamespaceDynLibs(env, nsInfo$dynlibs) invisible(dlls) }, list(for_loop = modify_lang( f = function(x) if (comp_lang(x, quote(library.dynam()), 1)) { quote(library.dynam2(pkg, lib)) } else { x }, extract_lang(body(loadNamespace), comp_lang, y = quote(for (i in seq_along(dynLibs)) NULL), idx = 1:3) ))))) # Return a list of currently loaded DLLs from the package loaded_dlls <- function(pkg = ".") { pkg <- as.package(pkg) libs <- .dynLibs() matchidx <- vapply(libs, "[[", character(1), "name") == pkg$package libs[matchidx] } # This is a replacement for base::library.dynam, with a slightly different # call interface. The original requires that the name of the package is the # same as the directory name, which isn't always the case when loading with # devtools. This version allows them to be different, and also searches in # the src/ directory for the DLLs, instead of the libs/$R_ARCH/ directory. library.dynam2 <- function(pkg = ".", lib = "") { pkg <- as.package(pkg) dllname <- paste(lib, .Platform$dynlib.ext, sep = "") dllfile <- file.path(pkg$path, "src", dllname) if (!file.exists(dllfile)) return(invisible()) # # The loading and registering of the dll is similar to how it's done # # in library.dynam. dllinfo <- dyn.load(dllfile) # Register dll info so it can be unloaded with library.dynam.unload .dynLibs(c(.dynLibs(), list(dllinfo))) return(dllinfo) } # This is taken directly from base::loadNamespace() # https://github.com/wch/r-source/blob/tags/R-3-3-0/src/library/base/R/namespace.R#L270-L273 onload_assign("addNamespaceDynLibs", eval(extract_lang(body(loadNamespace), comp_lang, y = quote(addNamespaceDynLibs <- NULL), idx = 1:2)[[3]])) # This is taken directly from base::loadNamespace # https://github.com/wch/r-source/blob/tags/R-3-3-0/src/library/base/R/namespace.R#L287-L308 # The only change is the line used get the package name onload_assign("assignNativeRoutines", { f <- eval( extract_lang(body(loadNamespace), comp_lang, y = quote(assignNativeRoutines <- NULL), idx = 1:2)[[3]]) body(f) <- as.call(append(after = 1, as.list(body(f)), quote(package <- methods::getPackageName(env)))) f }) devtools/R/revdep.R0000755000176200001440000003042113200623655013727 0ustar liggesusers#' Reverse dependency tools. #' #' Tools to check and notify maintainers of all CRAN and bioconductor #' packages that depend on the specified package. #' #' The first run in a session will be time-consuming because it must download #' all package metadata from CRAN and bioconductor. Subsequent runs will #' be faster. #' #' @param pkg Package name. This is unlike most devtools packages which #' take a path because you might want to determine dependencies for a package #' that you don't have installed. If omitted, defaults to the name of the #' current package. #' @param ignore A character vector of package names to ignore. These packages #' will not appear in returned vector. This is used in #' \code{\link{revdep_check}} to avoid packages with installation problems #' or extremely long check times. #' @param dependencies A character vector listing the types of dependencies #' to follow. #' @param bioconductor If \code{TRUE} also look for dependencies amongst #' bioconductor packages. #' @param recursive If \code{TRUE} look for full set of recursive dependencies. #' @inheritParams tools::dependsOnPkgs #' @seealso \code{\link{revdep_check}()} to run R CMD check on all reverse #' dependencies. #' @export #' @examples #' \dontrun{ #' revdep("ggplot2") #' #' revdep("ggplot2", ignore = c("xkcd", "zoo")) #'} revdep <- function(pkg, dependencies = c("Depends", "Imports", "Suggests", "LinkingTo"), recursive = FALSE, ignore = NULL, bioconductor = FALSE) { if (missing(pkg)) pkg <- as.package(".")$package all <- if (bioconductor) packages() else cran_packages() deps <- tools::dependsOnPkgs(pkg, dependencies, recursive, installed = all) deps <- setdiff(deps, ignore) sort_ci(deps) } #' @rdname revdep #' @export revdep_maintainers <- function(pkg = ".") { if (missing(pkg)) pkg <- as.package(".")$package maintainers <- unique(packages()[revdep(pkg), "Maintainer"]) class(maintainers) <- "maintainers" maintainers } #' @export print.maintainers <- function(x, ...) { x <- gsub("\n", " ", x) cat(x, sep = ",\n") cat("\n") } #' Run R CMD check on all downstream dependencies. #' #' Use \code{revdep_check()} to run \code{\link{check_cran}()} on all downstream #' dependencies. Summarises the results with \code{revdep_check_summary()} and #' see problems with \code{revdep_check_print_problems()}. #' #' Revdep checks are resumable - this is very helpful if somethings goes #' wrong (like you run out of power or you lose your internet connection) in #' the middle of a check. You can resume a partially completed check with #' \code{revdep_check_resume()}, or blow away the cached result so you can #' start afresh with \code{revdep_check_reset()}. #' #' @section Check process: #' \enumerate{ #' \item Install \code{pkg} (in special library, see below). #' \item Find all CRAN packages that depend on \code{pkg}. #' \item Install those packages, along with their dependencies. #' \item Run \code{R CMD check} on each package. #' \item Uninstall \code{pkg} (so other reverse dependency checks don't #' use the development version instead of the CRAN version) #' } #' #' @section Package library: #' By default \code{revdep_check} uses a temporary library to store any packages #' that are required by the packages being tested. This ensures that they don't #' interfere with your default library, but means that if you restart R #' between checks, you'll need to reinstall all the packages. If you're #' doing reverse dependency checks frequently, I recommend that you create #' a directory for these packages and set \code{options(devtools.revdep.libpath)}. #' #' @inheritParams revdep #' @param pkg Path to package. Defaults to current directory. #' @param skip A character vector of package names to exclude from the #' checks. #' @inheritParams check_cran #' @param check_dir A temporary directory to hold the results of the package #' checks. This should not exist as after the revdep checks complete #' successfully this directory is blown away. #' @seealso \code{\link{revdep_maintainers}()} to get a list of all revdep #' maintainers. #' @export #' @return An invisible list of results. But you'll probably want to look #' at the check results on disk, which are saved in \code{check_dir}. #' Summaries of all ERRORs and WARNINGs will be stored in #' \code{check_dir/00check-summary.txt}. #' @examples #' \dontrun{ #' # Run R CMD check on all downstream dependencies #' revdep_check() #' revdep_check_save_summary() #' revdep_check_print_problems() #' } revdep_check <- function(pkg = ".", recursive = FALSE, ignore = NULL, dependencies = c("Depends", "Imports", "Suggests", "LinkingTo"), skip = character(), libpath = getOption("devtools.revdep.libpath"), srcpath = libpath, bioconductor = FALSE, type = getOption("pkgType"), threads = getOption("Ncpus", 1), env_vars = NULL, check_dir = NULL, install_dir = NULL, quiet_check = TRUE) { pkg <- as.package(pkg) revdep_path <- file.path(pkg$path, "revdep") if (!file.exists(revdep_path)) { dir.create(revdep_path) } if (file.exists(revdep_cache_path(pkg))) { stop("Cache file `revdep/.cache.rds` exists.\n", "Use `revdep_check_resume()` to resume\n", "Use `revdep_check_reset()` to start afresh.", call. = FALSE) } rule("Reverse dependency checks for ", pkg$package, pad = "=") if (is.null(check_dir)) { check_dir <- file.path(pkg$path, "revdep", "checks") message("Saving check results in `revdep/checks/`") } if (dir.exists(check_dir) && length(dir(check_dir, all.files = TRUE, no.. = TRUE)) > 0) { stop("`check_dir()` must not already exist: it is deleted after a successful run", call. = FALSE) } if (is.null(install_dir)) { install_dir <- file.path(pkg$path, "revdep", "install") message("Saving install results in `revdep/install/`") } message("Computing reverse dependencies... ") revdeps <- revdep(pkg$package, recursive = recursive, ignore = ignore, bioconductor = bioconductor, dependencies = dependencies) # Save arguments and revdeps to a cache cache <- list( pkgs = revdeps, skip = skip, libpath = libpath, srcpath = srcpath, bioconductor = bioconductor, type = type, threads = threads, check_dir = check_dir, install_dir = install_dir, env_vars = env_vars, quiet_check = quiet_check ) saveRDS(cache, revdep_cache_path(pkg)) revdep_check_from_cache(pkg, cache) } #' @export #' @rdname revdep_check #' @param ... Optionally, override original value of arguments to #' \code{revdep_check}. Use with care. revdep_check_resume <- function(pkg = ".", ...) { pkg <- as.package(pkg) cache_path <- revdep_cache_path(pkg) if (!file.exists(cache_path)) { message("Previous run completed successfully") return(invisible()) } cache <- readRDS(cache_path) cache <- utils::modifyList(cache, list(...)) # Don't need to check packages that we've already checked. check_dirs <- dir(cache$check_dir, full.names = TRUE) completed <- file.exists(file.path(check_dirs, "COMPLETE")) completed_pkgs <- gsub("\\.Rcheck$", "", basename(check_dirs)[completed]) cache$pkgs <- setdiff(cache$pkgs, completed_pkgs) revdep_check_from_cache(pkg, cache) } #' @rdname revdep_check #' @export revdep_check_reset <- function(pkg = ".") { pkg <- as.package(pkg) cache_path <- revdep_cache_path(pkg) if (!file.exists(cache_path)) { return(invisible(FALSE)) } cache <- readRDS(cache_path) unlink(cache_path) unlink(cache$check_dir, recursive = TRUE) invisible(TRUE) } revdep_check_from_cache <- function(pkg, cache) { # Install all dependencies for this package into revdep library -------------- if (!file.exists(cache$libpath)) { dir.create(cache$libpath, recursive = TRUE, showWarnings = FALSE) } message( "Installing dependencies for ", pkg$package, " to ", cache$libpath ) # For installing from GitHub, if git2r is not installed in the cache$libpath requireNamespace("git2r", quietly = TRUE) withr::with_libpaths(cache$libpath, { install_deps(pkg, reload = FALSE, quiet = TRUE, dependencies = TRUE) }) # Always install this package into temporary library, to allow parallel ------ # revdep checks -------------------------------------------------------------- temp_libpath <- tempfile("revdep") dir.create(temp_libpath) on.exit(unlink(temp_libpath, recursive = TRUE), add = TRUE) message( "Installing ", pkg$package, " ", pkg$version, " to ", temp_libpath ) withr::with_libpaths(c(temp_libpath, cache$libpath), { install(pkg, reload = FALSE, quiet = TRUE, dependencies = FALSE) }) cache$env_vars <- c( NOT_CRAN = "false", RGL_USE_NULL = "true", DISPLAY = "", cache$env_vars ) show_env_vars(cache$env_vars) # Use combination of temporary path (with own package) and cached libpath # (for everything else) as check path cache$check_libpath <- c(temp_libpath, cache$libpath) # Append temporary path to libpath to avoid duplicate installation of this # package cache$libpath <- c(cache$libpath, temp_libpath) if (length(cache$skip) > 0) { message("Skipping: ", comma(cache$skip)) cache$pkgs <- setdiff(cache$pkgs, cache$skip) } cache$skip <- NULL do.call(check_cran, cache) rule("Saving check results to `revdep/check.rds`") revdep_check_save(pkg, cache$revdeps, cache$check_dir, cache$libpath) # Delete cache and check_dir on successful run rule("Cleaning up") revdep_check_reset(pkg) unlink(revdep_cache_path(pkg)) unlink(cache$check_dir, recursive = TRUE) invisible() } revdep_check_save <- function(pkg, revdeps, check_path, lib_path) { platform <- platform_info() # Revdep results results <- lapply(check_dirs(check_path), parse_package_check) # Find all dependencies deps <- pkg[c("imports", "depends", "linkingto", "suggests")] pkgs <- unlist(lapply(deps, function(x) parse_deps(x)$name), use.names = FALSE) pkgs <- c(pkg$package, unique(pkgs)) pkgs <- intersect(pkgs, dir(lib_path)) dependencies <- package_info(pkgs, libpath = lib_path) out <- list( revdeps = revdeps, platform = platform, dependencies = dependencies, results = results ) saveRDS(out, revdep_check_path(pkg)) } parse_package_check <- function(path) { pkgname <- gsub("\\.Rcheck$", "", basename(path)) desc <- read_dcf(file.path(path, "00_pkg_src", pkgname, "DESCRIPTION")) structure( list( maintainer = desc$Maintainer, bug_reports = desc$BugReports, package = desc$Package, version = desc$Version, check_time = parse_check_time(file.path(path, "check-time.txt")), results = parse_check_results(file.path(path, "00check.log")) ), class = "revdep_check_result" ) } revdep_check_path <- function(pkg) { file.path(pkg$path, "revdep", "checks.rds") } revdep_cache_path <- function(pkg) { revdep_cache_path_raw(pkg$path) } revdep_cache_path_raw <- function(path) { file.path(path, "revdep", ".cache.rds") } check_dirs <- function(path) { checkdirs <- list.dirs(path, recursive = FALSE, full.names = TRUE) checkdirs <- checkdirs[grepl("\\.Rcheck$", checkdirs)] names(checkdirs) <- sub("\\.Rcheck$", "", basename(checkdirs)) has_src <- file.exists(file.path(checkdirs, "00_pkg_src", names(checkdirs))) checkdirs[has_src] } # Package caches ---------------------------------------------------------- cran_packages <- memoise::memoise( function() { local <- file.path(tempdir(), "packages.rds") utils::download.file("http://cran.R-project.org/web/packages/packages.rds", local, mode = "wb", quiet = TRUE) on.exit(unlink(local)) cp <- readRDS(local) rownames(cp) <- unname(cp[, 1]) cp }, ~memoise::timeout(30 * 60) ) bioc_packages <- memoise::memoise( function(views = paste(BiocInstaller::biocinstallRepos()[["BioCsoft"]], "VIEWS", sep = "/")) { con <- url(views) on.exit(close(con)) bioc <- read.dcf(con) rownames(bioc) <- bioc[, 1] bioc }, ~memoise::timeout(30 * 60) ) packages <- function() { cran <- cran_packages() bioc <- bioc_packages() cols <- intersect(colnames(cran), colnames(bioc)) rbind(cran[, cols], bioc[, cols]) } devtools/R/dev-mode.r0000644000176200001440000000416712656131112014203 0ustar liggesusers#' Activate and deactivate development mode. #' #' When activated, \code{dev_mode} creates a new library for storing installed #' packages. This new library is automatically created when \code{dev_mode} is #' activated if it does not already exist. #' This allows you to test development packages in a sandbox, without #' interfering with the other packages you have installed. #' #' @param on turn dev mode on (\code{TRUE}) or off (\code{FALSE}). If omitted #' will guess based on whether or not \code{path} is in #' \code{\link{.libPaths}} #' @param path directory to library. #' @export #' @examples #' \dontrun{ #' dev_mode() #' dev_mode() #' } dev_mode <- local({ .prompt <- NULL function(on = NULL, path = getOption("devtools.path")) { lib_paths <- .libPaths() path <- normalizePath(path, winslash = "/", mustWork = FALSE) if (is.null(on)) { on <- !(path %in% lib_paths) } if (on) { if (!file.exists(path)) { dir.create(path, recursive = TRUE, showWarnings = FALSE) } if (!file.exists(path)) { stop("Failed to create ", path, call. = FALSE) } if (!is_library(path)) { warning(path, " does not appear to be a library. ", "Are sure you specified the correct directory?", call. = FALSE) } message("Dev mode: ON") options(dev_path = path) if (is.null(.prompt)) .prompt <<- getOption("prompt") options(prompt = paste("d> ")) .libPaths(c(path, lib_paths)) } else { message("Dev mode: OFF") options(dev_path = NULL) if (!is.null(.prompt)) options(prompt = .prompt) .prompt <<- NULL .libPaths(setdiff(lib_paths, path)) } } }) is_library <- function(path) { # empty directories can be libraries if (length(dir(path)) == 0) return (TRUE) # otherwise check that the directories are compiled R directories - # i.e. that they contain a Meta directory dirs <- dir(path, full.names = TRUE) dirs <- dirs[utils::file_test("-d", dirs)] has_pkg_dir <- function(path) length(dir(path, pattern = "Meta")) > 0 help_dirs <- vapply(dirs, has_pkg_dir, logical(1)) all(help_dirs) } devtools/R/aaa.r0000644000176200001440000000030313200623655013215 0ustar liggesusersonload_assign <- local({ names <- character() funs <- list() function(name, x) { names[length(names) + 1] <<- name funs[[length(funs) + 1]] <<- substitute(x) } }) devtools/R/session-info.r0000644000176200001440000001322513200623655015116 0ustar liggesusers#' Return a vector of names of attached packages #' @export #' @keywords internal #' @return A data frame with columns package and path, giving the name of #' each package and the path it was loaded from. loaded_packages <- function() { attached <- data.frame( package = search(), path = searchpaths(), stringsAsFactors = FALSE ) packages <- attached[grepl("^package:", attached$package), , drop = FALSE] rownames(packages) <- NULL packages$package <- sub("^package:", "", packages$package) packages } #' Return a vector of names of packages loaded by devtools #' @export #' @keywords internal dev_packages <- function() { packages <- vapply(loadedNamespaces(), function(x) !is.null(dev_meta(x)), logical(1)) names(packages)[packages] } #' Print session information #' #' This is \code{\link{sessionInfo}()} re-written from scratch to both exclude #' data that's rarely useful (e.g., the full collate string or base packages #' loaded) and include stuff you'd like to know (e.g., where a package was #' installed from). #' #' @param pkgs Either a vector of package names or NULL. If \code{NULL}, #' displays all loaded packages. If a character vector, also, includes #' all dependencies of the package. #' @param include_base Include base packages in summary? By default this is #' false since base packages should always match the R version. #' @export #' @examples #' session_info() #' session_info("devtools") session_info <- function(pkgs = NULL, include_base = FALSE) { if (is.null(pkgs)) { pkgs <- loadedNamespaces() } else { pkgs <- find_deps(pkgs, installed.packages(), top_dep = NA) } structure( list( platform = platform_info(), packages = package_info(pkgs, include_base = include_base) ), class = "session_info" ) } #' @export print.session_info <- function(x, ...) { rule("Session info") print(x$platform) cat("\n") rule("Packages") print(x$packages) } platform_info <- function() { if (rstudioapi::isAvailable()) { ver <- rstudioapi::getVersion() ui <- paste0("RStudio (", ver, ")") } else { ui <- .Platform$GUI } structure(list( version = R.version.string, system = version$system, ui = ui, language = Sys.getenv("LANGUAGE", "(EN)"), collate = Sys.getlocale("LC_COLLATE"), tz = Sys.timezone(), date = format(Sys.Date()) ), class = "platform_info") } #' @export print.platform_info <- function(x, ...) { df <- data.frame(setting = names(x), value = unlist(x), stringsAsFactors = FALSE) print(df, right = FALSE, row.names = FALSE) } package_info <- function(pkgs = loadedNamespaces(), include_base = FALSE, libpath = NULL) { desc <- suppressWarnings(lapply(pkgs, packageDescription, lib.loc = libpath)) # Filter out packages that are not installed not_installed <- vapply(desc, identical, logical(1), NA) if (any(not_installed)) { stop("`pkgs` ", paste0("'", pkgs[not_installed], "'", collapse = ", "), " are not installed", call. = FALSE) } if (!include_base) { base <- vapply(pkgs, pkg_is_base, logical(1)) pkgs <- pkgs[!base] } pkgs <- sort_ci(pkgs) attached <- pkgs %in% sub("^package:", "", search()) desc <- lapply(pkgs, packageDescription, lib.loc = libpath) version <- vapply(desc, function(x) x$Version, character(1)) date <- vapply(desc, pkg_date, character(1)) source <- vapply(desc, pkg_source, character(1)) pkgs_df <- data.frame( package = pkgs, `*` = ifelse(attached, "*", ""), version = version, date = date, source = source, stringsAsFactors = FALSE, check.names = FALSE ) rownames(pkgs_df) <- NULL class(pkgs_df) <- c("packages_info", "data.frame") pkgs_df } #' @export print.packages_info <- function(x, ...) { print.data.frame(x, right = FALSE, row.names = FALSE) } pkg_is_base <- function(desc) { is.list(desc) && !is.null(desc$Priority) && desc$Priority == "base" } pkg_date <- function(desc) { if (!is.null(desc$`Date/Publication`)) { date <- desc$`Date/Publication` } else if (!is.null(desc$Built)) { built <- strsplit(desc$Built, "; ")[[1]] date <- built[3] } else { date <- NA_character_ } as.character(as.Date(strptime(date, "%Y-%m-%d"))) } pkg_source <- function(desc) { if (!is.null(desc$GithubSHA1)) { str <- paste0("Github (", desc$GithubUsername, "/", desc$GithubRepo, "@", substr(desc$GithubSHA1, 1, 7), ")") } else if (!is.null(desc$RemoteType)) { # want to generate these: # remoteType (username/repo@commit) # remoteType (username/repo) # remoteType (@commit) # remoteType remote_type <- desc$RemoteType # RemoteUsername and RemoteRepo should always be present together if (!is.null(desc$RemoteUsername) && (!is.null(desc$RemoteRepo))) { user_repo <- paste0(desc$RemoteUsername, "/", desc$RemoteRepo) } else { user_repo <- NULL } if (!is.null(desc$RemoteSha)) { sha <- paste0("@", substr(desc$RemoteSha, 1, 7)) } else { sha <- NULL } # in order to fulfill the expectation of formatting, we paste the user_repo # and sha together if (!is.null(user_repo) || !is.null(sha)) { user_repo_and_sha <- paste0(" (", user_repo, sha, ")") } else { user_repo_and_sha <- NULL } str <- paste0(remote_type, user_repo_and_sha) } else if (!is.null(desc$Repository)) { repo <- desc$Repository if (!is.null(desc$Built)) { built <- strsplit(desc$Built, "; ")[[1]] ver <- sub("$R ", "", built[1]) repo <- paste0(repo, " (", ver, ")") } repo } else if (!is.null(desc$biocViews)) { "Bioconductor" } else { "local" } } devtools/R/install-bioc.r0000644000176200001440000001152713171407310015060 0ustar liggesusers#' Install a package from a Bioconductor repository #' #' This function requires \code{svn} to be installed on your system in order to #' be used. #' #' It is vectorised so you can install multiple packages with #' a single command. #' #' ' #' @inheritParams install_git #' @param repo Repository address in the format #' \code{[username:password@@][release/]repo[#revision]}. Valid values for #' the release are \sQuote{devel} (the default if none specified), #' \sQuote{release} or numeric release numbers (e.g. \sQuote{3.3}). #' @param mirror The bioconductor SVN mirror to use #' @param ... Other arguments passed on to \code{\link{install}} #' @export #' @family package installation #' @examples #' \dontrun{ #' install_bioc("SummarizedExperiment") #' install_bioc("user@SummarizedExperiment") #' install_bioc("user:password@release/SummarizedExperiment") #' install_bioc("user:password@3.3/SummarizedExperiment") #' install_bioc("user:password@3.3/SummarizedExperiment#117513") #'} install_bioc <- function(repo, mirror = getOption("BioC_svn", "https://hedgehog.fhcrc.org/bioconductor"), ..., quiet = FALSE) { remotes <- lapply(repo, bioc_remote, mirror = mirror) install_remotes(remotes, ..., quiet = quiet) } # Parse concise SVN repo specification: [username[:password]@][branch/]repo[#revision] parse_bioc_repo <- function(path) { user_pass_rx <- "(?:(?:([^:]+):)?([^:@]+)@)?" release_rx <- "(?:(devel|release|[0-9.]+)/)?" repo_rx <- "([^/@#]+)" revision_rx <- "(?:[#][Rr]?([0-9]+))?" bioc_rx <- sprintf("^(?:%s%s%s%s|(.*))$", user_pass_rx, release_rx, repo_rx, revision_rx) param_names <- c("username", "password", "release", "repo", "revision", "invalid") replace <- stats::setNames(sprintf("\\%d", seq_along(param_names)), param_names) params <- lapply(replace, function(r) gsub(bioc_rx, r, path, perl = TRUE)) if (params$invalid != "") stop(sprintf("Invalid bioc repo: %s", path)) params <- params[sapply(params, nchar) > 0] if (!is.null(params$password) && is.null(params$username)) { params$username <- params$password params$password <- NULL } params } bioc_remote <- function(repo, mirror = getOption("BioC_svn", "https://hedgehog.fhcrc.org/bioconductor")) { meta <- parse_bioc_repo(repo) meta$username <- meta$username %||% "readonly" if (meta$username == "readonly") { meta$password <- "readonly" } remote("bioc", mirror = mirror, username = meta$username, password = meta$password, repo = meta$repo, release = meta$release %||% "devel", revision = meta$revision ) } #' @export remote_download.bioc_remote <- function(x, quiet = FALSE) { if (!quiet) { message("Downloading Bioconductor repo ", x$repo) } bundle <- tempfile() svn_binary_path <- svn_path() args <- c( "co", bioc_args(x), bioc_url(x), bundle) if (!quiet) { message(svn_binary_path, " ", paste0(args, collapse = " ")) } request <- system2(svn_binary_path, args, stdout = FALSE, stderr = FALSE) # This is only looking for an error code above 0-success if (request > 0) { stop("Error retrieving Bioc Remote `", x$repo, "`", call. = FALSE) } bundle } #' @export remote_metadata.bioc_remote <- function(x, bundle = NULL, source = NULL) { if (!is.null(bundle)) { withr::with_dir(bundle, { revision <- svn_revision() }) } else { revision <- NULL } list( RemoteType = "bioc", RemoteRepo = x$repo, RemoteMirror = x$mirror, RemoteRelease = x$release, RemoteUsername = x$username, RemotePassword = x$password, RemoteRevision = revision, RemoteSha = revision # for compatibility with other remotes ) } #' @export remote_package_name.bioc_remote <- function(remote, ...) { remote$repo } #' @export remote_sha.bioc_remote <- function(remote, ...) { svn_revision(paste(c(bioc_args(remote), bioc_url(remote)), collapse = " ")) } bioc_args <- function(x) { args <- c( if (!interactive()) { "--non-interactive" }, if (!is.null(x$revision)) { c("--revision", x$revision) }, "--username", x$username, if (!is.null(x$password)) { c("--password", x$password) }) } bioc_url <- function(x) { to_svn_release <- function(x) { sprintf("RELEASE_%s", sub("[.]", "_", x)) } switch(tolower(x$release), devel = file.path(x$mirror, "trunk", "madman", "Rpacks", x$repo), release = file.path(x$mirror, "branches", to_svn_release(bioconductor_release()), "madman", "Rpacks", x$repo), file.path(x$mirror, "branches", to_svn_release(x$release), "madman", "Rpacks", x$repo)) } bioconductor_release <- memoise::memoise(function() { tmp <- tempfile() download.file("http://bioconductor.org/config.yaml", tmp, quiet = TRUE) gsub("release_version:[[:space:]]+\"([[:digit:].]+)\"", "\\1", grep("release_version:", readLines(tmp), value = TRUE)) }) format.bioc_remote <- function(x, ...) { "Bioc" } devtools/R/upload-ftp.r0000644000176200001440000000062612721574111014555 0ustar liggesusersupload_ftp <- function(file, url, verbose = FALSE){ check_suggested("curl") stopifnot(file.exists(file)) stopifnot(is.character(url)) con <- file(file, open = "rb") on.exit(close(con)) h <- curl::new_handle(upload = TRUE, filetime = FALSE) curl::handle_setopt(h, readfunction = function(n){ readBin(con, raw(), n = n) }, verbose = verbose) curl::curl_fetch_memory(url, handle = h) } devtools/R/run-loadhooks.r0000644000176200001440000000364213200623655015271 0ustar liggesusers#' Run user and package hooks. #' #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param hook hook name: one of "load", "unload", "attach", or "detach" #' @keywords internal run_pkg_hook <- function(pkg, hook) { pkg <- as.package(pkg) trans <- c( "load" = ".onLoad", "unload" = ".onUnload", "attach" = ".onAttach", "detach" = ".onDetach") hook <- match.arg(hook, names(trans)) f_name <- trans[hook] metadata <- dev_meta(pkg$package) if (isTRUE(metadata[[f_name]])) return(FALSE) # Run hook function if defined, and not already run nsenv <- ns_env(pkg) if (!exists(f_name, nsenv, inherits = FALSE)) return(FALSE) if (hook %in% c("load", "attach")) { nsenv[[f_name]](dirname(pkg$path), pkg$package) } else { nsenv[[f_name]](dirname(pkg$path)) } metadata[[f_name]] <- TRUE TRUE } #' @rdname run_pkg_hook run_user_hook <- function(pkg, hook) { pkg <- as.package(pkg) nsenv <- ns_env(pkg) trans <- c( "load" = "onLoad", "unload" = "onUnload", "attach" = "attach", "detach" = "detach") hook <- match.arg(hook, names(trans)) hook_name <- trans[hook] metadata <- dev_meta(pkg$package) if (isTRUE(metadata[[hook_name]])) return(FALSE) hooks <- getHook(packageEvent(pkg$package, hook_name)) if (length(hooks) == 0) return(FALSE) for(fun in rev(hooks)) try(fun(pkg$package)) metadata[[hook_name]] <- TRUE invisible(TRUE) } hook_warning <- function(pkg) { if (basename(pkg$path) == pkg$package) return() metadata <- dev_meta(pkg$package) if (isTRUE(metadata$hook_warning)) return() metadata$hook_warning <- TRUE warning( 'Package path "', basename(pkg$path), '" does not match package name "', pkg$package, '". ', 'This can result in problems when calling package hooks. ', 'Please rename the directory to match the package name.', call. = FALSE) } devtools/R/remove-s4-class.r0000644000176200001440000001144213200623655015425 0ustar liggesusers# Remove s4 classes created by this package. # This is only necessary if the package was loaded with devtools. If the # package was NOT loaded by devtools, it's not necessary to remove the # classes this way, and attempting to do so will result in errors. remove_s4_classes <- function(pkg = ".") { pkg <- as.package(pkg) classes <- methods::getClasses(ns_env(pkg)) lapply(sort_s4classes(classes, pkg), remove_s4_class, pkg) } # Sort S4 classes for hierarchical removal # Derived classes must be removed **after** their parents. # This reduces to a topological sorting on the S4 dependency class # https://en.wikipedia.org/wiki/Topological_sorting sort_s4classes <- function(classes, pkg) { pkg <- as.package(pkg) nsenv <- ns_env(pkg) sorted_classes <- vector(mode = 'character', length = 0) ## Return the parent class, if any within domestic classes extends_first <- function(x, classes) { ext <- methods::extends(methods::getClass(x, where = nsenv)) parent <- ext[2] classes %in% parent } ## Matrix of classes in columns, extending classes in rows extended_classes <- vapply( classes, extends_first, rep(TRUE, length(classes)), classes ) if (!is.matrix(extended_classes)) extended_classes <- as.matrix(extended_classes) ## Dynamic set of orphan classes (safe to remove) start_idx <- which(apply(extended_classes, 2, sum) == 0) while (length(start_idx) > 0) { ## add node to sorted list (and remove from pending list) i <- start_idx[1] start_idx <- utils::tail(start_idx, -1) sorted_classes <- c(sorted_classes, classes[i]) ## check its derived classes if any for (j in which(extended_classes[i, ])) { extended_classes[i, j] <- FALSE if (sum(extended_classes[, j]) == 0) { start_idx <- c(start_idx, j) } } } if (any(extended_classes)) { ## Graph has a cycle. This should not happen ## Stop or try to continue? idx <- !classes %in% sorted_classes sorted_classes <- c(sorted_classes, classes[idx]) } return(sorted_classes) } # Remove an s4 class from a package loaded by devtools # # For classes loaded with devtools, this is necessary so that R doesn't try to # modify superclasses that don't have references to this class. For example, # suppose you have package pkgA with class A, and pkgB with class B, which # contains A. If pkgB is loaded with load_all(), then class B will have a # reference to class A, and unloading pkgB the normal way, with # unloadNamespace("pkgB"), will result in some errors. They happen because R # will look at B, see that it is a superclass of A, then it will try to modify # A by removing subclass references to B. # # This function sidesteps the problem by modifying B. It finds all the classes # in B@contains which also have references back to B, then modifes B to keep # references to those classes, but remove references to all other classes. # Finally, it removes B. Calling removeClass("B") tells the classes referred to # in B@contains to remove their references back to B. # # It is entirely possible that this code is necessary only because of bugs in # R's S4 implementation. # # @param classname The name of the class. # @param pkg The package object which contains the class. remove_s4_class <- function(classname, pkg) { pkg <- as.package(pkg) nsenv <- ns_env(pkg) # Make a copy of the class class <- methods::getClassDef(classname, package = pkg$package, inherits = FALSE) # Find all the references to classes that (this one contains/extends AND # have backreferences to this class) so that R doesn't try to modify them. keep_idx <- contains_backrefs(classname, pkg$package, class@contains) class@contains <- class@contains[keep_idx] # Assign the modified class back into the package methods::assignClassDef(classname, class, where = nsenv) # Remove the class. methods::removeClass(classname, where = nsenv) } # Given a list of SClassExtension objects, this returns a logical vector of the # same length. Each element is TRUE if the corresponding object has a reference # to this class, FALSE otherwise. contains_backrefs <- function(classname, pkgname, contains) { # If class_a in pkg_a has class_b in pkg_b as a subclass, return TRUE, # otherwise FALSE. has_subclass_ref <- function(class_a, pkg_a, class_b, pkg_b) { x <- methods::getClassDef(class_a, package = pkg_a) if (is.null(x)) return(FALSE) subclass_ref <- x@subclasses[[class_b]] if (!is.null(subclass_ref) && subclass_ref@package == pkg_b) { return(TRUE) } FALSE } if (length(contains) == 0) { return() } # Get a named vector of 'contains', where each item's name is the class, # and the value is the package. contain_pkgs <- sapply(contains, "slot", "package") mapply(has_subclass_ref, names(contain_pkgs), contain_pkgs, classname, pkgname) } devtools/R/package-env.r0000644000176200001440000000552313200623655014665 0ustar liggesusers# Create the package environment where exported objects will be copied to attach_ns <- function(pkg = ".") { pkg <- as.package(pkg) nsenv <- ns_env(pkg) if (is_attached(pkg)) { stop("Package ", pkg$package, " is already attached.") } # This should be similar to attachNamespace pkgenv <- base::attach(NULL, name = pkg_env_name(pkg)) attr(pkgenv, "path") <- getNamespaceInfo(nsenv, "path") } # Invoke namespace load actions. According to the documentation for setLoadActions # these actions should be called at the end of processing of S4 metadata, after # dynamically linking any libraries, the call to .onLoad(), if any, and caching # method and class definitions, but before the namespace is sealed run_ns_load_actions <- function(pkg = ".") { nsenv <- ns_env(pkg) actions <- methods::getLoadActions(nsenv) for (action in actions) action(nsenv) } # Copy over the objects from the namespace env to the package env export_ns <- function(pkg = ".") { pkg <- as.package(pkg) nsenv <- ns_env(pkg) pkgenv <- pkg_env(pkg) nsInfo <- parse_ns_file(pkg) exports <- getNamespaceExports(nsenv) importIntoEnv(pkgenv, exports, nsenv, exports) # If lazydata is true, manually copy data objects in $lazydata to package # environment if (!is.null(pkg$lazydata) && tolower(pkg$lazydata) %in% c("true", "yes")) { copy_env(src = nsenv$.__NAMESPACE__.$lazydata, dest = pkgenv) } } #' Return package environment #' #' This is an environment like \code{}. The package #' environment contains the exported objects from a package. It is #' attached, so it is an ancestor of \code{R_GlobalEnv}. #' #' When a package is loaded the normal way, using \code{\link{library}}, #' this environment contains only the exported objects from the #' namespace. However, when loaded with \code{\link{load_all}}, this #' environment will contain all the objects from the namespace, unless #' \code{load_all} is used with \code{export_all=FALSE}. #' #' If the package is not attached, this function returns \code{NULL}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @keywords internal #' @seealso \code{\link{ns_env}} for the namespace environment that #' all the objects (exported and not exported). #' @seealso \code{\link{imports_env}} for the environment that contains #' imported objects for the package. #' @export pkg_env <- function(pkg = ".") { pkg <- as.package(pkg) name <- pkg_env_name(pkg) if (!is_attached(pkg)) return(NULL) as.environment(name) } # Generate name of package environment # Contains exported objects pkg_env_name <- function(pkg = ".") { pkg <- as.package(pkg) paste("package:", pkg$package, sep = "") } # Reports whether a package is loaded and attached is_attached <- function(pkg = ".") { pkg_env_name(pkg) %in% search() } devtools/R/env-utils.r0000644000176200001440000000072213200623655014426 0ustar liggesusers# Copy all objects from one environment to another. # # Returns the destination environment # # @param dest Destination environment. If not specified, create a new # environment. # @param ignore Names of objects that should not be copied. copy_env <- function(src, dest = new.env(parent = emptyenv()), ignore = NULL) { srclist <- as.list(src, all.names = TRUE) srclist <- srclist[ !(names(srclist) %in% ignore) ] list2env(srclist, envir = dest) dest } devtools/R/decompress.r0000644000176200001440000000543313145046770014655 0ustar liggesusers# Decompress pkg, if needed source_pkg_info <- function(path, subdir = NULL) { if (!file.info(path)$isdir) { bundle <- path outdir <- tempfile(pattern = "devtools") dir.create(outdir) path <- decompress(path, outdir) } else { bundle <- NULL } pkg_path <- if (is.null(subdir)) path else file.path(path, subdir) # Check it's an R package if (!file.exists(file.path(pkg_path, "DESCRIPTION"))) { stop("Does not appear to be an R package (no DESCRIPTION)", call. = FALSE) } list(pkg_path = pkg_path, bundle = bundle) } is_source_pkg <- function(path, subdir = NULL) { tryCatch({ source_pkg_info(path = path, subdir = subdir) TRUE }, error = function(e) return(FALSE)) } source_pkg <- function(path, subdir = NULL, before_install = NULL) { info <- source_pkg_info(path = path, subdir = subdir) # Check configure is executable if present config_path <- file.path(info$pkg_path, "configure") if (file.exists(config_path)) { Sys.chmod(config_path, "777") } # Call before_install for bundles (if provided) if (!is.null(info$bundle) && !is.null(before_install)) before_install(info$bundle, info$pkg_path) info$pkg_path } decompress <- function(src, target) { stopifnot(file.exists(src)) if (grepl("\\.zip$", src)) { my_unzip(src, target) outdir <- getrootdir(as.vector(utils::unzip(src, list = TRUE)$Name)) } else if (grepl("\\.tar$", src)) { utils::untar(src, exdir = target) outdir <- getrootdir(utils::untar(src, list = TRUE)) } else if (grepl("\\.(tar\\.gz|tgz)$", src)) { utils::untar(src, exdir = target, compressed = "gzip") outdir <- getrootdir(utils::untar(src, compressed = "gzip", list = TRUE)) } else if (grepl("\\.(tar\\.bz2|tbz)$", src)) { utils::untar(src, exdir = target, compressed = "bzip2") outdir <- getrootdir(utils::untar(src, compressed = "bzip2", list = TRUE)) } else { ext <- gsub("^[^.]*\\.", "", src) stop("Don't know how to decompress files with extension ", ext, call. = FALSE) } file.path(target, outdir) } # Returns everything before the last slash in a filename # getdir("path/to/file") returns "path/to" # getdir("path/to/dir/") returns "path/to/dir" getdir <- function(path) sub("/[^/]*$", "", path) # Given a list of files, returns the root (the topmost folder) # getrootdir(c("path/to/file", "path/to/other/thing")) returns "path/to" getrootdir <- function(file_list) { slashes <- nchar(gsub("[^/]", "", file_list)) if (min(slashes) == 0) return("") getdir(file_list[which.min(slashes)]) } my_unzip <- function(src, target, unzip = getOption("unzip")) { if (unzip == "internal") { return(utils::unzip(src, exdir = target)) } args <- paste( "-oq", shQuote(src), "-d", shQuote(target) ) system_check(unzip, args, quiet = TRUE) } devtools/R/namespace-env.r0000644000176200001440000001537313200623655015232 0ustar liggesusers#' Return the namespace environment for a package. #' #' Contains all (exported and non-exported) objects, and is a descendent of #' \code{R_GlobalEnv}. The hierarchy is \code{}, #' \code{}, \code{}, and then #' \code{R_GlobalEnv}. #' #' If the package is not loaded, this function returns \code{NULL}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @keywords internal #' @seealso \code{\link{pkg_env}} for the attached environment that #' contains the exported objects. #' @seealso \code{\link{imports_env}} for the environment that contains #' imported objects for the package. #' @export ns_env <- function(pkg = ".") { pkg <- as.package(pkg) if (!is_loaded(pkg)) return(NULL) asNamespace(pkg$package) } # Create the namespace environment for a package create_ns_env <- function(pkg = ".") { pkg <- as.package(pkg) if (is_loaded(pkg)) { stop("Namespace for ", pkg$package, " already exists.") } env <- makeNamespace(pkg$package, pkg$version) methods::setPackageName(pkg$package, env) # Create devtools metadata in namespace create_dev_meta(pkg$package) setNamespaceInfo(env, "path", pkg$path) setup_ns_imports(pkg) env } # This is taken directly from base::loadNamespace() # https://github.com/wch/r-source/blob/tags/R-3-3-0/src/library/base/R/namespace.R#L235-L258 onload_assign("makeNamespace", eval( modify_lang( extract_lang(body(loadNamespace), # Find makeNamespace definition comp_lang, y = quote(makeNamespace <- NULL), idx = 1:2)[[3]], # Replace call to .Internal(registerNamespace()) is replaced by a call to # register_namespace function(x) { if (comp_lang(x, quote(.Internal(registerNamespace(name, env))))) { quote(register_namespace(name, env)) } else { x } })) ) # Read the NAMESPACE file and set up the imports metdata. # (which is stored in .__NAMESPACE__.) setup_ns_imports <- function(pkg = ".") { pkg <- as.package(pkg) nsInfo <- parse_ns_file(pkg) setNamespaceInfo(pkg$package, "imports", nsInfo$imports) } # Read the NAMESPACE file and set up the exports metdata. This must be # run after all the objects are loaded into the namespace because # namespaceExport throw errors if the objects are not present. setup_ns_exports <- function(pkg = ".", export_all = FALSE) { pkg <- as.package(pkg) nsInfo <- parse_ns_file(pkg) nsenv <- ns_env(pkg) if (export_all) { exports <- ls(nsenv, all.names = TRUE) # Make sure to re-export objects that are imported from other packages but # not copied. exports <- union(exports, nsInfo$exports) # List of things to ignore is from loadNamespace. There are also a # couple things to ignore from devtools. ignoreidx <- exports %in% c( ".__NAMESPACE__.", ".__S3MethodsTable__.", ".packageName", ".First.lib", ".onLoad", ".onAttach", ".conflicts.OK", ".noGenerics", ".__DEVTOOLS__", ".cache") exports <- exports[!ignoreidx] } else { # This code is from base::loadNamespace exports <- nsInfo$exports for (p in nsInfo$exportPatterns) exports <- c(ls(nsenv, pattern = p, all.names = TRUE), exports) exports <- add_classes_to_exports(ns = nsenv, package = pkg$package, exports = exports, nsInfo = nsInfo) } # Don't try to export objects that are missing from the namespace and imports ns_and_imports <- c(ls(nsenv, all.names = TRUE), ls(imports_env(pkg), all.names = TRUE)) extra_exports <- setdiff(exports, ns_and_imports) if (length(extra_exports) > 0) { warning("Objects listed as exports, but not present in namespace: ", paste(extra_exports, collapse = ", ")) exports <- intersect(ns_and_imports, exports) } # Update the exports metadata for the namespace with base::namespaceExport # It will throw warnings if objects are already listed in the exports # metadata, so catch those warnings and ignore them. suppressWarnings(namespaceExport(nsenv, exports)) invisible() } # Lookup S4 classes for export # # This function uses code from base::loadNamespace. Previously this code was # copied directly, now it is dynamically looked up instead, to prevent drift as # base::loadNamespace changes. onload_assign("add_classes_to_exports", make_function(alist(ns =, package =, exports =, nsInfo =), call("{", extract_lang( f = comp_lang, y = quote(if (.isMethodsDispatchOn() && .hasS4MetaData(ns) && !identical(package, "methods")) NULL), idx = 1:2, modify_lang(body(base::loadNamespace), strip_internal_calls, "methods")), quote(exports)), asNamespace("methods"))) #' Parses the NAMESPACE file for a package #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @examples #' if (has_tests()) { #' parse_ns_file(devtest("testLoadHooks")) #' } #' @keywords internal #' @export parse_ns_file <- function(pkg = ".") { pkg <- as.package(pkg) parseNamespaceFile(basename(pkg$path), dirname(pkg$path), mustExist = FALSE) } # Register the S3 methods for this package register_s3 <- function(pkg = ".") { pkg <- as.package(pkg) nsInfo <- parse_ns_file(pkg) # Adapted from loadNamespace registerS3methods(nsInfo$S3methods, pkg$package, ns_env(pkg)) } # Reports whether a package is loaded into a namespace. It may be # attached or not attached. is_loaded <- function(pkg = ".") { pkg <- as.package(pkg) pkg$package %in% loadedNamespaces() } # Returns the namespace registry ns_registry <- function() { (get(".Internal", envir = baseenv(), mode = "function"))(getNamespaceRegistry()) } # To avoid a note about getNamespaceRegistry being missing utils::globalVariables("getNamespaceRegistry") # Register a namespace register_namespace <- function(name = NULL, env = NULL) { # Be careful about what we allow if (!is.character(name) || name == "" || length(name) != 1) stop("'name' must be a non-empty character string.") if (!is.environment(env)) stop("'env' must be an environment.") if (name %in% loadedNamespaces()) stop("Namespace ", name, " is already registered.") # Add the environment to the registry nsr <- ns_registry() nsr[[name]] <- env env } # unregister a namespace - should be used only if unloadNamespace() # fails for some reason unregister_namespace <- function(name = NULL) { # Be careful about what we allow if (!is.character(name) || name == "" || length(name) != 1) stop("'name' must be a non-empty character string.") if (!(name %in% loadedNamespaces())) stop(name, " is not a registered namespace.") # Remove the item from the registry do.call(rm, args = list(name, envir = ns_registry())) invisible() } devtools/R/compile-dll.r0000644000176200001440000000717613200623655014713 0ustar liggesusers#' Compile a .dll/.so from source. #' #' \code{compile_dll} performs a fake R CMD install so code that #' works here should work with a regular install (and vice versa). #' #' During compilation, debug flags are set with #' \code{\link{compiler_flags}(TRUE)}. #' #' Invisibly returns the names of the DLL. #' #' @note If this is used to compile code that uses Rcpp, you will need to #' add the following line to your \code{Makevars} file so that it #' knows where to find the Rcpp headers: #' \code{PKG_CPPFLAGS=`$(R_HOME)/bin/Rscript -e 'Rcpp:::CxxFlags()'`} #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param quiet if \code{TRUE} suppresses output from this function. #' @seealso \code{\link{clean_dll}} to delete the compiled files. #' @export compile_dll <- function(pkg = ".", quiet = FALSE) { pkg <- as.package(pkg) old <- withr_with_envvar(compiler_flags(TRUE), { if (!needs_compile(pkg)) return(invisible()) compile_rcpp_attributes(pkg) # Mock install the package to generate the DLL if (!quiet) message("Re-compiling ", pkg$package) install_dir <- tempfile("devtools_install_") dir.create(install_dir) inst <- install_min(pkg, install_dir, components = "libs", args = if (needs_clean(pkg)) "--preclean", quiet = quiet) invisible(dll_path(pkg)) }, "prefix") } #' Remove compiled objects from /src/ directory #' #' Invisibly returns the names of the deleted files. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @seealso \code{\link{compile_dll}} #' @export clean_dll <- function(pkg = ".") { pkg <- as.package(pkg) # Clean out the /src/ directory files <- dir(file.path(pkg$path, "src"), pattern = sprintf("\\.(o|sl|so|dylib|a|dll)$|(%s\\.def)$", pkg$package), full.names = TRUE, recursive = TRUE) unlink(files) invisible(files) } # Returns the full path and name of the DLL file dll_path <- function(pkg = ".") { pkg <- as.package(pkg) name <- paste(pkg$package, .Platform$dynlib.ext, sep = "") file.path(pkg$path, "src", name) } mtime <- function(x) { x <- x[file.exists(x)] if (length(x) == 0) return(NULL) max(file.info(x)$mtime) } # List all source files in the package sources <- function(pkg = ".") { pkg <- as.package(pkg) srcdir <- file.path(pkg$path, "src") dir(srcdir, "\\.(c.*|f)$", recursive = TRUE, full.names = TRUE) } # List all header files in the package headers <- function(pkg = ".") { pkg <- as.package(pkg) incldir <- file.path(pkg$path, "inst", "include") srcdir <- file.path(pkg$path, "src") c( dir(srcdir, "^Makevars.*$", recursive = TRUE, full.names = TRUE), dir(srcdir, "\\.h.*$", recursive = TRUE, full.names = TRUE), dir(incldir, "\\.h.*$", recursive = TRUE, full.names = TRUE) ) } # Does the package need recompiling? # (i.e. is there a source or header file newer than the dll) needs_compile <- function(pkg = ".") { pkg <- as.package(pkg) source <- mtime(c(sources(pkg), headers(pkg))) # no source files, so doesn't need compile if (is.null(source)) return(FALSE) dll <- mtime(dll_path(pkg)) # no dll, so needs compile if (is.null(dll)) return(TRUE) source > dll } # Does the package need a clean compile? # (i.e. is there a header or Makevars newer than the dll) needs_clean <- function(pkg = ".") { pkg <- as.package(pkg) headers <- mtime(headers(pkg)) # no headers, so never needs clean compile if (is.null(headers)) return(FALSE) dll <- mtime(dll_path(pkg)) # no dll, so needs compile if (is.null(dll)) return(TRUE) headers > dll } devtools/R/utils.r0000644000176200001440000001534313200625264013643 0ustar liggesusers# Given the name or vector of names, returns a named vector reporting # whether each exists and is a directory. dir.exists <- function(x) { res <- file.exists(x) & file.info(x)$isdir stats::setNames(res, x) } compact <- function(x) { is_empty <- vapply(x, function(x) length(x) == 0, logical(1)) x[!is_empty] } "%||%" <- function(a, b) if (!is.null(a)) a else b "%:::%" <- function(p, f) { get(f, envir = asNamespace(p)) } rule <- function(..., pad = "-") { if (nargs() == 0) { title <- "" } else { title <- paste0(..., " ") } width <- max(getOption("width") - nchar(title) - 1, 0) message(title, paste(rep(pad, width, collapse = ""))) } # check whether the specified file ends with newline ends_with_newline <- function(path) { conn <- file(path, open = "rb", raw = TRUE) on.exit(close(conn)) seek(conn, where = -1, origin = "end") lastByte <- readBin(conn, "raw", n = 1) lastByte == 0x0a } render_template <- function(name, data = list()) { path <- system.file("templates", name, package = "devtools") template <- readLines(path) whisker::whisker.render(template, data) } is_installed <- function(pkg, version = 0) { installed_version <- tryCatch(utils::packageVersion(pkg), error = function(e) NA) !is.na(installed_version) && installed_version >= version } check_bioconductor <- function() { if (is_installed("BiocInstaller")) { return() } msg <- paste0("'BiocInstaller' must be installed to install Bioconductor packages") if (!interactive()) { stop(msg, call. = FALSE) } message( msg, ".\n", "Would you like to install it? ", "This will source ." ) if (menu(c("Yes", "No")) != 1) { stop("'BiocInstaller' not installed", call. = FALSE) } suppressMessages( source("https://bioconductor.org/biocLite.R") ) } check_suggested <- function(pkg, version = NULL, compare = NA) { if (is.null(version)) { if (!is.na(compare)) { stop("Cannot set ", sQuote(compare), " without setting ", sQuote(version), call. = FALSE) } dep <- suggests_dep(pkg) version <- dep$version compare <- dep$compare } if (!is_installed(pkg) || !check_dep_version(pkg, version, compare)) { msg <- paste0(sQuote(pkg), if (is.na(version)) "" else paste0(" >= ", version), " must be installed for this functionality.") if (interactive()) { message(msg, "\nWould you like to install it?") if (menu(c("Yes", "No")) == 1) { install.packages(pkg) } else { stop(msg, call. = FALSE) } } else { stop(msg, call. = FALSE) } } } suggests_dep <- function(pkg) { suggests <- read_dcf(system.file("DESCRIPTION", package = "devtools"))$Suggests deps <- parse_deps(suggests) found <- which(deps$name == pkg)[1L] if (!length(found)) { stop(sQuote(pkg), " is not in Suggests: for devtools!", call. = FALSE) } deps[found, ] } read_dcf <- function(path) { fields <- colnames(read.dcf(path)) as.list(read.dcf(path, keep.white = fields)[1, ]) } write_dcf <- function(path, desc) { desc <- unlist(desc) # Add back in continuation characters desc <- gsub("\n[ \t]*\n", "\n .\n ", desc, perl = TRUE, useBytes = TRUE) desc <- gsub("\n \\.([^\n])", "\n .\\1", desc, perl = TRUE, useBytes = TRUE) starts_with_whitespace <- grepl("^\\s", desc, perl = TRUE, useBytes = TRUE) delimiters <- ifelse(starts_with_whitespace, ":", ": ") text <- paste0(names(desc), delimiters, desc, collapse = "\n") # If the description file has a declared encoding, set it so nchar() works # properly. if ("Encoding" %in% names(desc)) { Encoding(text) <- desc[["Encoding"]] } if (substr(text, nchar(text), 1) != "\n") { text <- paste0(text, "\n") } cat(text, file = path) } dots <- function(...) { eval(substitute(alist(...))) } first_upper <- function(x) { substr(x, 1, 1) <- toupper(substr(x, 1, 1)) x } download <- function(path, url, ...) { request <- httr::GET(url, ...) httr::stop_for_status(request) writeBin(httr::content(request, "raw"), path) path } download_text <- function(url, ...) { request <- httr::GET(url, ...) httr::stop_for_status(request) httr::content(request, "text") } last <- function(x) x[length(x)] # Modified version of utils::file_ext. Instead of always returning the text # after the last '.', as in "foo.tar.gz" => ".gz", if the text that directly # precedes the last '.' is ".tar", it will include also, so # "foo.tar.gz" => ".tar.gz" file_ext <- function (x) { pos <- regexpr("\\.((tar\\.)?[[:alnum:]]+)$", x) ifelse(pos > -1L, substring(x, pos + 1L), "") } is_bioconductor <- function(x) { x$package != "BiocInstaller" && !is.null(x$biocviews) } trim_ws <- function(x) { gsub("^[[:space:]]+|[[:space:]]+$", "", x) } is_dir <- function(x) file.info(x)$isdir indent <- function(x, spaces = 4) { ind <- paste(rep(" ", spaces), collapse = "") paste0(ind, gsub("\n", paste0("\n", ind), x, fixed = TRUE)) } is_windows <- isTRUE(.Platform$OS.type == "windows") all_named <- function (x) { if (length(x) == 0) return(TRUE) !is.null(names(x)) && all(names(x) != "") } make_function <- function (args, body, env = parent.frame()) { args <- as.pairlist(args) stopifnot(all_named(args), is.language(body)) eval(call("function", args, body), env) } comp_lang <- function(x, y, idx = seq_along(y)) { if (is.symbol(x) || is.symbol(y)) { return(identical(x, y)) } if (length(x) < length(idx)) return(FALSE) identical(x[idx], y[idx]) } extract_lang <- function(x, f, ...) { recurse <- function(y) { unlist(compact(lapply(y, extract_lang, f = f, ...)), recursive = FALSE) } # if x matches predicate return it if (isTRUE(f(x, ...))) { return(x) } if (is.call(x)) { res <- recurse(x)[[1]] if (top_level_call <- identical(sys.call()[[1]], as.symbol("extract_lang")) && is.null(res)) { warning("Devtools is incompatible with the current version of R. `load_all()` may function incorrectly.") } return(res) } NULL } modify_lang <- function(x, f, ...) { recurse <- function(x) { lapply(x, modify_lang, f = f, ...) } x <- f(x, ...) if (is.call(x)) { as.call(recurse(x)) } else if (is.function(x)) { formals(x) <- modify_lang(formals(x), f, ...) body(x) <- modify_lang(body(x), f, ...) } else { x } } strip_internal_calls <- function(x, package) { if (is.call(x) && identical(x[[1L]], as.name(":::")) && identical(x[[2L]], as.name(package))) { x[[3L]] } else { x } } sort_ci <- function(x) { withr::with_collate("C", x[order(tolower(x), x)]) } comma <- function(x, at_most = 20) { if (length(x) > at_most) { x <- c(x[seq_len(at_most)], "...") } paste(x, collapse = ", ") } menu <- function(...) { utils::menu(...) } devtools/R/load-depends.r0000644000176200001440000000053713200623655015043 0ustar liggesusersload_depends <- function(pkg = ".") { pkg <- as.package(pkg) # Get data frame of dependency names and versions deps <- parse_deps(pkg$depends) if (is.null(deps) || nrow(deps) == 0) return(invisible()) mapply(check_dep_version, deps$name, deps$version, deps$compare) lapply(deps$name, require, character.only = TRUE) invisible(deps) } devtools/R/run-examples.r0000644000176200001440000000532213200623655015121 0ustar liggesusers#' Run all examples in a package. #' #' One of the most frustrating parts of `R CMD check` is getting all of your #' examples to pass - whenever one fails you need to fix the problem and then #' restart the whole process. This function makes it a little easier by #' making it possible to run all examples from an R function. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param start Where to start running the examples: this can either be the #' name of \code{Rd} file to start with (with or without extensions), or #' a topic name. If omitted, will start with the (lexicographically) first #' file. This is useful if you have a lot of examples and don't want to #' rerun them every time you fix a problem. #' @family example functions #' @param show if \code{TRUE}, code in \code{\\dontshow{}} will be commented #' out #' @param test if \code{TRUE}, code in \code{\\donttest{}} will be commented #' out. If \code{FALSE}, code in \code{\\testonly{}} will be commented out. #' @param run if \code{TRUE}, code in \code{\\dontrun{}} will be commented #' out. #' @param fresh if \code{TRUE}, will be run in a fresh R session. This has #' the advantage that there's no way the examples can depend on anything in #' the current session, but interactive code (like \code{\link{browser}}) #' won't work. #' @keywords programming #' @export run_examples <- function(pkg = ".", start = NULL, show = TRUE, test = FALSE, run = TRUE, fresh = FALSE) { pkg <- as.package(pkg) document(pkg) files <- rd_files(pkg) if (!is.null(start)) { start_path <- find_pkg_topic(pkg, start) if (is.null(start_path)) { stop("Couldn't find start position ", start, call. = FALSE) } start_pos <- which(names(files) == start_path) if (length(start_pos) == 1) { files <- files[- seq(1, start_pos - 1)] } } if (length(files) == 0) return() rule("Running ", length(files), " example files in ", pkg$package) if (fresh) { to_run <- substitute(devtools::run_examples(path), list(path = pkg$path)) eval_clean(to_run) } else { load_all(pkg, reset = TRUE, export_all = FALSE) on.exit(load_all(pkg, reset = TRUE)) lapply(files, run_example, show = show, test = test, run = run) } invisible() } # If an error occurs, should print out the suspect line of code, and offer # the following options: # * skip to the next example # * quit # * browser # * rerun example and rerun # * reload code and rerun rd_files <- function(pkg) { path_man <- file.path(pkg$path, "man") files <- dir(path_man, pattern = "\\.[Rr]d$", full.names = TRUE) names(files) <- basename(files) sort_ci(files) } devtools/R/load-code.r0000644000176200001440000000320713200623655014330 0ustar liggesusers#' Load R code. #' #' Load all R code in the \code{R} directory. The first time the code is #' loaded, \code{.onLoad} will be run if it exists. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @keywords programming #' @export load_code <- function(pkg = ".") { pkg <- as.package(pkg) env <- ns_env(pkg) r_files <- find_code(pkg) paths <- changed_files(r_files) if (length(paths) == 0L) return() success <- FALSE cleanup <- function() { if (success) return() clear_cache() unload(pkg) } on.exit(cleanup()) withr_with_dir(file.path(pkg$path), source_many(paths, env)) success <- TRUE invisible(r_files) } # Parse collate string into vector of function names. parse_collate <- function(string) { con <- textConnection(string) on.exit(close(con)) scan(con, "character", sep = " ", quiet = TRUE) } # Find all R files in given directory. find_code <- function(pkg = ".") { path_r <- file.path(pkg$path, "R") r_files <- withr_with_collate( "C", tools::list_files_with_type(path_r, "code", full.names = TRUE) ) if (!is.null(pkg$collate)) { collate <- file.path(path_r, parse_collate(pkg$collate)) missing <- setdiff(collate, r_files) files <- function(x) paste(basename(x), collapse = ", ") if (length(missing) > 0) { message("Skipping missing files: ", files(missing)) } collate <- setdiff(collate, missing) extra <- setdiff(r_files, collate) if (length(extra) > 0) { message("Adding files missing in collate: ", files(extra)) } r_files <- union(collate, r_files) } r_files } devtools/R/show-news.r0000644000176200001440000000141312656131112014424 0ustar liggesusers#' Show package news #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param latest if \code{TRUE}, only show the news for the most recent #' version. #' @param ... other arguments passed on to \code{news} #' @export show_news <- function(pkg = ".", latest = TRUE, ...) { pkg <- as.package(pkg) news_path <- file.path(pkg$path, "NEWS") if (!file.exists(news_path)) { stop("No NEWS found", call. = FALSE) } out <- utils::news(..., db = ("tools" %:::% ".news_reader_default")(news_path)) if (latest) { ver <- numeric_version(out$Version) recent <- ver == max(ver) structure(out[recent, ], class = class(out), bad = attr(out, "bad")[recent]) } else { out } } devtools/R/install-github.r0000644000176200001440000002276313200623655015437 0ustar liggesusers#' Attempts to install a package directly from GitHub. #' #' This function is vectorised on \code{repo} so you can install multiple #' packages in a single command. #' #' @param repo Repository address in the format #' \code{username/repo[/subdir][@@ref|#pull]}. Alternatively, you can #' specify \code{subdir} and/or \code{ref} using the respective parameters #' (see below); if both are specified, the values in \code{repo} take #' precedence. #' @param username User name. Deprecated: please include username in the #' \code{repo} #' @param ref Desired git reference. Could be a commit, tag, or branch #' name, or a call to \code{\link{github_pull}}. Defaults to \code{"master"}. #' @param subdir subdirectory within repo that contains the R package. #' @param auth_token To install from a private repo, generate a personal #' access token (PAT) in \url{https://github.com/settings/tokens} and #' supply to this argument. This is safer than using a password because #' you can easily delete a PAT without affecting any others. Defaults to #' the \code{GITHUB_PAT} environment variable. #' @param host GitHub API host to use. Override with your GitHub enterprise #' hostname, for example, \code{"github.hostname.com/api/v3"}. #' @param quiet if \code{TRUE} suppresses output from this function. #' @param ... Other arguments passed on to \code{\link{install}}. #' @details #' Attempting to install from a source repository that uses submodules #' raises a warning. Because the zipped sources provided by GitHub do not #' include submodules, this may lead to unexpected behaviour or compilation #' failure in source packages. In this case, cloning the repository manually #' using \code{\link{install_git}} with \code{args="--recursive"} may yield #' better results. #' @export #' @family package installation #' @seealso \code{\link{github_pull}} #' @examples #' \dontrun{ #' install_github("klutometis/roxygen") #' install_github("wch/ggplot2") #' install_github(c("rstudio/httpuv", "rstudio/shiny")) #' install_github(c("hadley/httr@@v0.4", "klutometis/roxygen#142", #' "mfrasca/r-logging/pkg")) #' #' # Update devtools to the latest version, on Linux and Mac #' # On Windows, this won't work - see ?build_github_devtools #' install_github("hadley/devtools") #' #' # To install from a private repo, use auth_token with a token #' # from https://github.com/settings/tokens. You only need the #' # repo scope. Best practice is to save your PAT in env var called #' # GITHUB_PAT. #' install_github("hadley/private", auth_token = "abc") #' #' } install_github <- function(repo, username = NULL, ref = "master", subdir = NULL, auth_token = github_pat(quiet), host = "https://api.github.com", quiet = FALSE, ...) { remotes <- lapply(repo, github_remote, username = username, ref = ref, subdir = subdir, auth_token = auth_token, host = host) install_remotes(remotes, quiet = quiet, ...) } github_remote <- function(repo, username = NULL, ref = NULL, subdir = NULL, auth_token = github_pat(), sha = NULL, host = "https://api.github.com") { meta <- parse_git_repo(repo) meta <- github_resolve_ref(meta$ref %||% ref, meta) if (is.null(meta$username)) { meta$username <- username %||% getOption("github.user") %||% stop("Unknown username.") warning("Username parameter is deprecated. Please use ", username, "/", repo, call. = FALSE) } remote("github", host = host, repo = meta$repo, subdir = meta$subdir %||% subdir, username = meta$username, ref = meta$ref, sha = sha, auth_token = auth_token ) } #' @export remote_download.github_remote <- function(x, quiet = FALSE) { dest <- tempfile(fileext = paste0(".zip")) if (missing_protocol <- !grepl("^[^:]+?://", x$host)) { x$host <- paste0("https://", x$host) } src_root <- paste0(x$host, "/repos/", x$username, "/", x$repo) src <- paste0(src_root, "/zipball/", x$ref) if (!quiet) { message("Downloading GitHub repo ", x$username, "/", x$repo, "@", x$ref, "\nfrom URL ", src) } if (!is.null(x$auth_token)) { auth <- httr::authenticate( user = x$auth_token, password = "x-oauth-basic", type = "basic" ) } else { auth <- NULL } if (github_has_remotes(x, auth)) warning("GitHub repo contains submodules, may not function as expected!", call. = FALSE) download_github(dest, src, auth) } github_has_remotes <- function(x, auth = NULL) { src_root <- paste0(x$host, "/repos/", x$username, "/", x$repo) src_submodules <- paste0(src_root, "/contents/.gitmodules?ref=", x$ref) response <- httr::HEAD(src_submodules, , auth) identical(httr::status_code(response), 200L) } #' @export remote_metadata.github_remote <- function(x, bundle = NULL, source = NULL) { # Determine sha as efficiently as possible if (!is.null(bundle)) { # Might be able to get from zip archive sha <- git_extract_sha1(bundle) } else { # Otherwise can lookup with remote_ls sha <- remote_sha(x) } list( RemoteType = "github", RemoteHost = x$host, RemoteRepo = x$repo, RemoteUsername = x$username, RemoteRef = x$ref, RemoteSha = sha, RemoteSubdir = x$subdir, # Backward compatibility for packrat etc. GithubRepo = x$repo, GithubUsername = x$username, GithubRef = x$ref, GithubSHA1 = sha, GithubSubdir = x$subdir ) } #' GitHub references #' #' Use as \code{ref} parameter to \code{\link{install_github}}. #' Allows installing a specific pull request or the latest release. #' #' @param pull The pull request to install #' @seealso \code{\link{install_github}} #' @rdname github_refs #' @export github_pull <- function(pull) structure(pull, class = "github_pull") #' @rdname github_refs #' @export github_release <- function() structure(NA_integer_, class = "github_release") github_resolve_ref <- function(x, params) UseMethod("github_resolve_ref") #' @export github_resolve_ref.default <- function(x, params) { params$ref <- x params } #' @export github_resolve_ref.NULL <- function(x, params) { params$ref <- "master" params } #' @export github_resolve_ref.github_pull <- function(x, params) { # GET /repos/:user/:repo/pulls/:number path <- file.path("repos", params$username, params$repo, "pulls", x) response <- github_GET(path) params$username <- response$head$user$login params$ref <- response$head$ref params } # Retrieve the ref for the latest release #' @export github_resolve_ref.github_release <- function(x, params) { # GET /repos/:user/:repo/releases path <- paste("repos", params$username, params$repo, "releases", sep = "/") response <- github_GET(path) if (length(response) == 0L) stop("No releases found for repo ", params$username, "/", params$repo, ".") params$ref <- response[[1L]]$tag_name params } # Parse concise git repo specification: [username/]repo[/subdir][#pull|@ref|@*release] # (the *release suffix represents the latest release) parse_git_repo <- function(path) { username_rx <- "(?:([^/]+)/)?" repo_rx <- "([^/@#]+)" subdir_rx <- "(?:/([^@#]*[^@#/]))?" ref_rx <- "(?:@([^*].*))" pull_rx <- "(?:#([0-9]+))" release_rx <- "(?:@([*]release))" ref_or_pull_or_release_rx <- sprintf("(?:%s|%s|%s)?", ref_rx, pull_rx, release_rx) github_rx <- sprintf("^(?:%s%s%s%s|(.*))$", username_rx, repo_rx, subdir_rx, ref_or_pull_or_release_rx) param_names <- c("username", "repo", "subdir", "ref", "pull", "release", "invalid") replace <- stats::setNames(sprintf("\\%d", seq_along(param_names)), param_names) params <- lapply(replace, function(r) gsub(github_rx, r, path, perl = TRUE)) if (params$invalid != "") stop(sprintf("Invalid git repo: %s", path)) params <- params[sapply(params, nchar) > 0] if (!is.null(params$pull)) { params$ref <- github_pull(params$pull) params$pull <- NULL } if (!is.null(params$release)) { params$ref <- github_release() params$release <- NULL } params } #' @export remote_package_name.github_remote <- function(remote, url = "https://raw.githubusercontent.com", ...) { tmp <- tempfile() path <- paste(c( remote$username, remote$repo, remote$ref, remote$subdir, "DESCRIPTION"), collapse = "/") if (!is.null(remote$auth_token)) { auth <- httr::authenticate( user = remote$auth_token, password = "x-oauth-basic", type = "basic" ) } else { auth <- NULL } req <- httr::GET(url, path = path, httr::write_disk(path = tmp), auth) if (httr::status_code(req) >= 400) { return(NA_character_) } read_dcf(tmp)$Package } #' @export remote_sha.github_remote <- function(remote, url = "https://github.com", ...) { # If the remote ref is the same as the sha it is a pinned commit so just # return that. if (!is.null(remote$ref) && !is.null(remote$sha) && grepl(paste0("^", remote$ref), remote$sha)) { return(remote$sha) } tryCatch({ res <- git2r::remote_ls( paste0(url, "/", remote$username, "/", remote$repo, ".git"), ...) found <- grep(pattern = paste0("/", remote$ref), x = names(res)) if (length(found) == 0) { return(NA_character_) } unname(res[found[1]]) }, error = function(e) NA_character_) } #' @export format.github_remote <- function(x, ...) { "GitHub" } download_github <- function(path, url, ...) { request <- httr::GET(url, ...) if (httr::status_code(request) >= 400) { stop(github_error(request)) } writeBin(httr::content(request, "raw"), path) path } devtools/R/install-cran.r0000644000176200001440000000427313200623655015074 0ustar liggesusers#' Attempts to install a package from CRAN. #' #' This function is vectorised on \code{pkgs} so you can install multiple #' packages in a single command. #' #' @param pkgs Character vector of packages to install. #' @inheritParams package_deps #' @export #' @family package installation #' @examples #' \dontrun{ #' install_cran("ggplot2") #' install_cran(c("httpuv", "shiny") #' } install_cran <- function(pkgs, repos = getOption("repos"), type = getOption("pkgType"), ..., quiet = FALSE) { remotes <- lapply(pkgs, cran_remote, repos = repos, type = type) install_remotes(remotes, quiet = quiet, ...) } cran_remote <- function(pkg, repos, type) { remote("cran", name = pkg, repos = repos, pkg_type = type) } #' @export #' @importFrom utils download.packages remote_download.cran_remote <- function(x, quiet = FALSE) { dest_dir <- tempdir() # download.packages() doesn't fully respect "quiet" argument if (quiet) { sink_file <- tempfile() path <- withr::with_message_sink( sink_file, download_cran(x, quiet, dest_dir) ) } else { path <- download_cran(x, quiet, dest_dir) } # Make sure we return a copy which can be deleted later on # (e.g., for local repositories) if (dirname(normalizePath(path)) != normalizePath(dest_dir)) { file.copy(path, dest_dir) path <- file.path(dest_dir, basename(path)) } path } download_cran <- function(x, quiet, dest_dir) { download.packages(x$name, destdir = dest_dir, repos = x$repos, type = x$pkg_type, quiet = quiet)[1, 2] } #' @export remote_metadata.cran_remote <- function(x, bundle = NULL, source = NULL) { version <- read_dcf(file.path(source, "DESCRIPTION"))$Version list( RemoteType = "cran", RemoteSha = trimws(version), RemoteRepos = paste0(deparse(x$repos), collapse = ""), RemotePkgType = x$pkg_type ) } #' @export remote_package_name.cran_remote <- function(remote, ...) { remote$name } #' @export remote_sha.cran_remote <- function(remote, url = "https://github.com", ...) { cran <- available_packages(remote$repos, remote$pkg_type) trimws(unname(cran[, "Version"][match(remote$name, rownames(cran))])) } #' @export format.cran_remote <- function(x, ...) { "CRAN" } devtools/R/dev-example.r0000644000176200001440000000115613200623655014711 0ustar liggesusers#' Run a examples for an in-development function. #' #' @inheritParams run_examples #' @param topic Name or topic (or name of Rd) file to run examples for #' @export #' @family example functions #' @examples #' \dontrun{ #' # Runs installed example: #' library("ggplot2") #' example("ggplot") #' #' # Runs develoment example: #' load_all("ggplot2") #' dev_example("ggplot") #' } dev_example <- function(topic) { path <- find_topic(topic) if (is.null(path)) { stop("Can't find development example for topic ", topic, call. = FALSE) } pkg <- as.package(names(path)[[1]]) load_all(pkg) run_example(path) } devtools/R/document.r0000644000176200001440000000246513200623655014324 0ustar liggesusers#' Use roxygen to document a package. #' #' This function is a wrapper for the \code{\link[roxygen2]{roxygenize}()} #' function from the roxygen2 package. See the documentation and vignettes of #' that package to learn how to use roxygen. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param clean,reload Deprecated. #' @inheritParams roxygen2::roxygenise #' @seealso \code{\link[roxygen2]{roxygenize}}, #' \code{browseVignettes("roxygen2")} #' @export document <- function(pkg = ".", clean = NULL, roclets = NULL, reload = TRUE) { check_suggested("roxygen2") if (!missing(clean)) { warning("`clean` is deprecated: roxygen2 now automatically cleans up", call. = FALSE) } if (!missing(reload)) { warning("`reload` is deprecated: code is now always reloaded", call. = FALSE) } pkg <- as.package(pkg) message("Updating ", pkg$package, " documentation") load_all(pkg) if (packageVersion("roxygen2") > "4.1.1") { roclets <- roclets %||% roxygen2::load_options(pkg$path)$roclets # collate updated by load_all() roclets <- setdiff(roclets, "collate") } withr::with_envvar(r_env_vars(), roxygen2::roxygenise(pkg$path, roclets = roclets, load_code = ns_env) ) clear_topic_index(pkg) invisible() } devtools/R/lint.r0000644000176200001440000000204512721574111013445 0ustar liggesusers#' Lint all source files in a package. #' #' The default lintings correspond to the style guide at #' \url{http://r-pkgs.had.co.nz/r.html#style}, however it is possible to #' override any or all of them using the \code{linters} parameter. #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param cache store the lint results so repeated lints of the same content #' use the previous results. #' @param ... additional arguments passed to \code{\link[lintr]{lint_package}} #' @seealso \code{\link[lintr]{lint_package}}, \code{\link[lintr]{lint}} #' @details #' The lintr cache is by default stored in \code{~/.R/lintr_cache/} (this can #' be configured by setting \code{options(lintr.cache_directory)}). It can be #' cleared by calling \code{\link[lintr]{clear_cache}}. #' @export lint <- function(pkg = ".", cache = TRUE, ...) { check_suggested("lintr") pkg <- as.package(pkg) message("Linting ", pkg$package, appendLF = FALSE) lintr::lint_package(pkg$path, cache = cache, ...) } devtools/R/infrastructure.R0000644000176200001440000005460713200623655015533 0ustar liggesusers#' Add useful infrastructure to a package. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information. #' @name infrastructure #' @family infrastructure NULL #' @section \code{use_testthat}: #' Add testing infrastructure to a package that does not already have it. #' This will create \file{tests/testthat.R}, \file{tests/testthat/} and #' add \pkg{testthat} to the suggested packages. This is called #' automatically from \code{\link{test}} if needed. #' @rdname infrastructure #' @export use_testthat <- function(pkg = ".") { pkg <- as.package(pkg) check_suggested("testthat") if (uses_testthat(pkg = pkg)) { message("* testthat is already initialized") return(invisible(TRUE)) } message("* Adding testthat to Suggests") add_desc_package(pkg, "Suggests", "testthat") use_directory("tests/testthat", pkg = pkg) use_template( "testthat.R", "tests/testthat.R", data = list(name = pkg$package), pkg = pkg ) invisible(TRUE) } #' @section \code{use_test}: #' Add a test file, also add testing infrastructure if necessary. #' This will create \file{tests/testthat/test-.R} with a user-specified #' name for the test. Will fail if the file exists. #' @rdname infrastructure #' @export use_test <- function(name, pkg = ".") { pkg <- as.package(pkg) check_suggested("testthat") if (!uses_testthat(pkg = pkg)) { use_testthat(pkg = pkg) } use_template("test-example.R", sprintf("tests/testthat/test-%s.R", name), data = list(test_name = name), open = TRUE, pkg = pkg ) invisible(TRUE) } #' @export #' @rdname infrastructure use_rstudio <- function(pkg = ".") { pkg <- as.package(pkg) use_template( "template.Rproj", paste0(pkg$package, ".Rproj"), pkg = pkg ) use_git_ignore(c(".Rproj.user", ".Rhistory", ".RData"), pkg = pkg) use_build_ignore(c("^.*\\.Rproj$", "^\\.Rproj\\.user$"), escape = FALSE, pkg = pkg) invisible(TRUE) } #' @section \code{use_vignette}: #' Adds needed packages to \code{DESCRIPTION}, and creates draft vignette #' in \code{vignettes/}. It adds \code{inst/doc} to \code{.gitignore} #' so you don't accidentally check in the built vignettes. #' @param name File name to use for new vignette. Should consist only of #' numbers, letters, _ and -. I recommend using lower case. #' @export #' @rdname infrastructure use_vignette <- function(name, pkg = ".") { pkg <- as.package(pkg) check_suggested("rmarkdown") add_desc_package(pkg, "Suggests", "knitr") add_desc_package(pkg, "Suggests", "rmarkdown") add_desc_package(pkg, "VignetteBuilder", "knitr") use_directory("vignettes", pkg = pkg) use_git_ignore("inst/doc", pkg = pkg) path <- file.path(pkg$path, "vignettes", paste0(name, ".Rmd")) rmarkdown::draft(path, "html_vignette", "rmarkdown", create_dir = FALSE, edit = FALSE) open_in_rstudio(path) } #' @section \code{use_rcpp}: #' Creates \code{src/} and adds needed packages to \code{DESCRIPTION}. #' @export #' @rdname infrastructure use_rcpp <- function(pkg = ".") { pkg <- as.package(pkg) check_suggested("Rcpp") message("Adding Rcpp to LinkingTo and Imports") add_desc_package(pkg, "LinkingTo", "Rcpp") add_desc_package(pkg, "Imports", "Rcpp") use_directory("src/", pkg = pkg) message("* Ignoring generated binary files.") ignore_path <- file.path(pkg$path, "src", ".gitignore") union_write(ignore_path, c("*.o", "*.so", "*.dll")) message( "Next, include the following roxygen tags somewhere in your package:\n\n", "#' @useDynLib ", pkg$package, "\n", "#' @importFrom Rcpp sourceCpp\n", "NULL\n\n", "Then run document()" ) } #' @rdname infrastructure #' @section \code{use_travis}: #' Add basic travis template to a package. Also adds \code{.travis.yml} to #' \code{.Rbuildignore} so it isn't included in the built package. #' @param browse open a browser window to enable Travis builds for the package #' automatically. #' @export #' @aliases add_travis use_travis <- function(pkg = ".", browse = interactive()) { pkg <- as.package(pkg) use_template("travis.yml", ".travis.yml", ignore = TRUE, pkg = pkg) gh <- github_info(pkg$path) travis_url <- file.path("https://travis-ci.org", gh$fullname) message("Next: \n", " * Add a travis shield to your README.md:\n", "[![Travis-CI Build Status]", "(https://travis-ci.org/", gh$fullname, ".svg?branch=master)]", "(https://travis-ci.org/", gh$fullname, ")\n", " * Turn on travis for your repo at ", travis_url, "\n" ) if (browse) { utils::browseURL(travis_url) } invisible(TRUE) } #' @rdname infrastructure #' @param type CI tool to use. Currently supports codecov and coverall. #' @section \code{use_coverage}: #' Add test code coverage to basic travis template to a package. #' @export use_coverage <- function(pkg = ".", type = c("codecov", "coveralls")) { pkg <- as.package(pkg) check_suggested("covr") path <- file.path(pkg$path, ".travis.yml") if (!file.exists(path)) { use_travis() } message("* Adding covr to Suggests") add_desc_package(pkg, "Suggests", "covr") gh <- github_info(pkg$path) type <- match.arg(type) message("Next:") switch(type, codecov = { use_template("codecov.yml", "codecov.yml", ignore = TRUE, pkg = pkg) message("* Add to `README.md`: \n", "[![Coverage Status]", "(https://img.shields.io/codecov/c/github/", gh$fullname, "/master.svg)]", "(https://codecov.io/github/", gh$fullname, "?branch=master)" ) message("* Add to `.travis.yml`:\n", "after_success:\n", " - Rscript -e 'covr::codecov()'" ) }, coveralls = { message("* Turn on coveralls for this repo at https://coveralls.io/repos/new") message("* Add to `README.md`: \n", "[![Coverage Status]", "(https://img.shields.io/coveralls/", gh$fullname, ".svg)]", "(https://coveralls.io/r/", gh$fullname, "?branch=master)" ) message("* Add to `.travis.yml`:\n", "after_success:\n", " - Rscript -e 'covr::coveralls()'" ) }) invisible(TRUE) } #' @rdname infrastructure #' @section \code{use_appveyor}: #' Add basic AppVeyor template to a package. Also adds \code{appveyor.yml} to #' \code{.Rbuildignore} so it isn't included in the built package. #' @export use_appveyor <- function(pkg = ".") { pkg <- as.package(pkg) use_template("appveyor.yml", ignore = TRUE, pkg = pkg) gh <- github_info(pkg$path) message("Next: \n", " * Turn on AppVeyor for this repo at https://ci.appveyor.com/projects\n", " * Add an AppVeyor shield to your README.md:\n", "[![AppVeyor Build Status]", "(https://ci.appveyor.com/api/projects/status/github/", gh$username, "/", gh$repo, "?branch=master&svg=true)]", "(https://ci.appveyor.com/project/", gh$username, "/", gh$repo, ")" ) invisible(TRUE) } #' @rdname infrastructure #' @section \code{use_package_doc}: #' Adds a roxygen template for package documentation #' @export use_package_doc <- function(pkg = ".") { pkg <- as.package(pkg) use_template( "packagename-package.r", file.path("R", paste(pkg$package, "-package.r", sep = "")), data = list(name = pkg$package), open = TRUE, pkg = pkg ) } #' Use specified package. #' #' This adds a dependency to DESCRIPTION and offers a little advice #' about how to best use it. #' #' @param package Name of package to depend on. #' @param type Type of dependency: must be one of "Imports", "Depends", #' "Suggests", "Enhances", or "LinkingTo" (or unique abbreviation) #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information. #' @family infrastructure #' @export #' @examples #' \dontrun{ #' use_package("ggplot2") #' use_package("dplyr", "suggests") #' #' } use_package <- function(package, type = "Imports", pkg = ".") { stopifnot(is.character(package), length(package) == 1) stopifnot(is.character(type), length(type) == 1) if (!is_installed(package)) { stop(package, " must be installed before you can take a dependency on it", call. = FALSE) } types <- c("Imports", "Depends", "Suggests", "Enhances", "LinkingTo") names(types) <- tolower(types) type <- types[[match.arg(tolower(type), names(types))]] message("* Adding ", package, " to ", type) add_desc_package(pkg, type, package) msg <- switch(type, Imports = paste0("Refer to functions with ", package, "::fun()"), Depends = paste0("Are you sure you want Depends? Imports is almost always", " the better choice."), Suggests = paste0("Use requireNamespace(\"", package, "\", quietly = TRUE)", " to test if package is installed,\n", "then use ", package, "::fun() to refer to functions."), Enhances = "", LinkingTo = show_includes(package) ) message("Next: ") message(msg) invisible() } show_includes <- function(package) { incl <- system.file("include", package = package) h <- dir(incl, "\\.(h|hpp)$") if (length(h) == 0) return() message("Possible includes are:\n", paste0("#include <", h, ">", collapse = "\n")) } add_desc_package <- function(pkg = ".", field, name) { pkg <- as.package(pkg) desc_path <- file.path(pkg$path, "DESCRIPTION") desc <- read_dcf(desc_path) old <- desc[[field]] if (is.null(old)) { new <- name changed <- TRUE } else { if (!grepl(paste0('\\b', name, '\\b'), old)) { new <- paste0(old, ",\n ", name) changed <- TRUE } else { changed <- FALSE } } if (changed) { desc[[field]] <- new write_dcf(desc_path, desc) } invisible(changed) } #' Use data in a package. #' #' This function makes it easy to save package data in the correct format. #' #' @param ... Unquoted names of existing objects to save. #' @param pkg Package where to store data. Defaults to package in working #' directory. #' @param internal If \code{FALSE}, saves each object in individual #' \code{.rda} files in the \code{data/} directory. These are available #' whenever the package is loaded. If \code{TRUE}, stores all objects in #' a single \code{R/sysdata.rda} file. These objects are only available #' within the package. #' @param overwrite By default, \code{use_data} will not overwrite existing #' files. If you really want to do so, set this to \code{TRUE}. #' @param compress Choose the type of compression used by \code{\link{save}}. #' Should be one of "gzip", "bzip2" or "xz". #' @export #' @family infrastructure #' @examples #' \dontrun{ #' x <- 1:10 #' y <- 1:100 #' #' use_data(x, y) # For external use #' use_data(x, y, internal = TRUE) # For internal use #' } use_data <- function(..., pkg = ".", internal = FALSE, overwrite = FALSE, compress = "bzip2") { pkg <- as.package(pkg) objs <- get_objs_from_dots(dots(...)) if (internal) { dir_name <- file.path(pkg$path, "R") paths <- file.path(dir_name, "sysdata.rda") objs <- list(objs) } else { dir_name <- file.path(pkg$path, "data") paths <- file.path(dir_name, paste0(objs, ".rda")) } check_data_paths(paths, overwrite) message("Saving ", paste(unlist(objs), collapse = ", "), " as ", paste(basename(paths), collapse = ", "), " to ", dir_name) envir <- parent.frame() mapply(save, list = objs, file = paths, MoreArgs = list(envir = envir, compress = compress)) invisible() } get_objs_from_dots <- function(.dots) { if (length(.dots) == 0L) { stop("Nothing to save", call. = FALSE) } is_name <- vapply(.dots, is.symbol, logical(1)) if (any(!is_name)) { stop("Can only save existing named objects", call. = FALSE) } objs <- vapply(.dots, as.character, character(1)) duplicated_objs <- which(stats::setNames(duplicated(objs), objs)) if (length(duplicated_objs) > 0L) { objs <- unique(objs) warning("Saving duplicates only once: ", paste(names(duplicated_objs), collapse = ", "), call. = FALSE) } objs } check_data_paths <- function(paths, overwrite) { data_path <- dirname(paths[[1]]) if (!file.exists(data_path)) dir.create(data_path) if (!overwrite) { paths_exist <- which(stats::setNames(file.exists(paths), paths)) if (length(paths_exist) > 0L) { paths_exist <- unique(names(paths_exist)) existing_names <- basename(paths_exist) stop(paste(existing_names, collapse = ", "), " already exists in ", dirname(paths_exist[[1L]]), ". ", "Use overwrite = TRUE to overwrite", call. = FALSE) } } } #' Use \code{data-raw} to compute package datasets. #' #' @param pkg Package where to create \code{data-raw}. Defaults to package in #' working directory. #' @export #' @family infrastructure use_data_raw <- function(pkg = ".") { pkg <- as.package(pkg) use_directory("data-raw", ignore = TRUE, pkg = pkg) message("Next: \n", "* Add data creation scripts in data-raw\n", "* Use devtools::use_data() to add data to package") } #' Add a file to \code{.Rbuildignore} #' #' \code{.Rbuildignore} has a regular expression on each line, but it's #' usually easier to work with specific file names. By default, will (crudely) #' turn a filename into a regular expression that will only match that #' path. Repeated entries will be silently removed. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param files Name of file. #' @param escape If \code{TRUE}, the default, will escape \code{.} to #' \code{\\.} and surround with \code{^} and \code{$}. #' @return Nothing, called for its side effect. #' @export #' @aliases add_build_ignore #' @family infrastructure #' @keywords internal use_build_ignore <- function(files, escape = TRUE, pkg = ".") { pkg <- as.package(pkg) if (escape) { files <- paste0("^", gsub("\\.", "\\\\.", files), "$") } path <- file.path(pkg$path, ".Rbuildignore") union_write(path, files) invisible(TRUE) } #' Create README files. #' #' Creates skeleton README files with sections for #' \itemize{ #' \item a high-level description of the package and its goals #' \item R code to install from GitHub, if GitHub usage detected #' \item a basic example #' } #' Use \code{Rmd} if you want a rich intermingling of code and data. Use #' \code{md} for a basic README. \code{README.Rmd} will be automatically #' added to \code{.Rbuildignore}. The resulting README is populated with default #' YAML frontmatter and R fenced code blocks (\code{md}) or chunks (\code{Rmd}). #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @export #' @examples #' \dontrun{ #' use_readme_rmd() #' use_readme_md() #' } #' @family infrastructure use_readme_rmd <- function(pkg = ".") { pkg <- as.package(pkg) if (uses_github(pkg$path)) { pkg$github <- github_info(pkg$path) } pkg$Rmd <- TRUE use_template("omni-README", save_as = "README.Rmd", data = pkg, ignore = TRUE, open = TRUE, pkg = pkg) use_build_ignore("^README-.*\\.png$", escape = FALSE, pkg = pkg) if (uses_git(pkg$path) && !file.exists(pkg$path, ".git", "hooks", "pre-commit")) { message("* Adding pre-commit hook") use_git_hook("pre-commit", render_template("readme-rmd-pre-commit.sh"), pkg = pkg) } invisible(TRUE) } #' @export #' @rdname use_readme_rmd use_readme_md <- function(pkg = ".") { pkg <- as.package(pkg) if (uses_github(pkg$path)) { pkg$github <- github_info(pkg$path) } use_template("omni-README", save_as = "README.md", data = pkg, open = TRUE, pkg = pkg) } #' Use NEWS.md #' #' This creates \code{NEWS.md} from a template. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @export #' @family infrastructure use_news_md <- function(pkg = ".") { pkg <- as.package(pkg) use_template("NEWS.md", data = pkg, open = TRUE, pkg = pkg) } #' @rdname infrastructure #' @section \code{use_revdep}: #' Add \code{revdep} directory and basic check template. #' @export #' @aliases add_travis use_revdep <- function(pkg = ".") { pkg <- as.package(pkg) use_directory("revdep", ignore = TRUE, pkg = pkg) use_template( "revdep.R", "revdep/check.R", data = list(name = pkg$package), pkg = pkg ) use_git_ignore(revdep_cache_path_raw(""), pkg = pkg) } #' @rdname infrastructure #' @section \code{use_cran_comments}: #' Add \code{cran-comments.md} template. #' @export #' @aliases add_travis use_cran_comments <- function(pkg = ".") { pkg <- as.package(pkg) use_template( "cran-comments.md", data = list(rversion = paste0(version$major, ".", version$minor)), ignore = TRUE, open = TRUE, pkg = pkg ) invisible() } #' @rdname infrastructure #' @section \code{use_code_of_conduct}: #' Add a code of conduct to from \url{http://contributor-covenant.org}. #' #' @export #' @aliases add_travis use_code_of_conduct <- function(pkg = ".") { pkg <- as.package(pkg) use_template( "CONDUCT.md", ignore = TRUE, pkg = pkg ) message("* Don't forget to describe the code of conduct in your README.md:") message("Please note that this project is released with a ", "[Contributor Code of Conduct](CONDUCT.md). ", "By participating in this ", "project you agree to abide by its terms.") } add_build_ignore <- function(pkg = ".", files, escape = TRUE) { use_build_ignore(files, escape = escape, pkg = pkg) } union_write <- function(path, new_lines) { if (file.exists(path)) { lines <- readLines(path, warn = FALSE) } else { lines <- character() } all <- union(lines, new_lines) writeLines(all, path) } #' @rdname infrastructure #' @section \code{use_cran_badge}: #' Add a badge to show CRAN status and version number on the README #' @export use_cran_badge <- function(pkg = ".") { pkg <- as.package(pkg) message( " * Add a CRAN status shield by adding the following line to your README:\n", "[![CRAN_Status_Badge](http://www.r-pkg.org/badges/version/", pkg$package, ")](https://cran.r-project.org/package=", pkg$package, ")" ) invisible(TRUE) } #' @rdname infrastructure #' @section \code{use_mit_license}: #' Adds the necessary infrastructure to declare your package as #' distributed under the MIT license. #' @param copyright_holder The copyright holder for this package. Defaults to #' \code{getOption("devtools.name")}. #' @export use_mit_license <- function(pkg = ".", copyright_holder = getOption("devtools.name", "")) { pkg <- as.package(pkg) # Update the DESCRIPTION message("* Updating license field in DESCRIPTION.") descPath <- file.path(pkg$path, "DESCRIPTION") DESCRIPTION <- read_dcf(descPath) DESCRIPTION$License <- "MIT + file LICENSE" write_dcf(descPath, DESCRIPTION) use_template( "mit-license.txt", "LICENSE", data = list( year = format(Sys.Date(), "%Y"), copyright_holder = copyright_holder ), open = identical(copyright_holder, ""), pkg = pkg ) } #' @rdname infrastructure #' @section \code{use_gpl3_license}: #' Adds the necessary infrastructure to declare your package as #' distributed under the GPL v3. #' @export use_gpl3_license <- function(pkg = ".") { pkg <- as.package(pkg) # Update the DESCRIPTION message("* Updating license field in DESCRIPTION.") descPath <- file.path(pkg$path, "DESCRIPTION") DESCRIPTION <- read_dcf(descPath) DESCRIPTION$License <- "GPL-3 + file LICENSE" write_dcf(descPath, DESCRIPTION) use_template("gpl-v3.md", "LICENSE", pkg = pkg) } #' @section \code{use_dev_version}: #' This adds ".9000" to the package \code{DESCRIPTION}, adds a new heading to #' \code{NEWS.md} (if it exists), and then checks the result into git. #' @rdname infrastructure #' @export use_dev_version <- function(pkg = ".") { pkg <- as.package(pkg) if (uses_git(pkg$path) && git_uncommitted(pkg$path)) { stop( "Uncommited changes. Please commit to git before continuing", call. = FALSE ) } message("* Adding .9000 to version") desc_path <- file.path(pkg$path, "DESCRIPTION") DESCRIPTION <- read_dcf(desc_path) if (length(unlist(package_version(DESCRIPTION$Version))) > 3) { stop("Already has development version", call. = FALSE) } DESCRIPTION$Version <- paste0(DESCRIPTION$Version, ".9000") write_dcf(desc_path, DESCRIPTION) news_path <- file.path(pkg$path, "news.md") if (file.exists(news_path)) { message("* Adding new heading to NEWS.md") news <- readLines(news_path) news <- c( paste0("# ", pkg$package, " ", DESCRIPTION$Version), "", news ) writeLines(news, news_path) } if (uses_git(pkg$path)) { message("* Checking into git") r <- git2r::init(pkg$path) paths <- unlist(git2r::status(r)) git2r::add(r, paths) git2r::commit(r, "Use development version") } invisible(TRUE) } # Utilities --------------------------------------------------------------- use_directory <- function(path, ignore = FALSE, pkg = ".") { pkg <- as.package(pkg) pkg_path <- file.path(pkg$path, path) if (file.exists(pkg_path)) { if (!is_dir(pkg_path)) { stop("`", path, "` exists but is not a directory.", call. = FALSE) } } else { message("* Creating `", path, "`.") dir.create(pkg_path, showWarnings = FALSE, recursive = TRUE) } if (ignore) { message("* Adding `", path, "` to `.Rbuildignore`.") use_build_ignore(path, pkg = pkg) } invisible(TRUE) } use_template <- function(template, save_as = template, data = list(), ignore = FALSE, open = FALSE, pkg = ".") { pkg <- as.package(pkg) path <- file.path(pkg$path, save_as) if (!can_overwrite(path)) { stop("`", save_as, "` already exists.", call. = FALSE) } template_path <- system.file("templates", template, package = "devtools", mustWork = TRUE) template_out <- whisker::whisker.render(readLines(template_path), data) message("* Creating `", save_as, "` from template.") writeLines(template_out, path) if (ignore) { message("* Adding `", save_as, "` to `.Rbuildignore`.") use_build_ignore(save_as, pkg = pkg) } if (open) { message("* Modify `", save_as, "`.") open_in_rstudio(path) } invisible(TRUE) } open_in_rstudio <- function(path) { if (!rstudioapi::isAvailable()) return() if (!rstudioapi::hasFun("navigateToFile")) return() rstudioapi::navigateToFile(path) } can_overwrite <- function(path) { name <- basename(path) if (!file.exists(path)) { TRUE } else if (interactive() && !yesno("Overwrite `", name, "`?")) { TRUE } else { FALSE } } devtools/R/check-devtools.r0000644000176200001440000000673513200623655015424 0ustar liggesusers#' Custom devtools release checks. #' #' This function performs additional checks prior to release. It is called #' automatically by \code{\link{release}()}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information. #' @keywords internal #' @export release_checks <- function(pkg = ".", built_path = NULL) { pkg <- as.package(pkg) message("Running additional devtools checks for ", pkg$package) check_version(pkg) check_dev_versions(pkg) check_vignette_titles(pkg) check_news_md(pkg) check_remotes(pkg) } check_dev_versions <- function(pkg = ".") { pkg <- as.package(pkg) dep_list <- pkg[tolower(standardise_dep(TRUE))] deps <- do.call("rbind", unname(compact(lapply(dep_list, parse_deps)))) deps <- deps[!is.na(deps$version), , drop = FALSE] parsed <- lapply(deps$version, function(x) unlist(numeric_version(x))) lengths <- vapply(parsed, length, integer(1)) last_ver <- vapply(parsed, function(x) x[[length(x)]], integer(1)) is_dev <- lengths == 4 & last_ver >= 9000 check_status( !any(is_dev), "dependencies don't rely on dev versions", paste( "depends on devel versions of: ", paste0(deps$name[is_dev], collapse = ", ") ) ) return(invisible(FALSE)) } check_version <- function(pkg = ".") { pkg <- as.package(pkg) ver <- unlist(numeric_version(pkg$version)) check_status(length(ver) == 3, "version number has three components", paste0("version (", pkg$version, ") should have exactly three components") ) } check_vignette_titles <- function(pkg = ".") { pkg <- as.package(pkg) vigns <- tools::pkgVignettes(dir = pkg$path) if (length(vigns$docs) == 0) return() has_vignette_title <- function(v, n) { h <- readLines(v, n = n) any(grepl("Vignette Title", h)) } v <- stats::setNames(vigns$docs, basename(vigns$docs)) has_vt <- vapply(v, has_vignette_title, logical(1), n = 30) check_status( !any(has_vt), "vignette titles are not placeholders", paste0( "placeholder 'Vignette Title' detected in 'title' field and/or ", "'VignetteIndexEntry' for: ", paste(names(has_vt)[has_vt], collapse = ",") ) ) } check_news_md <- function(pkg) { pkg <- as.package(pkg) news_path <- file.path(pkg$path, "NEWS.md") if (!file.exists(news_path)) return() ignore_path <- file.path(pkg$path, ".Rbuildignore") if (!file.exists(ignore_path)) { ignore_lines <- character() } else { ignore_lines <- readLines(ignore_path) } has_news <- grepl("NEWS\\.md", ignore_lines, fixed = TRUE) | grepl("NEWS.md", ignore_lines, fixed = TRUE) check_status(!any(has_news), "NEWS.md is not ignored", "NEWS.md now supported by CRAN and doesn't need to be ignored." ) news_rd_path <- file.path(pkg$path, "inst/NEWS.Rd") check_status( !file.exists(news_rd_path), "NEWS.Rd does not exist", "NEWS.md now supported by CRAN, NEWS.Rd can be removed." ) } check_remotes <- function(pkg) { check_status(!has_dev_remotes(pkg), "DESCRIPTION doesn't have Remotes field", "Remotes field should be removed before CRAN submission." ) } check_status <- function(status, name, warning) { cat("Checking ", name, "...", sep = "") status <- tryCatch( if (status) { cat(" OK\n") } else { cat("\n") message("WARNING: ", warning) }, error = function(e) { cat("\n") message("ERROR: ", conditionMessage(e)) FALSE } ) invisible(status) } devtools/R/dev-meta.r0000644000176200001440000000206513200623655014204 0ustar liggesusers#' Return devtools metadata environment #' #' If the package was not loaded with devtools, returns \code{NULL}. #' #' @param name The name of a loaded package #' @keywords internal #' @examples #' dev_meta("stats") # NULL #' #' if (has_tests()) { #' # Load the test package in directory "testLoadHooks" #' load_all(devtest("testLoadHooks")) #' #' # Get metdata for the package #' x <- dev_meta("testLoadHooks") #' as.list(x) #' #' # Clean up. #' unload(devtest("testLoadHooks")) #' } #' @export dev_meta <- function(name) { ns <- .getNamespace(name) if (is.null(ns)) { stop("Namespace not found for ", name, ". Is it loaded?") } if (is.null(ns$.__DEVTOOLS__)) { return(NULL) } ns$.__DEVTOOLS__ } # Create the devtools metadata environment for a package. # This should be run when packages are loaded by devtools. create_dev_meta <- function(name) { ns <- .getNamespace(name) if (!is.null(ns$.__DEVTOOLS__)) { stop("devtools metadata for package ", name, " already exists.") } ns$.__DEVTOOLS__ <- new.env(parent = ns) ns$.__DEVTOOLS__ } devtools/R/unload.r0000644000176200001440000001044213200623655013762 0ustar liggesusers#' Unload a package #' #' This function attempts to cleanly unload a package, including unloading #' its namespace, deleting S4 class definitions and unloading any loaded #' DLLs. Unfortunately S4 classes are not really designed to be cleanly #' unloaded, and so we have to manually modify the class dependency graph in #' order for it to work - this works on the cases for which we have tested #' but there may be others. Similarly, automated DLL unloading is best tested #' for simple scenarios (particularly with \code{useDynLib(pkgname)} and may #' fail in other cases. If you do encounter a failure, please file a bug report #' at \url{http://github.com/hadley/devtools/issues}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' #' @examples #' \dontrun{ #' # Unload package that is in current directory #' unload(".") #' #' # Unload package that is in ./ggplot2/ #' unload("ggplot2/") #' #' # Can use inst() to find the path of an installed package #' # This will load and unload the installed ggplot2 package #' library(ggplot2) #' unload(inst("ggplot2")) #' } #' @export unload <- function(pkg = ".") { pkg <- as.package(pkg) if (pkg$package == "compiler") { # Disable JIT compilation as it could interfere with the compiler # unloading. Also, if the JIT was kept enabled, it would cause the # compiler package to be loaded again soon, anyway. Note if we # restored the JIT level after the unloading, the call to # enableJIT itself would load the compiler again. oldEnable <- compiler::enableJIT(0) if (oldEnable != 0) { warning("JIT automatically disabled when unloading the compiler.") } } # This is a hack to work around unloading devtools itself. The unloading # process normally makes other devtools functions inaccessible, # resulting in "Error in unload(pkg) : internal error -3 in R_decompress1". # If we simply force them first, then they will remain available for use # later. if (pkg$package == "devtools") { eapply(ns_env(pkg), force, all.names = TRUE) } # If the package was loaded with devtools, any s4 classes that were created # by the package need to be removed in a special way. if (!is.null(dev_meta(pkg$package))) { remove_s4_classes(pkg) } if (pkg$package %in% loadedNamespaces()) { # unloadNamespace will throw an error if it has trouble unloading. # This can happen when there's another package that depends on the # namespace. # unloadNamespace will also detach the package if it's attached. # # unloadNamespace calls onUnload hook and .onUnload try(unloadNamespace(pkg$package), silent = TRUE) } else { stop("Package ", pkg$package, " not found in loaded packages or namespaces") } # Sometimes the namespace won't unload with detach(), like when there's # another package that depends on it. If it's still around, force it # to go away. # loadedNamespaces() and unloadNamespace() often don't work here # because things can be in a weird state. if (!is.null(.getNamespace(pkg$package))) { message("unloadNamespace(\"", pkg$package, "\") not successful, probably because another loaded package depends on it.", "Forcing unload. If you encounter problems, please restart R.") unregister_namespace(pkg$package) } # Clear so that loading the package again will re-read all files clear_cache() # Do this after detach, so that packages that have an .onUnload function # which unloads DLLs (like MASS) won't try to unload the DLL twice. unload_dll(pkg) } # This unloads dlls loaded by either library() or load_all() unload_dll <- function(pkg = ".") { pkg <- as.package(pkg) # Always run garbage collector to force any deleted external pointers to # finalise gc() # Special case for devtools - don't unload DLL because we need to be able # to access nsreg() in the DLL in order to run makeNamespace. This means # that changes to compiled code in devtools can't be reloaded with # load_all -- it requires a reinstallation. if (pkg$package == "devtools") { return(invisible()) } pkglibs <- loaded_dlls(pkg) for (lib in pkglibs) { dyn.unload(lib[["path"]]) } # Remove the unloaded dlls from .dynLibs() libs <- .dynLibs() .dynLibs(libs[!(libs %in% pkglibs)]) invisible() } devtools/R/create.r0000644000176200001440000000752513200623655013753 0ustar liggesusers#' Creates a new package, following all devtools package conventions. #' #' Similar to \code{\link{package.skeleton}}, except that it only creates #' the standard devtools directory structures; it doesn't try and create #' source code and data files by inspecting the global environment. #' #' \code{create} requires that the directory doesn't exist yet; it will be #' created but deleted upon failure. \code{setup} assumes an existing #' directory from which it will infer the package name. #' #' @param path location to create new package. The last component of the path #' will be used as the package name. #' @param description list of description values to override default values or #' add additional values. #' @param check if \code{TRUE}, will automatically run \code{\link{check}} #' @param rstudio Create an RStudio project file? #' (with \code{\link{use_rstudio}}) #' @param quiet if \code{FALSE}, the default, prints informative messages. #' @seealso Text with \code{\link{package.skeleton}} #' @export #' @examples #' \dontrun{ #' # Create a package using all defaults: #' path <- file.path(tempdir(), "myDefaultPackage") #' create(path) #' #' # Override a description attribute. #' path <- file.path(tempdir(), "myCustomPackage") #' my_description <- list("Maintainer" = #' "'Yoni Ben-Meshulam' ") #' create(path, my_description) #' } create <- function(path, description = getOption("devtools.desc"), check = FALSE, rstudio = TRUE, quiet = FALSE) { check_package_name(path) # ensure the parent directory exists parent_dir <- normalizePath(dirname(path), winslash = "/", mustWork = FALSE) if (!file.exists(parent_dir)) { stop("Parent directory '", parent_dir, "' does not exist", call. = FALSE) } # allow 'create' to create a new directory, or populate # an empty directory, as long as the parent directory exists if (!file.exists(path)) { if (!dir.create(path)) { stop("Failed to create package directory '", basename(path), "'", call. = FALSE) } } # if the directory exists but is not empty, bail files <- list.files(path) if (length(files)) { valid <- length(files) == 1 && tools::file_ext(files) == "Rproj" if (!valid) stop("Directory exists and is not empty", call. = FALSE) } path <- normalizePath(path, winslash = "/", mustWork = TRUE) setup(path = path, description = description, rstudio = rstudio, check = check, quiet = quiet) invisible(TRUE) } #' @rdname create #' @export setup <- function(path = ".", description = getOption("devtools.desc"), check = FALSE, rstudio = TRUE, quiet = FALSE) { check_package_name(path) parent_dir <- normalizePath(dirname(path), winslash = "/", mustWork = TRUE) if (!quiet) { message("Creating package '", extract_package_name(path), "' in '", parent_dir, "'") } dir.create(file.path(path, "R"), showWarnings = FALSE) create_description(path, extra = description, quiet = quiet) create_namespace(path) if (rstudio) use_rstudio(path) if (check) check(path) invisible(TRUE) } extract_package_name <- function(path) { basename(normalizePath(path, mustWork = FALSE)) } check_package_name <- function(path) { name <- extract_package_name(path) if (!valid_name(name)) { stop( name, " is not a valid package name: it should contain only\n", "ASCII letters, numbers and dot, have at least two characters\n", "and start with a letter and not end in a dot.", call. = FALSE ) } } valid_name <- function(x) { grepl("^[[:alpha:]][[:alnum:].]+$", x) && !grepl("\\.$", x) } create_namespace <- function(path) { ns_path <- file.path(path, "NAMESPACE") if (file.exists(ns_path)) return() cat( '# Generated by roxygen2: fake comment so roxygen2 overwrites silently.\n', 'exportPattern("^[^\\\\.]")\n', sep = "", file = ns_path ) } devtools/R/load-data.r0000644000176200001440000000240313200623655014324 0ustar liggesusers#' Load data. #' #' Loads all \code{.RData} files in the data subdirectory. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @keywords programming #' @export load_data <- function(pkg = ".") { # Note that this simulates normal R package loading by placing the data sets # in the .__NAMESPACE__.$lazydata environment, but unlike with proper lazy # loading via lazyLoad(), we'll need to manually copy these objects over to # the package environment later. pkg <- as.package(pkg) nsenv <- ns_env(pkg) lazydata_env <- nsenv$.__NAMESPACE__.$lazydata objs <- character() sysdata <- file.path(pkg$path, "R", "sysdata.rda") if (file.exists(sysdata)) { objs <- c(objs, load(sysdata, envir = nsenv)) } path_data <- file.path(pkg$path, "data") if (file.exists(path_data)) { paths <- dir(path_data, "\\.[rR][dD]a(ta)?$", full.names = TRUE) paths <- changed_files(paths) objs <- c(objs, unlist(lapply(paths, load, envir = lazydata_env))) paths <- dir(path_data, "\\.[rR]$", full.names = TRUE) paths <- changed_files(paths) objs <- c(objs, unlist(lapply(paths, sys.source, envir = lazydata_env, chdir = TRUE, keep.source = TRUE))) } invisible(objs) } devtools/R/has-tests.r0000644000176200001440000000023112416621515014407 0ustar liggesusers#' Was devtools installed with tests? #' #' @keywords internal #' @export has_tests <- function() { system.file("tests", package = "devtools") != "" } devtools/R/git.R0000644000176200001440000001137113171407310013220 0ustar liggesusersuses_git <- function(path = ".") { !is.null(git2r::discover_repository(path, ceiling = 0)) } # sha of most recent commit git_repo_sha1 <- function(r) { rev <- git2r::head(r) if (is.null(rev)) { return(NULL) } if (git2r::is_commit(rev)) { rev@sha } else { git2r::branch_target(rev) } } git_sha1 <- function(n = 10, path = ".") { r <- git2r::repository(path, discover = TRUE) sha <- git_repo_sha1(r) substr(sha, 1, n) } git_uncommitted <- function(path = ".") { r <- git2r::repository(path, discover = TRUE) st <- vapply(git2r::status(r), length, integer(1)) any(st != 0) } git_sync_status <- function(path = ".", check_ahead = TRUE, check_behind = TRUE) { r <- git2r::repository(path, discover = TRUE) r_head <- git2r::head(r) if (!methods::is(r_head, "git_branch")) { stop("HEAD is not a branch", call. = FALSE) } upstream <- git2r::branch_get_upstream(r_head) if (is.null(upstream)) { stop("No upstream branch", call. = FALSE) } git2r::fetch(r, git2r::branch_remote_name(upstream)) c1 <- git2r::lookup(r, git2r::branch_target(r_head)) c2 <- git2r::lookup(r, git2r::branch_target(upstream)) ab <- git2r::ahead_behind(c1, c2) # if (ab[1] > 0) # message(ab[1], " ahead of remote") # if (ab[2] > 0) # message(ab[2], " behind remote") is_ahead <- ab[[1]] != 0 is_behind <- ab[[2]] != 0 check <- (check_ahead && is_ahead) || (check_behind && is_behind) check } # Retrieve the current running path of the git binary. # @param git_binary_name The name of the binary depending on the OS. git_path <- function(git_binary_name = NULL) { # Use user supplied path if (!is.null(git_binary_name)) { if (!file.exists(git_binary_name)) { stop("Path ", git_binary_name, " does not exist", .call = FALSE) } return(git_binary_name) } # Look on path git_path <- Sys.which("git")[[1]] if (git_path != "") return(git_path) # On Windows, look in common locations if (.Platform$OS.type == "windows") { look_in <- c( "C:/Program Files/Git/bin/git.exe", "C:/Program Files (x86)/Git/bin/git.exe" ) found <- file.exists(look_in) if (any(found)) return(look_in[found][1]) } stop("Git does not seem to be installed on your system.", call. = FALSE) } git_branch <- function(path = ".") { r <- git2r::repository(path, discover = TRUE) if (git2r::is_detached(r)) { return(NULL) } git2r::head(r)@name } # GitHub ------------------------------------------------------------------ uses_github <- function(path = ".") { if (!uses_git(path)) return(FALSE) r <- git2r::repository(path, discover = TRUE) r_remote_urls <- git2r::remote_url(r) any(grepl("github", r_remote_urls)) } github_info <- function(path = ".", remote_name = NULL) { if (!uses_github(path)) return(github_dummy) r <- git2r::repository(path, discover = TRUE) r_remote_urls <- grep("github", remote_urls(r), value = TRUE) if (!is.null(remote_name) && !remote_name %in% names(r_remote_urls)) stop("no github-related remote named ", remote_name, " found") remote_name <- c(remote_name, "origin", names(r_remote_urls)) x <- r_remote_urls[remote_name] x <- x[!is.na(x)][1] github_remote_parse(x) } github_dummy <- list(username = "", repo = "", fullname = "/") remote_urls <- function(r) { remotes <- git2r::remotes(r) stats::setNames(git2r::remote_url(r, remotes), remotes) } github_remote_parse <- function(x) { if (length(x) == 0) return(github_dummy) if (!grepl("github", x)) return(github_dummy) if (grepl("^(https|git)", x)) { # https://github.com/hadley/devtools.git # https://github.com/hadley/devtools # git@github.com:hadley/devtools.git re <- "github[^/:]*[/:]([^/]+)/(.*?)(?:\\.git)?$" } else { stop("Unknown GitHub repo format", call. = FALSE) } m <- regexec(re, x) match <- regmatches(x, m)[[1]] list( username = match[2], repo = match[3], fullname = paste0(match[2], "/", match[3]) ) } # Extract the commit hash from a git archive. Git archives include the SHA1 # hash as the comment field of the zip central directory record # (see https://www.kernel.org/pub/software/scm/git/docs/git-archive.html) # Since we know it's 40 characters long we seek that many bytes minus 2 # (to confirm the comment is exactly 40 bytes long) git_extract_sha1 <- function(bundle) { # open the bundle for reading conn <- file(bundle, open = "rb", raw = TRUE) on.exit(close(conn)) # seek to where the comment length field should be recorded seek(conn, where = -0x2a, origin = "end") # verify the comment is length 0x28 len <- readBin(conn, "raw", n = 2) if (len[1] == 0x28 && len[2] == 0x00) { # read and return the SHA1 rawToChar(readBin(conn, "raw", n = 0x28)) } else { NULL } } devtools/R/vignettes.r0000644000176200001440000000457213200623655014517 0ustar liggesusers#' Build package vignettes. #' #' Builds package vignettes using the same algorithm that \code{R CMD build} #' does. This means including non-Sweave vignettes, using makefiles (if #' present), and copying over extra files. You need to ensure that these #' files are not included in the built package - ideally they should not #' be checked into source, or at least excluded with \code{.Rbuildignore} #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @inheritParams install_deps #' @keywords programming #' @seealso \code{\link{clean_vignettes}} to remove the pdfs in #' \file{inst/doc} created from vignettes #' @export #' @seealso \code{\link{clean_vignettes}} to remove build tex/pdf files. build_vignettes <- function(pkg = ".", dependencies = "VignetteBuilder") { pkg <- as.package(pkg) vigns <- tools::pkgVignettes(dir = pkg$path) if (length(vigns$docs) == 0) return() install_deps(pkg, dependencies, upgrade = FALSE) message("Building ", pkg$package, " vignettes") tools::buildVignettes(dir = pkg$path, tangle = TRUE) copy_vignettes(pkg) invisible(TRUE) } #' Clean built vignettes. #' #' This uses a fairly rudimentary algorithm where any files in \file{inst/doc} #' with a name that exists in \file{vignettes} are removed. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @export clean_vignettes <- function(pkg = ".") { pkg <- as.package(pkg) vigns <- tools::pkgVignettes(dir = pkg$path) if (basename(vigns$dir) != "vignettes") return() message("Cleaning built vignettes from ", pkg$package) doc_path <- file.path(pkg$path, "inst", "doc") vig_candidates <- dir(doc_path, full.names = TRUE) vig_rm <- vig_candidates[file_name(vig_candidates) %in% file_name(vigns$docs)] extra_candidates <- file.path(doc_path, basename(find_vignette_extras(pkg))) extra_rm <- extra_candidates[file.exists(extra_candidates)] to_remove <- c(vig_rm, extra_rm) if (length(to_remove) > 0) { message("Removing ", paste(basename(to_remove), collapse = ", ")) file.remove(to_remove) } invisible(TRUE) } ext_variations <- function(path, valid_ext) { unique(c(outer(file_name(path), valid_ext, FUN = paste, sep = "."))) } file_name <- function(x) { if (length(x) == 0) return(NULL) tools::file_path_sans_ext(basename(x)) } devtools/R/test.r0000644000176200001440000000653113200623655013463 0ustar liggesusers#' Execute all \pkg{test_that} tests in a package. #' #' Tests are assumed to be located in either the \code{inst/tests/} or #' \code{tests/testthat} directory (the latter is recommended). #' See \code{\link[testthat]{test_dir}} for the naming convention of test #' scripts within one of those directories and #' \code{\link[testthat]{test_check}} for the folder structure conventions. #' #' If no testing infrastructure is present #' (detected by the \code{uses_testthat} function), you'll be asked if you want #' devtools to create it for you (in interactive sessions only). See #' \code{\link{use_test}} for more details. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param ... additional arguments passed to \code{\link[testthat]{test_dir}} #' @inheritParams testthat::test_dir #' @inheritParams run_examples #' @export test <- function(pkg = ".", filter = NULL, ...) { check_suggested("testthat") pkg <- as.package(pkg) if (!uses_testthat(pkg) && interactive()) { message("No testing infrastructure found. Create it?") if (menu(c("Yes", "No")) == 1) { use_testthat(pkg) } return(invisible()) } test_path <- find_test_dir(pkg$path) test_files <- dir(test_path, "^test.*\\.[rR]$") if (length(test_files) == 0) { message("No tests: no files in ", test_path, " match '^test.*\\.[rR]$'") return(invisible()) } # Need to attach testthat so that (e.g.) context() is available # Update package dependency to avoid explicit require() call (#798) if (pkg$package != "testthat") { pkg$depends <- paste0("testthat, ", pkg$depends) if (grepl("^testthat, *$", pkg$depends)) pkg$depends <- "testthat" } # Run tests in a child of the namespace environment, like # testthat::test_package message("Loading ", pkg$package) ns_env <- load_all(pkg, quiet = TRUE)$env message("Testing ", pkg$package) Sys.sleep(0.05); utils::flush.console() # Avoid misordered output in RStudio env <- new.env(parent = ns_env) testthat_args <- list(test_path, filter = filter, env = env, ... = ...) if (packageVersion("testthat") > "1.0.2") { testthat_args <- c(testthat_args, load_helpers = FALSE, encoding = pkg$encoding %||% "unknown") } withr::with_envvar(r_env_vars(), do.call(testthat::test_dir, testthat_args)) } find_test_dir <- function(path) { testthat <- file.path(path, "tests", "testthat") if (dir.exists(testthat)) return(testthat) inst <- file.path(path, "inst", "tests") if (dir.exists(inst)) return(inst) stop("No testthat directories found in ", path, call. = FALSE) } #' Return the path to one of the packages in the devtools test dir #' #' Devtools comes with some simple packages for testing. This function #' returns the path to them. #' #' @param package Name of the test package. #' @keywords internal #' @examples #' if (has_tests()) { #' devtest("testData") #' } #' @export devtest <- function(package) { stopifnot(has_tests()) path <- system.file(package = "devtools", "tests", "testthat", package) if (path == "") stop(package, " not found", call. = FALSE) path } #' @inheritParams test #' @rdname test #' @export uses_testthat <- function(pkg = ".") { pkg <- as.package(pkg) paths <- c( file.path(pkg$path, "inst", "tests"), file.path(pkg$path, "tests", "testthat") ) any(dir.exists(paths)) } devtools/R/zzz.r0000644000176200001440000000600313200625264013331 0ustar liggesusers#' @importFrom utils available.packages contrib.url install.packages #' installed.packages modifyList packageDescription #' packageVersion remove.packages NULL #' Package development tools for R. #' #' @section Package options: #' #' Devtools uses the following \code{\link{options}} to configure behaviour: #' #' \itemize{ #' \item \code{devtools.path}: path to use for \code{\link{dev_mode}} #' #' \item \code{devtools.name}: your name, used when signing draft #' emails. #' #' \item \code{devtools.install.args}: a string giving extra arguments passed #' to \code{R CMD install} by \code{\link{install}}. #' #' \item \code{devtools.desc.author}: a string providing a default Authors@@R #' string to be used in new \file{DESCRIPTION}s. Should be a R code, and #' look like \code{"Hadley Wickham [aut, cre]"}. See #' \code{\link[utils]{as.person}} for more details. #' #' \item \code{devtools.desc.license}: a default license string to use for #' new packages. #' #' \item \code{devtools.desc.suggests}: a character vector listing packages to #' to add to suggests by defaults for new packages. # #' \item \code{devtools.desc}: a named list listing any other #' extra options to add to \file{DESCRIPTION} #' #' } #' @docType package #' @name devtools NULL #' Deprecated Functions #' #' These functions are Deprecated in this release of devtools, they will be #' marked as Defunct and removed in a future version. #' @name devtools-deprecated #' @keywords internal NULL onload_assign("trimws", if (getRversion() < "3.2.0") { function(x, which = c("both", "left", "right")) { switch(match.arg(which), left = sub("^[ \t\r\n]+", "", x, perl = TRUE), right = sub("[ \t\r\n]+$", "", x, perl = TRUE), both = trimws(trimws(x, "left"), "right") ) } } else { base::trimws } ) .onLoad <- function(libname, pkgname) { op <- options() op.devtools <- list( devtools.path = "~/R-dev", devtools.install.args = "", devtools.name = "Your name goes here", devtools.desc.author = 'person("First", "Last", email = "first.last@example.com", role = c("aut", "cre"))', devtools.desc.license = "What license is it under?", devtools.desc.suggests = NULL, devtools.desc = list(), devtools.revdep.libpath = file.path(tempdir(), "R-lib") ) toset <- !(names(op.devtools) %in% names(op)) if(any(toset)) options(op.devtools[toset]) # These withr functions are used in load_all() so need to exist in the # devtools namespace so the withr namespace is not prematurely loaded by `::` # during a load_all() call env <- asNamespace(pkgname) assign("withr_with_dir", withr::with_dir, envir = env) assign("withr_with_collate", withr::with_collate, envir = env) assign("withr_with_envvar", withr::with_envvar, envir = env) nms <- environment(onload_assign)$names funs <- environment(onload_assign)$funs for (i in seq_along(nms)) { assign(nms[[i]], eval(funs[[i]], envir = env), envir = env) } invisible() } devtools/R/install-bitbucket.r0000644000176200001440000000621313171407310016114 0ustar liggesusers#' Install a package directly from bitbucket #' #' This function is vectorised so you can install multiple packages in #' a single command. #' #' @inheritParams install_github #' @param auth_user your account username if you're attempting to install #' a package hosted in a private repository (and your username is different #' to \code{username}) #' @param password your password #' @param ref Desired git reference; could be a commit, tag, or branch name. #' Defaults to master. #' @seealso Bitbucket API docs: #' \url{https://confluence.atlassian.com/bitbucket/use-the-bitbucket-cloud-rest-apis-222724129.html} #' @family package installation #' @export #' @examples #' \dontrun{ #' install_bitbucket("sulab/mygene.r@@default") #' install_bitbucket("dannavarro/lsr-package") #' } install_bitbucket <- function(repo, username, ref = "master", subdir = NULL, quiet = FALSE, auth_user = NULL, password = NULL, ...) { remotes <- lapply(repo, bitbucket_remote, username = username, ref = ref, subdir = subdir, auth_user = auth_user, password = password) install_remotes(remotes, ..., quiet = quiet) } bitbucket_remote <- function(repo, username = NULL, ref = NULL, subdir = NULL, auth_user = NULL, password = NULL, sha = NULL) { meta <- parse_git_repo(repo) meta$ref <- meta$ref %||% ref %||% "master" if (is.null(meta$username)) { meta$username <- username %||% stop("Unknown username.") warning("Username parameter is deprecated. Please use ", username, "/", repo, call. = FALSE) } remote("bitbucket", repo = meta$repo, subdir = meta$subdir %||% subdir, username = meta$username, ref = meta$ref %||% ref, sha = sha, auth_user = auth_user, password = password ) } #' @export remote_download.bitbucket_remote <- function(x, quiet = FALSE) { if (!quiet) { message("Downloading bitbucket repo ", x$username, "/", x$repo, "@", x$ref) } dest <- tempfile(fileext = paste0(".zip")) src <- paste("https://bitbucket.org/", x$username, "/", tolower(x$repo), "/get/", x$ref, ".zip", sep = "") if (!is.null(x$password)) { auth <- httr::authenticate( user = x$auth_user %||% x$username, password = x$password, type = "basic") } else { auth <- NULL } download(dest, src, auth) } #' @export remote_metadata.bitbucket_remote <- function(x, bundle = NULL, source = NULL) { if (!is.null(bundle)) { # Might be able to get from zip archive sha <- git_extract_sha1(bundle) } else { # Otherwise can lookup with remote_ls sha <- remote_sha(x) } list( RemoteType = "bitbucket", RemoteRepo = x$repo, RemoteUsername = x$username, RemoteRef = x$ref, RemoteSha = sha, RemoteSubdir = x$subdir ) } #' @export remote_package_name.bitbucket_remote <- function(remote, ...) { remote_package_name.github_remote(remote, url = "https://bitbucket.org", ...) } #' @export remote_sha.bitbucket_remote <-function(remote, ...) { remote_sha.github_remote(remote, url = "https://bitbucket.org", ...) } #' @export format.bitbucket_remote <- function(x, ...) { "Bitbucket" } devtools/R/system.r0000644000176200001440000000626413171407310014026 0ustar liggesusers#' Run a system command and check if it succeeds. #' #' @param cmd Command to run. Will be quoted by \code{\link{shQuote}()}. #' @param args A character vector of arguments. #' @param env_vars A named character vector of environment variables. #' @param path Path in which to execute the command #' @param quiet If \code{FALSE}, the command to be run will be echoed. #' @param throw If \code{TRUE}, will throw an error if the command fails #' (i.e. the return value is not 0). #' @param ... additional arguments passed to \code{\link[base]{system}} #' @keywords internal #' @export #' @return The exit status of the command, invisibly. system_check <- function(cmd, args = character(), env_vars = character(), path = ".", quiet = FALSE, throw = TRUE, ...) { full <- paste(shQuote(cmd), " ", paste(args, collapse = " "), sep = "") if (!quiet) { message(wrap_command(full)) message() } result <- suppressWarnings(withr::with_dir(path, withr::with_envvar(env_vars, system(full, intern = quiet, ignore.stderr = quiet, ...) ))) if (quiet) { status <- attr(result, "status") %||% 0L } else { status <- result } ok <- identical(as.character(status), "0") if (throw && !ok) { stop("Command failed (", status, ")", call. = FALSE) } invisible(status) } #' @noRd #' @param out_file Path of file to which output is written if \code{quiet} is #' \code{TRUE} system2_check <- function(cmd, args = character(), env_vars = character(), path = ".", quiet = FALSE, throw = TRUE, out_file = NULL, ...) { full <- paste(shQuote(cmd), " ", paste(args, collapse = " "), sep = "") if (!quiet) { message(wrap_command(full)) message() } if (quiet) std <- TRUE else std <- "" result <- suppressWarnings(withr::with_dir(path, withr::with_envvar(env_vars, system2(cmd, args, stdout = std, stderr = std, ...) ))) if (quiet) { if (!is.null(out_file)) { writeLines(result, out_file) } status <- attr(result, "status") %||% 0L } else { status <- result } ok <- identical(as.character(status), "0") if (throw && !ok) { stop("Command failed (", status, ")", call. = FALSE) } invisible(status) } #' Run a system command and capture the output. #' #' @inheritParams system_check #' @param ... additional arguments passed to \code{\link[base]{system}} #' @return command output if the command succeeds, an error will be thrown if #' the command fails. #' @keywords internal #' @export system_output <- function(cmd, args = character(), env_vars = character(), path = ".", quiet = FALSE, ...) { full <- paste(shQuote(cmd), " ", paste(args, collapse = " "), sep = "") if (!quiet) { message(wrap_command(full), "\n") } result <- withCallingHandlers(withr::with_dir(path, withr::with_envvar(env_vars, system(full, intern = TRUE, ignore.stderr = quiet, ...) )), warning = function(w) stop(w)) result } wrap_command <- function(x) { lines <- strwrap(x, getOption("width") - 2, exdent = 2) continue <- c(rep(" \\", length(lines) - 1), "") paste(lines, continue, collapse = "\n") } devtools/R/package-deps.r0000644000176200001440000000475613200623655015037 0ustar liggesusers#' Parse package dependency strings. #' #' @param string to parse. Should look like \code{"R (>= 3.0), ggplot2"} etc. #' @return list of two character vectors: \code{name} package names, #' and \code{version} package versions. If version is not specified, #' it will be stored as NA. #' @keywords internal #' @export #' @examples #' parse_deps("httr (< 2.1),\nRCurl (>= 3)") #' # only package dependencies are returned #' parse_deps("utils (== 2.12.1),\ntools,\nR (>= 2.10),\nmemoise") parse_deps <- function(string) { if (is.null(string)) return() stopifnot(is.character(string), length(string) == 1) if (grepl("^\\s*$", string)) return() pieces <- strsplit(string, "[[:space:]]*,[[:space:]]*")[[1]] # Get the names names <- gsub("\\s*\\(.*?\\)", "", pieces) names <- gsub("^\\s+|\\s+$", "", names) # Get the versions and comparison operators versions_str <- pieces have_version <- grepl("\\(.*\\)", versions_str) versions_str[!have_version] <- NA compare <- sub(".*\\((\\S+)\\s+.*\\)", "\\1", versions_str) versions <- sub(".*\\(\\S+\\s+(.*)\\)", "\\1", versions_str) # Check that non-NA comparison operators are valid compare_nna <- compare[!is.na(compare)] compare_valid <- compare_nna %in% c(">", ">=", "==", "<=", "<") if(!all(compare_valid)) { stop("Invalid comparison operator in dependency: ", paste(compare_nna[!compare_valid], collapse = ", ")) } deps <- data.frame(name = names, compare = compare, version = versions, stringsAsFactors = FALSE) # Remove R dependency deps[names != "R", ] } #' Check that the version of an imported package satisfies the requirements #' #' @param dep_name The name of the package with objects to import #' @param dep_ver The version of the package #' @param dep_compare The comparison operator to use to check the version #' @keywords internal check_dep_version <- function(dep_name, dep_ver = NA, dep_compare = NA) { if (!requireNamespace(dep_name, quietly = TRUE)) { stop("Dependency package ", dep_name, " not available.") } if (xor(is.na(dep_ver), is.na(dep_compare))) { stop("dep_ver and dep_compare must be both NA or both non-NA") } else if(!is.na(dep_ver) && !is.na(dep_compare)) { compare <- match.fun(dep_compare) if (!compare( as.numeric_version(getNamespaceVersion(dep_name)), as.numeric_version(dep_ver))) { warning("Need ", dep_name, " ", dep_compare, " ", dep_ver, " but loaded version is ", getNamespaceVersion(dep_name)) } } return(TRUE) } devtools/R/reload.r0000644000176200001440000000211213200623655013741 0ustar liggesusers#' Unload and reload package. #' #' This attempts to unload and reload a package. If the package is not loaded #' already, it does nothing. It's not always possible to cleanly unload a #' package: see the caveats in \code{\link{unload}} for some of the #' potential failure points. If in doubt, restart R and reload the package #' with \code{\link{library}}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param quiet if \code{TRUE} suppresses output from this function. #' @examples #' \dontrun{ #' # Reload package that is in current directory #' reload(".") #' #' # Reload package that is in ./ggplot2/ #' reload("ggplot2/") #' #' # Can use inst() to find the package path #' # This will reload the installed ggplot2 package #' reload(inst("ggplot2")) #' } #' @export reload <- function(pkg = ".", quiet = FALSE) { pkg <- as.package(pkg) if (is_attached(pkg)) { if (!quiet) message("Reloading installed ", pkg$package) unload(pkg) require(pkg$package, character.only = TRUE, quietly = TRUE) } } devtools/R/check-git.r0000644000176200001440000000162413200623655014340 0ustar liggesusers#' Git checks. #' #' This function performs Git checks checks prior to release. It is called #' automatically by \code{\link{release}()}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information. #' @keywords internal git_checks <- function(pkg = ".") { pkg <- as.package(pkg) message("Running Git checks for ", pkg$package) check_uncommitted(pkg) check_sync_status(pkg) } check_uncommitted <- function(pkg) { check_status(!git_uncommitted(pkg$path), "uncommitted files", "All files should be committed before release. Please add and commit." ) } check_sync_status <- function(pkg) { check_status(!git_sync_status(pkg$path, check_ahead = FALSE), "if synched with remote branch", "Local branch should contain all commits of remote branch before release. Please pull." ) } devtools/R/revdep-summarise.R0000644000176200001440000001154413200623655015734 0ustar liggesusers#' @rdname revdep_check #' @export revdep_check_save_summary <- function(pkg = ".") { pkg <- as.package(pkg) revdep_check_save_readme(pkg) revdep_check_save_problems(pkg) revdep_check_save_timing(pkg) } revdep_check_save_readme <- function(pkg) { md_all <- revdep_check_summary_md(pkg) writeLines(md_all, file.path(pkg$path, "revdep", "README.md")) } revdep_check_save_problems <- function(pkg) { md_bad <- revdep_check_summary_md(pkg, has_problem = TRUE) writeLines(md_bad, file.path(pkg$path, "revdep", "problems.md")) } revdep_check_save_timing <- function(pkg) { md_timing <- revdep_check_timing_md(pkg) writeLines(md_timing, file.path(pkg$path, "revdep", "timing.md")) } revdep_check_summary_md <- function(pkg, has_problem = FALSE) { check_suggested("knitr") check <- readRDS(revdep_check_path(pkg)) paste0( revdep_setup_md(check), "\n\n", revdep_check_results_md(check$results, has_problem) ) } revdep_setup_md <- function(check) { paste0( "# Setup\n\n", revdep_platform_md(check$platform), revdep_packages_md(check$dependencies) ) } revdep_platform_md <- function(platform) { paste0( "## Platform\n\n", paste(revdep_platform_kable(platform), collapse = "\n"), "\n\n" ) } revdep_platform_kable <- function(platform) { plat_df <- data.frame( setting = names(platform), value = unlist(platform) ) rownames(plat_df) <- NULL knitr::kable(plat_df) } revdep_packages_md <- function(dependencies) { paste0( "## Packages\n\n", paste(knitr::kable(dependencies), collapse = "\n") ) } revdep_check_results_md <- function(results, has_problem) { if (has_problem) { problems <- vapply(results, has_problems, logical(1)) results <- results[problems] msg <- "packages with problems" } else { msg <- "packages" } summary_table <- paste0( paste0(revdep_check_results_kable(results), collapse = "\n"), "\n\n") summaries <- vapply(results, format, character(1)) paste0( "# Check results\n\n", paste0(length(summaries), " ", msg, "\n\n"), summary_table, paste0(summaries, collapse = "\n") ) } revdep_check_results_kable <- function(results) { if (length(results) == 0) return(character()) summary_df <- data.frame( package = I(names(results)), version = I(vapply(results, function(x) x$version, character(1))), errors = vapply(results, function(x) length(x$results$errors), integer(1)), warnings = vapply(results, function(x) length(x$results$warnings), integer(1)), notes = vapply(results, function(x) length(x$results$notes), integer(1)) ) rownames(summary_df) <- NULL knitr::kable(summary_df) } revdep_check_timing_md <- function(pkg) { check_suggested("knitr") check <- readRDS(revdep_check_path(pkg)) paste0( "# Check times\n\n", paste0(revdep_check_timing_kable(check$results), collapse = "\n"), "\n\n" ) } revdep_check_timing_kable <- function(results) { if (length(results) == 0) return(character()) timing_df <- data.frame( package = I(names(results)), version = I(vapply(results, function(x) x$version, character(1))), check_time = I(vapply(results, function(x) x$check_time, numeric(1))) ) rownames(timing_df) <- NULL timing_df <- timing_df[order(-timing_df$check_time), ] knitr::kable(timing_df) } #' @export format.revdep_check_result <- function(x, ...) { meta <- c( "Maintainer" = x$maintainer, "Bug reports" = x$bug_reports ) meta_string <- paste(names(meta), ": ", meta, collapse = " \n", sep = "") header <- paste0( "## ", x$package, " (", x$version, ")\n", meta_string, "\n" ) summary <- summarise_check_results(x$results) if (length(unlist(x$results)) > 0) { checks <- paste0("\n```\n", format(x$results), "\n```\n") } else { checks <- "" } paste0(header, "\n", summary, "\n", checks) } #' @export print.revdep_check_result <- function(x, ...) { cat(format(x, ...), "\n", sep = "") } #' @rdname revdep_check #' @export revdep_check_print_problems <- function(pkg = ".") { pkg <- as.package(pkg) summaries <- readRDS(revdep_check_path(pkg))$results problems <- vapply(summaries, function(x) first_problem(x$results), character(1)) problems <- problems[!is.na(problems)] dep_fail <- grepl("checking package dependencies", problems, fixed = TRUE) inst_fail <- grepl("checking whether package .+ can be installed", problems) pkgs <- names(problems) if (any(dep_fail)) { bad <- paste(pkgs[dep_fail], collapse = ", ") cat("* Failed to install dependencies for: ", bad, "\n", sep = "") } if (any(inst_fail)) { bad <- paste(pkgs[inst_fail], collapse = ", ") cat("* Failed to install: ", bad, "\n", sep = "") } if (length(problems) > 0) { other <- problems[!inst_fail & !dep_fail] cat(paste0("* ", names(other), ": ", other, "\n"), sep = "") } else { cat("No ERRORs or WARNINGs found :)\n") } } devtools/R/path.r0000644000176200001440000000254512656131112013435 0ustar liggesusers#' Get/set the PATH variable. #' #' @param path character vector of paths #' @return \code{set_path} invisibly returns the old path. #' @name path #' @family path #' @seealso \code{\link[withr]{with_path}} to temporarily set the path for a block #' of code #' @examples #' path <- get_path() #' length(path) #' old <- add_path(".") #' length(get_path()) #' set_path(old) #' length(get_path()) NULL #' @export #' @rdname path get_path <- function() { strsplit(Sys.getenv("PATH"), .Platform$path.sep)[[1]] } #' @export #' @rdname path set_path <- function(path) { path <- normalizePath(path, mustWork = FALSE) old <- get_path() path <- paste(path, collapse = .Platform$path.sep) Sys.setenv(PATH = path) invisible(old) } #' @export #' @rdname path #' @param after for \code{add_path}, the place on the PATH where the new paths #' should be added add_path <- function(path, after = Inf) { set_path(append(get_path(), path, after)) } #' Test if an object is on the path. #' #' @param ... Strings indicating the executables to check for on the path. #' @family path #' @keywords internal #' @export #' @examples #' on_path("R") #' on_path("gcc") #' on_path("foo", "bar") # FALSE in most cases #' withr::with_path(tempdir(), on_path("gcc")) on_path <- function(...) { commands <- c(...) stopifnot(is.character(commands)) unname(Sys.which(commands) != "") } devtools/R/load.r0000644000176200001440000002132113200623655013415 0ustar liggesusers#' Load complete package. #' #' \code{load_all} loads a package. It roughly simulates what happens #' when a package is installed and loaded with \code{\link{library}}. #' #' Currently \code{load_all}: #' #' \itemize{ #' \item Loads all data files in \code{data/}. See \code{\link{load_data}} #' for more details. #' #' \item Sources all R files in the R directory, storing results in #' environment that behaves like a regular package namespace. See #' below and \code{\link{load_code}} for more details. #' #' \item Compiles any C, C++, or Fortran code in the \code{src/} directory #' and connects the generated DLL into R. See \code{\link{compile_dll}} #' for more details. #' #' \item Runs \code{.onAttach()}, \code{.onLoad()} and \code{.onUnload()} #' functions at the correct times. #' #' \item If you use \pkg{testthat}, will load all test helpers so you #' can access them interactively. #' #' } #' #' @section Namespaces: #' The namespace environment \code{}, is a child of #' the imports environment, which has the name attribute #' \code{imports:pkgname}. It is in turn is a child of #' \code{}, which is a child of the global environment. #' (There is also a copy of the base namespace that is a child of the empty #' environment.) #' #' The package environment \code{} is an ancestor of the #' global environment. Normally when loading a package, the objects #' listed as exports in the NAMESPACE file are copied from the namespace #' to the package environment. However, \code{load_all} by default will #' copy all objects (not just the ones listed as exports) to the package #' environment. This is useful during development because it makes all #' objects easy to access. #' #' To export only the objects listed as exports, use #' \code{export_all=FALSE}. This more closely simulates behavior when #' loading an installed package with \code{\link{library}}, and can be #' useful for checking for missing exports. #' #' @section Shim files: #' \code{load_all} also inserts shim functions into the imports environment #' of the laded package. It presently adds a replacement version of #' \code{system.file} which returns different paths from #' \code{base::system.file}. This is needed because installed and uninstalled #' package sources have different directory structures. Note that this is not #' a perfect replacement for \code{base::system.file}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information. #' @param reset clear package environment and reset file cache before loading #' any pieces of the package. This is equivalent to running #' \code{\link{unload}} and is the default. Use \code{reset = FALSE} may be #' faster for large code bases, but is a significantly less accurate #' approximation. #' @param recompile force a recompile of DLL from source code, if present. #' This is equivalent to running \code{\link{clean_dll}} before #' \code{load_all} #' @param export_all If \code{TRUE} (the default), export all objects. #' If \code{FALSE}, export only the objects that are listed as exports #' in the NAMESPACE file. #' @param quiet if \code{TRUE} suppresses output from this function. #' @inheritParams as.package #' @keywords programming #' @examples #' \dontrun{ #' # Load the package in the current directory #' load_all("./") #' #' # Running again loads changed files #' load_all("./") #' #' # With reset=TRUE, unload and reload the package for a clean start #' load_all("./", TRUE) #' #' # With export_all=FALSE, only objects listed as exports in NAMESPACE #' # are exported #' load_all("./", export_all = FALSE) #' } #' @export load_all <- function(pkg = ".", reset = TRUE, recompile = FALSE, export_all = TRUE, quiet = FALSE, create = NA) { pkg <- as.package(pkg, create = create) check_suggested("roxygen2") if (!quiet) message("Loading ", pkg$package) if (pkg$package == "compiler") { # Disable JIT while loading the compiler package to avoid interference # (otherwise the compiler package would be loaded as a side effect of # JIT compilation and it would be locked before we can insert shims into # it). oldEnabled <- compiler::enableJIT(0) on.exit(compiler::enableJIT(oldEnabled), TRUE) } roxygen2::update_collate(pkg$path) # Refresh the pkg structure with any updates to the Collate entry # in the DESCRIPTION file pkg$collate <- as.package(pkg$path)$collate # Forcing all of the promises for the loaded namespace now will avoid lazy-load # errors when the new package is loaded overtop the old one. # # Reloading devtools is a special case. Normally, objects in the # namespace become inaccessible if the namespace is unloaded before the # object has been accessed. Instead we force the object so they will still be # accessible. if (is_loaded(pkg)) { eapply(ns_env(pkg), force, all.names = TRUE) } # Check description file is ok check <- ("tools" %:::% ".check_package_description")( file.path(pkg$path, "DESCRIPTION")) if (length(check) > 0) { msg <- utils::capture.output(("tools" %:::% "print.check_package_description")(check)) message("Invalid DESCRIPTION:\n", paste(msg, collapse = "\n")) } # If installed version of package loaded, unload it if (is_loaded(pkg) && is.null(dev_meta(pkg$package))) { unload(pkg) } # Unload dlls unload_dll(pkg) if (reset) { clear_cache() if (is_loaded(pkg)) unload(pkg) } if (recompile) clean_dll(pkg) # Compile dll if it exists compile_dll(pkg, quiet = quiet) # Set up the namespace environment ---------------------------------- # This mimics the procedure in loadNamespace if (!is_loaded(pkg)) create_ns_env(pkg) out <- list(env = ns_env(pkg)) # Load dependencies load_depends(pkg) load_imports(pkg) # Add shim objects to imports environment insert_imports_shims(pkg) out$data <- load_data(pkg) out$code <- load_code(pkg) register_s3(pkg) out$dll <- load_dll(pkg) # Run namespace load hooks run_pkg_hook(pkg, "load") run_ns_load_actions(pkg) run_user_hook(pkg, "load") # Set up the exports in the namespace metadata (this must happen after # the objects are loaded) setup_ns_exports(pkg, export_all) # Set up the package environment ------------------------------------ # Create the package environment if needed if (!is_attached(pkg)) attach_ns(pkg) # Copy over objects from the namespace environment export_ns(pkg) # Run hooks run_pkg_hook(pkg, "attach") run_user_hook(pkg, "attach") # Source test helpers into package environment if (uses_testthat(pkg)) { testthat::source_test_helpers(find_test_dir(pkg$path), env = pkg_env(pkg)) } # Replace help and ? in utils package environment insert_global_shims() invisible(out) } #' Create a default DESCRIPTION file for a package. #' #' @details #' To set the default author and licenses, set \code{options} #' \code{devtools.desc.author} and \code{devtools.desc.license}. I use #' \code{options(devtools.desc.author = '"Hadley Wickham [aut,cre]"', #' devtools.desc.license = "GPL-3")}. #' @param path path to package root directory #' @param extra a named list of extra options to add to \file{DESCRIPTION}. #' Arguments that take a list #' @param quiet if \code{TRUE}, suppresses output from this function. #' @export create_description <- function(path = ".", extra = getOption("devtools.desc"), quiet = FALSE) { # Don't call check_dir(path) here (#803) desc_path <- file.path(path, "DESCRIPTION") if (file.exists(desc_path)) return(FALSE) subdir <- file.path(path, c("R", "src", "data")) if (!any(file.exists(subdir))) { stop("'", path, "' does not look like a package: no R/, src/ or data directories", call. = FALSE) } desc <- build_description(extract_package_name(path), extra) if (!quiet) { message("No DESCRIPTION found. Creating with values:\n\n") write_dcf("", desc) } write_dcf(desc_path, desc) TRUE } build_description <- function(name, extra = list()) { check_package_name(name) defaults <- compact(list( Package = name, Title = "What the Package Does (one line, title case)", Version = "0.0.0.9000", "Authors@R" = getOption("devtools.desc.author"), Description = "What the package does (one paragraph).", Depends = paste0("R (>= ", as.character(getRversion()) ,")"), License = getOption("devtools.desc.license"), Suggests = getOption("devtools.desc.suggests"), Encoding = "UTF-8", LazyData = "true" )) # Override defaults with user supplied options desc <- modifyList(defaults, extra) # Collapse all vector arguments to single strings desc <- lapply(desc, function(x) paste(x, collapse = ", ")) desc } devtools/R/R.r0000644000176200001440000000670413200623655012707 0ustar liggesusers# R("-e 'str(as.list(Sys.getenv()))' --slave") R <- function(args, path = tempdir(), env_vars = character(), fun = system_check, ...) { r <- file.path(R.home("bin"), "R") stopifnot(is.character(args)) args <- c( "--no-site-file", "--no-environ", "--no-save", "--no-restore", "--quiet", args ) stopifnot(is.character(env_vars)) env_vars <- c(r_profile(), r_env_vars(), env_vars) # If rtools has been detected, add it to the path only when running R... if (!is.null(get_rtools_path())) { old <- add_path(get_rtools_path(), 0) on.exit(set_path(old)) } fun <- match.fun(fun) fun(r, args = args, env_vars = env_vars, path = path, ...) } #' Run R CMD xxx from within R #' #' @param cmd one of the R tools available from the R CMD interface. #' @param options a character vector of options to pass to the command #' @param path the directory to run the command in. #' @param env_vars environment variables to set before running the command. #' @param ... additional arguments passed to \code{\link{system_check}} #' @return \code{TRUE} if the command succeeds, throws an error if the command #' fails. #' @export RCMD <- function(cmd, options, path = tempdir(), env_vars = character(), ...) { options <- paste(options, collapse = " ") R(paste("CMD", cmd, options), path = path, env_vars = env_vars, ...) } #' Environment variables to set when calling R #' #' Devtools sets a number of environmental variables to ensure consistent #' between the current R session and the new session, and to ensure that #' everything behaves the same across systems. It also suppresses a common #' warning on windows, and sets \code{NOT_CRAN} so you can tell that your #' code is not running on CRAN. If \code{NOT_CRAN} has been set externally, it #' is not overwritten. #' #' @keywords internal #' @return a named character vector #' @export r_env_vars <- function() { vars <- c( "R_LIBS" = paste(.libPaths(), collapse = .Platform$path.sep), "CYGWIN" = "nodosfilewarning", # When R CMD check runs tests, it sets R_TESTS. When the tests # themselves run R CMD xxxx, as is the case with the tests in # devtools, having R_TESTS set causes errors because it confuses # the R subprocesses. Un-setting it here avoids those problems. "R_TESTS" = "", "R_BROWSER" = "false", "R_PDFVIEWER" = "false", "TAR" = auto_tar()) if (is.na(Sys.getenv("NOT_CRAN", unset = NA))) { vars[["NOT_CRAN"]] <- "true" } vars } # Create a temporary .Rprofile based on the current "repos" option # and return a named vector that corresponds to environment variables # that need to be set to use this .Rprofile r_profile <- function() { tmp_user_profile <- file.path(tempdir(), "Rprofile-devtools") tmp_user_profile_con <- file(tmp_user_profile, "w") on.exit(close(tmp_user_profile_con), add = TRUE) writeLines("options(repos =", tmp_user_profile_con) dput(getOption("repos"), tmp_user_profile_con) writeLines(")", tmp_user_profile_con) c(R_PROFILE_USER = tmp_user_profile) } # Determine the best setting for the TAR environmental variable # This is needed for R <= 2.15.2 to use internal tar. Later versions don't need # this workaround, and they use R_BUILD_TAR instead of TAR, so this has no # effect on them. auto_tar <- function() { tar <- Sys.getenv("TAR", unset = NA) if (!is.na(tar)) return(tar) windows <- .Platform$OS.type == "windows" no_rtools <- is.null(get_rtools_path()) if (windows && no_rtools) "internal" else "" } devtools/R/with.r0000644000176200001440000000530213200623655013452 0ustar liggesusers#' @rdname devtools-deprecated #' @section \code{in_dir}: #' working directory #' @export in_dir <- function(new, code) { .Deprecated(new = "withr::with_dir", package = "devtools") withr::with_dir(new = new, code = code) } #' @rdname devtools-deprecated #' @section \code{with_collate}: #' collation order #' @export with_collate <- function(new, code) { .Deprecated(new = "withr::with_collate", package = "devtools") withr::with_collate(new = new, code = code) } #' @rdname devtools-deprecated #' @section \code{with_envvar}: #' environmental variables #' @export with_envvar <- function(new, code, action = "replace") { .Deprecated(new = "withr::with_envvar", package = "devtools") withr::with_envvar(new = new, code = code, action = action) } #' @rdname devtools-deprecated #' @section \code{with_lib}: #' library paths, prepending to current libpaths #' @export with_lib <- function(new, code) { .Deprecated(new = "withr::with_libpaths", package = "devtools") withr::with_libpaths(new = new, code = code, action = "prefix") } #' @rdname devtools-deprecated #' @section \code{with_libpaths}: #' library paths, replacing current libpaths #' @export with_libpaths <- function(new, code) { .Deprecated(new = "withr::with_libpaths", package = "devtools") withr::with_libpaths(new = new, code = code, action = "replace") } #' @rdname devtools-deprecated #' @section \code{with_locale}: #' any locale setting #' @export with_locale <- function(new, code) { .Deprecated(new = "withr::with_locale", package = "devtools") withr::with_locale(new = new, code = code) } #' @rdname devtools-deprecated #' @section \code{with_makevars}: #' Temporarily change contents of an existing Makevars file. #' @export with_makevars <- function(new, code, path = file.path("~", ".R", "Makevars")) { .Deprecated(new = "withr::with_makevars", package = "devtools") withr::with_makevars(new = new, code = code, path = path) } #' @rdname devtools-deprecated #' @section \code{with_options}: #' options #' @export with_options <- function(new, code) { .Deprecated(new = "withr::with_options", package = "devtools") withr::with_options(new = new, code = code) } #' @rdname devtools-deprecated #' @section \code{with_par}: #' graphics parameters #' @export with_par <- function(new, code) { .Deprecated(new = "withr::with_par", package = "devtools") withr::with_par(new = new, code = code) } #' @rdname devtools-deprecated #' @section \code{with_path}: #' PATH environment variable #' @export with_path <- function(new, code, add = TRUE) { .Deprecated(new = "withr::with_path", package = "devtools") action <- if (isTRUE(add)) { "suffix" } else { "replace" } withr::with_path(new = new, code = code, action = action) } devtools/R/rcpp-attributes.r0000644000176200001440000000142713200625223015624 0ustar liggesusers # Call the Rcpp::compileAttributes function for a package (only do so if the # package links to Rcpp and a recent enough version of Rcpp in installed). compile_rcpp_attributes <- function(pkg) { # Only scan for attributes in packages explicitly linking to Rcpp if (links_to_rcpp(pkg)) { check_suggested("Rcpp") # We need to use use with_dir here to work around a regression in Rcpp # 0.12.11 (https://github.com/RcppCore/Rcpp/pull/697) withr_with_dir(pkg$path, { Rcpp::compileAttributes() }) } } # Does this package have a compilation dependency on Rcpp? links_to_rcpp <- function(pkg) { "Rcpp" %in% pkg_linking_to(pkg) } # Get the LinkingTo field of a package as a character vector pkg_linking_to <- function(pkg) { parse_deps(pkg$linkingto)$name } devtools/R/install-remote.R0000644000176200001440000001540313200623655015401 0ustar liggesusers#' Install a remote package. #' #' This: #' \enumerate{ #' \item checks if source bundle is different from installed version #' \item checks if log of a past installation failure exists #' \item downloads source bundle #' \item decompresses & checks that it's a package #' \item adds metadata to DESCRIPTION #' \item calls install #' } #' @noRd install_remote <- function(remote, ..., force = FALSE, quiet = FALSE, out_dir = NULL, skip_if_log_exists = FALSE) { stopifnot(is.remote(remote)) remote_sha <- remote_sha(remote) package_name <- remote_package_name(remote) local_sha <- local_sha(package_name) if (!isTRUE(force) && !different_sha(remote_sha = remote_sha, local_sha = local_sha)) { if (!quiet) { message( "Skipping install of '", package_name, "' from a ", sub("_remote", "", class(remote)[1L]), " remote,", " the SHA1 (", substr(remote_sha, 1L, 8L), ") has not changed since last install.\n", " Use `force = TRUE` to force installation") } return(invisible(FALSE)) } if (!is.null(out_dir)) { out_file <- file.path(out_dir, paste0(package_name, ".out")) if (skip_if_log_exists && file.exists(out_file)) { message("Skipping ", package_name, ", installation failed before, see log in ", out_file) return(invisible(FALSE)) } } if (is_windows && inherits(remote, "cran_remote")) { install_packages(package_name, repos = remote$repos, type = remote$pkg_type, dependencies = NA, ..., quiet = quiet, out_dir = out_dir, skip_if_log_exists = skip_if_log_exists) return(invisible(TRUE)) } bundle <- remote_download(remote, quiet = quiet) on.exit(unlink(bundle), add = TRUE) source <- source_pkg(bundle, subdir = remote$subdir) on.exit(unlink(source, recursive = TRUE), add = TRUE) metadata <- remote_metadata(remote, bundle, source) install(source, ..., quiet = quiet, metadata = metadata, out_dir = out_dir, skip_if_log_exists = skip_if_log_exists) } try_install_remote <- function(..., quiet) { tryCatch( install_remote(..., quiet = quiet), error = function(e) { if (!quiet) { message("Installation failed: ", conditionMessage(e)) } FALSE } ) } install_remotes <- function(remotes, ...) { invisible(vapply(remotes, try_install_remote, ..., FUN.VALUE = logical(1))) } # Add metadata add_metadata <- function(pkg_path, meta) { # During installation, the DESCRIPTION file is read and an package.rds file # created with most of the information from the DESCRIPTION file. Functions # that read package metadata may use either the DESCRIPTION file or the # package.rds file, therefore we attempt to modify both of them, and return an # error if neither one exists. source_desc <- file.path(pkg_path, "DESCRIPTION") binary_desc <- file.path(pkg_path, "Meta", "package.rds") if (file.exists(source_desc)) { desc <- read_dcf(source_desc) desc <- modifyList(desc, meta) write_dcf(source_desc, desc) } if (file.exists(binary_desc)) { pkg_desc <- base::readRDS(binary_desc) desc <- as.list(pkg_desc$DESCRIPTION) desc <- modifyList(desc, meta) pkg_desc$DESCRIPTION <- stats::setNames(as.character(desc), names(desc)) base::saveRDS(pkg_desc, binary_desc) } if (!file.exists(source_desc) && !file.exists(binary_desc)) { stop("No DESCRIPTION found!", call. = FALSE) } } # Modify the MD5 file - remove the line for DESCRIPTION clear_description_md5 <- function(pkg_path) { path <- file.path(pkg_path, "MD5") if (file.exists(path)) { text <- readLines(path) text <- text[!grepl(".*\\*DESCRIPTION$", text)] writeLines(text, path) } } remote <- function(type, ...) { structure(list(...), class = c(paste0(type, "_remote"), "remote")) } is.remote <- function(x) inherits(x, "remote") different_sha <- function(remote_sha = NULL, local_sha = NULL) { if (is.null(remote_sha)) { remote_sha <- remote_sha(remote) } if (is.null(local_sha)) { local_sha <- local_sha(remote_package_name(remote)) } same <- remote_sha == local_sha same <- isTRUE(same) && !is.na(same) !same } local_sha <- function(name) { if (!is_installed(name)) { return(NA_character_) } package2remote(name)$sha %||% NA_character_ } remote_download <- function(x, quiet = FALSE) UseMethod("remote_download") remote_metadata <- function(x, bundle = NULL, source = NULL) UseMethod("remote_metadata") remote_package_name <- function(remote, ...) UseMethod("remote_package_name") remote_sha <- function(remote, ...) UseMethod("remote_sha") package2remote <- function(name, repos = getOption("repos"), type = getOption("pkgType")) { x <- tryCatch(packageDescription(name, lib.loc = .libPaths()), error = function(e) NA, warning = function(e) NA) # will be NA if not installed if (identical(x, NA)) { return(remote("cran", name = name, repos = repos, pkg_type = type, sha = NA_character_)) } if (is.null(x$RemoteType)) { # Packages installed with install.packages() or locally without devtools return(remote("cran", name = x$Package, repos = repos, pkg_type = type, sha = x$Version)) } switch(x$RemoteType, github = remote("github", host = x$RemoteHost, repo = x$RemoteRepo, subdir = x$RemoteSubdir, username = x$RemoteUsername, ref = x$RemoteRef, sha = x$RemoteSha), git = remote("git", url = x$RemoteUrl, ref = x$RemoteRef, sha = x$RemoteSha, subdir = x$RemoteSubdir), bitbucket = remote("bitbucket", host = x$RemoteHost, repo = x$RemoteRepo, username = x$RemoteUsername, ref = x$RemoteRef, sha = x$RemoteSha, subdir = x$RemoteSubdir), svn = remote("svn", url = x$RemoteUrl, svn_subdir = x$RemoteSvnSubdir, branch = x$RemoteBranch, sha = x$RemoteRevision, args = x$RemoteArgs), local = remote("local", path = x$RemoteUrl, branch = x$RemoteBranch, subdir = x$RemoteSubdir, sha = x$RemoteSha %||% x$Version, username = x$RemoteUsername, repo = x$RemoteRepo), url = remote("url", url = x$RemoteUrl, subdir = x$RemoteSubdir, config = x$RemoteConfig), bioc = remote("bioc", repo = x$RemoteRepo, mirror = x$RemoteMirror, release = x$RemoteRelease, username = x$RemoteUsername, password = x$RemotePassword, revision = x$RemoteRevision, sha = x$RemoteSha), # packages installed with install_cran cran = remote("cran", name = x$Package, repos = eval(parse(text = x$RemoteRepos)), pkg_type = x$RemotePkgType, sha = x$RemoteSha)) } #' @export format.remotes <- function(x, ...) { vapply(x, format, character(1)) } devtools/R/has-devel.r0000644000176200001440000000236713200623655014357 0ustar liggesusers# The checking code looks for the objects in the package namespace, so defining # dll here removes the following NOTE # Registration problem: # Evaluating 'dll$foo' during check gives error # 'object 'dll' not found': # .C(dll$foo, 0L) # See https://github.com/wch/r-source/blob/d4e8fc9832f35f3c63f2201e7a35fbded5b5e14c/src/library/tools/R/QC.R#L1950-L1980 # Setting the class is needed to avoid a note about returning the wrong class. # The local object is found first in the actual call, so current behavior is # unchanged. dll <- list(foo = structure(list(), class = "NativeSymbolInfo")) #' Check if you have a development environment installed. #' #' Thanks to the suggestion of Simon Urbanek. #' #' @return TRUE if your development environment is correctly set up, otherwise #' returns an error. #' @export #' @examples #' has_devel() has_devel <- function() { foo_path <- file.path(tempdir(), "foo.c") cat("void foo(int *bar) { *bar=1; }\n", file = foo_path) on.exit(unlink(foo_path)) R("CMD SHLIB foo.c", tempdir()) dylib <- file.path(tempdir(), paste("foo", .Platform$dynlib.ext, sep='')) on.exit(unlink(dylib), add = TRUE) dll <- dyn.load(dylib) on.exit(dyn.unload(dylib), add = TRUE) stopifnot(.C(dll$foo, 0L)[[1]] == 1L) TRUE } devtools/R/run-source.r0000644000176200001440000001207713200623655014610 0ustar liggesusers#' Run a script through some protocols such as http, https, ftp, etc. #' #' If a SHA-1 hash is specified with the \code{sha1} argument, then this #' function will check the SHA-1 hash of the downloaded file to make sure it #' matches the expected value, and throw an error if it does not match. If the #' SHA-1 hash is not specified, it will print a message displaying the hash of #' the downloaded file. The purpose of this is to improve security when running #' remotely-hosted code; if you have a hash of the file, you can be sure that #' it has not changed. For convenience, it is possible to use a truncated SHA1 #' hash, down to 6 characters, but keep in mind that a truncated hash won't be #' as secure as the full hash. #' #' @param url url #' @param ... other options passed to \code{\link{source}} #' @param sha1 The (prefix of the) SHA-1 hash of the file at the remote URL. #' @export #' @examples #' \dontrun{ #' #' source_url("https://gist.github.com/hadley/6872663/raw/hi.r") #' #' # With a hash, to make sure the remote file hasn't changed #' source_url("https://gist.github.com/hadley/6872663/raw/hi.r", #' sha1 = "54f1db27e60bb7e0486d785604909b49e8fef9f9") #' #' # With a truncated hash #' source_url("https://gist.github.com/hadley/6872663/raw/hi.r", #' sha1 = "54f1db27e60") #' } source_url <- function(url, ..., sha1 = NULL) { stopifnot(is.character(url), length(url) == 1) temp_file <- tempfile() on.exit(unlink(temp_file)) request <- httr::GET(url) httr::stop_for_status(request) writeBin(httr::content(request, type = "raw"), temp_file) file_sha1 <- digest::digest(file = temp_file, algo = "sha1") if (is.null(sha1)) { message("SHA-1 hash of file is ", file_sha1) } else { if (nchar(sha1) < 6) { stop("Supplied SHA-1 hash is too short (must be at least 6 characters)") } # Truncate file_sha1 to length of sha1 file_sha1 <- substr(file_sha1, 1, nchar(sha1)) if (!identical(file_sha1, sha1)) { stop("SHA-1 hash of downloaded file (", file_sha1, ")\n does not match expected value (", sha1, ")", call. = FALSE) } } source(temp_file, ...) } #' Run a script on gist #' #' \dQuote{Gist is a simple way to share snippets and pastes with others. #' All gists are git repositories, so they are automatically versioned, #' forkable and usable as a git repository.} #' \url{https://gist.github.com/} #' #' @param id either full url (character), gist ID (numeric or character of #' numeric). #' @param ... other options passed to \code{\link{source}} #' @param filename if there is more than one R file in the gist, which one to #' source (filename ending in '.R')? Default \code{NULL} will source the #' first file. #' @param sha1 The SHA-1 hash of the file at the remote URL. This is highly #' recommend as it prevents you from accidentally running code that's not #' what you expect. See \code{\link{source_url}} for more information on #' using a SHA-1 hash. #' @param quiet if \code{FALSE}, the default, prints informative messages. #' @export #' @examples #' \dontrun{ #' # You can run gists given their id #' source_gist(6872663) #' source_gist("6872663") #' #' # Or their html url #' source_gist("https://gist.github.com/hadley/6872663") #' source_gist("gist.github.com/hadley/6872663") #' #' # It's highly recommend that you run source_gist with the optional #' # sha1 argument - this will throw an error if the file has changed since #' # you first ran it #' source_gist(6872663, sha1 = "54f1db27e60") #' # Wrong hash will result in error #' source_gist(6872663, sha1 = "54f1db27e61") #' #' #' # You can speficy a particular R file in the gist #' source_gist(6872663, filename = "hi.r") #' source_gist(6872663, filename = "hi.r", sha1 = "54f1db27e60") #' } source_gist <- function(id, ..., filename = NULL, sha1 = NULL, quiet = FALSE) { stopifnot(length(id) == 1) url_match <- "((^https://)|^)gist.github.com/([^/]+/)?([0-9a-f]+)$" if (grepl(url_match, id)) { # https://gist.github.com/kohske/1654919, https://gist.github.com/1654919, # or gist.github.com/1654919 id <- regmatches(id, regexec(url_match, id))[[1]][5] url <- find_gist(id, filename) } else if (is.numeric(id) || grepl("^[0-9a-f]+$", id)) { # 1654919 or "1654919" url <- find_gist(id, filename) } else { stop("Unknown id: ", id) } if (!quiet) message("Sourcing ", url) source_url(url, ..., sha1 = sha1) } find_gist <- function(id, filename) { files <- github_GET(sprintf("gists/%s", id))$files r_files <- files[grepl("\\.[rR]$", names(files))] if (length(r_files) == 0) { stop("No R files found in gist", call. = FALSE) } if (!is.null(filename)) { if (!is.character(filename) || length(filename) > 1 || !grepl("\\.[rR]$", filename)) { stop("'filename' must be NULL, or a single filename ending in .R/.r") } which <- match(tolower(filename), tolower(names(r_files))) if (is.na(which)) { stop("You have speficied a file that is not in this gist.") } } else { if (length(r_files) > 1) { warning("Multiple R files in gist, using first.") which <- 1 } } r_files[[which]]$raw_url } devtools/R/build-dependencies.R0000644000176200001440000000176413200623655016172 0ustar liggesusershas_src <- function(pkg = ".") { pkg <- as.package(pkg) src_path <- file.path(pkg$path, "src") file.exists(src_path) } check_build_tools <- function(pkg = ".") { if (!has_src(pkg)) { return(TRUE) } # RStudio provides a dialog that can prompt the user to install the tools. check <- getOption("buildtools.check", NULL) if (!is.null(check)) { can_build <- check("Building R package from source") setup_rtools() } else { # Outside of Rstudio, check on Windows, otherwise assume they're present can_build <- setup_rtools() } if (!can_build) { stop("Could not find build tools necessary to build ", pkg$package, call. = FALSE) } } has_latex <- function(verbose = FALSE) { has <- nzchar(Sys.which("pdflatex")) if (!has && verbose) { message("pdflatex not found! Not building PDF manual or vignettes.\n", "If you are planning to release this package, please run a check with ", "manual and vignettes beforehand.\n") } has } devtools/R/missing-s3.r0000644000176200001440000000126213200623655014474 0ustar liggesusers#' Find missing s3 exports. #' #' The method is heuristic - looking for objs with a period in their name. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @export missing_s3 <- function(pkg = ".") { pkg <- as.package(pkg) check_suggested("roxygen2") loaded <- load_all(pkg) # Find all S3 methods in package objs <- ls(envir = loaded$env) is_s3 <- function(x) roxygen2::is_s3_method(x, env = loaded$env) s3_objs <- Filter(is_s3, objs) # Find all S3 methods in NAMESPACE ns <- parse_ns_file(pkg) exports <- paste(ns$S3methods[, 1], ns$S3methods[, 2], sep = ".") setdiff(s3_objs, exports) } devtools/R/rtools.r0000644000176200001440000002520113200623655014021 0ustar liggesusersusing_gcc49 <- function() { isTRUE(sub("^gcc[^[:digit:]]+", "", Sys.getenv("R_COMPILED_BY")) >= "4.9.3") } gcc_arch <- function() if (Sys.getenv("R_ARCH") == "/i386") "32" else "64" # Need to check for existence so load_all doesn't override known rtools location if (!exists("set_rtools_path")) { set_rtools_path <- NULL get_rtools_path <- NULL local({ rtools_paths <- NULL set_rtools_path <<- function(rtools) { if (is.null(rtools)) { rtools_paths <<- NULL return() } stopifnot(is.rtools(rtools)) path <- file.path(rtools$path, version_info[[rtools$version]]$path) # If using gcc49 and _without_ a valid BINPREF already set if (using_gcc49() && is.null(rtools$valid_binpref)) { Sys.setenv(BINPREF = file.path(rtools$path, "mingw_$(WIN)", "bin", "/")) } rtools_paths <<- path } get_rtools_path <<- function() { rtools_paths } }) } #' Find rtools. #' #' To build binary packages on windows, Rtools (found at #' \url{https://cran.r-project.org/bin/windows/Rtools/}) needs to be on #' the path. The default installation process does not add it, so this #' script finds it (looking first on the path, then in the registry). #' It also checks that the version of rtools matches the version of R. #' #' @section Acknowledgements: #' This code borrows heavily from RStudio's code for finding rtools. #' Thanks JJ! #' @param cache if \code{TRUE} will used cached version of RTools. #' @param debug if \code{TRUE} prints a lot of additional information to #' help in debugging. #' @return Either a visible \code{TRUE} if rtools is found, or an invisible #' \code{FALSE} with a diagnostic \code{\link{message}}. #' As a side-effect the internal package variable \code{rtools_path} is #' updated to the paths to rtools binaries. #' @keywords internal #' @export setup_rtools <- function(cache = TRUE, debug = FALSE) { # Non-windows users don't need rtools if (.Platform$OS.type != "windows") return(TRUE) # Don't look again, if we've already found it if (!cache) { set_rtools_path(NULL) } if (!is.null(get_rtools_path())) return(TRUE) # First try the path from_path <- scan_path_for_rtools(debug) if (is_compatible(from_path)) { set_rtools_path(from_path) return(TRUE) } if (!is.null(from_path)) { # Installed if (is.null(from_path$version)) { # but not from rtools if (debug) "gcc and ls on path, assuming set up is correct\n" return(TRUE) } else { # Installed, but not compatible message("WARNING: Rtools ", from_path$version, " found on the path", " at ", from_path$path, " is not compatible with R ", getRversion(), ".\n\n", "Please download and install ", rtools_needed(), " from ", rtools_url, ", remove the incompatible version from your PATH.") return(invisible(FALSE)) } } # Not on path, so try registry registry_candidates <- scan_registry_for_rtools(debug) if (length(registry_candidates) == 0) { # Not on path or in registry, so not installled message("WARNING: Rtools is required to build R packages, but is not ", "currently installed.\n\n", "Please download and install ", rtools_needed(), " from ", rtools_url, ".") return(invisible(FALSE)) } from_registry <- Find(is_compatible, registry_candidates, right = TRUE) if (is.null(from_registry)) { # In registry, but not compatible. versions <- vapply(registry_candidates, function(x) x$version, character(1)) message("WARNING: Rtools is required to build R packages, but no version ", "of Rtools compatible with R ", getRversion(), " was found. ", "(Only the following incompatible version(s) of Rtools were found:", paste(versions, collapse = ","), ")\n\n", "Please download and install ", rtools_needed(), " from ", rtools_url, ".") return(invisible(FALSE)) } installed_ver <- installed_version(from_registry$path, debug = debug) if (is.null(installed_ver)) { # Previously installed version now deleted message("WARNING: Rtools is required to build R packages, but the ", "version of Rtools previously installed in ", from_registry$path, " has been deleted.\n\n", "Please download and install ", rtools_needed(), " from ", rtools_url, ".") return(invisible(FALSE)) } if (installed_ver != from_registry$version) { # Installed version doesn't match registry version message("WARNING: Rtools is required to build R packages, but no version ", "of Rtools compatible with R ", getRversion(), " was found. ", "Rtools ", from_registry$version, " was previously installed in ", from_registry$path, " but now that directory contains Rtools ", installed_ver, ".\n\n", "Please download and install ", rtools_needed(), " from ", rtools_url, ".") return(invisible(FALSE)) } # Otherwise it must be ok :) set_rtools_path(from_registry) TRUE } scan_path_for_rtools <- function(debug = FALSE, gcc49 = using_gcc49(), arch = gcc_arch()) { if (debug) cat("Scanning path...\n") # First look for ls and gcc ls_path <- Sys.which("ls") if (ls_path == "") return(NULL) if (debug) cat("ls :", ls_path, "\n") # We have a candidate installPath install_path <- dirname(dirname(ls_path)) if (gcc49) { find_gcc49 <- function(path) { if (!file.exists(path)) { path <- paste0(path, ".exe") } file_info <- file.info(path) # file_info$exe should be win32 or win64 respectively if (!file.exists(path) || file_info$exe != paste0("win", arch)) { return(character(1)) } path } # First check if gcc set by BINPREF/CC is valid and use that is so cc_path <- RCMD("config", "CC", fun = system_output, quiet = !debug) # remove '-m64' from tail if it exists cc_path <- sub("[[:space:]]+-m[[:digit:]]+$", "", cc_path) gcc_path <- find_gcc49(cc_path) if (nzchar(gcc_path)) { return(rtools(install_path, NULL, valid_binpref = TRUE)) } # if not check default location Rtools/mingw_{32,64}/bin/gcc.exe gcc_path <- find_gcc49(file.path(install_path, paste0("mingw_", arch), "bin", "gcc.exe")) if (!nzchar(gcc_path)) { return(NULL) } } else { gcc_path <- Sys.which("gcc") if (gcc_path == "") return(NULL) } if (debug) cat("gcc:", gcc_path, "\n") install_path2 <- dirname(dirname(dirname(gcc_path))) # If both install_paths are not equal if (tolower(install_path2) != tolower(install_path)) return(NULL) version <- installed_version(install_path, debug = debug) if (debug) cat("Version:", version, "\n") rtools(install_path, version) } scan_registry_for_rtools <- function(debug = FALSE) { if (debug) cat("Scanning registry...\n") keys <- NULL try(keys <- utils::readRegistry("SOFTWARE\\R-core\\Rtools", hive = "HCU", view = "32-bit", maxdepth = 2), silent = TRUE) if (is.null(keys)) try(keys <- utils::readRegistry("SOFTWARE\\R-core\\Rtools", hive = "HLM", view = "32-bit", maxdepth = 2), silent = TRUE) if (is.null(keys)) return(NULL) rts <- vector("list", length(keys)) for (i in seq_along(keys)) { version <- names(keys)[[i]] key <- keys[[version]] if (!is.list(key) || is.null(key$InstallPath)) next; install_path <- normalizePath(key$InstallPath, mustWork = FALSE, winslash = "/") if (debug) cat("Found", install_path, "for", version, "\n") rts[[i]] <- rtools(install_path, version) } Filter(Negate(is.null), rts) } installed_version <- function(path, debug) { if (!file.exists(file.path(path, "Rtools.txt"))) return(NULL) # Find the version path version_path <- file.path(path, "VERSION.txt") if (debug) { cat("VERSION.txt\n") cat(readLines(version_path), "\n") } if (!file.exists(version_path)) return(NULL) # Rtools is in the path -- now crack the VERSION file contents <- NULL try(contents <- readLines(version_path), silent = TRUE) if (is.null(contents)) return(NULL) # Extract the version contents <- gsub("^\\s+|\\s+$", "", contents) version_re <- "Rtools version (\\d\\.\\d+)\\.[0-9.]+$" if (!grepl(version_re, contents)) return(NULL) m <- regexec(version_re, contents) regmatches(contents, m)[[1]][2] } is_compatible <- function(rtools) { if (is.null(rtools)) return(FALSE) if (is.null(rtools$version)) return(FALSE) stopifnot(is.rtools(rtools)) info <- version_info[[rtools$version]] if (is.null(info)) return(FALSE) r_version <- getRversion() r_version >= info$version_min && r_version <= info$version_max } rtools <- function(path, version, ...) { structure(list(version = version, path = path, ...), class = "rtools") } is.rtools <- function(x) inherits(x, "rtools") # Rtools metadata -------------------------------------------------------------- rtools_url <- "http://cran.r-project.org/bin/windows/Rtools/" version_info <- list( "2.11" = list( version_min = "2.10.0", version_max = "2.11.1", path = c("bin", "perl/bin", "MinGW/bin") ), "2.12" = list( version_min = "2.12.0", version_max = "2.12.2", path = c("bin", "perl/bin", "MinGW/bin", "MinGW64/bin") ), "2.13" = list( version_min = "2.13.0", version_max = "2.13.2", path = c("bin", "MinGW/bin", "MinGW64/bin") ), "2.14" = list( version_min = "2.13.0", version_max = "2.14.2", path = c("bin", "MinGW/bin", "MinGW64/bin") ), "2.15" = list( version_min = "2.14.2", version_max = "2.15.1", path = c("bin", "gcc-4.6.3/bin") ), "2.16" = list( version_min = "2.15.2", version_max = "3.0.0", path = c("bin", "gcc-4.6.3/bin") ), "3.0" = list( version_min = "2.15.2", version_max = "3.0.99", path = c("bin", "gcc-4.6.3/bin") ), "3.1" = list( version_min = "3.0.0", version_max = "3.1.99", path = c("bin", "gcc-4.6.3/bin") ), "3.2" = list( version_min = "3.1.0", version_max = "3.2.99", path = c("bin", "gcc-4.6.3/bin") ), "3.3" = list( version_min = "3.2.0", version_max = "3.3.99", path = if (using_gcc49()) { "bin" } else { c("bin", "gcc-4.6.3/bin") } ), "3.4" = list( version_min = "3.3.0", version_max = "3.4.99", path = if (using_gcc49()) { "bin" } else { c("bin", "gcc-4.6.3/bin") } ) ) rtools_needed <- function() { r_version <- getRversion() for(i in rev(seq_along(version_info))) { version <- names(version_info)[i] info <- version_info[[i]] ok <- r_version >= info$version_min && r_version <= info$version_max if (ok) return(paste("Rtools", version)) } "the appropriate version of Rtools" } #' @rdname setup_rtools #' @usage NULL #' @export find_rtools <- setup_rtools devtools/R/cran.r0000644000176200001440000000211413200623655013420 0ustar liggesusersavailable_packages <- memoise::memoise(function(repos, type) { suppressWarnings(available.packages(contrib.url(repos, type), type = type)) }) package_url <- function(package, repos, available = available_packages(repos, "source")) { ok <- (available[, "Package"] == package) ok <- ok & !is.na(ok) if (!any(ok)) { return(list(name = NA_character_, url = NA_character_)) } vers <- package_version(available[ok, "Version"]) keep <- vers == max(vers) keep[duplicated(keep)] <- FALSE ok[ok][!keep] <- FALSE name <- paste(package, "_", available[ok, "Version"], ".tar.gz", sep = "") url <- file.path(available[ok, "Repository"], name) list(name = name, url = url) } # Return the version of a package on CRAN (or other repository) # @param package The name of the package. # @param available A matrix of information about packages. cran_pkg_version <- function(package, available = available.packages()) { idx <- available[, "Package"] == package if (any(idx)) { as.package_version(available[package, "Version"]) } else { NULL } } devtools/R/uninstall.r0000644000176200001440000000151213200623655014507 0ustar liggesusers#' Uninstall a local development package. #' #' Uses \code{remove.package} to uninstall the package. #' To uninstall a package from a non-default library, #' use \code{\link[withr]{with_libpaths}}. #' #' @inheritParams install #' @param unload if \code{TRUE} (the default), will automatically unload the #' package prior to uninstalling. #' @param ... additional arguments passed to \code{\link{remove.packages}}. #' @export #' @family package installation #' @seealso \code{\link{with_debug}} to install packages with debugging flags #' set. uninstall <- function(pkg = ".", unload = TRUE, quiet = FALSE, ...) { pkg <- as.package(pkg) if (unload && pkg$package %in% loaded_packages()$package) { unload(pkg) } if (!quiet) { message("Uninstalling ", pkg$package) } remove.packages(pkg$package) invisible(TRUE) } devtools/R/run-example.r0000644000176200001440000000437513200623655014745 0ustar liggesusersrun_example <- function(path, show = TRUE, test = FALSE, run = TRUE, env = new.env(parent = globalenv())) { check_suggested("evaluate") rd <- tools::parse_Rd(path) ex <- rd[rd_tags(rd) == "examples"] code <- process_ex(ex, show = show, test = test, run = run) if (is.null(code)) return() rule("Running examples in ", basename(path)) code <- paste(code, collapse = "") results <- evaluate::evaluate(code, env) replay_stop(results) } process_ex <- function(rd, show = TRUE, test = FALSE, run = TRUE) { tag <- rd_tag(rd) recurse <- function(rd) { unlist(lapply(rd, process_ex, show = show, test = test, run = run)) } if (is.null(tag) || tag == "examples") { return(recurse(rd)) } # Base case if (tag %in% c("RCODE", "COMMENT", "TEXT", "VERB")) { return(rd[[1]]) } # Conditional execution if (tag %in% c("dontshow", "dontrun", "donttest", "testonly")) { out <- recurse(rd) if ((tag == "dontshow" && show) || (tag == "dontrun" && run) || (tag == "donttest" && test) || (tag == "testonly" && !test)) { type <- paste("\n# ", toupper(tag), "\n", sep = "") out <- c(type, out) out <- gsub("\n", "\n# ", out) } return(out) } if (tag %in% c("dots", "ldots")) { return("...") } warning("Unknown tag ", tag, call. = FALSE) tag } rd_tag <- function(x) { tag <- attr(x, "Rd_tag") if (is.null(tag)) return() gsub("\\", "", tag, fixed = TRUE) } rd_tags <- function(x) { vapply(x, function(x) rd_tag(x) %||% "", character(1)) } remove_tag <- function(x) { attr(x, "Rd_tag") <- NULL x } replay_stop <- function(x) UseMethod("replay_stop", x) #' @export replay_stop.error <- function(x) { stop(quiet_error(x$message, x$call)) } #' @export replay_stop.default <- function(x) evaluate::replay(x) #' @export replay_stop.list <- function(x) { invisible(lapply(x, replay_stop)) } quiet_error <- function(message, call = NULL) { structure(list(message = as.character(message), call = call), class = c("quietError", "error", "condition")) } #' @export as.character.quietError <- function(x, ...) { if (is.null(x$call)) { paste("Error: ", x$message, sep = "") } else { call <- deparse(x$call) paste("Error in ", call, ": ", x$message, sep = "") } } devtools/R/build-github-devtools.r0000644000176200001440000000445113200623655016717 0ustar liggesusers#' Build the development version of devtools from GitHub. #' #' This function is especially useful for Windows users who want to upgrade #' their version of devtools to the development version hosted on on GitHub. #' In Windows, it's not possible to upgrade devtools while the package is loaded #' because there is an open DLL, which in Windows can't be overwritten. This #' function allows you to build a binary package of the development version of #' devtools; then you can restart R (so that devtools isn't loaded) and install #' the package. #' #' Mac and Linux users don't need this function; they can use #' \code{\link{install_github}} to install devtools directly, without going #' through the separate build-restart-install steps. #' #' This function requires a working development environment. On Windows, it #' needs \url{https://cran.r-project.org/bin/windows/Rtools/}. #' #' @param outfile The name of the output file. If NULL (the default), it uses #' ./devtools.tgz (Mac and Linux), or ./devtools.zip (Windows). #' @return a string giving the location (including file name) of the built #' package #' @examples #' \dontrun{ #' library(devtools) #' build_github_devtools() #' #' #### Restart R before continuing #### #' install.packages("./devtools.zip", repos = NULL) #' #' # Remove the package after installation #' unlink("./devtools.zip") #' } #' @export build_github_devtools <- function(outfile = NULL) { .Deprecated(msg = "`build_github_devtools()` is deprecated, you can simply use `install_github(\"hadley/devtools\")`") if (!has_devel()) { stop("This requires a working development environment.") } ext <- if (.Platform$OS.type == "windows") "zip" else "tgz" outfile <- paste0("./devtools.", ext) url <- "https://github.com/hadley/devtools/archive/master.zip" message("Downloading devtools from ", url) bundle <- file.path(tempdir(), "devtools-master.zip") # Download package file request <- httr::GET(url) httr::stop_for_status(request) writeBin(httr::content(request, "raw"), bundle) on.exit(unlink(bundle)) utils::unzip(bundle, exdir = tempdir()) # Build binary package pkgdir <- file.path(tempdir(), "devtools-master") built_pkg <- devtools::build(pkgdir, binary = TRUE) message("Renaming file to ", outfile) file.rename(built_pkg, outfile) invisible(outfile) } devtools/R/install.r0000644000176200001440000002103313200623655014144 0ustar liggesusers#' Install a local development package. #' #' Uses \code{R CMD INSTALL} to install the package. Will also try to install #' dependencies of the package from CRAN, if they're not already installed. #' #' By default, installation takes place using the current package directory. #' If you have compiled code, this means that artefacts of compilation will be #' created in the \code{src/} directory. If you want to avoid this, you can #' use \code{local = FALSE} to first build a package bundle and then install #' it from a temporary directory. This is slower, but keeps the source #' directory pristine. #' #' If the package is loaded, it will be reloaded after installation. This is #' not always completely possible, see \code{\link{reload}} for caveats. #' #' To install a package in a non-default library, use \code{\link[withr]{with_libpaths}}. #' #' @param pkg package description, can be path or package name. See #' \code{\link{as.package}} for more information #' @param reload if \code{TRUE} (the default), will automatically reload the #' package after installing. #' @param quick if \code{TRUE} skips docs, multiple-architectures, #' demos, and vignettes, to make installation as fast as possible. #' @param local if \code{FALSE} \code{\link{build}}s the package first: #' this ensures that the installation is completely clean, and prevents any #' binary artefacts (like \file{.o}, \code{.so}) from appearing in your local #' package directory, but is considerably slower, because every compile has #' to start from scratch. #' @param args An optional character vector of additional command line #' arguments to be passed to \code{R CMD install}. This defaults to the #' value of the option \code{"devtools.install.args"}. #' @param quiet if \code{TRUE} suppresses output from this function. #' @param dependencies \code{logical} indicating to also install uninstalled #' packages which this \code{pkg} depends on/links to/suggests. See #' argument \code{dependencies} of \code{\link{install.packages}}. #' @param upgrade_dependencies If \code{TRUE}, the default, will also update #' any out of date dependencies. #' @param build_vignettes if \code{TRUE}, will build vignettes. Normally it is #' \code{build} that's responsible for creating vignettes; this argument makes #' sure vignettes are built even if a build never happens (i.e. because #' \code{local = TRUE}). #' @param keep_source If \code{TRUE} will keep the srcrefs from an installed #' package. This is useful for debugging (especially inside of RStudio). #' It defaults to the option \code{"keep.source.pkgs"}. #' @param threads number of concurrent threads to use for installing #' dependencies. #' It defaults to the option \code{"Ncpus"} or \code{1} if unset. #' @param force_deps whether to force installation of dependencies even if their #' SHA1 reference hasn't changed from the currently installed version. #' @param metadata Named list of metadata entries to be added to the #' \code{DESCRIPTION} after installation. #' @param out_dir Directory to store installation output in case of failure. #' @param skip_if_log_exists If the \code{out_dir} is defined and contains #' a file named \code{package.out}, no installation is attempted. #' @param ... additional arguments passed to \code{\link{install.packages}} #' when installing dependencies. \code{pkg} is installed with #' \code{R CMD INSTALL}. #' @export #' @family package installation #' @seealso \code{\link{with_debug}} to install packages with debugging flags #' set. install <- function(pkg = ".", reload = TRUE, quick = FALSE, local = TRUE, args = getOption("devtools.install.args"), quiet = FALSE, dependencies = NA, upgrade_dependencies = TRUE, build_vignettes = FALSE, keep_source = getOption("keep.source.pkgs"), threads = getOption("Ncpus", 1), force_deps = FALSE, metadata = remote_metadata(as.package(pkg)), out_dir = NULL, skip_if_log_exists = FALSE, ...) { pkg <- as.package(pkg) check_build_tools(pkg) # Forcing all of the promises for the current namespace now will avoid lazy-load # errors when the new package is installed overtop the old one. # https://stat.ethz.ch/pipermail/r-devel/2015-December/072150.html if (is_loaded(pkg)) { eapply(ns_env(pkg), force, all.names = TRUE) } root_install <- is.null(installing$packages) if (root_install) { on.exit(installing$packages <- NULL, add = TRUE) } if (pkg$package %in% installing$packages) { if (!quiet) { message("Skipping ", pkg$package, ", it is already being installed.") } return(invisible(FALSE)) } if (!is.null(out_dir)) { out_file <- file.path(out_dir, paste0(pkg$package, ".out")) if (skip_if_log_exists && file.exists(out_file)) { message("Skipping ", pkg$package, ", installation failed before, see log in ", out_file) return(invisible(FALSE)) } } else { out_file <- NULL } installing$packages <- c(installing$packages, pkg$package) if (!quiet) { message("Installing ", pkg$package) } # If building vignettes, make sure we have all suggested packages too. if (build_vignettes && missing(dependencies)) { dependencies <- standardise_dep(TRUE) } else { dependencies <- standardise_dep(dependencies) } initial_deps <- dependencies[dependencies != "Suggests"] final_deps <- dependencies[dependencies == "Suggests"] # cache the Remote: dependencies here so we don't have to query them each # time we call install_deps installing$remote_deps <- remote_deps(pkg) on.exit(installing$remote_deps <- NULL, add = TRUE) install_deps(pkg, dependencies = initial_deps, upgrade = upgrade_dependencies, threads = threads, force_deps = force_deps, quiet = quiet, ..., out_dir = out_dir, skip_if_log_exists = skip_if_log_exists) # Build the package. Only build locally if it doesn't have vignettes has_vignettes <- length(tools::pkgVignettes(dir = pkg$path)$docs > 0) if (local && !(has_vignettes && build_vignettes)) { built_path <- pkg$path } else { built_path <- build(pkg, tempdir(), vignettes = build_vignettes, quiet = quiet) on.exit(unlink(built_path), add = TRUE) } opts <- c( paste("--library=", shQuote(.libPaths()[1]), sep = ""), if (keep_source) "--with-keep.source", "--install-tests" ) if (quick) { opts <- c(opts, "--no-docs", "--no-multiarch", "--no-demo") } opts <- paste(paste(opts, collapse = " "), paste(args, collapse = " ")) built_path <- normalizePath(built_path, winslash = "/") R(paste("CMD INSTALL ", shQuote(built_path), " ", opts, sep = ""), fun = system2_check, quiet = quiet || !is.null(out_file), out_file = out_file) # Remove immediately upon success unlink(out_file) install_deps(pkg, dependencies = final_deps, upgrade = upgrade_dependencies, threads = threads, force_deps = force_deps, quiet = quiet, ..., out_dir = out_dir, skip_if_log_exists = skip_if_log_exists) if (length(metadata) > 0) { add_metadata(inst(pkg$package), metadata) } if (reload) { reload(pkg, quiet = quiet) } invisible(TRUE) } # A environment to hold which packages are being installed so packages with # circular dependencies can be skipped the second time. installing <- new.env(parent = emptyenv()) #' Install package dependencies if needed. #' #' \code{install_deps} is used by \code{install_*} to make sure you have #' all the dependencies for a package. \code{install_dev_deps()} is useful #' if you have a source version of the package and want to be able to #' develop with it: it installs all dependencies of the package, and it #' also installs roxygen2. #' #' @inheritParams install #' @inheritParams package_deps #' @param ... additional arguments passed to \code{\link{install.packages}}. #' @export #' @examples #' \dontrun{install_deps(".")} install_deps <- function(pkg = ".", dependencies = NA, threads = getOption("Ncpus", 1), repos = getOption("repos"), type = getOption("pkgType"), ..., upgrade = TRUE, quiet = FALSE, force_deps = FALSE) { pkg <- dev_package_deps(pkg, repos = repos, dependencies = dependencies, type = type) update(pkg, ..., Ncpus = threads, quiet = quiet, upgrade = upgrade) invisible() } #' @rdname install_deps #' @export install_dev_deps <- function(pkg = ".", ...) { update_packages("roxygen2") install_deps(pkg, ..., dependencies = TRUE, upgrade = FALSE, bioc_packages = TRUE) } devtools/vignettes/0000755000176200001440000000000013200656425014124 5ustar liggesusersdevtools/vignettes/dependencies.Rmd0000644000176200001440000000502213200623656017215 0ustar liggesusers--- title: "Devtools dependencies" author: "Jim Hester, Hadley Wickham" date: "`r Sys.Date()`" output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Devtools dependencies} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- # Package remotes Devtools version 1.9 supports package dependency installation for packages not yet in a standard package repository such as [CRAN](https://cran.r-project.org) or [Bioconductor](http://bioconductor.org). You can mark any regular dependency defined in the `Depends`, `Imports`, `Suggests` or `Enhances` fields as being installed from a remote location by adding the remote location to `Remotes` in your `DESCRIPTION` file. This will cause devtools to download and install them prior to installing your package (so they won't be installed from CRAN). The remote dependencies specified in `Remotes` should be described in the following form. ``` Remotes: [type::], [type2::] ``` The `type` is an optional parameter. If the type is missing the default is to install from GitHub. Additional remote dependencies should be separated by commas, just like normal dependencies elsewhere in the `DESCRIPTION` file. ### Github Because github is the most commonly used unofficial package distribution in R, it's the default: ```yaml Remotes: hadley/testthat ``` You can also specify a specific hash, tag, or pull request (using the same syntax as `install_github()` if you want a particular commit. Otherwise the latest commit on the master branch is used. ```yaml Remotes: hadley/httr@v0.4, klutometis/roxygen#142, hadley/testthat@c67018fa4970 ``` A type of 'github' can be specified, but is not required ```yaml Remotes: github::hadley/ggplot2 ``` ### Other sources All of the currently supported install sources are available, see the 'See Also' section in `?install` for a complete list. ```yaml # Git Remotes: git::https://github.com/hadley/ggplot2.git # Bitbucket Remotes: bitbucket::sulab/mygene.r@default, dannavarro/lsr-package # Bioconductor Remotes: bioc::3.3/SummarizedExperiment#117513, bioc::release/Biobase # SVN Remotes: svn::https://github.com/hadley/stringr # URL Remotes: url::https://github.com/hadley/stringr/archive/master.zip # Local Remotes: local::/pkgs/testthat # Gitorious Remotes: gitorious::r-mpc-package/r-mpc-package ``` ### CRAN submission When you submit your package to CRAN, all of its dependencies must also be available on CRAN. For this reason, `release()` will warn you if you try to release a package with a `Remotes` field. devtools/README.md0000644000176200001440000001346413200623655013402 0ustar liggesusers# devtools [![Build Status](https://travis-ci.org/hadley/devtools.svg?branch=master)](https://travis-ci.org/hadley/devtools) [![AppVeyor Build Status](https://ci.appveyor.com/api/projects/status/github/hadley/devtools?branch=master&svg=true)](https://ci.appveyor.com/project/hadley/devtools) [![Coverage Status](https://codecov.io/github/hadley/devtools/coverage.svg?branch=master)](https://codecov.io/github/hadley/devtools?branch=master) [![CRAN_Status_Badge](http://www.r-pkg.org/badges/version/devtools)](https://cran.r-project.org/package=devtools) The aim of `devtools` is to make package development easier by providing R functions that simplify common tasks. An R package is actually quite simple. A package is a template or set of conventions that structures your code. This not only makes sharing code easy, it reduces the time and effort required to complete you project: following a template removes the need to have to think about how to organize things and paves the way for the creation of standardised tools that can further accelerate your progress. While package development in R can feel intimidating, `devtools` does every thing it can to make it less so. In fact, `devtools` comes with a small guarantee: if you get an angry e-mail from an R-core member because of a bug in `devtools`, forward me the email and your address and I'll mail you a card with a handwritten apology. `devtools` is opinionated about package development. It requires that you use `roxygen2` for documentation and `testthat` for testing. Not everyone would agree with this approach, and they are by no means perfect. But they have evolved out of the experience of writing over 30 R packages. I'm always happy to hear about what doesn't work for you and where `devtools` gets in your way. Either send an email to the [rdevtools mailing list](http://groups.google.com/group/rdevtools) or file an [issue at the GitHub repository](http://github.com/hadley/devtools/issues). ## Updating to the latest version of devtools You can track (and contribute to) the development of `devtools` at https://github.com/hadley/devtools. To install it: 1. Install the release version of `devtools` from CRAN with `install.packages("devtools")`. 2. Make sure you have a working development environment. * **Windows**: Install [Rtools](https://cran.r-project.org/bin/windows/Rtools/). * **Mac**: Install Xcode from the Mac App Store. * **Linux**: Install a compiler and various development libraries (details vary across different flavors of Linux). 3. Install the development version of devtools. ```R devtools::install_github("hadley/devtools") ``` ## Package development tools All `devtools` functions accept a path as an argument, e.g. `load_all("path/to/path/mypkg")`. If you don't specify a path, `devtools` will look in the current working directory - this is recommended practice. Frequent development tasks: * `load_all()` simulates installing and reloading your package, loading R code in `R/`, compiled shared objects in `src/` and data files in `data/`. During development you usually want to access all functions so `load_all()` ignores the package `NAMESPACE`. `load_all()` will automatically create a `DESCRIPTION` if needed. * `document()` updates documentation, file collation and `NAMESPACE`. * `test()` reloads your code, then runs all `testthat` tests. Building and installing: * `install()` reinstalls the package, detaches the currently loaded version then reloads the new version with `library()`. Reloading a package is not guaranteed to work: see the documentation to `unload()` for caveats. * `build()` builds a package file from package sources. You can use it to build a binary version of your package. * `install_*` functions install an R package: * `install_github()` from github, * `install_bitbucket()` from bitbucket, * `install_url()` from an arbitrary url and * `install_local()` from a local file on disk. * `install_version()` installs a specified version from cran. Check and release: * `check()` updates the documentation, then builds and checks the package. `build_win()` builds a package using [win-builder](http://win-builder.r-project.org/), allowing you to easily check your package on windows. * `run_examples()` will run all examples to make sure they work. This is useful because example checking is the last step of `R CMD check`. * `check_man()` runs most of the documentation checking components of `R CMD check` * `release()` makes sure everything is ok with your package (including asking you a number of questions), then builds and uploads to CRAN. It also drafts an email to let the CRAN maintainers know that you've uploaded a new package. ## Other tips I recommend adding the following code to your `.Rprofile`: ```R .First <- function() { options( repos = c(CRAN = "https://cran.rstudio.com/"), browserNLdisabled = TRUE, deparse.max.lines = 2) } if (interactive()) { suppressMessages(require(devtools)) } ``` See the complete list in `?devtools` This will set up R to: * always install packages from the RStudio CRAN mirror * ignore newlines when `browse()`ing * give minimal output from `traceback()` * automatically load `devtools` in interactive sessions There are also a number of options you might want to set (in `.Rprofile`) to customise the default behaviour when creating packages and drafting emails: * `devtools.name`: your name, used to sign emails * `devtools.desc.author`: your R author string, in the form of `"Hadley Wickham [aut, cre]"`. Used when creating default `DESCRIPTION` files. * `devtools.desc.license`: a default license used when creating new packages # Code of conduct Please note that this project is released with a [Contributor Code of Conduct](CONDUCT.md). By participating in this project you agree to abide by its terms. devtools/MD50000644000176200001440000006212113201030625012413 0ustar liggesusersb5812d8e2544ba6e61ab6259d842629b *DESCRIPTION 4ae9d5ebe0696cce2990b516bc309e81 *NAMESPACE 23d79328a8ade32454f08eb39e055f97 *NEWS.md 752f58de5622cdaeca45ec6a16ac8295 *R/R.r 4e9dcebb7f5130f7fe646c6925a70f0b *R/aaa.r 4cae753257b3807c1b41086da55eeb2b *R/bash.r 1c20a025a7354b123c365d39c68f46a5 *R/build-dependencies.R e07a7199f2e83d6e3b587390208d0d91 *R/build-github-devtools.r b3febe00d1a4a07586d434823b29252f *R/build.r 03b44e73dd4ca1a4dfe71105b5ee9e95 *R/check-cran.r a2940e5504839990c4e8a78309d6ab72 *R/check-devtools.r 9ecd93991624d7dc48087d6acd40c08f *R/check-doc.r a66655e2f7276c4721600f7f9878bf0a *R/check-git.r 5c1ce110f18bf1e2d5448f882e321f4e *R/check-results.R 13eaf88a41310084e946d13f47edfb10 *R/check.r 5f1a531c16341ebaf92723656c9d7c63 *R/clean.r 277efbac4341325ba5ff08d9cd41d0d7 *R/compile-dll.r e06a70cecdeaefba8b6210b916b541a8 *R/cran.r ccde0a5d2d9e9e74abbeb3fb2245375f *R/create.r 8ae4cb774842ba0c8522506aa3ac84d2 *R/decompress.r a620421cb085060e34904c71051d3c14 *R/deps.R 55daaf5d9406ca18533c0573e54147c5 *R/dev-example.r c96397494dc6150160f72b6f47a563fc *R/dev-help.r 9ad8363047abe472db37b24a20cb6434 *R/dev-meta.r 71da847b77f30d8369f65ac2c9a20dca *R/dev-mode.r 18225084a3bc93f42f0bdc93ba56c13c *R/doctor.R 2bb4396e0ba0a40f3780fd6aed13d5e1 *R/document.r 90b0e27ab465352e899c1de4c5251b1d *R/download-method.R f4d2d1338a5782f6fbed109348871374 *R/env-utils.r ec20041c1e398e82c95c105a8ea8b542 *R/file-cache.r 22fa012ac7ba500f786b001a5e45ea3e *R/git.R 1a6dcc688571b2675333140247278b41 *R/github.R b5ba5c0b4a78f291d16dec56bfc1654a *R/has-devel.r 4e5fc50cfdc279ab70244882a3a525bf *R/has-tests.r 38792f9822e1c6171eb62d1ccb0771b9 *R/imports-env.r 49596da52897907d864b862a2ccc8779 *R/infrastructure-git.R bf776b45f38c81450a7e63fa3fdec34f *R/infrastructure.R 134e0376f9cb38d09922b4c1e63c9f91 *R/inst.r a5bce974d6f07e622a201e93e978a4fc *R/install-bioc.r 02e10d27f22ba28b85438ede9196ee85 *R/install-bitbucket.r 368bdb94e8c19926a2df5ba878b5296b *R/install-cran.r 32ab4d68d68e116c94afb9f321f6acda *R/install-git.r 06c9115d4bc33040612591147c5ec8b1 *R/install-github.r 194bc8abe23876114f48ccdcf704e898 *R/install-local.r b9a6e2a000c337e2f29c83d7b91907b9 *R/install-min.r 430d0b01e48c14308917b02c6bbbec62 *R/install-remote.R 27a8b7d4092f7c4c16935407655fc609 *R/install-svn.r fa618dcd49447eaf79b963ce24dabbe3 *R/install-url.r 02676c67a077b1b2eb83619764c00cae *R/install-version.r 2ae46d4d96a1bd674ac60031464dc792 *R/install.r c5ffe043e3d2291573fdc51e68600c4c *R/lint.r edfb4d03b3afe630c97160c3e39b604d *R/load-code.r b4789380cf08e313880b4e6442d44cf3 *R/load-data.r 1dd72917e95c30a6bb777a8158eed735 *R/load-depends.r 3c09e99b8bf2d5df65df98a5670d5864 *R/load-dll.r d894d32231bb1a5173e860d6c975d8ec *R/load.r 4c6dd8bdb02df2fbf2afa9cdedcdf539 *R/missing-s3.r 85ed6227a71d711f215767bba828a183 *R/namespace-env.r 3b2414b3ad6f9bf89066eb6b21aefbde *R/package-deps.r cea8bf8399ecd3f2d29ed907a228a227 *R/package-env.r 4156070bddc7dc6ff769b72861c5ccc5 *R/package.r b63bebe50176fe9db76e98edb4ce1fe8 *R/path.r e7796869095b54d8690dfae65c052d58 *R/rcpp-attributes.r 4d77922f3b6d40846342a780e481eeef *R/release.r b267a126e1a4d1ad95cd68abcb6a89b4 *R/reload.r 839d900006685369439cb865f4740ac9 *R/remove-s4-class.r 3c0d1232fd60273e3ae27722f1519d86 *R/revdep-email.R 43609fc20c5568d0d59a146251da5b30 *R/revdep-summarise.R 3185a0d932f1282000528c5b1e4f4edb *R/revdep.R b944a1015fc01819f38d53929cf45600 *R/rtools.r 4f89537e6deaa8aa4498c5bb6656102c *R/run-example.r b41d337c5102b16a664a120a021c22c9 *R/run-examples.r 203af4e1879833c284f5ff4505290fb3 *R/run-loadhooks.r 248e66d68dd6bf362309402c97a3639c *R/run-source.r bf01727a6e9ddaf6bc9f69fc7a8072eb *R/session-info.r 15f8388807558eaa17f454fa1c8563f7 *R/shims.r 8aea9796d525fd50c4f5d070da12ab5a *R/show-news.r 80280ec376c42db51ba51ec7fa2d6231 *R/source.r 4eb2a534c9e5e0fa7549f008b2d3b6a6 *R/spell-check.R 2cfd7b7b3a5a41722b63c718d917438f *R/system.r 3468834d225765817fd036ae4d5ad4ce *R/test.r e3c9324d5802a76e6aca87b089052958 *R/topic-index.r be25ef26721fc32a511af54ea9447f2a *R/uninstall.r 189762ee27d367c30871c24108edaac4 *R/unload.r 6a110b0250b2f863254059cac101964d *R/upload-ftp.r 5e2ceeda1e042e72824d0fda80310151 *R/utils.r f1b050a25b6c3fc6ffd431d99e6716bd *R/vignette-r.r e7cd30f87e06a509ef03e075c362e264 *R/vignettes.r 78e6a47190fdb758c534a2ad90b86050 *R/wd.r 3c4b0c893679e5ca1fc9d832b5d4eac1 *R/with-debug.r 316466f5f022e4f74a02f9b8ec7249d2 *R/with.r 77f3b6ea5066dd3bcec443fb2e8fc6fd *R/zzz.r f8cc1bac8dc6b84491ba0293042fc614 *README.md 88c064895a7888475ca94a2da4caeb3d *build/vignette.rds 8b5a68edbe3593efcb2764da9d4c7733 *inst/doc/dependencies.Rmd a98609d38030e7c15894a3acaaab92d2 *inst/doc/dependencies.html f87ef290340322089c32b4e573d8f1e8 *inst/templates/CONDUCT.md fccaad4db171eb4173c05286ad2b4b91 *inst/templates/NEWS.md c311ca55d43ba49ec7e9ff5a2a93a6f6 *inst/templates/appveyor.yml 110cd9eb7f10cca6429772ad69b2ebb9 *inst/templates/codecov.yml e96f570e8c922ff5e07e416e99f305de *inst/templates/cran-comments.md 29a9012941a6bcb26bf0fb4382c5dd75 *inst/templates/gpl-v3.md 77639515db0fda8f3b1bffef4cfe3a74 *inst/templates/mit-license.txt 24058451094682328a71aad210422744 *inst/templates/omni-README 9bfbcb6421618728d2a470a386d2b868 *inst/templates/packagename-package.r 8c4b3f14fb3d6ffc74a28f361771b9c9 *inst/templates/readme-rmd-pre-commit.sh 896187df43e14fab73ca49e82e7f8e2a *inst/templates/revdep.R a22ea92b08e06a4bbbb38cfd5693e080 *inst/templates/template.Rproj 997109045736c35a03cfe4e74f14611e *inst/templates/test-example.R b112494cb8925006b418b0d4634063be *inst/templates/testthat.R 8f31cf9ffaa20f44f61291064ce45719 *inst/templates/travis.yml 8dd7ae7d10240ffd26b25a577c64ffeb *man/RCMD.Rd b8d463ae9482ae43916f025ce125d584 *man/as.package.Rd 034f627efc31f86adf0c235bdb0fd981 *man/bash.Rd fe65280fcdd0624e32e9cd76c84bd2f5 *man/build.Rd 10a3cdf5e1ea6772e3940af681273d75 *man/build_github_devtools.Rd 379db85a5906718adc93589ad960bc17 *man/build_vignettes.Rd 225705aaf293c66c9fbd88da3728bf5b *man/build_win.Rd e559f558a07265c2ad340343945f089a *man/check.Rd 5d12d6d29abe94d6a5926383b1b70775 *man/check_cran.Rd e1bb2f36e17e31153575398aad8b67e2 *man/check_dep_version.Rd ba2f01b9eb4a1e4842fd6d82be3c854e *man/check_failures.Rd ec9b6ebdaac71060e5b97b780f11c80f *man/check_man.Rd 161494e6c9a1a9cfebf4f9d7c6c2c47d *man/clean_dll.Rd b7fa6149d553f1c1e8cd9a9e8a76e10d *man/clean_source.Rd b01eb64a6cc023a6a258b7830577a301 *man/clean_vignettes.Rd fef3e98f9dc0483241454ffb69ed57c6 *man/compile_dll.Rd c222dab29ce968a1895fffcabe97aef3 *man/compiler_flags.Rd e7637deedee47a3efe45df6ffab2a8b0 *man/create.Rd 6ca86a19f493723b23fd37e2d72fdc29 *man/create_description.Rd 66e3933383f30f714319e17276b7db84 *man/dev_example.Rd 8b7a3caa0bdcf2b63b6fe301c73c4de4 *man/dev_help.Rd 97c81a94d910caa3bfa148bf1c6159bc *man/dev_meta.Rd 17eb86abc64a9cf65850c28ef838f18e *man/dev_mode.Rd ef55a3901b2b4e79824e9a225916988c *man/dev_packages.Rd a2531dfc05c5e7e2d5ef5a195c5f4465 *man/devtest.Rd 4527aa48a7d0b5df34fc44cc7bcf70c6 *man/devtools-deprecated.Rd 2a5db17bf15c5c2af7ffc042bc123392 *man/devtools.Rd 3dfc7f637f38c6d57d8c4decad508e04 *man/document.Rd 6c40ffc234ce9232c7f49688849dbdd8 *man/dr_devtools.Rd 15896d01c62847762fcfe78e1c09f9e6 *man/dr_github.Rd 26fa5894165ed8799cf9e1be8c950cff *man/eval_clean.Rd ff101a48bb093434cd7a4f617d5dbfc8 *man/find_topic.Rd 9f5d4e2c5862034b0a1f2c30ba56591c *man/git_checks.Rd d07cb6d8cfbffd0b8a0983995f385e00 *man/github_pat.Rd 78b2f68ea5a2f3ff5e0b43a416ffe2a0 *man/github_refs.Rd 6253682ebd92f0d64bdb006831c89056 *man/has_devel.Rd 20939d23ddeff9f9cb355d1f0e35879a *man/has_tests.Rd b3d529ed003d74d5549adfdfcaa439dd *man/help.Rd a7e45d8984793b80deecede433fc6ee3 *man/imports_env.Rd 6d5ab264122d0c2a8e51d90479ce585d *man/infrastructure.Rd 01a5e21f9623e48e687f40049df34239 *man/inst.Rd b9beffd380be22b5b74ba34d6c08e977 *man/install.Rd 1ba53bb7948442a2b718791b606470a4 *man/install_bioc.Rd e9025cee40d2f108e7bbb4fb2dc10a75 *man/install_bitbucket.Rd ea9ba4638537b1cf084e40fe26e7e677 *man/install_cran.Rd 7e90e76c2ae5439105bb49357a548b1f *man/install_deps.Rd c8496f46c6003ad98fd3d9610f4d00bf *man/install_git.Rd d8ccb10b1efd1b32629d32432e56a9ab *man/install_github.Rd ed4d886fc560fa1569697b145bb7e01b *man/install_local.Rd e0432e0660e6c3cf968f5eaddcf74b57 *man/install_svn.Rd b1deeed2c1c8015a89a738acb3e9a5d1 *man/install_url.Rd 226b84fe4b3a9d4a49146cb97aff8006 *man/install_version.Rd a5cbee92d581de2b7a39e161d26ab712 *man/is.package.Rd f6edaf39f92f6288aa11533f171bca73 *man/lint.Rd af4e9a3907747f0f18ac7f7e9ee13200 *man/load_all.Rd acda85bf284a90e9537eeb197a5edfcf *man/load_code.Rd e0b828ef741f7c444e96c10e0bbd01b7 *man/load_data.Rd 861f62b1c16bdb51525a13930cce93d2 *man/load_dll.Rd f0ee57a248c96f99a1d17cd8dad383c0 *man/load_imports.Rd 95efede4c8add4365ab71e16dfb964f9 *man/loaded_packages.Rd 7f37e8fc6f0dd077b58fa01513485e73 *man/missing_s3.Rd f15d265eefd4a247b7474594f4a978f2 *man/ns_env.Rd 63e03eebf5764a545e054d9e6527b898 *man/on_path.Rd 02dc02c9b36571d1650778e45e486f5e *man/package_deps.Rd 76342a70136bbc1e1b63ad14cefed751 *man/package_file.Rd 5fd592af632e36dd24b34763bfee1f5c *man/parse_deps.Rd 30907a3301f5011ee68d1707acc76d2e *man/parse_ns_file.Rd 903938d1ad51e3a55b51262d79fae091 *man/path.Rd 9248ef9be3d966c9fb376455ba2dae27 *man/pkg_env.Rd adbeec9f47336a2725db3f7c42b613f2 *man/r_env_vars.Rd 69c352ca4eb69cf0de8071f2b0c29757 *man/release.Rd 99540da9eb5283424adb28c6f6d58002 *man/release_checks.Rd c73b8ff5193f52907adb1347d86c77de *man/reload.Rd dc463db5b7b35a61f06e106020f24265 *man/revdep.Rd 5f2bf121e620a6a07561908c25951fe6 *man/revdep_check.Rd 038c24bb5dfbccdff0434f0990d12c28 *man/revdep_email.Rd 941c38330090800b8ec5d30f5c3bbb70 *man/run_examples.Rd 43fcb67af686fbe6e9e1e109a724abc0 *man/run_pkg_hook.Rd f9e07369a2ca63b0ad0cc8852a56bfe2 *man/session_info.Rd 4377f99737a223ba6f99099ad1d96228 *man/setup_rtools.Rd dcc55dde8da6496fe4b3032877523b73 *man/show_news.Rd 23183c9a08277c226cc44090d686a58d *man/source_gist.Rd d8a8f00263ebf3a06e8ce0545b4b6f50 *man/source_url.Rd 6984f1471cffcdeba9eb569f5d368a1f *man/spell_check.Rd 18352f8dd92ace1a4f037f0ff967d762 *man/submit_cran.Rd cbf5038910cc7ff1d449c950e3ca2a75 *man/system.file.Rd 58d452025c754da86b467b20263ef6f0 *man/system_check.Rd e7ac97534320c32ae64cced8918671f3 *man/system_output.Rd ac6ce56cd35f396b5704b21778eebaaf *man/test.Rd c655a802be9ab81102adeaa222e66abb *man/uninstall.Rd 64489abe28a4084749393937a7105de7 *man/unload.Rd 1cc552a798ebc961d05d5e1933c1e6d9 *man/update_packages.Rd 0e95a9835185df297708a6bca2b9b4ff *man/use_build_ignore.Rd 0fb9ddd46ab4926547de7c6ea984cc10 *man/use_data.Rd 6de78ff78daa0f9c5ca20394ca369a0d *man/use_data_raw.Rd c6394dd96c53004ee6deaa6c35d886cf *man/use_git.Rd 5629ba53dac06f7497ea8da147270c1c *man/use_git_hook.Rd c780b5534633b9029eb89fd4489b8332 *man/use_github.Rd cee91b4186f91e1049922b69e7866518 *man/use_github_links.Rd c958d10273290bf8d0b80941d5d297e7 *man/use_news_md.Rd 669628025c74d2cd501c6376c768b6b9 *man/use_package.Rd 35d4e488621f20b85508660ea48a2df7 *man/use_readme_rmd.Rd 1293831362b0153f87d0fdce3cbc25da *man/wd.Rd a317627e9cbf337c5c0a2df6cc449a2a *man/with_debug.Rd 76f69abebd89cf31cb608ef2fa85b773 *tests/has-devel.R b67bfa8b084622fe9bbef3986b82acbb *tests/test-that.R 604e31b0e928d069d21ea4a8be3919cb *tests/testthat/archive.rds 2913f95b2e0f8b0c189759c4a228f46c *tests/testthat/check-results-note.log a6f3d37b84b56e39ad76d628cf744f17 *tests/testthat/helper-github.R f87ef290340322089c32b4e573d8f1e8 *tests/testthat/infrastructure/CONDUCT.md 97eb3e2290da734b8f02cdf52fb5abe7 *tests/testthat/infrastructure/DESCRIPTION 2d36daba1c0b2d80f4e131eb1b06b24a *tests/testthat/infrastructure/LICENSE f4952a5a758aaeaa98846f193e878264 *tests/testthat/infrastructure/NAMESPACE c068f877b35bce30a95adf87fb8646f3 *tests/testthat/infrastructure/NEWS.md e42ea595f24337c80eeae1d6deb90834 *tests/testthat/infrastructure/R/infrastructure-package.r 7a0c9e00e8fbc1c5b8a453f43e1f4ce9 *tests/testthat/infrastructure/README.Rmd 10bcd78ddcfae533cbf3997ba9e81032 *tests/testthat/infrastructure/README.md 060c7fbc8440b5bd0ea0a792644e49a8 *tests/testthat/infrastructure/Rbuildignore c311ca55d43ba49ec7e9ff5a2a93a6f6 *tests/testthat/infrastructure/appveyor.yml 110cd9eb7f10cca6429772ad69b2ebb9 *tests/testthat/infrastructure/codecov.yml 1c20e8e4e63f3548259c6db13be5eb74 *tests/testthat/infrastructure/cran-comments.md cc53bd2dccb88d01d3d83e28f66c2ea5 *tests/testthat/infrastructure/data/x.rda a22ea92b08e06a4bbbb38cfd5693e080 *tests/testthat/infrastructure/infrastructure.Rproj 896187df43e14fab73ca49e82e7f8e2a *tests/testthat/infrastructure/revdep/check.R 4f6fc0bfde98057e0b22f37e6e80d68c *tests/testthat/infrastructure/tests/testthat.R 1f3a6efaad71107e1684933e89f2f67f *tests/testthat/infrastructure/tests/testthat/test-test1.R 8f31cf9ffaa20f44f61291064ce45719 *tests/testthat/infrastructure/travis.yml 9c1157c67baaadd6bce038f5f8deb009 *tests/testthat/infrastructure/vignettes/test2.Rmd fc3e744bf34ee2fc62b3700bb66c1563 *tests/testthat/rtools-2.15/Rtools.txt 371bec6ffc0072b07210fcc00608b85a *tests/testthat/rtools-2.15/VERSION.txt d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-2.15/bin/ls.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-2.15/gcc-4.6.3/bin/gcc.exe 45317c5b2548b241603e3e4d6fbd1323 *tests/testthat/rtools-gcc493-winbuilder/Rtools.txt 520f27a9bdb7a57489fdb55231a5ef8f *tests/testthat/rtools-gcc493-winbuilder/VERSION.txt d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-gcc493-winbuilder/bin/ls.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-gcc493-winbuilder/mingw_32/bin/gcc.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-gcc493-winbuilder/mingw_64/bin/gcc.exe 45317c5b2548b241603e3e4d6fbd1323 *tests/testthat/rtools-gcc493/Rtools.txt 520f27a9bdb7a57489fdb55231a5ef8f *tests/testthat/rtools-gcc493/VERSION.txt d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-gcc493/bin/ls.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-gcc493/mingw_32/bin/gcc.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-gcc493/mingw_64/bin/gcc.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-manual/bin/ls.exe d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-manual/gcc-4.6.3/bin/gcc.exe fc3e744bf34ee2fc62b3700bb66c1563 *tests/testthat/rtools-no-gcc/Rtools.txt 371bec6ffc0072b07210fcc00608b85a *tests/testthat/rtools-no-gcc/VERSION.txt d41d8cd98f00b204e9800998ecf8427e *tests/testthat/rtools-no-gcc/bin/ls.exe 4cf2d64e44205fe628ddd534e1151b58 *tests/testthat/shallowRepo/HEAD 78d7c20c4c3a7fb97c98ab599c923421 *tests/testthat/shallowRepo/config a0a7c3fff21f2aea3cfa1d0316dd816c *tests/testthat/shallowRepo/description 036208b4a1ab4a235d75c181e685e5a3 *tests/testthat/shallowRepo/info/exclude 593fdfcbcc04d475782bc046132edcef *tests/testthat/shallowRepo/objects/pack/pack-c4e0f1d1d68408f260cbbf0a533ad5f6bfd5524e.idx 2554b1bae0dbb84947faff32cf409287 *tests/testthat/shallowRepo/objects/pack/pack-c4e0f1d1d68408f260cbbf0a533ad5f6bfd5524e.pack 2d1cf6b7f2fa49adb6f307e98766c975 *tests/testthat/shallowRepo/packed-refs e09030d939f08dcad3985f33db51c18f *tests/testthat/shallowRepo/refs/tags/v1.10.0 4ef4145c4fc7a472499a55cb0b25e3d3 *tests/testthat/shallowRepo/shallow d5fb1a6b99737ba2127cc6a21a1dcfc7 *tests/testthat/test-bioconductor.r d15c8c57b7182962aebf570568fe57f3 *tests/testthat/test-build.r ec609cc728c2c45771f1a4779f81555b *tests/testthat/test-check.r e83b9ae744fd434d5cf94a8f36c8f8e6 *tests/testthat/test-data.r c271667c19e72f6f1657d5f3467a749d *tests/testthat/test-depend.r 236ef0f93a6e5d49d43a9fae2060d092 *tests/testthat/test-description.r b619f32fc10a3d60a6afad75defab867 *tests/testthat/test-dll.r da5d16598430ef93913d03b2091e12e0 *tests/testthat/test-extraction.R 59484d4dd0acd3e7c07f413a3db996f9 *tests/testthat/test-getrootdir.R 6afc2fb4072beab29565936d2fc67346 *tests/testthat/test-git.R 70d5f0bfbbb0bc59dae10a96311d77fc *tests/testthat/test-github-connections.R a72634d2871ac6e40614cdf00db37aab *tests/testthat/test-github.r 25b335f5213575b5fa9c041d6c942fca *tests/testthat/test-help.r 07e0667d971338700300e1c6b6ecaa24 *tests/testthat/test-imports.r 95aa428d8052b1ac09931c90e710ac1d *tests/testthat/test-infrastructure.r 6587344b5b1496caf74a12a99e12a0a4 *tests/testthat/test-install-version.R 7070347c4a6f7687cadfb8e7adce73a1 *tests/testthat/test-install.R 245b245c4c7187b8321e29c9c99b7022 *tests/testthat/test-load-collate.r fb9ff788bb966363a891efd5e1fd8a71 *tests/testthat/test-load-hooks.r cb343cb97288bb37b6233e54bf024217 *tests/testthat/test-load.r 2229534d8dc55e86910105826ce02656 *tests/testthat/test-metadata.r d055c102b85df2323db756da99ccb500 *tests/testthat/test-namespace.r 6cdab9a20cd71f3236eab7b8e53df009 *tests/testthat/test-package.R cb676d532fad4e64eb3ddd4bf65b1a90 *tests/testthat/test-remote-metadata.R 1ea299168309e734fc527b65a2e6a7ff *tests/testthat/test-remotes.r 592fc4fd84e7364bf6b08bcb9b9fd4a4 *tests/testthat/test-rtools.r a976185403ee3c36774d70397aabf18c *tests/testthat/test-s4-export.r 9a1c23da0f0015f81f0ddeade1bb58d8 *tests/testthat/test-s4-sort.r 978fcafea7e4cda9580cedcebd614e82 *tests/testthat/test-s4-unload.r 9f65efe1878c88e929dcfb77e8ab9c1f *tests/testthat/test-session-info.R 0e349eede0a6d7b3949ff4007dd23f99 *tests/testthat/test-shim.r ea40ce7a0c1748bc48c22b4405ed23e3 *tests/testthat/test-sort.R 855ea2be026cf0dd483046ea59e75736 *tests/testthat/test-test.r 9bab3088050470ab8d1783b7fa52cb53 *tests/testthat/test-uninstall.r 6d7ae97fc23271950b480a147e8138b1 *tests/testthat/test-update.R c4a4e100c98d5595cc121ae8599916bf *tests/testthat/test-vignettes.r 872b6d963de006fd1d5f630014d44fd6 *tests/testthat/testCheckExtrafile/DESCRIPTION 85601cb90b291bb14a88bc9e7349b653 *tests/testthat/testCheckExtrafile/NAMESPACE abd2e85f330c6edb718f21b70c4025fd *tests/testthat/testCheckExtrafile/R/a.r 7dd2e80bb9bc9a2e5b5242cfa0773594 *tests/testthat/testCheckExtrafile/an_extra_file f2156963e20e8b103d1c56ff19952e05 *tests/testthat/testCheckExtrafile/man/a.Rd 190573569db1b89f8a95a78b7a88a7c2 *tests/testthat/testCollateAbsent/DESCRIPTION 870a24794b577e7883169d2d0fa5cbc8 *tests/testthat/testCollateAbsent/R/a.r 0bce505aa6c5dc80635131e13597ae39 *tests/testthat/testCollateAbsent/R/b.r 9428933359013ba999fe5a1b9990022e *tests/testthat/testCollateAbsent/R/c.r 7f23764e802386a39c54a97fccfb005d *tests/testthat/testCollateExtra/DESCRIPTION 870a24794b577e7883169d2d0fa5cbc8 *tests/testthat/testCollateExtra/R/a.r fcb09da8e42f2773858290efd5ef7604 *tests/testthat/testCollateMissing/DESCRIPTION 870a24794b577e7883169d2d0fa5cbc8 *tests/testthat/testCollateMissing/R/a.r d119cb97e3acb0ddacf52c658960de8c *tests/testthat/testCollateMissing/R/b.r d050ad69653ed61d8a36ac8be8ebb59e *tests/testthat/testCollateOrder/DESCRIPTION 012592abf4b40a8d741b299df2ded05a *tests/testthat/testCollateOrder/NAMESPACE c8f75318162ecbd0c54038d1ef899ccb *tests/testthat/testCollateOrder/R/a.r 5dc6a5b47f4272a583e1a76ce68ae884 *tests/testthat/testCollateOrder/R/b.r cec09efd7fb2195e51c58be69e08fc28 *tests/testthat/testData/DESCRIPTION f9ea5333e6efd9c51fc46fe34164c341 *tests/testthat/testData/NAMESPACE d2a5cf8248b95340f49aabf60c1dee3f *tests/testthat/testData/R/sysdata.rda e03377f1cb29a3f1861e89fdb0604ad8 *tests/testthat/testData/data/a.rda aa6661f9ad2794e7d2c6d8a36023e78b *tests/testthat/testData/data/b.r cb7beacf5fecc82dc47093f30438a556 *tests/testthat/testDataLazy/DESCRIPTION f9ea5333e6efd9c51fc46fe34164c341 *tests/testthat/testDataLazy/NAMESPACE d2a5cf8248b95340f49aabf60c1dee3f *tests/testthat/testDataLazy/R/sysdata.rda e03377f1cb29a3f1861e89fdb0604ad8 *tests/testthat/testDataLazy/data/a.rda aa6661f9ad2794e7d2c6d8a36023e78b *tests/testthat/testDataLazy/data/b.r fbebb7564fe7bd602015709775aaa604 *tests/testthat/testDependMissing/DESCRIPTION 26813d0f7f8e272af05102662163c53b *tests/testthat/testDependMissing/R/a.r af96180ec981f03fbd476eb546ce0091 *tests/testthat/testDllLoad/DESCRIPTION 44de50ca195021dfec7b25689a9e8249 *tests/testthat/testDllLoad/NAMESPACE 1088899aa7d116bc6360163784e95ada *tests/testthat/testDllLoad/R/a.r 68c4f21a77f2f7cc331e49d03918a326 *tests/testthat/testDllLoad/src/null-test.c 0da7dd55d206c37fd1fedfd552c51b45 *tests/testthat/testDllRcpp/DESCRIPTION 029ade657ecf41cfeabeba27c10f35e7 *tests/testthat/testDllRcpp/NAMESPACE 4d04ffe66b12f29b6d6a24647729b7bb *tests/testthat/testDllRcpp/R/RcppExports.R d41d8cd98f00b204e9800998ecf8427e *tests/testthat/testDllRcpp/R/rcpp_hello_world.R b3f86c20b81b51f5162874977d38784d *tests/testthat/testDllRcpp/src/Makevars 73505498722ad4b07b0ddb3067e92523 *tests/testthat/testDllRcpp/src/Makevars.win 62d05afcefe9766e5a9061ee7682d2f2 *tests/testthat/testDllRcpp/src/RcppExports.cpp cdae79fe661bbb694dd79074547735b9 *tests/testthat/testDllRcpp/src/rcpp_hello_world.cpp 3ef2ad4e721c38bf6b4dbd213ca8e474 *tests/testthat/testError/DESCRIPTION 482ad90a326645dfd29cb4009d87fc77 *tests/testthat/testError/R/error.r 9175fa50c1057fe73ee5629fc9e9267e *tests/testthat/testHelp/DESCRIPTION 922005c2954fc4956e18215a334aae71 *tests/testthat/testHelp/NAMESPACE d9598353fe7c2f0554d195151ef27a52 *tests/testthat/testHelp/R/foofoo.r 0dd736555db841dc8c26e961a40eb421 *tests/testthat/testHelp/man/foofoo.Rd d1f0e2582532c62a03076215d497593e *tests/testthat/testHooks/DESCRIPTION 5f10eb4bf9f2ee689fc932f60cc4a002 *tests/testthat/testHooks/R/a.r 396095258983b3d44ae21a25cb676f18 *tests/testthat/testImportMissing/DESCRIPTION 26813d0f7f8e272af05102662163c53b *tests/testthat/testImportMissing/R/a.r 70f90352135085de8bc1f52b475cfd12 *tests/testthat/testImportVersion/DESCRIPTION 85601cb90b291bb14a88bc9e7349b653 *tests/testthat/testImportVersion/NAMESPACE 26813d0f7f8e272af05102662163c53b *tests/testthat/testImportVersion/R/a.r aa6661f9ad2794e7d2c6d8a36023e78b *tests/testthat/testImportVersion/R/b.r f5aa586562104703d2de8955d6b3b0e6 *tests/testthat/testLoadDir/DESCRIPTION 4ee0e944b871a30086d8ea8a9c227d4e *tests/testthat/testLoadDir/R/a.r b026ea81923945b1f0c1422d5a92310f *tests/testthat/testLoadHooks/DESCRIPTION 700c3057dba39073ff2db2421ee90563 *tests/testthat/testLoadHooks/R/a.r 63a6931eb1ef4e0b6ef4d64f32ee6f40 *tests/testthat/testMarkdownVignettes/DESCRIPTION eeac23c800462210e6b7cec8c94ccdee *tests/testthat/testMarkdownVignettes/vignettes/test.Rmd 1494bf61f7e5b6e5a36a6df0faf2c486 *tests/testthat/testMissingNsObject/DESCRIPTION 9fb09fba835e687b843b78452bbeb820 *tests/testthat/testMissingNsObject/NAMESPACE 26813d0f7f8e272af05102662163c53b *tests/testthat/testMissingNsObject/R/a.r 588c5a391cfd1e2bf36aae7a8bd3a20c *tests/testthat/testNamespace/DESCRIPTION f4cc9c3507eb406de70a6a7c6ca63d8e *tests/testthat/testNamespace/NAMESPACE 26813d0f7f8e272af05102662163c53b *tests/testthat/testNamespace/R/a.r aa6661f9ad2794e7d2c6d8a36023e78b *tests/testthat/testNamespace/R/b.r a7e2f077bd7e448a798f153c7c153e94 *tests/testthat/testS4export/DESCRIPTION 2b3de241ad146080f3815429df0666b6 *tests/testthat/testS4export/NAMESPACE f30aea7d3b7e4d5293c35571d647efc1 *tests/testthat/testS4export/R/all.r e4786dbff584b1b4d65b4460c02b0b87 *tests/testthat/testS4import/DESCRIPTION da89f1bf6b87610e418958974b95bb42 *tests/testthat/testS4import/NAMESPACE a0c94dee500c22e01d242a233d5f1039 *tests/testthat/testS4import/R/all.r 471d4a329e0d46f1f7796129e382a5e1 *tests/testthat/testS4sort/DESCRIPTION 77889dc631765bb10f12bdbdfa8542af *tests/testthat/testS4sort/NAMESPACE 43b55f669c6979ed4ed2f6080d9a63ae *tests/testthat/testS4sort/R/classes.r bfc904e6fb6910b903755139720af809 *tests/testthat/testS4union/DESCRIPTION 85f7ace341a0219ec4ae73022413b740 *tests/testthat/testS4union/NAMESPACE 567cddae5d3b69b18264748d7c0c9aaa *tests/testthat/testS4union/R/classes.r 583f3e7f7449571e9921cefb3ddc39cf *tests/testthat/testShim/A.txt 2aaa7783c966d1455430681fe9f46c7b *tests/testthat/testShim/C.txt 7e54256efc4e214ed32c379a67a55634 *tests/testthat/testShim/DESCRIPTION 14451c613ae2c737269d0062f36eeb26 *tests/testthat/testShim/NAMESPACE 8960211fe22ff697115cc52c5f6616b6 *tests/testthat/testShim/R/a.r 119151c832b23525b2818019573bb463 *tests/testthat/testShim/inst/A.txt 490d1b2eb66ba6d4011ada175d264a41 *tests/testthat/testShim/inst/B.txt 061a5d2b06159eb07c4dc25880832acb *tests/testthat/testTest/DESCRIPTION dc21c19f0d6968ee25d441b2cf46017d *tests/testthat/testTest/NAMESPACE 9d0478fe975946f3ce9fbd3c4e003c67 *tests/testthat/testTest/tests/testthat.R 6686ffeb20a11f2001d31f4f85b3cc62 *tests/testthat/testTest/tests/testthat/test-dummy.R de918500a6a50ddacc42bdbe04ca6a11 *tests/testthat/testTestWithDepends/DESCRIPTION 012592abf4b40a8d741b299df2ded05a *tests/testthat/testTestWithDepends/NAMESPACE d8f7b32e5a6ab4d463e3c33080c80537 *tests/testthat/testTestWithDepends/tests/testthat.R 6686ffeb20a11f2001d31f4f85b3cc62 *tests/testthat/testTestWithDepends/tests/testthat/test-dummy.R 755c328df07127fa8a38f18b61dae04d *tests/testthat/testUseData/DESCRIPTION f9ea5333e6efd9c51fc46fe34164c341 *tests/testthat/testUseData/NAMESPACE d41d8cd98f00b204e9800998ecf8427e *tests/testthat/testUseData/R/a.r 06242e2cea47b189439bf8477f8aab09 *tests/testthat/testVignetteExtras/DESCRIPTION d41d8cd98f00b204e9800998ecf8427e *tests/testthat/testVignetteExtras/NAMESPACE 26813d0f7f8e272af05102662163c53b *tests/testthat/testVignetteExtras/vignettes/a.r 2106801e8a4dfc0552ff0fcf3ae2fe7a *tests/testthat/testVignetteExtras/vignettes/new.Rnw 06242e2cea47b189439bf8477f8aab09 *tests/testthat/testVignettes/DESCRIPTION d41d8cd98f00b204e9800998ecf8427e *tests/testthat/testVignettes/NAMESPACE 2106801e8a4dfc0552ff0fcf3ae2fe7a *tests/testthat/testVignettes/vignettes/new.Rnw f56fba6bb35d06307179111a2375e2b2 *tests/testthat/testVignettesBuilt/DESCRIPTION 343f94d4d7972a1af6cb10fd64bae431 *tests/testthat/testVignettesBuilt/NAMESPACE ea67dc1edf62b5901bb4eee40db481a6 *tests/testthat/testVignettesBuilt/R/code.r 4a91678fed5c1ef66c63b3d4ee51520e *tests/testthat/testVignettesBuilt/vignettes/new.Rnw 8b5a68edbe3593efcb2764da9d4c7733 *vignettes/dependencies.Rmd devtools/build/0000755000176200001440000000000013200656425013213 5ustar liggesusersdevtools/build/vignette.rds0000644000176200001440000000031113200656425015545 0ustar liggesusersb```b`fab`b2 1# 'ZZ&/ZVSM !%9h @!@tB5/1Tv 3GZY_Ӄ -3'foHf e2|s mMI,F(WJbI^ZP? L̸devtools/DESCRIPTION0000644000176200001440000000253013201030625013607 0ustar liggesusersPackage: devtools Title: Tools to Make Developing R Packages Easier Version: 1.13.4 Authors@R: c( person("Hadley", "Wickham", , "hadley@rstudio.com", role = c("aut", "cre")), person("Winston", "Chang", role = "aut"), person("RStudio", role = "cph"), person("R Core team", role = "ctb", comment = "Some namespace and vignette code extracted from base R") ) Encoding: UTF-8 Description: Collection of package development tools. URL: https://github.com/hadley/devtools BugReports: https://github.com/hadley/devtools/issues Depends: R (>= 3.0.2) Imports: httr (>= 0.4), utils, tools, methods, memoise (>= 1.0.0), whisker, digest, rstudioapi (>= 0.2.0), jsonlite, stats, git2r (>= 0.11.0), withr Suggests: curl (>= 0.9), crayon, testthat (>= 1.0.2), BiocInstaller, Rcpp (>= 0.10.0), MASS, rmarkdown, knitr, hunspell (>= 2.0), lintr (>= 0.2.1), bitops, roxygen2 (>= 5.0.0), evaluate, rversions, covr, gmailr (> 0.7.0) License: GPL (>= 2) VignetteBuilder: knitr RoxygenNote: 6.0.1 NeedsCompilation: yes Packaged: 2017-11-08 19:37:59 UTC; jhester Author: Hadley Wickham [aut, cre], Winston Chang [aut], RStudio [cph], R Core team [ctb] (Some namespace and vignette code extracted from base R) Maintainer: Hadley Wickham Repository: CRAN Date/Publication: 2017-11-09 10:44:37 UTC devtools/man/0000755000176200001440000000000013200623656012667 5ustar liggesusersdevtools/man/show_news.Rd0000644000176200001440000000076113171407310015170 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/show-news.r \name{show_news} \alias{show_news} \title{Show package news} \usage{ show_news(pkg = ".", latest = TRUE, ...) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{latest}{if \code{TRUE}, only show the news for the most recent version.} \item{...}{other arguments passed on to \code{news}} } \description{ Show package news } devtools/man/install_bitbucket.Rd0000644000176200001440000000343213172203511016652 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-bitbucket.r \name{install_bitbucket} \alias{install_bitbucket} \title{Install a package directly from bitbucket} \usage{ install_bitbucket(repo, username, ref = "master", subdir = NULL, quiet = FALSE, auth_user = NULL, password = NULL, ...) } \arguments{ \item{repo}{Repository address in the format \code{username/repo[/subdir][@ref|#pull]}. Alternatively, you can specify \code{subdir} and/or \code{ref} using the respective parameters (see below); if both are specified, the values in \code{repo} take precedence.} \item{username}{User name. Deprecated: please include username in the \code{repo}} \item{ref}{Desired git reference; could be a commit, tag, or branch name. Defaults to master.} \item{subdir}{subdirectory within repo that contains the R package.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{auth_user}{your account username if you're attempting to install a package hosted in a private repository (and your username is different to \code{username})} \item{password}{your password} \item{...}{Other arguments passed on to \code{\link{install}}.} } \description{ This function is vectorised so you can install multiple packages in a single command. } \examples{ \dontrun{ install_bitbucket("sulab/mygene.r@default") install_bitbucket("dannavarro/lsr-package") } } \seealso{ Bitbucket API docs: \url{https://confluence.atlassian.com/bitbucket/use-the-bitbucket-cloud-rest-apis-222724129.html} Other package installation: \code{\link{install_bioc}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/devtools.Rd0000644000176200001440000000235712705170137015024 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/zzz.r \docType{package} \name{devtools} \alias{devtools} \alias{devtools-package} \title{Package development tools for R.} \description{ Package development tools for R. } \section{Package options}{ Devtools uses the following \code{\link{options}} to configure behaviour: \itemize{ \item \code{devtools.path}: path to use for \code{\link{dev_mode}} \item \code{devtools.name}: your name, used when signing draft emails. \item \code{devtools.install.args}: a string giving extra arguments passed to \code{R CMD install} by \code{\link{install}}. \item \code{devtools.desc.author}: a string providing a default Authors@R string to be used in new \file{DESCRIPTION}s. Should be a R code, and look like \code{"Hadley Wickham [aut, cre]"}. See \code{\link[utils]{as.person}} for more details. \item \code{devtools.desc.license}: a default license string to use for new packages. \item \code{devtools.desc.suggests}: a character vector listing packages to to add to suggests by defaults for new packages. \item \code{devtools.desc}: a named list listing any other extra options to add to \file{DESCRIPTION} } } devtools/man/dr_github.Rd0000644000176200001440000000065513172203511015123 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/doctor.R \name{dr_github} \alias{dr_github} \title{Diagnose potential GitHub issues} \usage{ dr_github(path = ".") } \arguments{ \item{path}{Path to repository to check. Defaults to current working directory} } \description{ Diagnose potential GitHub issues } \examples{ \donttest{ dr_github() } } \seealso{ Other doctors: \code{\link{dr_devtools}} } devtools/man/package_deps.Rd0000644000176200001440000000500113200623655015557 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/deps.R \name{package_deps} \alias{package_deps} \alias{dev_package_deps} \alias{update.package_deps} \title{Find all dependencies of a CRAN or dev package.} \usage{ package_deps(pkg, dependencies = NA, repos = getOption("repos"), type = getOption("pkgType")) dev_package_deps(pkg = ".", dependencies = NA, repos = getOption("repos"), type = getOption("pkgType"), bioconductor = TRUE) \method{update}{package_deps}(object, ..., quiet = FALSE, upgrade = TRUE) } \arguments{ \item{pkg}{A character vector of package names. If missing, defaults to the name of the package in the current directory.} \item{dependencies}{Which dependencies do you want to check? Can be a character vector (selecting from "Depends", "Imports", "LinkingTo", "Suggests", or "Enhances"), or a logical vector. \code{TRUE} is shorthand for "Depends", "Imports", "LinkingTo" and "Suggests". \code{NA} is shorthand for "Depends", "Imports" and "LinkingTo" and is the default. \code{FALSE} is shorthand for no dependencies (i.e. just check this package, not its dependencies).} \item{repos}{A character vector giving repositories to use.} \item{type}{Type of package to \code{update}. If "both", will switch automatically to "binary" to avoid interactive prompts during package installation.} \item{bioconductor}{Install Bioconductor dependencies if the package has a BiocViews field in the DESCRIPTION.} \item{object}{A \code{package_deps} object.} \item{...}{Additional arguments passed to \code{install_packages}.} \item{quiet}{If \code{TRUE}, suppress output.} \item{upgrade}{If \code{TRUE}, also upgrade any of out date dependencies.} } \value{ A \code{data.frame} with columns: \tabular{ll}{ \code{package} \tab The dependent package's name,\cr \code{installed} \tab The currently installed version,\cr \code{available} \tab The version available on CRAN,\cr \code{diff} \tab An integer denoting whether the locally installed version of the package is newer (1), the same (0) or older (-1) than the version currently available on CRAN.\cr } } \description{ Find all the dependencies of a package and determine whether they are ahead or behind CRAN. A \code{print()} method identifies mismatches (if any) between local and CRAN versions of each dependent package; an \code{update()} method installs outdated or missing packages from CRAN. } \examples{ \dontrun{ package_deps("devtools") # Use update to update any out-of-date dependencies update(package_deps("devtools")) } } devtools/man/is.package.Rd0000644000176200001440000000036213171407310015156 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/package.r \name{is.package} \alias{is.package} \title{Is the object a package?} \usage{ is.package(x) } \description{ Is the object a package? } \keyword{internal} devtools/man/source_gist.Rd0000644000176200001440000000333113171407310015476 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/run-source.r \name{source_gist} \alias{source_gist} \title{Run a script on gist} \usage{ source_gist(id, ..., filename = NULL, sha1 = NULL, quiet = FALSE) } \arguments{ \item{id}{either full url (character), gist ID (numeric or character of numeric).} \item{...}{other options passed to \code{\link{source}}} \item{filename}{if there is more than one R file in the gist, which one to source (filename ending in '.R')? Default \code{NULL} will source the first file.} \item{sha1}{The SHA-1 hash of the file at the remote URL. This is highly recommend as it prevents you from accidentally running code that's not what you expect. See \code{\link{source_url}} for more information on using a SHA-1 hash.} \item{quiet}{if \code{FALSE}, the default, prints informative messages.} } \description{ \dQuote{Gist is a simple way to share snippets and pastes with others. All gists are git repositories, so they are automatically versioned, forkable and usable as a git repository.} \url{https://gist.github.com/} } \examples{ \dontrun{ # You can run gists given their id source_gist(6872663) source_gist("6872663") # Or their html url source_gist("https://gist.github.com/hadley/6872663") source_gist("gist.github.com/hadley/6872663") # It's highly recommend that you run source_gist with the optional # sha1 argument - this will throw an error if the file has changed since # you first ran it source_gist(6872663, sha1 = "54f1db27e60") # Wrong hash will result in error source_gist(6872663, sha1 = "54f1db27e61") #' # You can speficy a particular R file in the gist source_gist(6872663, filename = "hi.r") source_gist(6872663, filename = "hi.r", sha1 = "54f1db27e60") } } devtools/man/install_git.Rd0000644000176200001440000000251213172203511015457 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-git.r \name{install_git} \alias{install_git} \title{Install a package from a git repository} \usage{ install_git(url, subdir = NULL, branch = NULL, credentials = NULL, quiet = FALSE, ...) } \arguments{ \item{url}{Location of package. The url should point to a public or private repository.} \item{subdir}{A sub-directory within a git repository that may contain the package we are interested in installing.} \item{branch}{Name of branch or tag to use, if not master.} \item{credentials}{A git2r credentials object passed through to \code{\link[git2r]{clone}}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{...}{passed on to \code{\link{install}}} } \description{ It is vectorised so you can install multiple packages with a single command. You do not need to have git installed. } \examples{ \dontrun{ install_git("git://github.com/hadley/stringr.git") install_git("git://github.com/hadley/stringr.git", branch = "stringr-0.2") } } \seealso{ Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/inst.Rd0000644000176200001440000000116713200623655014137 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/inst.r \name{inst} \alias{inst} \title{Get the installation path of a package} \usage{ inst(name) } \arguments{ \item{name}{the name of a package.} } \description{ Given the name of a package, this returns a path to the installed copy of the package, which can be passed to other devtools functions. } \details{ It searches for the package in \code{\link{.libPaths}()}. If multiple dirs are found, it will return the first one. } \examples{ inst("devtools") inst("grid") \dontrun{ # Can be passed to other devtools functions unload(inst("ggplot2")) } } devtools/man/infrastructure.Rd0000644000176200001440000001006213200623655016234 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{infrastructure} \alias{infrastructure} \alias{use_testthat} \alias{use_test} \alias{use_rstudio} \alias{use_vignette} \alias{use_rcpp} \alias{use_travis} \alias{add_travis} \alias{use_coverage} \alias{use_appveyor} \alias{use_package_doc} \alias{use_revdep} \alias{add_travis} \alias{use_cran_comments} \alias{add_travis} \alias{use_code_of_conduct} \alias{add_travis} \alias{use_cran_badge} \alias{use_mit_license} \alias{use_gpl3_license} \alias{use_dev_version} \title{Add useful infrastructure to a package.} \usage{ use_testthat(pkg = ".") use_test(name, pkg = ".") use_rstudio(pkg = ".") use_vignette(name, pkg = ".") use_rcpp(pkg = ".") use_travis(pkg = ".", browse = interactive()) use_coverage(pkg = ".", type = c("codecov", "coveralls")) use_appveyor(pkg = ".") use_package_doc(pkg = ".") use_revdep(pkg = ".") use_cran_comments(pkg = ".") use_code_of_conduct(pkg = ".") use_cran_badge(pkg = ".") use_mit_license(pkg = ".", copyright_holder = getOption("devtools.name", "")) use_gpl3_license(pkg = ".") use_dev_version(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information.} \item{name}{File name to use for new vignette. Should consist only of numbers, letters, _ and -. I recommend using lower case.} \item{browse}{open a browser window to enable Travis builds for the package automatically.} \item{type}{CI tool to use. Currently supports codecov and coverall.} \item{copyright_holder}{The copyright holder for this package. Defaults to \code{getOption("devtools.name")}.} } \description{ Add useful infrastructure to a package. } \section{\code{use_testthat}}{ Add testing infrastructure to a package that does not already have it. This will create \file{tests/testthat.R}, \file{tests/testthat/} and add \pkg{testthat} to the suggested packages. This is called automatically from \code{\link{test}} if needed. } \section{\code{use_test}}{ Add a test file, also add testing infrastructure if necessary. This will create \file{tests/testthat/test-.R} with a user-specified name for the test. Will fail if the file exists. } \section{\code{use_vignette}}{ Adds needed packages to \code{DESCRIPTION}, and creates draft vignette in \code{vignettes/}. It adds \code{inst/doc} to \code{.gitignore} so you don't accidentally check in the built vignettes. } \section{\code{use_rcpp}}{ Creates \code{src/} and adds needed packages to \code{DESCRIPTION}. } \section{\code{use_travis}}{ Add basic travis template to a package. Also adds \code{.travis.yml} to \code{.Rbuildignore} so it isn't included in the built package. } \section{\code{use_coverage}}{ Add test code coverage to basic travis template to a package. } \section{\code{use_appveyor}}{ Add basic AppVeyor template to a package. Also adds \code{appveyor.yml} to \code{.Rbuildignore} so it isn't included in the built package. } \section{\code{use_package_doc}}{ Adds a roxygen template for package documentation } \section{\code{use_revdep}}{ Add \code{revdep} directory and basic check template. } \section{\code{use_cran_comments}}{ Add \code{cran-comments.md} template. } \section{\code{use_code_of_conduct}}{ Add a code of conduct to from \url{http://contributor-covenant.org}. } \section{\code{use_cran_badge}}{ Add a badge to show CRAN status and version number on the README } \section{\code{use_mit_license}}{ Adds the necessary infrastructure to declare your package as distributed under the MIT license. } \section{\code{use_gpl3_license}}{ Adds the necessary infrastructure to declare your package as distributed under the GPL v3. } \section{\code{use_dev_version}}{ This adds ".9000" to the package \code{DESCRIPTION}, adds a new heading to \code{NEWS.md} (if it exists), and then checks the result into git. } \seealso{ Other infrastructure: \code{\link{use_build_ignore}}, \code{\link{use_data_raw}}, \code{\link{use_data}}, \code{\link{use_news_md}}, \code{\link{use_package}}, \code{\link{use_readme_rmd}} } devtools/man/path.Rd0000644000176200001440000000136413172203511014106 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/path.r \name{path} \alias{path} \alias{get_path} \alias{set_path} \alias{add_path} \title{Get/set the PATH variable.} \usage{ get_path() set_path(path) add_path(path, after = Inf) } \arguments{ \item{path}{character vector of paths} \item{after}{for \code{add_path}, the place on the PATH where the new paths should be added} } \value{ \code{set_path} invisibly returns the old path. } \description{ Get/set the PATH variable. } \examples{ path <- get_path() length(path) old <- add_path(".") length(get_path()) set_path(old) length(get_path()) } \seealso{ \code{\link[withr]{with_path}} to temporarily set the path for a block of code Other path: \code{\link{on_path}} } devtools/man/use_data_raw.Rd0000644000176200001440000000114213200623656015612 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{use_data_raw} \alias{use_data_raw} \title{Use \code{data-raw} to compute package datasets.} \usage{ use_data_raw(pkg = ".") } \arguments{ \item{pkg}{Package where to create \code{data-raw}. Defaults to package in working directory.} } \description{ Use \code{data-raw} to compute package datasets. } \seealso{ Other infrastructure: \code{\link{infrastructure}}, \code{\link{use_build_ignore}}, \code{\link{use_data}}, \code{\link{use_news_md}}, \code{\link{use_package}}, \code{\link{use_readme_rmd}} } devtools/man/use_github.Rd0000644000176200001440000000543013200623656015316 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure-git.R \name{use_github} \alias{use_github} \title{Connect a local repo with GitHub.} \usage{ use_github(auth_token = github_pat(), private = FALSE, pkg = ".", host = "https://api.github.com", protocol = c("ssh", "https"), credentials = NULL) } \arguments{ \item{auth_token}{Provide a personal access token (PAT) from \url{https://github.com/settings/tokens}. Defaults to the \code{GITHUB_PAT} environment variable.} \item{private}{If \code{TRUE}, creates a private repository.} \item{pkg}{Path to package. See \code{\link{as.package}} for more information.} \item{host}{GitHub API host to use. Override with the endpoint-root for your GitHub enterprise instance, for example, "https://github.hostname.com/api/v3".} \item{protocol}{transfer protocol, either "ssh" (the default) or "https"} \item{credentials}{A \code{\link[git2r]{cred_ssh_key}} specifying specific ssh credentials or NULL for default ssh key and ssh-agent behaviour. Default is NULL.} } \description{ If the current repo does not use git, calls \code{\link{use_git}} automatically. \code{\link{use_github_links}} is called to populate the \code{URL} and \code{BugReports} fields of DESCRIPTION. } \section{Authentication}{ A new GitHub repo will be created via the GitHub API, therefore you must provide a GitHub personal access token (PAT) via the argument \code{auth_token}, which defaults to the value of the \code{GITHUB_PAT} environment variable. Obtain a PAT from \url{https://github.com/settings/tokens}. The "repo" scope is required which is one of the default scopes for a new PAT. The argument \code{protocol} reflects how you wish to authenticate with GitHub for this repo in the long run. For either \code{protocol}, a remote named "origin" is created, an initial push is made using the specified \code{protocol}, and a remote tracking branch is set. The URL of the "origin" remote has the form \code{git@github.com:/.git} (\code{protocol = "ssh"}, the default) or \code{https://github.com//.git} (\code{protocol = "https"}). For \code{protocol = "ssh"}, it is assumed that public and private keys are in the default locations, \code{~/.ssh/id_rsa.pub} and \code{~/.ssh/id_rsa}, respectively, and that \code{ssh-agent} is configured to manage any associated passphrase. Alternatively, specify a \code{\link[git2r]{cred_ssh_key}} object via the \code{credentials} parameter. } \examples{ \dontrun{ ## to use default ssh protocol create("testpkg") use_github(pkg = "testpkg") ## or use https create("testpkg2") use_github(pkg = "testpkg2", protocol = "https") } } \seealso{ Other git infrastructure: \code{\link{use_git_hook}}, \code{\link{use_github_links}}, \code{\link{use_git}} } devtools/man/use_git.Rd0000644000176200001440000000107313200623656014616 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure-git.R \name{use_git} \alias{use_git} \title{Initialise a git repository.} \usage{ use_git(message = "Initial commit", pkg = ".") } \arguments{ \item{message}{Message to use for first commit.} \item{pkg}{Path to package. See \code{\link{as.package}} for more information.} } \description{ Initialise a git repository. } \examples{ \dontrun{use_git()} } \seealso{ Other git infrastructure: \code{\link{use_git_hook}}, \code{\link{use_github_links}}, \code{\link{use_github}} } devtools/man/install.Rd0000644000176200001440000001015013172203511014611 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install.r \name{install} \alias{install} \title{Install a local development package.} \usage{ install(pkg = ".", reload = TRUE, quick = FALSE, local = TRUE, args = getOption("devtools.install.args"), quiet = FALSE, dependencies = NA, upgrade_dependencies = TRUE, build_vignettes = FALSE, keep_source = getOption("keep.source.pkgs"), threads = getOption("Ncpus", 1), force_deps = FALSE, metadata = remote_metadata(as.package(pkg)), out_dir = NULL, skip_if_log_exists = FALSE, ...) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{reload}{if \code{TRUE} (the default), will automatically reload the package after installing.} \item{quick}{if \code{TRUE} skips docs, multiple-architectures, demos, and vignettes, to make installation as fast as possible.} \item{local}{if \code{FALSE} \code{\link{build}}s the package first: this ensures that the installation is completely clean, and prevents any binary artefacts (like \file{.o}, \code{.so}) from appearing in your local package directory, but is considerably slower, because every compile has to start from scratch.} \item{args}{An optional character vector of additional command line arguments to be passed to \code{R CMD install}. This defaults to the value of the option \code{"devtools.install.args"}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{dependencies}{\code{logical} indicating to also install uninstalled packages which this \code{pkg} depends on/links to/suggests. See argument \code{dependencies} of \code{\link{install.packages}}.} \item{upgrade_dependencies}{If \code{TRUE}, the default, will also update any out of date dependencies.} \item{build_vignettes}{if \code{TRUE}, will build vignettes. Normally it is \code{build} that's responsible for creating vignettes; this argument makes sure vignettes are built even if a build never happens (i.e. because \code{local = TRUE}).} \item{keep_source}{If \code{TRUE} will keep the srcrefs from an installed package. This is useful for debugging (especially inside of RStudio). It defaults to the option \code{"keep.source.pkgs"}.} \item{threads}{number of concurrent threads to use for installing dependencies. It defaults to the option \code{"Ncpus"} or \code{1} if unset.} \item{force_deps}{whether to force installation of dependencies even if their SHA1 reference hasn't changed from the currently installed version.} \item{metadata}{Named list of metadata entries to be added to the \code{DESCRIPTION} after installation.} \item{out_dir}{Directory to store installation output in case of failure.} \item{skip_if_log_exists}{If the \code{out_dir} is defined and contains a file named \code{package.out}, no installation is attempted.} \item{...}{additional arguments passed to \code{\link{install.packages}} when installing dependencies. \code{pkg} is installed with \code{R CMD INSTALL}.} } \description{ Uses \code{R CMD INSTALL} to install the package. Will also try to install dependencies of the package from CRAN, if they're not already installed. } \details{ By default, installation takes place using the current package directory. If you have compiled code, this means that artefacts of compilation will be created in the \code{src/} directory. If you want to avoid this, you can use \code{local = FALSE} to first build a package bundle and then install it from a temporary directory. This is slower, but keeps the source directory pristine. If the package is loaded, it will be reloaded after installation. This is not always completely possible, see \code{\link{reload}} for caveats. To install a package in a non-default library, use \code{\link[withr]{with_libpaths}}. } \seealso{ \code{\link{with_debug}} to install packages with debugging flags set. Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{uninstall}} } devtools/man/on_path.Rd0000644000176200001440000000077613172203511014610 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/path.r \name{on_path} \alias{on_path} \title{Test if an object is on the path.} \usage{ on_path(...) } \arguments{ \item{...}{Strings indicating the executables to check for on the path.} } \description{ Test if an object is on the path. } \examples{ on_path("R") on_path("gcc") on_path("foo", "bar") # FALSE in most cases withr::with_path(tempdir(), on_path("gcc")) } \seealso{ Other path: \code{\link{path}} } \keyword{internal} devtools/man/install_version.Rd0000644000176200001440000000354413200623655016376 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-version.r \name{install_version} \alias{install_version} \title{Install specified version of a CRAN package.} \usage{ install_version(package, version = NULL, repos = getOption("repos"), type = getOption("pkgType"), ..., quiet = FALSE) } \arguments{ \item{package}{package name} \item{version}{If the specified version is NULL or the same as the most recent version of the package, this function simply calls \code{\link{install}}. Otherwise, it looks at the list of archived source tarballs and tries to install an older version instead.} \item{repos}{ character vector, the base URL(s) of the repositories to use, e.g., the URL of a CRAN mirror such as \code{"https://cloud.r-project.org"}. For more details on supported URL schemes see \code{\link{url}}. Can be \code{NULL} to install from local files, directories or URLs: this will be inferred by extension from \code{pkgs} if of length one. } \item{type}{character, indicating the type of package to download and install. Will be \code{"source"} except on Windows and some macOS builds: see the section on \sQuote{Binary packages} for those. } \item{...}{Other arguments passed on to \code{\link{install}}.} \item{quiet}{ logical: if true, reduce the amount of output. } } \description{ If you are installing an package that contains compiled code, you will need to have an R development environment installed. You can check if you do by running \code{\link{has_devel}}. } \seealso{ Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install}}, \code{\link{uninstall}} } \author{ Jeremy Stephens } devtools/man/install_cran.Rd0000644000176200001440000000225213200623655015627 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-cran.r \name{install_cran} \alias{install_cran} \title{Attempts to install a package from CRAN.} \usage{ install_cran(pkgs, repos = getOption("repos"), type = getOption("pkgType"), ..., quiet = FALSE) } \arguments{ \item{pkgs}{Character vector of packages to install.} \item{repos}{A character vector giving repositories to use.} \item{type}{Type of package to \code{update}. If "both", will switch automatically to "binary" to avoid interactive prompts during package installation.} \item{...}{Additional arguments passed to \code{install_packages}.} \item{quiet}{If \code{TRUE}, suppress output.} } \description{ This function is vectorised on \code{pkgs} so you can install multiple packages in a single command. } \examples{ \dontrun{ install_cran("ggplot2") install_cran(c("httpuv", "shiny") } } \seealso{ Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/compiler_flags.Rd0000644000176200001440000000161613200623655016147 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/with-debug.r \name{compiler_flags} \alias{compiler_flags} \title{Default compiler flags used by devtools.} \usage{ compiler_flags(debug = FALSE) } \arguments{ \item{debug}{If \code{TRUE} adds \code{-g -O0} to all flags (Adding \env{FFLAGS} and \env{FCFLAGS}} } \description{ These default flags enforce good coding practice by ensuring that \env{CFLAGS} and \env{CXXFLAGS} are set to \code{-Wall -pedantic}. These tests are run by cran and are generally considered to be good practice. } \details{ By default \code{\link{compile_dll}} is run with \code{compiler_flags(TRUE)}, and check with \code{compiler_flags(FALSE)}. If you want to avoid the possible performance penalty from the debug flags, install the package. } \examples{ compiler_flags() compiler_flags(TRUE) } \seealso{ Other debugging flags: \code{\link{with_debug}} } devtools/man/with_debug.Rd0000644000176200001440000000155313200623656015303 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/with-debug.r \name{with_debug} \alias{with_debug} \title{Temporarily set debugging compilation flags.} \usage{ with_debug(code, CFLAGS = NULL, CXXFLAGS = NULL, FFLAGS = NULL, FCFLAGS = NULL, debug = TRUE) } \arguments{ \item{code}{to execute.} \item{CFLAGS}{flags for compiling C code} \item{CXXFLAGS}{flags for compiling C++ code} \item{FFLAGS}{flags for compiling Fortran code.} \item{FCFLAGS}{flags for Fortran 9x code.} \item{debug}{If \code{TRUE} adds \code{-g -O0} to all flags (Adding \env{FFLAGS} and \env{FCFLAGS}} } \description{ Temporarily set debugging compilation flags. } \examples{ flags <- names(compiler_flags(TRUE)) with_debug(Sys.getenv(flags)) \dontrun{ install("mypkg") with_debug(install("mypkg")) } } \seealso{ Other debugging flags: \code{\link{compiler_flags}} } devtools/man/load_all.Rd0000644000176200001440000000750013200623655014726 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/load.r \name{load_all} \alias{load_all} \title{Load complete package.} \usage{ load_all(pkg = ".", reset = TRUE, recompile = FALSE, export_all = TRUE, quiet = FALSE, create = NA) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information.} \item{reset}{clear package environment and reset file cache before loading any pieces of the package. This is equivalent to running \code{\link{unload}} and is the default. Use \code{reset = FALSE} may be faster for large code bases, but is a significantly less accurate approximation.} \item{recompile}{force a recompile of DLL from source code, if present. This is equivalent to running \code{\link{clean_dll}} before \code{load_all}} \item{export_all}{If \code{TRUE} (the default), export all objects. If \code{FALSE}, export only the objects that are listed as exports in the NAMESPACE file.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{create}{only relevant if a package structure does not exist yet: if \code{TRUE}, create a package structure; if \code{NA}, ask the user (in interactive mode only)} } \description{ \code{load_all} loads a package. It roughly simulates what happens when a package is installed and loaded with \code{\link{library}}. } \details{ Currently \code{load_all}: \itemize{ \item Loads all data files in \code{data/}. See \code{\link{load_data}} for more details. \item Sources all R files in the R directory, storing results in environment that behaves like a regular package namespace. See below and \code{\link{load_code}} for more details. \item Compiles any C, C++, or Fortran code in the \code{src/} directory and connects the generated DLL into R. See \code{\link{compile_dll}} for more details. \item Runs \code{.onAttach()}, \code{.onLoad()} and \code{.onUnload()} functions at the correct times. \item If you use \pkg{testthat}, will load all test helpers so you can access them interactively. } } \section{Namespaces}{ The namespace environment \code{}, is a child of the imports environment, which has the name attribute \code{imports:pkgname}. It is in turn is a child of \code{}, which is a child of the global environment. (There is also a copy of the base namespace that is a child of the empty environment.) The package environment \code{} is an ancestor of the global environment. Normally when loading a package, the objects listed as exports in the NAMESPACE file are copied from the namespace to the package environment. However, \code{load_all} by default will copy all objects (not just the ones listed as exports) to the package environment. This is useful during development because it makes all objects easy to access. To export only the objects listed as exports, use \code{export_all=FALSE}. This more closely simulates behavior when loading an installed package with \code{\link{library}}, and can be useful for checking for missing exports. } \section{Shim files}{ \code{load_all} also inserts shim functions into the imports environment of the laded package. It presently adds a replacement version of \code{system.file} which returns different paths from \code{base::system.file}. This is needed because installed and uninstalled package sources have different directory structures. Note that this is not a perfect replacement for \code{base::system.file}. } \examples{ \dontrun{ # Load the package in the current directory load_all("./") # Running again loads changed files load_all("./") # With reset=TRUE, unload and reload the package for a clean start load_all("./", TRUE) # With export_all=FALSE, only objects listed as exports in NAMESPACE # are exported load_all("./", export_all = FALSE) } } \keyword{programming} devtools/man/clean_dll.Rd0000644000176200001440000000067213200623655015077 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/compile-dll.r \name{clean_dll} \alias{clean_dll} \title{Remove compiled objects from /src/ directory} \usage{ clean_dll(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Invisibly returns the names of the deleted files. } \seealso{ \code{\link{compile_dll}} } devtools/man/as.package.Rd0000644000176200001440000000104413171407310015144 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/package.r \name{as.package} \alias{as.package} \title{Coerce input to a package.} \usage{ as.package(x = NULL, create = NA) } \arguments{ \item{x}{object to coerce to a package} \item{create}{only relevant if a package structure does not exist yet: if \code{TRUE}, create a package structure; if \code{NA}, ask the user (in interactive mode only)} } \description{ Possible specifications of package: \itemize{ \item path \item package object } } \keyword{internal} devtools/man/use_git_hook.Rd0000644000176200001440000000136313200623656015640 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure-git.R \name{use_git_hook} \alias{use_git_hook} \title{Add a git hook.} \usage{ use_git_hook(hook, script, pkg = ".") } \arguments{ \item{hook}{Hook name. One of "pre-commit", "prepare-commit-msg", "commit-msg", "post-commit", "applypatch-msg", "pre-applypatch", "post-applypatch", "pre-rebase", "post-rewrite", "post-checkout", "post-merge", "pre-push", "pre-auto-gc".} \item{script}{Text of script to run} \item{pkg}{Path to package. See \code{\link{as.package}} for more information.} } \description{ Add a git hook. } \seealso{ Other git infrastructure: \code{\link{use_github_links}}, \code{\link{use_github}}, \code{\link{use_git}} } \keyword{internal} devtools/man/ns_env.Rd0000644000176200001440000000152613200623655014451 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/namespace-env.r \name{ns_env} \alias{ns_env} \title{Return the namespace environment for a package.} \usage{ ns_env(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Contains all (exported and non-exported) objects, and is a descendent of \code{R_GlobalEnv}. The hierarchy is \code{}, \code{}, \code{}, and then \code{R_GlobalEnv}. } \details{ If the package is not loaded, this function returns \code{NULL}. } \seealso{ \code{\link{pkg_env}} for the attached environment that contains the exported objects. \code{\link{imports_env}} for the environment that contains imported objects for the package. } \keyword{internal} devtools/man/loaded_packages.Rd0000644000176200001440000000065013171407310016237 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/session-info.r \name{loaded_packages} \alias{loaded_packages} \title{Return a vector of names of attached packages} \usage{ loaded_packages() } \value{ A data frame with columns package and path, giving the name of each package and the path it was loaded from. } \description{ Return a vector of names of attached packages } \keyword{internal} devtools/man/build_github_devtools.Rd0000644000176200001440000000300013200623655017526 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/build-github-devtools.r \name{build_github_devtools} \alias{build_github_devtools} \title{Build the development version of devtools from GitHub.} \usage{ build_github_devtools(outfile = NULL) } \arguments{ \item{outfile}{The name of the output file. If NULL (the default), it uses ./devtools.tgz (Mac and Linux), or ./devtools.zip (Windows).} } \value{ a string giving the location (including file name) of the built package } \description{ This function is especially useful for Windows users who want to upgrade their version of devtools to the development version hosted on on GitHub. In Windows, it's not possible to upgrade devtools while the package is loaded because there is an open DLL, which in Windows can't be overwritten. This function allows you to build a binary package of the development version of devtools; then you can restart R (so that devtools isn't loaded) and install the package. } \details{ Mac and Linux users don't need this function; they can use \code{\link{install_github}} to install devtools directly, without going through the separate build-restart-install steps. This function requires a working development environment. On Windows, it needs \url{https://cran.r-project.org/bin/windows/Rtools/}. } \examples{ \dontrun{ library(devtools) build_github_devtools() #### Restart R before continuing #### install.packages("./devtools.zip", repos = NULL) # Remove the package after installation unlink("./devtools.zip") } } devtools/man/parse_ns_file.Rd0000644000176200001440000000076013200623655015771 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/namespace-env.r \name{parse_ns_file} \alias{parse_ns_file} \title{Parses the NAMESPACE file for a package} \usage{ parse_ns_file(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Parses the NAMESPACE file for a package } \examples{ if (has_tests()) { parse_ns_file(devtest("testLoadHooks")) } } \keyword{internal} devtools/man/reload.Rd0000644000176200001440000000171613200623656014431 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/reload.r \name{reload} \alias{reload} \title{Unload and reload package.} \usage{ reload(pkg = ".", quiet = FALSE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ This attempts to unload and reload a package. If the package is not loaded already, it does nothing. It's not always possible to cleanly unload a package: see the caveats in \code{\link{unload}} for some of the potential failure points. If in doubt, restart R and reload the package with \code{\link{library}}. } \examples{ \dontrun{ # Reload package that is in current directory reload(".") # Reload package that is in ./ggplot2/ reload("ggplot2/") # Can use inst() to find the package path # This will reload the installed ggplot2 package reload(inst("ggplot2")) } } devtools/man/revdep.Rd0000644000176200001440000000304313171407310014435 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/revdep.R \name{revdep} \alias{revdep} \alias{revdep_maintainers} \title{Reverse dependency tools.} \usage{ revdep(pkg, dependencies = c("Depends", "Imports", "Suggests", "LinkingTo"), recursive = FALSE, ignore = NULL, bioconductor = FALSE) revdep_maintainers(pkg = ".") } \arguments{ \item{pkg}{Package name. This is unlike most devtools packages which take a path because you might want to determine dependencies for a package that you don't have installed. If omitted, defaults to the name of the current package.} \item{dependencies}{A character vector listing the types of dependencies to follow.} \item{recursive}{If \code{TRUE} look for full set of recursive dependencies.} \item{ignore}{A character vector of package names to ignore. These packages will not appear in returned vector. This is used in \code{\link{revdep_check}} to avoid packages with installation problems or extremely long check times.} \item{bioconductor}{If \code{TRUE} also look for dependencies amongst bioconductor packages.} } \description{ Tools to check and notify maintainers of all CRAN and bioconductor packages that depend on the specified package. } \details{ The first run in a session will be time-consuming because it must download all package metadata from CRAN and bioconductor. Subsequent runs will be faster. } \examples{ \dontrun{ revdep("ggplot2") revdep("ggplot2", ignore = c("xkcd", "zoo")) } } \seealso{ \code{\link{revdep_check}()} to run R CMD check on all reverse dependencies. } devtools/man/check_cran.Rd0000644000176200001440000000413313200623655015236 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/check-cran.r \name{check_cran} \alias{check_cran} \title{Check a package from CRAN.} \usage{ check_cran(pkgs, libpath = file.path(tempdir(), "R-lib"), srcpath = libpath, check_libpath = libpath, bioconductor = FALSE, type = getOption("pkgType"), threads = getOption("Ncpus", 1), check_dir = tempfile("check_cran"), install_dir = tempfile("check_cran_install"), env_vars = NULL, quiet_check = TRUE) } \arguments{ \item{pkgs}{Vector of package names - note that unlike other \pkg{devtools} functions this is the name of a CRAN package, not a path.} \item{libpath}{Path to library to store dependencies packages - if you you're doing this a lot it's a good idea to pick a directory and stick with it so you don't have to download all the packages every time.} \item{srcpath}{Path to directory to store source versions of dependent packages - again, this saves a lot of time because you don't need to redownload the packages every time you run the package.} \item{check_libpath}{Path to library used for checking, should contain the top-level library from \code{libpath}.} \item{bioconductor}{Include bioconductor packages in checking?} \item{type}{binary Package type to test (source, mac.binary etc). Defaults to the same type as \code{\link{install.packages}()}.} \item{threads}{Number of concurrent threads to use for checking. It defaults to the option \code{"Ncpus"} or \code{1} if unset.} \item{check_dir, install_dir}{Directory to store check and installation results.} \item{env_vars}{Environment variables set during \code{R CMD check}} \item{quiet_check}{If \code{TRUE}, suppresses individual \code{R CMD check} output and only prints summaries. Set to \code{FALSE} for debugging.} } \value{ Returns (invisibly) the directory where check results are stored. } \description{ Internal function used to power \code{\link{revdep_check}()}. } \details{ This function does not clean up after itself, but does work in a session-specific temporary directory, so all files will be removed when your current R session ends. } \keyword{internal} devtools/man/compile_dll.Rd0000644000176200001440000000175413200623655015447 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/compile-dll.r \name{compile_dll} \alias{compile_dll} \title{Compile a .dll/.so from source.} \usage{ compile_dll(pkg = ".", quiet = FALSE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ \code{compile_dll} performs a fake R CMD install so code that works here should work with a regular install (and vice versa). } \details{ During compilation, debug flags are set with \code{\link{compiler_flags}(TRUE)}. Invisibly returns the names of the DLL. } \note{ If this is used to compile code that uses Rcpp, you will need to add the following line to your \code{Makevars} file so that it knows where to find the Rcpp headers: \code{PKG_CPPFLAGS=`$(R_HOME)/bin/Rscript -e 'Rcpp:::CxxFlags()'`} } \seealso{ \code{\link{clean_dll}} to delete the compiled files. } devtools/man/dr_devtools.Rd0000644000176200001440000000074513172203511015500 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/doctor.R \name{dr_devtools} \alias{dr_devtools} \title{Diagnose potential devtools issues} \usage{ dr_devtools() } \description{ This checks to make sure you're using the latest release of R, the released version of RStudio (if you're using it as your gui), and the latest version of devtools and its dependencies. } \examples{ \dontrun{ dr_devtools() } } \seealso{ Other doctors: \code{\link{dr_github}} } devtools/man/update_packages.Rd0000644000176200001440000000276413200623656016307 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/deps.R \name{update_packages} \alias{update_packages} \title{Update packages that are missing or out-of-date.} \usage{ update_packages(pkgs = NULL, dependencies = NA, repos = getOption("repos"), type = getOption("pkgType")) } \arguments{ \item{pkgs}{Character vector of packages to update. IF \code{TRUE} all installed packages are updated. If \code{NULL} user is prompted to confirm update of all installed packages.} \item{dependencies}{Which dependencies do you want to check? Can be a character vector (selecting from "Depends", "Imports", "LinkingTo", "Suggests", or "Enhances"), or a logical vector. \code{TRUE} is shorthand for "Depends", "Imports", "LinkingTo" and "Suggests". \code{NA} is shorthand for "Depends", "Imports" and "LinkingTo" and is the default. \code{FALSE} is shorthand for no dependencies (i.e. just check this package, not its dependencies).} \item{repos}{A character vector giving repositories to use.} \item{type}{Type of package to \code{update}. If "both", will switch automatically to "binary" to avoid interactive prompts during package installation.} } \description{ Works similarly to \code{install.packages()} but doesn't install packages that are already installed, and also upgrades out dated dependencies. } \examples{ \dontrun{ update_packages("ggplot2") update_packages(c("plyr", "ggplot2")) } } \seealso{ \code{\link{package_deps}} to see which packages are out of date/ missing. } devtools/man/create_description.Rd0000644000176200001440000000152113200623655017022 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/load.r \name{create_description} \alias{create_description} \title{Create a default DESCRIPTION file for a package.} \usage{ create_description(path = ".", extra = getOption("devtools.desc"), quiet = FALSE) } \arguments{ \item{path}{path to package root directory} \item{extra}{a named list of extra options to add to \file{DESCRIPTION}. Arguments that take a list} \item{quiet}{if \code{TRUE}, suppresses output from this function.} } \description{ Create a default DESCRIPTION file for a package. } \details{ To set the default author and licenses, set \code{options} \code{devtools.desc.author} and \code{devtools.desc.license}. I use \code{options(devtools.desc.author = '"Hadley Wickham [aut,cre]"', devtools.desc.license = "GPL-3")}. } devtools/man/dev_packages.Rd0000644000176200001440000000047213171407310015567 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/session-info.r \name{dev_packages} \alias{dev_packages} \title{Return a vector of names of packages loaded by devtools} \usage{ dev_packages() } \description{ Return a vector of names of packages loaded by devtools } \keyword{internal} devtools/man/submit_cran.Rd0000644000176200001440000000147313171407310015463 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/release.r \name{submit_cran} \alias{submit_cran} \title{Submit a package to CRAN.} \usage{ submit_cran(pkg = ".", args = NULL) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{args}{An optional character vector of additional command line arguments to be passed to \code{R CMD build}.} } \description{ This uses the new CRAN web-form submission process. After submission, you will receive an email asking you to confirm submission - this is used to check that the package is submitted by the maintainer. } \details{ It's recommended that you use \code{\link{release}()} rather than this function as it performs more checks prior to submission. } \keyword{internal} devtools/man/system.file.Rd0000644000176200001440000000341713200623656015425 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/shims.r \name{system.file} \alias{system.file} \alias{shim_system.file} \title{Replacement version of system.file} \usage{ # system.file(..., package = "base", lib.loc = NULL, mustWork = FALSE) } \arguments{ \item{...}{character vectors, specifying subdirectory and file(s) within some package. The default, none, returns the root of the package. Wildcards are not supported.} \item{package}{a character string with the name of a single package. An error occurs if more than one package name is given.} \item{lib.loc}{a character vector with path names of \R libraries. See \sQuote{Details} for the meaning of the default value of \code{NULL}.} \item{mustWork}{logical. If \code{TRUE}, an error is given if there are no matching files.} } \description{ This function is meant to intercept calls to \code{\link[base]{system.file}}, so that it behaves well with packages loaded by devtools. It is made available when a package is loaded with \code{\link{load_all}}. } \details{ When \code{system.file} is called from the R console (the global envrironment), this function detects if the target package was loaded with \code{\link{load_all}}, and if so, it uses a customized method of searching for the file. This is necessary because the directory structure of a source package is different from the directory structure of an installed package. When a package is loaded with \code{load_all}, this function is also inserted into the package's imports environment, so that calls to \code{system.file} from within the package namespace will use this modified version. If this function were not inserted into the imports environment, then the package would end up calling \code{base::system.file} instead. } devtools/man/revdep_check.Rd0000644000176200001440000001111513200623656015577 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/revdep-summarise.R, R/revdep.R \name{revdep_check_save_summary} \alias{revdep_check_save_summary} \alias{revdep_check_print_problems} \alias{revdep_check} \alias{revdep_check_resume} \alias{revdep_check_reset} \title{Run R CMD check on all downstream dependencies.} \usage{ revdep_check_save_summary(pkg = ".") revdep_check_print_problems(pkg = ".") revdep_check(pkg = ".", recursive = FALSE, ignore = NULL, dependencies = c("Depends", "Imports", "Suggests", "LinkingTo"), skip = character(), libpath = getOption("devtools.revdep.libpath"), srcpath = libpath, bioconductor = FALSE, type = getOption("pkgType"), threads = getOption("Ncpus", 1), env_vars = NULL, check_dir = NULL, install_dir = NULL, quiet_check = TRUE) revdep_check_resume(pkg = ".", ...) revdep_check_reset(pkg = ".") } \arguments{ \item{pkg}{Path to package. Defaults to current directory.} \item{recursive}{If \code{TRUE} look for full set of recursive dependencies.} \item{ignore}{A character vector of package names to ignore. These packages will not appear in returned vector. This is used in \code{\link{revdep_check}} to avoid packages with installation problems or extremely long check times.} \item{dependencies}{A character vector listing the types of dependencies to follow.} \item{skip}{A character vector of package names to exclude from the checks.} \item{libpath}{Path to library to store dependencies packages - if you you're doing this a lot it's a good idea to pick a directory and stick with it so you don't have to download all the packages every time.} \item{srcpath}{Path to directory to store source versions of dependent packages - again, this saves a lot of time because you don't need to redownload the packages every time you run the package.} \item{bioconductor}{If \code{TRUE} also look for dependencies amongst bioconductor packages.} \item{type}{binary Package type to test (source, mac.binary etc). Defaults to the same type as \code{\link{install.packages}()}.} \item{threads}{Number of concurrent threads to use for checking. It defaults to the option \code{"Ncpus"} or \code{1} if unset.} \item{env_vars}{Environment variables set during \code{R CMD check}} \item{check_dir}{A temporary directory to hold the results of the package checks. This should not exist as after the revdep checks complete successfully this directory is blown away.} \item{install_dir}{Directory to store check and installation results.} \item{quiet_check}{If \code{TRUE}, suppresses individual \code{R CMD check} output and only prints summaries. Set to \code{FALSE} for debugging.} \item{...}{Optionally, override original value of arguments to \code{revdep_check}. Use with care.} } \value{ An invisible list of results. But you'll probably want to look at the check results on disk, which are saved in \code{check_dir}. Summaries of all ERRORs and WARNINGs will be stored in \code{check_dir/00check-summary.txt}. } \description{ Use \code{revdep_check()} to run \code{\link{check_cran}()} on all downstream dependencies. Summarises the results with \code{revdep_check_summary()} and see problems with \code{revdep_check_print_problems()}. } \details{ Revdep checks are resumable - this is very helpful if somethings goes wrong (like you run out of power or you lose your internet connection) in the middle of a check. You can resume a partially completed check with \code{revdep_check_resume()}, or blow away the cached result so you can start afresh with \code{revdep_check_reset()}. } \section{Check process}{ \enumerate{ \item Install \code{pkg} (in special library, see below). \item Find all CRAN packages that depend on \code{pkg}. \item Install those packages, along with their dependencies. \item Run \code{R CMD check} on each package. \item Uninstall \code{pkg} (so other reverse dependency checks don't use the development version instead of the CRAN version) } } \section{Package library}{ By default \code{revdep_check} uses a temporary library to store any packages that are required by the packages being tested. This ensures that they don't interfere with your default library, but means that if you restart R between checks, you'll need to reinstall all the packages. If you're doing reverse dependency checks frequently, I recommend that you create a directory for these packages and set \code{options(devtools.revdep.libpath)}. } \examples{ \dontrun{ # Run R CMD check on all downstream dependencies revdep_check() revdep_check_save_summary() revdep_check_print_problems() } } \seealso{ \code{\link{revdep_maintainers}()} to get a list of all revdep maintainers. } devtools/man/missing_s3.Rd0000644000176200001440000000063013171407310015225 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/missing-s3.r \name{missing_s3} \alias{missing_s3} \title{Find missing s3 exports.} \usage{ missing_s3(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ The method is heuristic - looking for objs with a period in their name. } devtools/man/build_vignettes.Rd0000644000176200001440000000215413200623655016346 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/vignettes.r \name{build_vignettes} \alias{build_vignettes} \title{Build package vignettes.} \usage{ build_vignettes(pkg = ".", dependencies = "VignetteBuilder") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{dependencies}{\code{logical} indicating to also install uninstalled packages which this \code{pkg} depends on/links to/suggests. See argument \code{dependencies} of \code{\link{install.packages}}.} } \description{ Builds package vignettes using the same algorithm that \code{R CMD build} does. This means including non-Sweave vignettes, using makefiles (if present), and copying over extra files. You need to ensure that these files are not included in the built package - ideally they should not be checked into source, or at least excluded with \code{.Rbuildignore} } \seealso{ \code{\link{clean_vignettes}} to remove the pdfs in \file{inst/doc} created from vignettes \code{\link{clean_vignettes}} to remove build tex/pdf files. } \keyword{programming} devtools/man/load_dll.Rd0000644000176200001440000000055513200623655014734 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/load-dll.r \name{load_dll} \alias{load_dll} \title{Load a compiled DLL} \usage{ load_dll(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Load a compiled DLL } \keyword{programming} devtools/man/load_data.Rd0000644000176200001440000000061413200623655015066 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/load-data.r \name{load_data} \alias{load_data} \title{Load data.} \usage{ load_data(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Loads all \code{.RData} files in the data subdirectory. } \keyword{programming} devtools/man/dev_help.Rd0000644000176200001440000000164113200623655014745 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/dev-help.r \name{dev_help} \alias{dev_help} \title{Read the in-development help for a package loaded with devtools.} \usage{ dev_help(topic, stage = "render", type = getOption("help_type")) } \arguments{ \item{topic}{name of help to search for.} \item{stage}{at which stage ("build", "install", or "render") should \\Sexpr macros be executed? This is only important if you're using \\Sexpr macro's in your Rd files.} \item{type}{of html to produce: \code{"html"} or \code{"text"}. Defaults to your default documentation type.} } \description{ Note that this only renders a single documentation file, so that links to other files within the package won't work. } \examples{ \dontrun{ library("ggplot2") help("ggplot") # loads installed documentation for ggplot load_all("ggplot2") dev_help("ggplot") # loads development documentation for ggplot } } devtools/man/run_pkg_hook.Rd0000644000176200001440000000100313200623656015635 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/run-loadhooks.r \name{run_pkg_hook} \alias{run_pkg_hook} \alias{run_user_hook} \title{Run user and package hooks.} \usage{ run_pkg_hook(pkg, hook) run_user_hook(pkg, hook) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{hook}{hook name: one of "load", "unload", "attach", or "detach"} } \description{ Run user and package hooks. } \keyword{internal} devtools/man/release.Rd0000644000176200001440000000372013200623655014577 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/release.r \name{release} \alias{release} \title{Release package to CRAN.} \usage{ release(pkg = ".", check = TRUE, args = NULL, spelling = "en_US") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{check}{if \code{TRUE}, run checking, otherwise omit it. This is useful if you've just checked your package and you're ready to release it.} \item{args}{An optional character vector of additional command line arguments to be passed to \code{R CMD build}.} \item{spelling}{language or dictionary file to spell check documentation. See \code{\link{spell_check}}. Set to \code{NULL} to skip spell checking.} } \description{ Run automated and manual tests, then ftp to CRAN. } \details{ The package release process will: \itemize{ \item Confirm that the package passes \code{R CMD check} \item Ask if you've checked your code on win-builder \item Confirm that news is up-to-date \item Confirm that DESCRIPTION is ok \item Ask if you've checked packages that depend on your package \item Build the package \item Submit the package to CRAN, using comments in "cran-comments.md" } You can also add arbitrary extra questions by defining an (un-exported) function called \code{release_questions()} that returns a character vector of additional questions to ask. You also need to read the CRAN repository policy at \url{https://cran.r-project.org/web/packages/policies.html} and make sure you're in line with the policies. \code{release} tries to automate as many of polices as possible, but it's impossible to be completely comprehensive, and they do change in between releases of devtools. } \section{Guarantee}{ If a devtools bug causes one of the CRAN maintainers to treat you impolitely, I will personally send you a handwritten apology note. Please forward me the email and your address, and I'll get a card in the mail. } devtools/man/find_topic.Rd0000644000176200001440000000072413200623655015276 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/topic-index.r \name{find_topic} \alias{find_topic} \title{Find the rd file that documents a topic.} \usage{ find_topic(topic) } \arguments{ \item{topic}{The topic, a string.} } \value{ A named string. The values gives the path to file; the name gives the path to package. } \description{ Only packages loaded by devtools are searched. } \examples{ find_topic("help") } \keyword{internal} devtools/man/github_refs.Rd0000644000176200001440000000072313171407310015453 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-github.r \name{github_pull} \alias{github_pull} \alias{github_release} \title{GitHub references} \usage{ github_pull(pull) github_release() } \arguments{ \item{pull}{The pull request to install} } \description{ Use as \code{ref} parameter to \code{\link{install_github}}. Allows installing a specific pull request or the latest release. } \seealso{ \code{\link{install_github}} } devtools/man/setup_rtools.Rd0000644000176200001440000000212613200623656015721 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/rtools.r \name{setup_rtools} \alias{setup_rtools} \alias{find_rtools} \title{Find rtools.} \usage{ setup_rtools(cache = TRUE, debug = FALSE) } \arguments{ \item{cache}{if \code{TRUE} will used cached version of RTools.} \item{debug}{if \code{TRUE} prints a lot of additional information to help in debugging.} } \value{ Either a visible \code{TRUE} if rtools is found, or an invisible \code{FALSE} with a diagnostic \code{\link{message}}. As a side-effect the internal package variable \code{rtools_path} is updated to the paths to rtools binaries. } \description{ To build binary packages on windows, Rtools (found at \url{https://cran.r-project.org/bin/windows/Rtools/}) needs to be on the path. The default installation process does not add it, so this script finds it (looking first on the path, then in the registry). It also checks that the version of rtools matches the version of R. } \section{Acknowledgements}{ This code borrows heavily from RStudio's code for finding rtools. Thanks JJ! } \keyword{internal} devtools/man/dev_example.Rd0000644000176200001440000000110713200623655015445 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/dev-example.r \name{dev_example} \alias{dev_example} \title{Run a examples for an in-development function.} \usage{ dev_example(topic) } \arguments{ \item{topic}{Name or topic (or name of Rd) file to run examples for} } \description{ Run a examples for an in-development function. } \examples{ \dontrun{ # Runs installed example: library("ggplot2") example("ggplot") # Runs develoment example: load_all("ggplot2") dev_example("ggplot") } } \seealso{ Other example functions: \code{\link{run_examples}} } devtools/man/unload.Rd0000644000176200001440000000233713200623656014445 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/unload.r \name{unload} \alias{unload} \title{Unload a package} \usage{ unload(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ This function attempts to cleanly unload a package, including unloading its namespace, deleting S4 class definitions and unloading any loaded DLLs. Unfortunately S4 classes are not really designed to be cleanly unloaded, and so we have to manually modify the class dependency graph in order for it to work - this works on the cases for which we have tested but there may be others. Similarly, automated DLL unloading is best tested for simple scenarios (particularly with \code{useDynLib(pkgname)} and may fail in other cases. If you do encounter a failure, please file a bug report at \url{http://github.com/hadley/devtools/issues}. } \examples{ \dontrun{ # Unload package that is in current directory unload(".") # Unload package that is in ./ggplot2/ unload("ggplot2/") # Can use inst() to find the path of an installed package # This will load and unload the installed ggplot2 package library(ggplot2) unload(inst("ggplot2")) } } devtools/man/use_news_md.Rd0000644000176200001440000000111113200623656015460 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{use_news_md} \alias{use_news_md} \title{Use NEWS.md} \usage{ use_news_md(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ This creates \code{NEWS.md} from a template. } \seealso{ Other infrastructure: \code{\link{infrastructure}}, \code{\link{use_build_ignore}}, \code{\link{use_data_raw}}, \code{\link{use_data}}, \code{\link{use_package}}, \code{\link{use_readme_rmd}} } devtools/man/run_examples.Rd0000644000176200001440000000316013200623656015660 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/run-examples.r \name{run_examples} \alias{run_examples} \title{Run all examples in a package.} \usage{ run_examples(pkg = ".", start = NULL, show = TRUE, test = FALSE, run = TRUE, fresh = FALSE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{start}{Where to start running the examples: this can either be the name of \code{Rd} file to start with (with or without extensions), or a topic name. If omitted, will start with the (lexicographically) first file. This is useful if you have a lot of examples and don't want to rerun them every time you fix a problem.} \item{show}{if \code{TRUE}, code in \code{\\dontshow{}} will be commented out} \item{test}{if \code{TRUE}, code in \code{\\donttest{}} will be commented out. If \code{FALSE}, code in \code{\\testonly{}} will be commented out.} \item{run}{if \code{TRUE}, code in \code{\\dontrun{}} will be commented out.} \item{fresh}{if \code{TRUE}, will be run in a fresh R session. This has the advantage that there's no way the examples can depend on anything in the current session, but interactive code (like \code{\link{browser}}) won't work.} } \description{ One of the most frustrating parts of `R CMD check` is getting all of your examples to pass - whenever one fails you need to fix the problem and then restart the whole process. This function makes it a little easier by making it possible to run all examples from an R function. } \seealso{ Other example functions: \code{\link{dev_example}} } \keyword{programming} devtools/man/use_data.Rd0000644000176200001440000000255613200623656014753 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{use_data} \alias{use_data} \title{Use data in a package.} \usage{ use_data(..., pkg = ".", internal = FALSE, overwrite = FALSE, compress = "bzip2") } \arguments{ \item{...}{Unquoted names of existing objects to save.} \item{pkg}{Package where to store data. Defaults to package in working directory.} \item{internal}{If \code{FALSE}, saves each object in individual \code{.rda} files in the \code{data/} directory. These are available whenever the package is loaded. If \code{TRUE}, stores all objects in a single \code{R/sysdata.rda} file. These objects are only available within the package.} \item{overwrite}{By default, \code{use_data} will not overwrite existing files. If you really want to do so, set this to \code{TRUE}.} \item{compress}{Choose the type of compression used by \code{\link{save}}. Should be one of "gzip", "bzip2" or "xz".} } \description{ This function makes it easy to save package data in the correct format. } \examples{ \dontrun{ x <- 1:10 y <- 1:100 use_data(x, y) # For external use use_data(x, y, internal = TRUE) # For internal use } } \seealso{ Other infrastructure: \code{\link{infrastructure}}, \code{\link{use_build_ignore}}, \code{\link{use_data_raw}}, \code{\link{use_news_md}}, \code{\link{use_package}}, \code{\link{use_readme_rmd}} } devtools/man/use_github_links.Rd0000644000176200001440000000204613200623656016516 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure-git.R \name{use_github_links} \alias{use_github_links} \title{Add GitHub links to DESCRIPTION.} \usage{ use_github_links(pkg = ".", auth_token = github_pat(), host = "https://api.github.com") } \arguments{ \item{pkg}{Path to package. See \code{\link{as.package}} for more information.} \item{auth_token}{Provide a personal access token (PAT) from \url{https://github.com/settings/tokens}. Defaults to the \code{GITHUB_PAT} environment variable.} \item{host}{GitHub API host to use. Override with the endpoint-root for your GitHub enterprise instance, for example, "https://github.hostname.com/api/v3".} } \description{ Populates the URL and BugReports fields of DESCRIPTION with \code{https://github.com//} AND \code{https://github.com///issues}, respectively, unless those fields already exist. } \seealso{ Other git infrastructure: \code{\link{use_git_hook}}, \code{\link{use_github}}, \code{\link{use_git}} } \keyword{internal} devtools/man/use_build_ignore.Rd0000644000176200001440000000211213200623656016470 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{use_build_ignore} \alias{use_build_ignore} \alias{add_build_ignore} \title{Add a file to \code{.Rbuildignore}} \usage{ use_build_ignore(files, escape = TRUE, pkg = ".") } \arguments{ \item{files}{Name of file.} \item{escape}{If \code{TRUE}, the default, will escape \code{.} to \code{\\.} and surround with \code{^} and \code{$}.} \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \value{ Nothing, called for its side effect. } \description{ \code{.Rbuildignore} has a regular expression on each line, but it's usually easier to work with specific file names. By default, will (crudely) turn a filename into a regular expression that will only match that path. Repeated entries will be silently removed. } \seealso{ Other infrastructure: \code{\link{infrastructure}}, \code{\link{use_data_raw}}, \code{\link{use_data}}, \code{\link{use_news_md}}, \code{\link{use_package}}, \code{\link{use_readme_rmd}} } \keyword{internal} devtools/man/test.Rd0000644000176200001440000000233513200623656014140 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test.r \name{test} \alias{test} \alias{uses_testthat} \title{Execute all \pkg{test_that} tests in a package.} \usage{ test(pkg = ".", filter = NULL, ...) uses_testthat(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{filter}{If not \code{NULL}, only tests with file names matching this regular expression will be executed. Matching will take on the file name after it has been stripped of \code{"test-"} and \code{".R"}.} \item{...}{additional arguments passed to \code{\link[testthat]{test_dir}}} } \description{ Tests are assumed to be located in either the \code{inst/tests/} or \code{tests/testthat} directory (the latter is recommended). See \code{\link[testthat]{test_dir}} for the naming convention of test scripts within one of those directories and \code{\link[testthat]{test_check}} for the folder structure conventions. } \details{ If no testing infrastructure is present (detected by the \code{uses_testthat} function), you'll be asked if you want devtools to create it for you (in interactive sessions only). See \code{\link{use_test}} for more details. } devtools/man/has_devel.Rd0000644000176200001440000000061013200623655015104 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/has-devel.r \name{has_devel} \alias{has_devel} \title{Check if you have a development environment installed.} \usage{ has_devel() } \value{ TRUE if your development environment is correctly set up, otherwise returns an error. } \description{ Thanks to the suggestion of Simon Urbanek. } \examples{ has_devel() } devtools/man/clean_vignettes.Rd0000644000176200001440000000074113171407310016324 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/vignettes.r \name{clean_vignettes} \alias{clean_vignettes} \title{Clean built vignettes.} \usage{ clean_vignettes(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ This uses a fairly rudimentary algorithm where any files in \file{inst/doc} with a name that exists in \file{vignettes} are removed. } devtools/man/eval_clean.Rd0000644000176200001440000000152413200623655015250 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/clean.r \name{eval_clean} \alias{eval_clean} \alias{evalq_clean} \title{Evaluate code in a clean R session.} \usage{ eval_clean(expr, quiet = TRUE) evalq_clean(expr, quiet = TRUE) } \arguments{ \item{expr}{an R expression to evaluate. For \code{eval_clean} this should already be quoted. For \code{evalq_clean} it will be quoted for you.} \item{quiet}{if \code{TRUE}, the default, only the final result and the any explicitly printed output will be displayed. If \code{FALSE}, all input and output will be displayed, as if you'd copied and paste the code.} } \value{ An invisible \code{TRUE} on success. } \description{ Evaluate code in a clean R session. } \examples{ x <- 1 y <- 2 ls() evalq_clean(ls()) evalq_clean(ls(), FALSE) eval_clean(quote({ z <- 1 ls() })) } devtools/man/devtools-deprecated.Rd0000644000176200001440000000264213200623655017116 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/with.r, R/zzz.r \name{in_dir} \alias{in_dir} \alias{with_collate} \alias{with_envvar} \alias{with_lib} \alias{with_libpaths} \alias{with_locale} \alias{with_makevars} \alias{with_options} \alias{with_par} \alias{with_path} \alias{devtools-deprecated} \title{Deprecated Functions} \usage{ in_dir(new, code) with_collate(new, code) with_envvar(new, code, action = "replace") with_lib(new, code) with_libpaths(new, code) with_locale(new, code) with_makevars(new, code, path = file.path("~", ".R", "Makevars")) with_options(new, code) with_par(new, code) with_path(new, code, add = TRUE) } \description{ These functions are Deprecated in this release of devtools, they will be marked as Defunct and removed in a future version. } \section{\code{in_dir}}{ working directory } \section{\code{with_collate}}{ collation order } \section{\code{with_envvar}}{ environmental variables } \section{\code{with_lib}}{ library paths, prepending to current libpaths } \section{\code{with_libpaths}}{ library paths, replacing current libpaths } \section{\code{with_locale}}{ any locale setting } \section{\code{with_makevars}}{ Temporarily change contents of an existing Makevars file. } \section{\code{with_options}}{ options } \section{\code{with_par}}{ graphics parameters } \section{\code{with_path}}{ PATH environment variable } \keyword{internal} devtools/man/dev_meta.Rd0000644000176200001440000000115013200623655014736 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/dev-meta.r \name{dev_meta} \alias{dev_meta} \title{Return devtools metadata environment} \usage{ dev_meta(name) } \arguments{ \item{name}{The name of a loaded package} } \description{ If the package was not loaded with devtools, returns \code{NULL}. } \examples{ dev_meta("stats") # NULL if (has_tests()) { # Load the test package in directory "testLoadHooks" load_all(devtest("testLoadHooks")) # Get metdata for the package x <- dev_meta("testLoadHooks") as.list(x) # Clean up. unload(devtest("testLoadHooks")) } } \keyword{internal} devtools/man/load_code.Rd0000644000176200001440000000071513200623655015071 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/load-code.r \name{load_code} \alias{load_code} \title{Load R code.} \usage{ load_code(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Load all R code in the \code{R} directory. The first time the code is loaded, \code{.onLoad} will be run if it exists. } \keyword{programming} devtools/man/help.Rd0000644000176200001440000000377213200623655014116 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/dev-help.r \name{help} \alias{help} \alias{shim_help} \alias{?} \alias{shim_question} \title{Drop-in replacements for help and ? functions} \usage{ # help(topic, package = NULL, ...) # ?e2 # e1?e2 } \arguments{ \item{topic}{A name or character string specifying the help topic.} \item{package}{A name or character string specifying the package in which to search for the help topic. If NULL, search all packages.} \item{...}{Additional arguments to pass to \code{\link[utils]{help}}.} \item{e1}{First argument to pass along to \code{utils::`?`}.} \item{e2}{Second argument to pass along to \code{utils::`?`}.} } \description{ The \code{?} and \code{help} functions are replacements for functions of the same name in the utils package. They are made available when a package is loaded with \code{\link{load_all}}. } \details{ The \code{?} function is a replacement for \code{\link[utils]{?}} from the utils package. It will search for help in devtools-loaded packages first, then in regular packages. The \code{help} function is a replacement for \code{\link[utils]{help}} from the utils package. If \code{package} is not specified, it will search for help in devtools-loaded packages first, then in regular packages. If \code{package} is specified, then it will search for help in devtools-loaded packages or regular packages, as appropriate. } \examples{ \dontrun{ # This would load devtools and look at the help for load_all, if currently # in the devtools source directory. load_all() ?load_all help("load_all") } # To see the help pages for utils::help and utils::`?`: help("help", "utils") help("?", "utils") \dontrun{ # Examples demonstrating the multiple ways of supplying arguments # NB: you can't do pkg <- "ggplot2"; help("ggplot2", pkg) help(lm) help(lm, stats) help(lm, 'stats') help('lm') help('lm', stats) help('lm', 'stats') help(package = stats) help(package = 'stats') topic <- "lm" help(topic) help(topic, stats) help(topic, 'stats') } } devtools/man/lint.Rd0000644000176200001440000000174613171407310014126 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/lint.r \name{lint} \alias{lint} \title{Lint all source files in a package.} \usage{ lint(pkg = ".", cache = TRUE, ...) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{cache}{store the lint results so repeated lints of the same content use the previous results.} \item{...}{additional arguments passed to \code{\link[lintr]{lint_package}}} } \description{ The default lintings correspond to the style guide at \url{http://r-pkgs.had.co.nz/r.html#style}, however it is possible to override any or all of them using the \code{linters} parameter. } \details{ The lintr cache is by default stored in \code{~/.R/lintr_cache/} (this can be configured by setting \code{options(lintr.cache_directory)}). It can be cleared by calling \code{\link[lintr]{clear_cache}}. } \seealso{ \code{\link[lintr]{lint_package}}, \code{\link[lintr]{lint}} } devtools/man/revdep_email.Rd0000644000176200001440000000201213200623656015605 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/revdep-email.R \name{revdep_email} \alias{revdep_email} \title{Experimental email notification system.} \usage{ revdep_email(pkg = ".", date, version, author = getOption("devtools.name"), draft = TRUE, unsent = NULL, template = "revdep/email.md", only_problems = TRUE) } \arguments{ \item{pkg}{Path to package. Defaults to current directory.} \item{date}{Date package will be submitted to CRAN} \item{version}{Version which will be used for the CRAN submission (usually different from the current package version)} \item{author}{Name used to sign email} \item{draft}{If \code{TRUE}, creates as draft email; if \code{FALSE}, sends immediately.} \item{unsent}{If some emails fail to send, in a previous} \item{template}{Path of template to use} \item{only_problems}{Only inform authors with problems?} } \description{ This currently assumes that you use github and gmail, and you have a \code{revdep/email.md} email template. } \keyword{internal} devtools/man/use_readme_rmd.Rd0000644000176200001440000000221113200623656016125 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{use_readme_rmd} \alias{use_readme_rmd} \alias{use_readme_md} \title{Create README files.} \usage{ use_readme_rmd(pkg = ".") use_readme_md(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Creates skeleton README files with sections for \itemize{ \item a high-level description of the package and its goals \item R code to install from GitHub, if GitHub usage detected \item a basic example } Use \code{Rmd} if you want a rich intermingling of code and data. Use \code{md} for a basic README. \code{README.Rmd} will be automatically added to \code{.Rbuildignore}. The resulting README is populated with default YAML frontmatter and R fenced code blocks (\code{md}) or chunks (\code{Rmd}). } \examples{ \dontrun{ use_readme_rmd() use_readme_md() } } \seealso{ Other infrastructure: \code{\link{infrastructure}}, \code{\link{use_build_ignore}}, \code{\link{use_data_raw}}, \code{\link{use_data}}, \code{\link{use_news_md}}, \code{\link{use_package}} } devtools/man/use_package.Rd0000644000176200001440000000164613200623656015434 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/infrastructure.R \name{use_package} \alias{use_package} \title{Use specified package.} \usage{ use_package(package, type = "Imports", pkg = ".") } \arguments{ \item{package}{Name of package to depend on.} \item{type}{Type of dependency: must be one of "Imports", "Depends", "Suggests", "Enhances", or "LinkingTo" (or unique abbreviation)} \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information.} } \description{ This adds a dependency to DESCRIPTION and offers a little advice about how to best use it. } \examples{ \dontrun{ use_package("ggplot2") use_package("dplyr", "suggests") } } \seealso{ Other infrastructure: \code{\link{infrastructure}}, \code{\link{use_build_ignore}}, \code{\link{use_data_raw}}, \code{\link{use_data}}, \code{\link{use_news_md}}, \code{\link{use_readme_rmd}} } devtools/man/install_url.Rd0000644000176200001440000000226613200623655015513 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-url.r \name{install_url} \alias{install_url} \title{Install a package from a url} \usage{ install_url(url, subdir = NULL, config = list(), ..., quiet = FALSE) } \arguments{ \item{url}{location of package on internet. The url should point to a zip file, a tar file or a bzipped/gzipped tar file.} \item{subdir}{subdirectory within url bundle that contains the R package.} \item{config}{additional configuration argument (e.g. proxy, authentication) passed on to \code{\link[httr]{GET}}.} \item{...}{Other arguments passed on to \code{\link{install}}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ This function is vectorised so you can install multiple packages in a single command. } \examples{ \dontrun{ install_url("https://github.com/hadley/stringr/archive/master.zip") } } \seealso{ Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/system_output.Rd0000644000176200001440000000153513171407310016120 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/system.r \name{system_output} \alias{system_output} \title{Run a system command and capture the output.} \usage{ system_output(cmd, args = character(), env_vars = character(), path = ".", quiet = FALSE, ...) } \arguments{ \item{cmd}{Command to run. Will be quoted by \code{\link{shQuote}()}.} \item{args}{A character vector of arguments.} \item{env_vars}{A named character vector of environment variables.} \item{path}{Path in which to execute the command} \item{quiet}{If \code{FALSE}, the command to be run will be echoed.} \item{...}{additional arguments passed to \code{\link[base]{system}}} } \value{ command output if the command succeeds, an error will be thrown if the command fails. } \description{ Run a system command and capture the output. } \keyword{internal} devtools/man/load_imports.Rd0000644000176200001440000000064113200623655015652 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/imports-env.r \name{load_imports} \alias{load_imports} \title{Load all of the imports for a package} \usage{ load_imports(pkg = ".") } \description{ The imported objects are copied to the imports environment, and are not visible from R_GlobalEnv. This will automatically load (but not attach) the dependency packages. } \keyword{internal} devtools/man/build_win.Rd0000644000176200001440000000250713200623655015135 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/build.r \name{build_win} \alias{build_win} \title{Build windows binary package.} \usage{ build_win(pkg = ".", version = c("R-release", "R-devel"), args = NULL, quiet = FALSE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{version}{directory to upload to on the win-builder, controlling which version of R is used to build the package. Possible options are listed on \url{http://win-builder.r-project.org/}. Defaults to R-devel.} \item{args}{An optional character vector of additional command line arguments to be passed to \code{R CMD build} if \code{binary = FALSE}, or \code{R CMD install} if \code{binary = TRUE}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ This function works by bundling source package, and then uploading to \url{http://win-builder.r-project.org/}. Once building is complete you'll receive a link to the built package in the email address listed in the maintainer field. It usually takes around 30 minutes. As a side effect, win-build also runs \code{R CMD check} on the package, so \code{build_win} is also useful to check that your package is ok on windows. } \seealso{ Other build functions: \code{\link{build}} } devtools/man/check_failures.Rd0000644000176200001440000000140113171407310016113 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/check-results.R \name{check_failures} \alias{check_failures} \title{Parses R CMD check log file for ERRORs, WARNINGs and NOTEs} \usage{ check_failures(path, error = TRUE, warning = TRUE, note = TRUE) } \arguments{ \item{path}{check path, e.g., value of the \code{check_dir} argument in a call to \code{\link{check}}} \item{error, warning, note}{logical, indicates if errors, warnings and/or notes should be returned} } \value{ a character vector with the relevant messages, can have length zero if no messages are found } \description{ Extracts check messages from the \code{00check.log} file generated by \code{R CMD check}. } \seealso{ \code{\link{check}}, \code{\link{revdep_check}} } devtools/man/spell_check.Rd0000644000176200001440000000170513200623656015435 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/spell-check.R \name{spell_check} \alias{spell_check} \title{Spell checking} \usage{ spell_check(pkg = ".", ignore = character(), dict = "en_US") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{ignore}{character vector with words to ignore. See \code{\link[hunspell:hunspell]{hunspell}} for more information} \item{dict}{a dictionary object or language string. See \code{\link[hunspell:hunspell]{hunspell}} for more information} } \description{ Runs a spell check on text fields in the package description file and manual pages. Hunspell includes dictionaries for \code{en_US} and \code{en_GB} by default. Other languages require installation of a custom dictionary, see the \href{https://cran.r-project.org/package=hunspell/vignettes/intro.html#system_dictionaries}{hunspell vignette} for details. } devtools/man/wd.Rd0000644000176200001440000000066113171407310013565 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/wd.r \name{wd} \alias{wd} \title{Set working directory.} \usage{ wd(pkg = ".", path = "") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{path}{path within package. Leave empty to change working directory to package directory.} } \description{ Set working directory. } devtools/man/build.Rd0000644000176200001440000000324413200623655014257 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/build.r \name{build} \alias{build} \title{Build package.} \usage{ build(pkg = ".", path = NULL, binary = FALSE, vignettes = TRUE, manual = FALSE, args = NULL, quiet = FALSE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{path}{path in which to produce package. If \code{NULL}, defaults to the parent directory of the package.} \item{binary}{Produce a binary (\code{--binary}) or source ( \code{--no-manual --no-resave-data}) version of the package.} \item{vignettes, manual}{For source packages: if \code{FALSE}, don't build PDF vignettes (\code{--no-build-vignettes}) or manual (\code{--no-manual}).} \item{args}{An optional character vector of additional command line arguments to be passed to \code{R CMD build} if \code{binary = FALSE}, or \code{R CMD install} if \code{binary = TRUE}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \value{ a string giving the location (including file name) of the built package } \description{ Building converts a package source directory into a single bundled file. If \code{binary = FALSE} this creates a \code{tar.gz} package that can be installed on any platform, provided they have a full development environment (although packages without source code can typically be install out of the box). If \code{binary = TRUE}, the package will have a platform specific extension (e.g. \code{.zip} for windows), and will only be installable on the current platform, but no development environment is needed. } \seealso{ Other build functions: \code{\link{build_win}} } devtools/man/check.Rd0000644000176200001440000000715713200623655014244 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/check.r \name{check} \alias{check} \alias{check_built} \title{Build and check a package, cleaning up automatically on success.} \usage{ check(pkg = ".", document = TRUE, build_args = NULL, ..., manual = FALSE, cran = TRUE, check_version = FALSE, force_suggests = FALSE, run_dont_test = FALSE, args = NULL, env_vars = NULL, quiet = FALSE, check_dir = tempdir(), cleanup = TRUE) check_built(path = NULL, cran = TRUE, check_version = FALSE, force_suggests = FALSE, run_dont_test = FALSE, manual = FALSE, args = NULL, env_vars = NULL, check_dir = tempdir(), quiet = FALSE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{document}{if \code{TRUE} (the default), will update and check documentation before running formal check.} \item{build_args}{Additional arguments passed to \code{R CMD build}} \item{...}{Additional arguments passed on to \code{\link{build}()}.} \item{manual}{If \code{FALSE}, don't build and check manual (\code{--no-manual}).} \item{cran}{if \code{TRUE} (the default), check using the same settings as CRAN uses.} \item{check_version}{Sets \code{_R_CHECK_CRAN_INCOMING_} env var. If \code{TRUE}, performs a number of checked related to version numbers of packages on CRAN.} \item{force_suggests}{Sets \code{_R_CHECK_FORCE_SUGGESTS_}. If \code{FALSE} (the default), check will proceed even if all suggested packages aren't found.} \item{run_dont_test}{Sets \code{--run-donttest} so that tests surrounded in \code{\\dontest\{\}} are also tested. This is important for CRAN submission.} \item{args}{Additional arguments passed to \code{R CMD check}} \item{env_vars}{Environment variables set during \code{R CMD check}} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{check_dir}{the directory in which the package is checked} \item{cleanup}{Deprecated.} \item{path}{Path to built package.} } \value{ An object containing errors, warnings, and notes. } \description{ \code{check} automatically builds and checks a source package, using all known best practices. \code{check_built} checks an already built package. } \details{ Passing \code{R CMD check} is essential if you want to submit your package to CRAN: you must not have any ERRORs or WARNINGs, and you want to ensure that there are as few NOTEs as possible. If you are not submitting to CRAN, at least ensure that there are no ERRORs or WARNINGs: these typically represent serious problems. \code{check} automatically builds a package before calling \code{check_built} as this is the recommended way to check packages. Note that this process runs in an independent realisation of R, so nothing in your current workspace will affect the process. } \section{Environment variables}{ Devtools does its best to set up an environment that combines best practices with how check works on CRAN. This includes: \itemize{ \item The standard environment variables set by devtools: \code{\link{r_env_vars}}. Of particular note for package tests is the \code{NOT_CRAN} env var which lets you know that your tests are not running on cran, and hence can take a reasonable amount of time. \item Debugging flags for the compiler, set by \code{\link{compiler_flags}(FALSE)}. \item If \code{aspell} is found \code{_R_CHECK_CRAN_INCOMING_USE_ASPELL_} is set to \code{TRUE}. If no spell checker is installed, a warning is issued.) \item env vars set by arguments \code{check_version} and \code{force_suggests} } } \seealso{ \code{\link{release}} if you want to send the checked package to CRAN. } devtools/man/install_bioc.Rd0000644000176200001440000000277413172203511015622 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-bioc.r \name{install_bioc} \alias{install_bioc} \title{Install a package from a Bioconductor repository} \usage{ install_bioc(repo, mirror = getOption("BioC_svn", "https://hedgehog.fhcrc.org/bioconductor"), ..., quiet = FALSE) } \arguments{ \item{repo}{Repository address in the format \code{[username:password@][release/]repo[#revision]}. Valid values for the release are \sQuote{devel} (the default if none specified), \sQuote{release} or numeric release numbers (e.g. \sQuote{3.3}).} \item{mirror}{The bioconductor SVN mirror to use} \item{...}{Other arguments passed on to \code{\link{install}}} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ This function requires \code{svn} to be installed on your system in order to be used. } \details{ It is vectorised so you can install multiple packages with a single command. ' } \examples{ \dontrun{ install_bioc("SummarizedExperiment") install_bioc("user@SummarizedExperiment") install_bioc("user:password@release/SummarizedExperiment") install_bioc("user:password@3.3/SummarizedExperiment") install_bioc("user:password@3.3/SummarizedExperiment#117513") } } \seealso{ Other package installation: \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/session_info.Rd0000644000176200001440000000153013171407310015645 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/session-info.r \name{session_info} \alias{session_info} \title{Print session information} \usage{ session_info(pkgs = NULL, include_base = FALSE) } \arguments{ \item{pkgs}{Either a vector of package names or NULL. If \code{NULL}, displays all loaded packages. If a character vector, also, includes all dependencies of the package.} \item{include_base}{Include base packages in summary? By default this is false since base packages should always match the R version.} } \description{ This is \code{\link{sessionInfo}()} re-written from scratch to both exclude data that's rarely useful (e.g., the full collate string or base packages loaded) and include stuff you'd like to know (e.g., where a package was installed from). } \examples{ session_info() session_info("devtools") } devtools/man/r_env_vars.Rd0000644000176200001440000000120413171407310015311 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/R.r \name{r_env_vars} \alias{r_env_vars} \title{Environment variables to set when calling R} \usage{ r_env_vars() } \value{ a named character vector } \description{ Devtools sets a number of environmental variables to ensure consistent between the current R session and the new session, and to ensure that everything behaves the same across systems. It also suppresses a common warning on windows, and sets \code{NOT_CRAN} so you can tell that your code is not running on CRAN. If \code{NOT_CRAN} has been set externally, it is not overwritten. } \keyword{internal} devtools/man/parse_deps.Rd0000644000176200001440000000125713200623655015307 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/package-deps.r \name{parse_deps} \alias{parse_deps} \title{Parse package dependency strings.} \usage{ parse_deps(string) } \arguments{ \item{string}{to parse. Should look like \code{"R (>= 3.0), ggplot2"} etc.} } \value{ list of two character vectors: \code{name} package names, and \code{version} package versions. If version is not specified, it will be stored as NA. } \description{ Parse package dependency strings. } \examples{ parse_deps("httr (< 2.1),\\nRCurl (>= 3)") # only package dependencies are returned parse_deps("utils (== 2.12.1),\\ntools,\\nR (>= 2.10),\\nmemoise") } \keyword{internal} devtools/man/uninstall.Rd0000644000176200001440000000220013172203511015151 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/uninstall.r \name{uninstall} \alias{uninstall} \title{Uninstall a local development package.} \usage{ uninstall(pkg = ".", unload = TRUE, quiet = FALSE, ...) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{unload}{if \code{TRUE} (the default), will automatically unload the package prior to uninstalling.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{...}{additional arguments passed to \code{\link{remove.packages}}.} } \description{ Uses \code{remove.package} to uninstall the package. To uninstall a package from a non-default library, use \code{\link[withr]{with_libpaths}}. } \seealso{ \code{\link{with_debug}} to install packages with debugging flags set. Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}} } devtools/man/install_deps.Rd0000644000176200001440000000350513200623655015641 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install.r \name{install_deps} \alias{install_deps} \alias{install_dev_deps} \title{Install package dependencies if needed.} \usage{ install_deps(pkg = ".", dependencies = NA, threads = getOption("Ncpus", 1), repos = getOption("repos"), type = getOption("pkgType"), ..., upgrade = TRUE, quiet = FALSE, force_deps = FALSE) install_dev_deps(pkg = ".", ...) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{dependencies}{\code{logical} indicating to also install uninstalled packages which this \code{pkg} depends on/links to/suggests. See argument \code{dependencies} of \code{\link{install.packages}}.} \item{threads}{number of concurrent threads to use for installing dependencies. It defaults to the option \code{"Ncpus"} or \code{1} if unset.} \item{repos}{A character vector giving repositories to use.} \item{type}{Type of package to \code{update}. If "both", will switch automatically to "binary" to avoid interactive prompts during package installation.} \item{...}{additional arguments passed to \code{\link{install.packages}}.} \item{upgrade}{If \code{TRUE}, also upgrade any of out date dependencies.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{force_deps}{whether to force installation of dependencies even if their SHA1 reference hasn't changed from the currently installed version.} } \description{ \code{install_deps} is used by \code{install_*} to make sure you have all the dependencies for a package. \code{install_dev_deps()} is useful if you have a source version of the package and want to be able to develop with it: it installs all dependencies of the package, and it also installs roxygen2. } \examples{ \dontrun{install_deps(".")} } devtools/man/RCMD.Rd0000644000176200001440000000127013200623655013702 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/R.r \name{RCMD} \alias{RCMD} \title{Run R CMD xxx from within R} \usage{ RCMD(cmd, options, path = tempdir(), env_vars = character(), ...) } \arguments{ \item{cmd}{one of the R tools available from the R CMD interface.} \item{options}{a character vector of options to pass to the command} \item{path}{the directory to run the command in.} \item{env_vars}{environment variables to set before running the command.} \item{...}{additional arguments passed to \code{\link{system_check}}} } \value{ \code{TRUE} if the command succeeds, throws an error if the command fails. } \description{ Run R CMD xxx from within R } devtools/man/pkg_env.Rd0000644000176200001440000000216413200623655014611 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/package-env.r \name{pkg_env} \alias{pkg_env} \title{Return package environment} \usage{ pkg_env(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ This is an environment like \code{}. The package environment contains the exported objects from a package. It is attached, so it is an ancestor of \code{R_GlobalEnv}. } \details{ When a package is loaded the normal way, using \code{\link{library}}, this environment contains only the exported objects from the namespace. However, when loaded with \code{\link{load_all}}, this environment will contain all the objects from the namespace, unless \code{load_all} is used with \code{export_all=FALSE}. If the package is not attached, this function returns \code{NULL}. } \seealso{ \code{\link{ns_env}} for the namespace environment that all the objects (exported and not exported). \code{\link{imports_env}} for the environment that contains imported objects for the package. } \keyword{internal} devtools/man/devtest.Rd0000644000176200001440000000070613200623655014636 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/test.r \name{devtest} \alias{devtest} \title{Return the path to one of the packages in the devtools test dir} \usage{ devtest(package) } \arguments{ \item{package}{Name of the test package.} } \description{ Devtools comes with some simple packages for testing. This function returns the path to them. } \examples{ if (has_tests()) { devtest("testData") } } \keyword{internal} devtools/man/has_tests.Rd0000644000176200001440000000040413171407310015143 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/has-tests.r \name{has_tests} \alias{has_tests} \title{Was devtools installed with tests?} \usage{ has_tests() } \description{ Was devtools installed with tests? } \keyword{internal} devtools/man/install_github.Rd0000644000176200001440000000570113200623655016170 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-github.r \name{install_github} \alias{install_github} \title{Attempts to install a package directly from GitHub.} \usage{ install_github(repo, username = NULL, ref = "master", subdir = NULL, auth_token = github_pat(quiet), host = "https://api.github.com", quiet = FALSE, ...) } \arguments{ \item{repo}{Repository address in the format \code{username/repo[/subdir][@ref|#pull]}. Alternatively, you can specify \code{subdir} and/or \code{ref} using the respective parameters (see below); if both are specified, the values in \code{repo} take precedence.} \item{username}{User name. Deprecated: please include username in the \code{repo}} \item{ref}{Desired git reference. Could be a commit, tag, or branch name, or a call to \code{\link{github_pull}}. Defaults to \code{"master"}.} \item{subdir}{subdirectory within repo that contains the R package.} \item{auth_token}{To install from a private repo, generate a personal access token (PAT) in \url{https://github.com/settings/tokens} and supply to this argument. This is safer than using a password because you can easily delete a PAT without affecting any others. Defaults to the \code{GITHUB_PAT} environment variable.} \item{host}{GitHub API host to use. Override with your GitHub enterprise hostname, for example, \code{"github.hostname.com/api/v3"}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} \item{...}{Other arguments passed on to \code{\link{install}}.} } \description{ This function is vectorised on \code{repo} so you can install multiple packages in a single command. } \details{ Attempting to install from a source repository that uses submodules raises a warning. Because the zipped sources provided by GitHub do not include submodules, this may lead to unexpected behaviour or compilation failure in source packages. In this case, cloning the repository manually using \code{\link{install_git}} with \code{args="--recursive"} may yield better results. } \examples{ \dontrun{ install_github("klutometis/roxygen") install_github("wch/ggplot2") install_github(c("rstudio/httpuv", "rstudio/shiny")) install_github(c("hadley/httr@v0.4", "klutometis/roxygen#142", "mfrasca/r-logging/pkg")) # Update devtools to the latest version, on Linux and Mac # On Windows, this won't work - see ?build_github_devtools install_github("hadley/devtools") # To install from a private repo, use auth_token with a token # from https://github.com/settings/tokens. You only need the # repo scope. Best practice is to save your PAT in env var called # GITHUB_PAT. install_github("hadley/private", auth_token = "abc") } } \seealso{ \code{\link{github_pull}} Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_git}}, \code{\link{install_svn}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/source_url.Rd0000644000176200001440000000262513171407310015337 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/run-source.r \name{source_url} \alias{source_url} \title{Run a script through some protocols such as http, https, ftp, etc.} \usage{ source_url(url, ..., sha1 = NULL) } \arguments{ \item{url}{url} \item{...}{other options passed to \code{\link{source}}} \item{sha1}{The (prefix of the) SHA-1 hash of the file at the remote URL.} } \description{ If a SHA-1 hash is specified with the \code{sha1} argument, then this function will check the SHA-1 hash of the downloaded file to make sure it matches the expected value, and throw an error if it does not match. If the SHA-1 hash is not specified, it will print a message displaying the hash of the downloaded file. The purpose of this is to improve security when running remotely-hosted code; if you have a hash of the file, you can be sure that it has not changed. For convenience, it is possible to use a truncated SHA1 hash, down to 6 characters, but keep in mind that a truncated hash won't be as secure as the full hash. } \examples{ \dontrun{ source_url("https://gist.github.com/hadley/6872663/raw/hi.r") # With a hash, to make sure the remote file hasn't changed source_url("https://gist.github.com/hadley/6872663/raw/hi.r", sha1 = "54f1db27e60bb7e0486d785604909b49e8fef9f9") # With a truncated hash source_url("https://gist.github.com/hadley/6872663/raw/hi.r", sha1 = "54f1db27e60") } } devtools/man/imports_env.Rd0000644000176200001440000000135513200623655015526 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/imports-env.r \name{imports_env} \alias{imports_env} \title{Return imports environment for a package} \usage{ imports_env(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information.} } \description{ Contains objects imported from other packages. Is the parent of the package namespace environment, and is a child of , which is a child of R_GlobalEnv. } \seealso{ \code{\link{ns_env}} for the namespace environment that all the objects (exported and not exported). \code{\link{pkg_env}} for the attached environment that contains the exported objects. } \keyword{internal} devtools/man/git_checks.Rd0000644000176200001440000000071013171407310015251 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/check-git.r \name{git_checks} \alias{git_checks} \title{Git checks.} \usage{ git_checks(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information.} } \description{ This function performs Git checks checks prior to release. It is called automatically by \code{\link{release}()}. } \keyword{internal} devtools/man/github_pat.Rd0000644000176200001440000000046413171407310015302 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/github.R \name{github_pat} \alias{github_pat} \title{Retrieve Github personal access token.} \usage{ github_pat(quiet = FALSE) } \description{ A github personal access token Looks in env var \code{GITHUB_PAT} } \keyword{internal} devtools/man/install_svn.Rd0000644000176200001440000000276413200623655015522 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-svn.r \name{install_svn} \alias{install_svn} \title{Install a package from a SVN repository} \usage{ install_svn(url, subdir = NULL, branch = NULL, args = character(0), ..., revision = NULL, quiet = FALSE) } \arguments{ \item{url}{Location of package. The url should point to a public or private repository.} \item{subdir}{A sub-directory withing a svn repository that may contain the package we are interested in installing. By default, this points to the 'trunk' directory.} \item{branch}{Name of branch or tag to use, if not trunk.} \item{args}{A character vector providing extra arguments to pass on to} \item{...}{Other arguments passed on to \code{\link{install}}} \item{revision}{svn revision, if omitted updates to latest} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ This function requires \code{svn} to be installed on your system in order to be used. } \details{ It is vectorised so you can install multiple packages with a single command. } \examples{ \dontrun{ install_svn("https://github.com/hadley/stringr") install_svn("https://github.com/hadley/httr", branch = "oauth") } } \seealso{ Other package installation: \code{\link{install_bioc}}, \code{\link{install_bitbucket}}, \code{\link{install_cran}}, \code{\link{install_github}}, \code{\link{install_git}}, \code{\link{install_url}}, \code{\link{install_version}}, \code{\link{install}}, \code{\link{uninstall}} } devtools/man/clean_source.Rd0000644000176200001440000000115613200623655015622 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/clean.r \name{clean_source} \alias{clean_source} \title{Sources an R file in a clean environment.} \usage{ clean_source(path, quiet = FALSE) } \arguments{ \item{path}{path to R script} \item{quiet}{If \code{FALSE}, the default, all input and output will be displayed, as if you'd copied and paste the code. If \code{TRUE} only the final result and the any explicitly printed output will be displayed.} } \description{ Opens up a fresh R environment and sources file, ensuring that it works independently of the current working environment. } devtools/man/release_checks.Rd0000644000176200001440000000100013171407310016077 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/check-devtools.r \name{release_checks} \alias{release_checks} \title{Custom devtools release checks.} \usage{ release_checks(pkg = ".", built_path = NULL) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information.} } \description{ This function performs additional checks prior to release. It is called automatically by \code{\link{release}()}. } \keyword{internal} devtools/man/system_check.Rd0000644000176200001440000000165113171407310015634 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/system.r \name{system_check} \alias{system_check} \title{Run a system command and check if it succeeds.} \usage{ system_check(cmd, args = character(), env_vars = character(), path = ".", quiet = FALSE, throw = TRUE, ...) } \arguments{ \item{cmd}{Command to run. Will be quoted by \code{\link{shQuote}()}.} \item{args}{A character vector of arguments.} \item{env_vars}{A named character vector of environment variables.} \item{path}{Path in which to execute the command} \item{quiet}{If \code{FALSE}, the command to be run will be echoed.} \item{throw}{If \code{TRUE}, will throw an error if the command fails (i.e. the return value is not 0).} \item{...}{additional arguments passed to \code{\link[base]{system}}} } \value{ The exit status of the command, invisibly. } \description{ Run a system command and check if it succeeds. } \keyword{internal} devtools/man/check_dep_version.Rd0000644000176200001440000000112613200623655016627 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/package-deps.r \name{check_dep_version} \alias{check_dep_version} \title{Check that the version of an imported package satisfies the requirements} \usage{ check_dep_version(dep_name, dep_ver = NA, dep_compare = NA) } \arguments{ \item{dep_name}{The name of the package with objects to import} \item{dep_ver}{The version of the package} \item{dep_compare}{The comparison operator to use to check the version} } \description{ Check that the version of an imported package satisfies the requirements } \keyword{internal} devtools/man/bash.Rd0000644000176200001440000000055313171407310014070 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/bash.r \name{bash} \alias{bash} \title{Open bash shell in package directory.} \usage{ bash(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \description{ Open bash shell in package directory. } devtools/man/document.Rd0000644000176200001440000000171313171407310014770 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/document.r \name{document} \alias{document} \title{Use roxygen to document a package.} \usage{ document(pkg = ".", clean = NULL, roclets = NULL, reload = TRUE) } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} \item{clean, reload}{Deprecated.} \item{roclets}{Character vector of roclet names to use with package. This defaults to \code{NULL}, which will use the \code{roclets} fields in the list provided in the \code{Roxygen} DESCRIPTION field. If none are specified, defaults to \code{c("collate", "namespace", "rd")}.} } \description{ This function is a wrapper for the \code{\link[roxygen2]{roxygenize}()} function from the roxygen2 package. See the documentation and vignettes of that package to learn how to use roxygen. } \seealso{ \code{\link[roxygen2]{roxygenize}}, \code{browseVignettes("roxygen2")} } devtools/man/check_man.Rd0000644000176200001440000000136113171407310015061 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/check-doc.r \name{check_man} \alias{check_man} \title{Check documentation, as \code{R CMD check} does.} \usage{ check_man(pkg = ".") } \arguments{ \item{pkg}{package description, can be path or package name. See \code{\link{as.package}} for more information} } \value{ Nothing. This function is called purely for it's side effects: if } \description{ This function attempts to run the documentation related checks in the same way that \code{R CMD check} does. Unfortunately it can't run them all because some tests require the package to be loaded, and the way they attempt to load the code conflicts with how devtools does it. } \examples{ \dontrun{ check_man("mypkg") } } devtools/man/create.Rd0000644000176200001440000000322313200623655014420 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/create.r \name{create} \alias{create} \alias{setup} \title{Creates a new package, following all devtools package conventions.} \usage{ create(path, description = getOption("devtools.desc"), check = FALSE, rstudio = TRUE, quiet = FALSE) setup(path = ".", description = getOption("devtools.desc"), check = FALSE, rstudio = TRUE, quiet = FALSE) } \arguments{ \item{path}{location to create new package. The last component of the path will be used as the package name.} \item{description}{list of description values to override default values or add additional values.} \item{check}{if \code{TRUE}, will automatically run \code{\link{check}}} \item{rstudio}{Create an RStudio project file? (with \code{\link{use_rstudio}})} \item{quiet}{if \code{FALSE}, the default, prints informative messages.} } \description{ Similar to \code{\link{package.skeleton}}, except that it only creates the standard devtools directory structures; it doesn't try and create source code and data files by inspecting the global environment. } \details{ \code{create} requires that the directory doesn't exist yet; it will be created but deleted upon failure. \code{setup} assumes an existing directory from which it will infer the package name. } \examples{ \dontrun{ # Create a package using all defaults: path <- file.path(tempdir(), "myDefaultPackage") create(path) # Override a description attribute. path <- file.path(tempdir(), "myCustomPackage") my_description <- list("Maintainer" = "'Yoni Ben-Meshulam' ") create(path, my_description) } } \seealso{ Text with \code{\link{package.skeleton}} } devtools/man/package_file.Rd0000644000176200001440000000115413171407310015543 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/package.r \name{package_file} \alias{package_file} \title{Find file in a package.} \usage{ package_file(..., path = ".") } \arguments{ \item{...}{Components of the path.} \item{path}{Place to start search for package directory.} } \description{ It always starts by finding by walking up the path until it finds the root directory, i.e. a directory containing \code{DESCRIPTION}. If it cannot find the root directory, or it can't find the specified path, it will throw an error. } \examples{ \dontrun{ package_file("figures", "figure_1") } } devtools/man/install_local.Rd0000644000176200001440000000145213200623655015777 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/install-local.r \name{install_local} \alias{install_local} \title{Install a package from a local file} \usage{ install_local(path, subdir = NULL, ..., quiet = FALSE) } \arguments{ \item{path}{path to local directory, or compressed file (tar, zip, tar.gz tar.bz2, tgz2 or tbz)} \item{subdir}{subdirectory within url bundle that contains the R package.} \item{...}{Other arguments passed on to \code{\link{install}}.} \item{quiet}{if \code{TRUE} suppresses output from this function.} } \description{ This function is vectorised so you can install multiple packages in a single command. } \examples{ \dontrun{ dir <- tempfile() dir.create(dir) pkg <- download.packages("testthat", dir, type = "source") install_local(pkg[, 2]) } } devtools/man/dev_mode.Rd0000644000176200001440000000146313171407310014736 0ustar liggesusers% Generated by roxygen2: do not edit by hand % Please edit documentation in R/dev-mode.r \name{dev_mode} \alias{dev_mode} \title{Activate and deactivate development mode.} \usage{ dev_mode(on = NULL, path = getOption("devtools.path")) } \arguments{ \item{on}{turn dev mode on (\code{TRUE}) or off (\code{FALSE}). If omitted will guess based on whether or not \code{path} is in \code{\link{.libPaths}}} \item{path}{directory to library.} } \description{ When activated, \code{dev_mode} creates a new library for storing installed packages. This new library is automatically created when \code{dev_mode} is activated if it does not already exist. This allows you to test development packages in a sandbox, without interfering with the other packages you have installed. } \examples{ \dontrun{ dev_mode() dev_mode() } }